2025-07-02 13:05:22,121 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: STARTUP_MSG: /************************************************************ STARTUP_MSG: Starting DataNode STARTUP_MSG: host = dmidlkprdls04.svr.luc.edu/192.168.158.4 STARTUP_MSG: args = [] STARTUP_MSG: version = 3.1.1.7.3.1.0-197 STARTUP_MSG: classpath = /var/run/cloudera-scm-agent/process/163-hdfs-DATANODE:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/audience-annotations-0.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/avro.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/aws-java-sdk-bundle-1.12.720.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-hdfs-plugin-shim-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-plugin-classloader-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-yarn-plugin-shim-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/azure-data-lake-store-sdk-2.3.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/checker-qual-3.33.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-beanutils-1.9.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-cli-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-codec-1.15.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-collections-3.2.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-compress-1.23.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-configuration2-2.10.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-io-2.11.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-lang-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-lang3-3.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-logging-1.1.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-math3-3.6.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-net-3.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-text-1.10.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-client-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-framework-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-recipes-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/failureaccess-1.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/gson-2.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/guava-32.0.1-jre.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/httpclient-4.5.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/httpcore-4.4.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/j2objc-annotations-2.8.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-core-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-core-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-databind-2.12.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-mapper-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-xc-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/javax.activation-api-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/javax.servlet-api-3.1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jaxb-api-2.2.11.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jcip-annotations-1.0-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-core-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-json-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-server-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-servlet-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jettison-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-http-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-io-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-security-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-server-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-servlet-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-util-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-util-ajax-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-webapp-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-xml-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jline-3.22.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsch-0.1.54.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsp-api-2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsr305-3.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsr311-api-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jul-to-slf4j-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-admin-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-client-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-common-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-core-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-crypto-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-identity-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-server-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-simplekdc-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-asn1-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-config-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-pkix-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-xdr-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/logredactor-2.0.16.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/lz4-java-1.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/metrics-core-3.2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-all-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-buffer-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-haproxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-http-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-http2-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-memcache-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-mqtt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-redis-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-smtp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-log4j12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-reload4j-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-api-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/reload4j-1.2.22.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/zookeeper.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-udt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-unix-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-kqueue-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final-linux-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-classes-kqueue-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-rxtx-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-classes-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-classes-macos-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-ssl-ocsp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-proxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-xml-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-stomp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-socks-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/wildfly-openssl-2.1.4.ClouderaFinal.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/protobuf-java-2.5.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/nimbus-jose-jwt-9.37.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/woodstox-core-5.4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/token-provider-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/stax2-api-4.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/snappy-java-1.1.10.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/re2j-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/zookeeper-jute.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-sctp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-kqueue-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final-linux-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//gcs-connector-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-annotations.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-auth.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-aws.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-datalake.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-kms.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-nfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//ozone-filesystem-hadoop3-1.3.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-nfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-kms-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-datalake-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-aws-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-auth-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-annotations-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//gcs-connector-2.1.2.7.3.1.0-197-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-thrift.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-scala_2.12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-protobuf.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-pig.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-pig-bundle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-jackson.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-hadoop.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-hadoop-bundle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-generator.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-format-structures.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-encoding.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-column.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-cascading3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-cascading.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-avro.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/./:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/audience-annotations-0.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/avro-1.11.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/checker-qual-3.33.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-beanutils-1.9.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-codec-1.15.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-collections-3.2.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-compress-1.23.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-configuration2-2.10.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-io-2.11.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-lang3-3.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-math3-3.6.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-net-3.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-text-1.10.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-client-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-framework-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-recipes-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/failureaccess-1.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/gson-2.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/guava-32.0.1-jre.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/httpclient-4.5.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/httpcore-4.4.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/j2objc-annotations-2.8.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-core-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-databind-2.12.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-jaxrs-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-xc-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/javax.activation-api-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/javax.servlet-api-3.1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jaxb-api-2.2.11.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jaxb-impl-2.2.3-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jcip-annotations-1.0-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-core-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-json-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-server-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-servlet-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jettison-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-http-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-io-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-security-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-server-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-servlet-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-util-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-util-ajax-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-webapp-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-xml-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jline-3.22.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsch-0.1.54.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/json-simple-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsr311-api-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-admin-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-client-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-common-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-core-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-crypto-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-identity-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-server-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-simplekdc-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-asn1-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-config-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-pkix-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-xdr-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/leveldbjni-cldr-1.9.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/lz4-java-1.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/metrics-core-3.2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-all-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-buffer-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-haproxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-http-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-http2-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-memcache-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-mqtt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-redis-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-smtp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-socks-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-stomp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-xml-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-proxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-ssl-ocsp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/re2j-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/zookeeper-3.8.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-sctp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-rxtx-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-unix-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-kqueue-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-kqueue-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final-linux-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final-linux-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-classes-kqueue-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-classes-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-classes-macos-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/nimbus-jose-jwt-9.37.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/woodstox-core-5.4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/token-provider-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/stax2-api-4.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/snappy-java-1.1.10.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/reload4j-1.2.22.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/zookeeper-jute-3.8.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-udt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-httpfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-nfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-nfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-httpfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//asm-5.0.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//aspectjweaver-1.9.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//azure-storage-7.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//checker-compat-qual-2.5.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-slf4j-backend-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-system-backend-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//google-extensions-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//azure-keyvault-core-1.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//accessors-smart-2.4.9.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gcs-connector-2.1.2.7.3.1.0-197-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archive-logs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archive-logs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archives-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archives.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-datajoin-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-datajoin.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-distcp-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-distcp.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-extras-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-extras.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-fs2img-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-fs2img.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-gridmix-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-gridmix.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-kafka-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-kafka.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-app-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-app.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-core-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-core.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-nativetask-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-nativetask.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-uploader-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-uploader.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-examples-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-examples.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-openstack-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-openstack.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-resourceestimator-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-resourceestimator.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-rumen-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-rumen.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-sls-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-sls.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-streaming-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-streaming.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ojalgo-43.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//kafka-clients-2.8.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//log4j-core-2.18.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-hook-abfs-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//forbiddenapis-3.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-intg-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//log4j-api-2.18.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//zstd-jni-1.4.9-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-i18n.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-s3-lib-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//javax.activation-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//bundle-2.23.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//json-smart-2.4.10.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-util-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-shell.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-cloud-bindings.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-hook-s3-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//aspectjrt-1.9.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/./:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/swagger-annotations-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/objenesis-1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jersey-client-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jakarta.xml.bind-api-2.3.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jakarta.activation-api-1.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-dataformat-yaml-2.9.10.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcutil-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcprov-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcpkix-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/HikariCP-java7-2.4.12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/snakeyaml-2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jsonschema2pojo-core-1.0.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/joda-time-2.10.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jna-5.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jersey-guice-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/javax.inject-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-module-jaxb-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-jaxrs-json-provider-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-jaxrs-base-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/guice-servlet-4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/guice-4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/geronimo-jcache_1.0_spec-1.0-alpha-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/fst-2.50.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/ehcache-3.3.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/dnsjava-2.1.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/codemodel-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-api.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-distributedshell.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-cloud-resourcemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-registry.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-nodemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-resourcemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-router.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-sharedcachemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-timeline-pluginstorage.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-web-proxy.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-api.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-core.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-registry-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-api-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-core-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-api-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-web-proxy-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-timeline-pluginstorage-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-tests-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-sharedcachemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-router-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-resourcemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-nodemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-cloud-resourcemanager-1.0.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-distributedshell-3.1.1.7.3.1.0-197.jar:/opt/cloudera/cm/lib/plugins/event-publish-7.13.1-shaded.jar:/opt/cloudera/cm/lib/plugins/tt-instrumentation-7.13.1.jar STARTUP_MSG: build = git@github.infra.cloudera.com:CDH/hadoop.git -r 31a42fb39494f541ffae15c3c61185deeeacca86; compiled by 'jenkins' on 2024-12-04T01:09Z STARTUP_MSG: java = 1.8.0_432 ************************************************************/ 2025-07-02 13:05:22,206 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: registered UNIX signal handlers for [TERM, HUP, INT] 2025-07-02 13:05:22,620 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d1/dfs/dn 2025-07-02 13:05:22,626 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d2/dfs/dn 2025-07-02 13:05:22,626 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d3/dfs/dn 2025-07-02 13:05:22,627 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d4/dfs/dn 2025-07-02 13:05:22,788 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: Loaded properties from hadoop-metrics2.properties 2025-07-02 13:05:22,912 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled Metric snapshot period at 10 second(s). 2025-07-02 13:05:22,913 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: DataNode metrics system started 2025-07-02 13:05:23,337 INFO org.apache.hadoop.hdfs.server.common.Util: dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling 2025-07-02 13:05:23,363 INFO org.apache.hadoop.hdfs.server.datanode.BlockScanner: Initialized block scanner with targetBytesPerSec 1048576 2025-07-02 13:05:23,371 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: File descriptor passing is enabled. 2025-07-02 13:05:23,372 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Configured hostname is dmidlkprdls04.svr.luc.edu 2025-07-02 13:05:23,373 INFO org.apache.hadoop.hdfs.server.common.Util: dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling 2025-07-02 13:05:23,379 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting DataNode with maxLockedMemory = 4294967296 2025-07-02 13:05:23,407 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened streaming server at /192.168.158.4:9866 2025-07-02 13:05:23,410 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Balancing bandwidth is 10485760 bytes/s 2025-07-02 13:05:23,410 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Number threads for balancing is 50 2025-07-02 13:05:23,413 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Balancing bandwidth is 10485760 bytes/s 2025-07-02 13:05:23,414 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Number threads for balancing is 50 2025-07-02 13:05:23,414 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Listening on UNIX domain socket: /var/run/hdfs-sockets/dn 2025-07-02 13:05:23,462 INFO org.eclipse.jetty.util.log: Logging initialized @2533ms to org.eclipse.jetty.util.log.Slf4jLog 2025-07-02 13:05:23,581 WARN org.apache.hadoop.security.authentication.server.AuthenticationFilter: Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /var/lib/hadoop-hdfs/hadoop-http-auth-signature-secret 2025-07-02 13:05:23,590 INFO org.apache.hadoop.http.HttpRequestLog: Http request log for http.requests.datanode is not defined 2025-07-02 13:05:23,598 INFO org.apache.hadoop.http.HttpServer2: Added global filter 'safety' (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter) 2025-07-02 13:05:23,601 INFO org.apache.hadoop.security.HttpCrossOriginFilterInitializer: CORS filter not enabled. Please set hadoop.http.cross-origin.enabled to 'true' to enable it 2025-07-02 13:05:23,602 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context datanode 2025-07-02 13:05:23,602 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context logs 2025-07-02 13:05:23,602 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context static 2025-07-02 13:05:23,641 INFO org.apache.hadoop.http.HttpServer2: Jetty bound to port 45477 2025-07-02 13:05:23,642 INFO org.eclipse.jetty.server.Server: jetty-9.4.54.v20240208; built: 2024-02-08T19:42:39.027Z; git: cef3fbd6d736a21e7d541a5db490381d95a2047d; jvm 1.8.0_432-b06 2025-07-02 13:05:23,694 INFO org.eclipse.jetty.server.session: DefaultSessionIdManager workerName=node0 2025-07-02 13:05:23,694 INFO org.eclipse.jetty.server.session: No SessionScavenger set, using defaults 2025-07-02 13:05:23,697 INFO org.eclipse.jetty.server.session: node0 Scavenging every 600000ms 2025-07-02 13:05:23,732 WARN org.apache.hadoop.security.authentication.server.AuthenticationFilter: Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /var/lib/hadoop-hdfs/hadoop-http-auth-signature-secret 2025-07-02 13:05:23,738 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.s.ServletContextHandler@6cea706c{logs,/logs,file:///var/log/hadoop-hdfs/,AVAILABLE} 2025-07-02 13:05:23,740 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.s.ServletContextHandler@21ec5d87{static,/static,file:///opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/static/,AVAILABLE} 2025-07-02 13:05:23,888 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.w.WebAppContext@4d157787{datanode,/,file:///opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/datanode/,AVAILABLE}{file:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/datanode} 2025-07-02 13:05:23,902 INFO org.eclipse.jetty.server.AbstractConnector: Started ServerConnector@51f49060{HTTP/1.1, (http/1.1)}{localhost:45477} 2025-07-02 13:05:23,902 INFO org.eclipse.jetty.server.Server: Started @2973ms 2025-07-02 13:05:24,189 INFO org.apache.hadoop.hdfs.server.datanode.web.DatanodeHttpServer: Listening HTTP traffic on /192.168.158.4:9864 2025-07-02 13:05:24,201 INFO org.apache.hadoop.util.JvmPauseMonitor: Starting JVM pause monitor 2025-07-02 13:05:24,202 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: dnUserName = hdfs 2025-07-02 13:05:24,202 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: supergroup = supergroup 2025-07-02 13:05:24,271 INFO org.apache.hadoop.ipc.CallQueueManager: Using callQueue: class java.util.concurrent.LinkedBlockingQueue queueCapacity: 1000 scheduler: class org.apache.hadoop.ipc.DefaultRpcScheduler 2025-07-02 13:05:24,293 INFO org.apache.hadoop.ipc.Server: Starting Socket Reader #1 for port 9867 2025-07-02 13:05:24,343 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened IPC server at /192.168.158.4:9867 2025-07-02 13:05:24,379 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Refresh request received for nameservices: null 2025-07-02 13:05:24,388 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting BPOfferServices for nameservices: 2025-07-02 13:05:24,403 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool (Datanode Uuid unassigned) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 starting to offer service 2025-07-02 13:05:24,412 INFO org.apache.hadoop.ipc.Server: IPC Server Responder: starting 2025-07-02 13:05:24,412 INFO org.apache.hadoop.ipc.Server: IPC Server listener on 9867: starting 2025-07-02 13:05:25,606 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-02 13:05:26,608 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-02 13:05:26,748 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Acknowledging ACTIVE Namenode during handshakeBlock pool (Datanode Uuid unassigned) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-02 13:05:26,751 INFO org.apache.hadoop.hdfs.server.common.Storage: Using 4 threads to upgrade data directories (dfs.datanode.parallel.volumes.load.threads.num=4, dataDirs=4) 2025-07-02 13:05:26,759 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d1/dfs/dn/in_use.lock acquired by nodename 228336@dmidlkprdls04.svr.luc.edu 2025-07-02 13:05:26,760 INFO org.apache.hadoop.hdfs.server.common.Storage: Storage directory with location [DISK]file:/hdfs/d1/dfs/dn is not formatted for namespace 1949837783. Formatting... 2025-07-02 13:05:26,762 INFO org.apache.hadoop.hdfs.server.common.Storage: Generated new storageID DS-4d75b0ca-10b9-4609-8e68-77a4f5414078 for directory /hdfs/d1/dfs/dn 2025-07-02 13:05:26,771 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d2/dfs/dn/in_use.lock acquired by nodename 228336@dmidlkprdls04.svr.luc.edu 2025-07-02 13:05:26,771 INFO org.apache.hadoop.hdfs.server.common.Storage: Storage directory with location [DISK]file:/hdfs/d2/dfs/dn is not formatted for namespace 1949837783. Formatting... 2025-07-02 13:05:26,772 INFO org.apache.hadoop.hdfs.server.common.Storage: Generated new storageID DS-9da8ed8d-5505-4aa6-9202-3a757c515004 for directory /hdfs/d2/dfs/dn 2025-07-02 13:05:26,774 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d3/dfs/dn/in_use.lock acquired by nodename 228336@dmidlkprdls04.svr.luc.edu 2025-07-02 13:05:26,775 INFO org.apache.hadoop.hdfs.server.common.Storage: Storage directory with location [DISK]file:/hdfs/d3/dfs/dn is not formatted for namespace 1949837783. Formatting... 2025-07-02 13:05:26,775 INFO org.apache.hadoop.hdfs.server.common.Storage: Generated new storageID DS-0cdbc609-0d97-4a42-88e7-a1d84207e95a for directory /hdfs/d3/dfs/dn 2025-07-02 13:05:26,778 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d4/dfs/dn/in_use.lock acquired by nodename 228336@dmidlkprdls04.svr.luc.edu 2025-07-02 13:05:26,778 INFO org.apache.hadoop.hdfs.server.common.Storage: Storage directory with location [DISK]file:/hdfs/d4/dfs/dn is not formatted for namespace 1949837783. Formatting... 2025-07-02 13:05:26,779 INFO org.apache.hadoop.hdfs.server.common.Storage: Generated new storageID DS-66da5157-6664-49c0-94d4-de2a951e5d1e for directory /hdfs/d4/dfs/dn 2025-07-02 13:05:26,814 INFO org.apache.hadoop.hdfs.server.common.Storage: Analyzing storage directories for bpid BP-58625395-192.168.158.1-1751479517048 2025-07-02 13:05:26,814 INFO org.apache.hadoop.hdfs.server.common.Storage: Locking is disabled for /hdfs/d1/dfs/dn/current/BP-58625395-192.168.158.1-1751479517048 2025-07-02 13:05:26,815 INFO org.apache.hadoop.hdfs.server.common.Storage: Block pool storage directory for location [DISK]file:/hdfs/d1/dfs/dn and block pool id BP-58625395-192.168.158.1-1751479517048 is not formatted. Formatting ... 2025-07-02 13:05:26,816 INFO org.apache.hadoop.hdfs.server.common.Storage: Formatting block pool BP-58625395-192.168.158.1-1751479517048 directory /hdfs/d1/dfs/dn/current/BP-58625395-192.168.158.1-1751479517048/current 2025-07-02 13:05:26,840 INFO org.apache.hadoop.hdfs.server.common.Storage: Analyzing storage directories for bpid BP-58625395-192.168.158.1-1751479517048 2025-07-02 13:05:26,840 INFO org.apache.hadoop.hdfs.server.common.Storage: Locking is disabled for /hdfs/d2/dfs/dn/current/BP-58625395-192.168.158.1-1751479517048 2025-07-02 13:05:26,840 INFO org.apache.hadoop.hdfs.server.common.Storage: Block pool storage directory for location [DISK]file:/hdfs/d2/dfs/dn and block pool id BP-58625395-192.168.158.1-1751479517048 is not formatted. Formatting ... 2025-07-02 13:05:26,840 INFO org.apache.hadoop.hdfs.server.common.Storage: Formatting block pool BP-58625395-192.168.158.1-1751479517048 directory /hdfs/d2/dfs/dn/current/BP-58625395-192.168.158.1-1751479517048/current 2025-07-02 13:05:26,862 INFO org.apache.hadoop.hdfs.server.common.Storage: Analyzing storage directories for bpid BP-58625395-192.168.158.1-1751479517048 2025-07-02 13:05:26,862 INFO org.apache.hadoop.hdfs.server.common.Storage: Locking is disabled for /hdfs/d3/dfs/dn/current/BP-58625395-192.168.158.1-1751479517048 2025-07-02 13:05:26,862 INFO org.apache.hadoop.hdfs.server.common.Storage: Block pool storage directory for location [DISK]file:/hdfs/d3/dfs/dn and block pool id BP-58625395-192.168.158.1-1751479517048 is not formatted. Formatting ... 2025-07-02 13:05:26,862 INFO org.apache.hadoop.hdfs.server.common.Storage: Formatting block pool BP-58625395-192.168.158.1-1751479517048 directory /hdfs/d3/dfs/dn/current/BP-58625395-192.168.158.1-1751479517048/current 2025-07-02 13:05:26,882 INFO org.apache.hadoop.hdfs.server.common.Storage: Analyzing storage directories for bpid BP-58625395-192.168.158.1-1751479517048 2025-07-02 13:05:26,883 INFO org.apache.hadoop.hdfs.server.common.Storage: Locking is disabled for /hdfs/d4/dfs/dn/current/BP-58625395-192.168.158.1-1751479517048 2025-07-02 13:05:26,883 INFO org.apache.hadoop.hdfs.server.common.Storage: Block pool storage directory for location [DISK]file:/hdfs/d4/dfs/dn and block pool id BP-58625395-192.168.158.1-1751479517048 is not formatted. Formatting ... 2025-07-02 13:05:26,883 INFO org.apache.hadoop.hdfs.server.common.Storage: Formatting block pool BP-58625395-192.168.158.1-1751479517048 directory /hdfs/d4/dfs/dn/current/BP-58625395-192.168.158.1-1751479517048/current 2025-07-02 13:05:26,885 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Setting up storage: nsid=1949837783;bpid=BP-58625395-192.168.158.1-1751479517048;lv=-57;nsInfo=lv=-64;cid=cluster72;nsid=1949837783;c=1751479517048;bpid=BP-58625395-192.168.158.1-1751479517048;dnuuid=null 2025-07-02 13:05:26,888 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Generated and persisted new Datanode UUID 92aa408f-509e-4bad-98e0-6e83f0a34a12 2025-07-02 13:05:26,903 INFO org.apache.hadoop.conf.Configuration.deprecation: No unit for dfs.datanode.lock-reporting-threshold-ms(300) assuming MILLISECONDS 2025-07-02 13:05:26,906 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: The datanode lock is a read write lock 2025-07-02 13:05:26,955 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Added new volume: DS-4d75b0ca-10b9-4609-8e68-77a4f5414078 2025-07-02 13:05:26,955 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Added volume - [DISK]file:/hdfs/d1/dfs/dn, StorageType: DISK 2025-07-02 13:05:26,957 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Added new volume: DS-9da8ed8d-5505-4aa6-9202-3a757c515004 2025-07-02 13:05:26,957 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Added volume - [DISK]file:/hdfs/d2/dfs/dn, StorageType: DISK 2025-07-02 13:05:26,959 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Added new volume: DS-0cdbc609-0d97-4a42-88e7-a1d84207e95a 2025-07-02 13:05:26,959 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Added volume - [DISK]file:/hdfs/d3/dfs/dn, StorageType: DISK 2025-07-02 13:05:26,962 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Added new volume: DS-66da5157-6664-49c0-94d4-de2a951e5d1e 2025-07-02 13:05:26,962 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Added volume - [DISK]file:/hdfs/d4/dfs/dn, StorageType: DISK 2025-07-02 13:05:26,972 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Registered FSDatasetState MBean 2025-07-02 13:05:26,979 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Adding block pool BP-58625395-192.168.158.1-1751479517048 2025-07-02 13:05:26,980 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Scanning block pool BP-58625395-192.168.158.1-1751479517048 on volume /hdfs/d1/dfs/dn... 2025-07-02 13:05:26,982 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Scanning block pool BP-58625395-192.168.158.1-1751479517048 on volume /hdfs/d4/dfs/dn... 2025-07-02 13:05:26,981 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Scanning block pool BP-58625395-192.168.158.1-1751479517048 on volume /hdfs/d3/dfs/dn... 2025-07-02 13:05:26,980 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Scanning block pool BP-58625395-192.168.158.1-1751479517048 on volume /hdfs/d2/dfs/dn... 2025-07-02 13:05:27,060 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Time taken to scan block pool BP-58625395-192.168.158.1-1751479517048 on /hdfs/d4/dfs/dn: 78ms 2025-07-02 13:05:27,062 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Time taken to scan block pool BP-58625395-192.168.158.1-1751479517048 on /hdfs/d2/dfs/dn: 78ms 2025-07-02 13:05:27,062 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Time taken to scan block pool BP-58625395-192.168.158.1-1751479517048 on /hdfs/d3/dfs/dn: 79ms 2025-07-02 13:05:27,063 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Time taken to scan block pool BP-58625395-192.168.158.1-1751479517048 on /hdfs/d1/dfs/dn: 81ms 2025-07-02 13:05:27,064 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Total time to scan all replicas for block pool BP-58625395-192.168.158.1-1751479517048: 85ms 2025-07-02 13:05:27,067 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Adding replicas to map for block pool BP-58625395-192.168.158.1-1751479517048 on volume /hdfs/d1/dfs/dn... 2025-07-02 13:05:27,067 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Adding replicas to map for block pool BP-58625395-192.168.158.1-1751479517048 on volume /hdfs/d2/dfs/dn... 2025-07-02 13:05:27,067 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.BlockPoolSlice: Replica Cache file: /hdfs/d1/dfs/dn/current/BP-58625395-192.168.158.1-1751479517048/current/replicas doesn't exist 2025-07-02 13:05:27,067 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Adding replicas to map for block pool BP-58625395-192.168.158.1-1751479517048 on volume /hdfs/d3/dfs/dn... 2025-07-02 13:05:27,067 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.BlockPoolSlice: Replica Cache file: /hdfs/d2/dfs/dn/current/BP-58625395-192.168.158.1-1751479517048/current/replicas doesn't exist 2025-07-02 13:05:27,068 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.BlockPoolSlice: Replica Cache file: /hdfs/d3/dfs/dn/current/BP-58625395-192.168.158.1-1751479517048/current/replicas doesn't exist 2025-07-02 13:05:27,068 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Adding replicas to map for block pool BP-58625395-192.168.158.1-1751479517048 on volume /hdfs/d4/dfs/dn... 2025-07-02 13:05:27,069 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.BlockPoolSlice: Replica Cache file: /hdfs/d4/dfs/dn/current/BP-58625395-192.168.158.1-1751479517048/current/replicas doesn't exist 2025-07-02 13:05:27,071 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Time to add replicas to map for block pool BP-58625395-192.168.158.1-1751479517048 on volume /hdfs/d1/dfs/dn: 4ms 2025-07-02 13:05:27,071 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Time to add replicas to map for block pool BP-58625395-192.168.158.1-1751479517048 on volume /hdfs/d2/dfs/dn: 4ms 2025-07-02 13:05:27,072 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Time to add replicas to map for block pool BP-58625395-192.168.158.1-1751479517048 on volume /hdfs/d4/dfs/dn: 3ms 2025-07-02 13:05:27,072 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Time to add replicas to map for block pool BP-58625395-192.168.158.1-1751479517048 on volume /hdfs/d3/dfs/dn: 4ms 2025-07-02 13:05:27,074 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Total time to add all replicas to map for block pool BP-58625395-192.168.158.1-1751479517048: 7ms 2025-07-02 13:05:27,074 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for /hdfs/d1/dfs/dn 2025-07-02 13:05:27,087 INFO org.apache.hadoop.hdfs.server.datanode.checker.DatasetVolumeChecker: Scheduled health check for volume /hdfs/d1/dfs/dn 2025-07-02 13:05:27,090 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for /hdfs/d2/dfs/dn 2025-07-02 13:05:27,090 INFO org.apache.hadoop.hdfs.server.datanode.checker.DatasetVolumeChecker: Scheduled health check for volume /hdfs/d2/dfs/dn 2025-07-02 13:05:27,090 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for /hdfs/d3/dfs/dn 2025-07-02 13:05:27,091 INFO org.apache.hadoop.hdfs.server.datanode.checker.DatasetVolumeChecker: Scheduled health check for volume /hdfs/d3/dfs/dn 2025-07-02 13:05:27,091 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for /hdfs/d4/dfs/dn 2025-07-02 13:05:27,092 INFO org.apache.hadoop.hdfs.server.datanode.checker.DatasetVolumeChecker: Scheduled health check for volume /hdfs/d4/dfs/dn 2025-07-02 13:05:27,095 INFO org.apache.hadoop.hdfs.server.datanode.VolumeScanner: Now scanning bpid BP-58625395-192.168.158.1-1751479517048 on volume /hdfs/d3/dfs/dn 2025-07-02 13:05:27,096 INFO org.apache.hadoop.hdfs.server.datanode.VolumeScanner: Now scanning bpid BP-58625395-192.168.158.1-1751479517048 on volume /hdfs/d4/dfs/dn 2025-07-02 13:05:27,097 INFO org.apache.hadoop.hdfs.server.datanode.VolumeScanner: Now scanning bpid BP-58625395-192.168.158.1-1751479517048 on volume /hdfs/d2/dfs/dn 2025-07-02 13:05:27,096 INFO org.apache.hadoop.hdfs.server.datanode.VolumeScanner: Now scanning bpid BP-58625395-192.168.158.1-1751479517048 on volume /hdfs/d1/dfs/dn 2025-07-02 13:05:27,103 INFO org.apache.hadoop.hdfs.server.datanode.VolumeScanner: VolumeScanner(/hdfs/d2/dfs/dn, DS-9da8ed8d-5505-4aa6-9202-3a757c515004): finished scanning block pool BP-58625395-192.168.158.1-1751479517048 2025-07-02 13:05:27,104 INFO org.apache.hadoop.hdfs.server.datanode.VolumeScanner: VolumeScanner(/hdfs/d3/dfs/dn, DS-0cdbc609-0d97-4a42-88e7-a1d84207e95a): finished scanning block pool BP-58625395-192.168.158.1-1751479517048 2025-07-02 13:05:27,104 INFO org.apache.hadoop.hdfs.server.datanode.VolumeScanner: VolumeScanner(/hdfs/d4/dfs/dn, DS-66da5157-6664-49c0-94d4-de2a951e5d1e): finished scanning block pool BP-58625395-192.168.158.1-1751479517048 2025-07-02 13:05:27,106 INFO org.apache.hadoop.hdfs.server.datanode.VolumeScanner: VolumeScanner(/hdfs/d1/dfs/dn, DS-4d75b0ca-10b9-4609-8e68-77a4f5414078): finished scanning block pool BP-58625395-192.168.158.1-1751479517048 2025-07-02 13:05:27,110 INFO org.apache.hadoop.hdfs.server.datanode.DirectoryScanner: Periodic Directory Tree Verification scan starting at 7/2/25 6:11 PM with interval of 21600000ms 2025-07-02 13:05:27,123 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool BP-58625395-192.168.158.1-1751479517048 (Datanode Uuid 92aa408f-509e-4bad-98e0-6e83f0a34a12) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 beginning handshake with NN 2025-07-02 13:05:27,131 INFO org.apache.hadoop.hdfs.server.datanode.VolumeScanner: VolumeScanner(/hdfs/d1/dfs/dn, DS-4d75b0ca-10b9-4609-8e68-77a4f5414078): no suitable block pools found to scan. Waiting 1814399964 ms. 2025-07-02 13:05:27,131 INFO org.apache.hadoop.hdfs.server.datanode.VolumeScanner: VolumeScanner(/hdfs/d2/dfs/dn, DS-9da8ed8d-5505-4aa6-9202-3a757c515004): no suitable block pools found to scan. Waiting 1814399964 ms. 2025-07-02 13:05:27,131 INFO org.apache.hadoop.hdfs.server.datanode.VolumeScanner: VolumeScanner(/hdfs/d3/dfs/dn, DS-0cdbc609-0d97-4a42-88e7-a1d84207e95a): no suitable block pools found to scan. Waiting 1814399964 ms. 2025-07-02 13:05:27,131 INFO org.apache.hadoop.hdfs.server.datanode.VolumeScanner: VolumeScanner(/hdfs/d4/dfs/dn, DS-66da5157-6664-49c0-94d4-de2a951e5d1e): no suitable block pools found to scan. Waiting 1814399964 ms. 2025-07-02 13:05:27,241 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool Block pool BP-58625395-192.168.158.1-1751479517048 (Datanode Uuid 92aa408f-509e-4bad-98e0-6e83f0a34a12) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 successfully registered with NN 2025-07-02 13:05:27,243 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: For namenode dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 using BLOCKREPORT_INTERVAL of 21600000msec CACHEREPORT_INTERVAL of 10000msec Initial delay: 0msec; heartBeatInterval=3000 2025-07-02 13:05:27,243 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting IBR Task Handler. 2025-07-02 13:05:27,505 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Successfully sent block report 0xa0ec881d8e5c3389, containing 4 storage report(s), of which we sent 4. The reports had 0 total blocks and used 1 RPC(s). This took 7 msec to generate and 117 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5. 2025-07-02 13:05:27,506 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Got finalize command for block pool BP-58625395-192.168.158.1-1751479517048 2025-07-02 14:14:03,468 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Successfully sent block report 0xa0ec881d8e5c338a, containing 4 storage report(s), of which we sent 4. The reports had 0 total blocks and used 1 RPC(s). This took 0 msec to generate and 5 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5. 2025-07-02 14:14:03,468 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Got finalize command for block pool BP-58625395-192.168.158.1-1751479517048 2025-07-02 14:41:04,563 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: RECEIVED SIGNAL 15: SIGTERM 2025-07-02 14:41:04,568 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG: /************************************************************ SHUTDOWN_MSG: Shutting down DataNode at dmidlkprdls04.svr.luc.edu/192.168.158.4 ************************************************************/ 2025-07-02 15:19:07,948 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: STARTUP_MSG: /************************************************************ STARTUP_MSG: Starting DataNode STARTUP_MSG: host = dmidlkprdls04.svr.luc.edu/192.168.158.4 STARTUP_MSG: args = [] STARTUP_MSG: version = 3.1.1.7.3.1.0-197 STARTUP_MSG: classpath = /var/run/cloudera-scm-agent/process/227-hdfs-DATANODE:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/audience-annotations-0.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/avro.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/aws-java-sdk-bundle-1.12.720.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-hdfs-plugin-shim-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-plugin-classloader-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-yarn-plugin-shim-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/azure-data-lake-store-sdk-2.3.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/checker-qual-3.33.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-beanutils-1.9.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-cli-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-codec-1.15.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-collections-3.2.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-compress-1.23.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-configuration2-2.10.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-io-2.11.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-lang-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-lang3-3.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-logging-1.1.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-math3-3.6.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-net-3.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-text-1.10.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-client-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-framework-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-recipes-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/failureaccess-1.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/gson-2.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/guava-32.0.1-jre.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/httpclient-4.5.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/httpcore-4.4.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/j2objc-annotations-2.8.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-core-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-core-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-databind-2.12.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-mapper-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-xc-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/javax.activation-api-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/javax.servlet-api-3.1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jaxb-api-2.2.11.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jcip-annotations-1.0-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-core-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-json-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-server-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-servlet-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jettison-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-http-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-io-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-security-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-server-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-servlet-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-util-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-util-ajax-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-webapp-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-xml-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jline-3.22.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsch-0.1.54.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsp-api-2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsr305-3.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsr311-api-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jul-to-slf4j-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-admin-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-client-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-common-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-core-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-crypto-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-identity-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-server-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-simplekdc-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-asn1-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-config-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-pkix-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-xdr-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/logredactor-2.0.16.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/lz4-java-1.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/metrics-core-3.2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-all-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-buffer-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-haproxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-http-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-http2-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-memcache-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-mqtt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-redis-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-smtp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-log4j12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-reload4j-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-api-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/reload4j-1.2.22.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/zookeeper.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-udt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-unix-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-kqueue-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final-linux-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-classes-kqueue-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-rxtx-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-classes-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-classes-macos-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-ssl-ocsp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-proxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-xml-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-stomp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-socks-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/wildfly-openssl-2.1.4.ClouderaFinal.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/protobuf-java-2.5.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/nimbus-jose-jwt-9.37.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/woodstox-core-5.4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/token-provider-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/stax2-api-4.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/snappy-java-1.1.10.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/re2j-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/zookeeper-jute.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-sctp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-kqueue-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final-linux-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//gcs-connector-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-annotations.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-auth.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-aws.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-datalake.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-kms.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-nfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//ozone-filesystem-hadoop3-1.3.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-nfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-kms-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-datalake-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-aws-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-auth-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-annotations-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//gcs-connector-2.1.2.7.3.1.0-197-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-thrift.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-scala_2.12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-protobuf.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-pig.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-pig-bundle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-jackson.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-hadoop.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-hadoop-bundle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-generator.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-format-structures.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-encoding.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-column.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-cascading3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-cascading.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-avro.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/./:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/audience-annotations-0.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/avro-1.11.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/checker-qual-3.33.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-beanutils-1.9.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-codec-1.15.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-collections-3.2.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-compress-1.23.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-configuration2-2.10.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-io-2.11.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-lang3-3.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-math3-3.6.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-net-3.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-text-1.10.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-client-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-framework-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-recipes-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/failureaccess-1.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/gson-2.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/guava-32.0.1-jre.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/httpclient-4.5.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/httpcore-4.4.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/j2objc-annotations-2.8.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-core-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-databind-2.12.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-jaxrs-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-xc-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/javax.activation-api-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/javax.servlet-api-3.1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jaxb-api-2.2.11.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jaxb-impl-2.2.3-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jcip-annotations-1.0-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-core-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-json-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-server-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-servlet-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jettison-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-http-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-io-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-security-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-server-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-servlet-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-util-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-util-ajax-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-webapp-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-xml-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jline-3.22.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsch-0.1.54.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/json-simple-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsr311-api-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-admin-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-client-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-common-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-core-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-crypto-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-identity-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-server-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-simplekdc-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-asn1-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-config-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-pkix-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-xdr-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/leveldbjni-cldr-1.9.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/lz4-java-1.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/metrics-core-3.2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-all-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-buffer-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-haproxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-http-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-http2-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-memcache-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-mqtt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-redis-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-smtp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-socks-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-stomp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-xml-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-proxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-ssl-ocsp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/re2j-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/zookeeper-3.8.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-sctp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-rxtx-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-unix-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-kqueue-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-kqueue-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final-linux-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final-linux-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-classes-kqueue-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-classes-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-classes-macos-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/nimbus-jose-jwt-9.37.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/woodstox-core-5.4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/token-provider-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/stax2-api-4.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/snappy-java-1.1.10.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/reload4j-1.2.22.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/zookeeper-jute-3.8.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-udt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-httpfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-nfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-nfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-httpfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//asm-5.0.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//aspectjweaver-1.9.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//azure-storage-7.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//checker-compat-qual-2.5.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-slf4j-backend-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-system-backend-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//google-extensions-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//azure-keyvault-core-1.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//accessors-smart-2.4.9.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gcs-connector-2.1.2.7.3.1.0-197-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archive-logs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archive-logs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archives-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archives.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-datajoin-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-datajoin.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-distcp-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-distcp.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-extras-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-extras.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-fs2img-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-fs2img.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-gridmix-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-gridmix.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-kafka-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-kafka.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-app-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-app.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-core-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-core.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-nativetask-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-nativetask.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-uploader-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-uploader.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-examples-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-examples.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-openstack-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-openstack.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-resourceestimator-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-resourceestimator.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-rumen-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-rumen.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-sls-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-sls.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-streaming-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-streaming.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ojalgo-43.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//kafka-clients-2.8.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//log4j-core-2.18.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-hook-abfs-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//forbiddenapis-3.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-intg-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//log4j-api-2.18.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//zstd-jni-1.4.9-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-i18n.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-s3-lib-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//javax.activation-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//bundle-2.23.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//json-smart-2.4.10.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-util-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-shell.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-cloud-bindings.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-hook-s3-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//aspectjrt-1.9.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/./:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/swagger-annotations-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/objenesis-1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jersey-client-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jakarta.xml.bind-api-2.3.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jakarta.activation-api-1.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-dataformat-yaml-2.9.10.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcutil-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcprov-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcpkix-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/HikariCP-java7-2.4.12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/snakeyaml-2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jsonschema2pojo-core-1.0.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/joda-time-2.10.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jna-5.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jersey-guice-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/javax.inject-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-module-jaxb-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-jaxrs-json-provider-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-jaxrs-base-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/guice-servlet-4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/guice-4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/geronimo-jcache_1.0_spec-1.0-alpha-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/fst-2.50.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/ehcache-3.3.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/dnsjava-2.1.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/codemodel-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-api.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-distributedshell.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-cloud-resourcemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-registry.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-nodemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-resourcemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-router.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-sharedcachemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-timeline-pluginstorage.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-web-proxy.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-api.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-core.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-registry-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-api-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-core-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-api-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-web-proxy-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-timeline-pluginstorage-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-tests-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-sharedcachemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-router-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-resourcemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-nodemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-cloud-resourcemanager-1.0.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-distributedshell-3.1.1.7.3.1.0-197.jar:/opt/cloudera/cm/lib/plugins/event-publish-7.13.1-shaded.jar:/opt/cloudera/cm/lib/plugins/tt-instrumentation-7.13.1.jar STARTUP_MSG: build = git@github.infra.cloudera.com:CDH/hadoop.git -r 31a42fb39494f541ffae15c3c61185deeeacca86; compiled by 'jenkins' on 2024-12-04T01:09Z STARTUP_MSG: java = 1.8.0_432 ************************************************************/ 2025-07-02 15:19:08,033 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: registered UNIX signal handlers for [TERM, HUP, INT] 2025-07-02 15:19:08,380 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d1/dfs/dn 2025-07-02 15:19:08,386 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d2/dfs/dn 2025-07-02 15:19:08,386 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d3/dfs/dn 2025-07-02 15:19:08,387 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d4/dfs/dn 2025-07-02 15:19:08,532 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: Loaded properties from hadoop-metrics2.properties 2025-07-02 15:19:08,635 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled Metric snapshot period at 10 second(s). 2025-07-02 15:19:08,635 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: DataNode metrics system started 2025-07-02 15:19:08,914 INFO org.apache.hadoop.hdfs.server.common.Util: dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling 2025-07-02 15:19:08,937 INFO org.apache.hadoop.hdfs.server.datanode.BlockScanner: Initialized block scanner with targetBytesPerSec 1048576 2025-07-02 15:19:08,944 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: File descriptor passing is enabled. 2025-07-02 15:19:08,945 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Configured hostname is dmidlkprdls04.svr.luc.edu 2025-07-02 15:19:08,946 INFO org.apache.hadoop.hdfs.server.common.Util: dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling 2025-07-02 15:19:08,951 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting DataNode with maxLockedMemory = 4294967296 2025-07-02 15:19:08,976 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened streaming server at /192.168.158.4:9866 2025-07-02 15:19:08,979 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Balancing bandwidth is 10485760 bytes/s 2025-07-02 15:19:08,979 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Number threads for balancing is 50 2025-07-02 15:19:08,982 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Balancing bandwidth is 10485760 bytes/s 2025-07-02 15:19:08,983 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Number threads for balancing is 50 2025-07-02 15:19:08,983 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Listening on UNIX domain socket: /var/run/hdfs-sockets/dn 2025-07-02 15:19:09,149 INFO org.eclipse.jetty.util.log: Logging initialized @2368ms to org.eclipse.jetty.util.log.Slf4jLog 2025-07-02 15:19:09,312 WARN org.apache.hadoop.security.authentication.server.AuthenticationFilter: Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /var/lib/hadoop-hdfs/hadoop-http-auth-signature-secret 2025-07-02 15:19:09,322 INFO org.apache.hadoop.http.HttpRequestLog: Http request log for http.requests.datanode is not defined 2025-07-02 15:19:09,333 INFO org.apache.hadoop.http.HttpServer2: Added global filter 'safety' (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter) 2025-07-02 15:19:09,336 INFO org.apache.hadoop.security.HttpCrossOriginFilterInitializer: CORS filter not enabled. Please set hadoop.http.cross-origin.enabled to 'true' to enable it 2025-07-02 15:19:09,338 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context datanode 2025-07-02 15:19:09,338 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context logs 2025-07-02 15:19:09,338 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context static 2025-07-02 15:19:09,384 INFO org.apache.hadoop.http.HttpServer2: Jetty bound to port 39851 2025-07-02 15:19:09,386 INFO org.eclipse.jetty.server.Server: jetty-9.4.54.v20240208; built: 2024-02-08T19:42:39.027Z; git: cef3fbd6d736a21e7d541a5db490381d95a2047d; jvm 1.8.0_432-b06 2025-07-02 15:19:09,440 INFO org.eclipse.jetty.server.session: DefaultSessionIdManager workerName=node0 2025-07-02 15:19:09,440 INFO org.eclipse.jetty.server.session: No SessionScavenger set, using defaults 2025-07-02 15:19:09,444 INFO org.eclipse.jetty.server.session: node0 Scavenging every 660000ms 2025-07-02 15:19:09,471 WARN org.apache.hadoop.security.authentication.server.AuthenticationFilter: Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /var/lib/hadoop-hdfs/hadoop-http-auth-signature-secret 2025-07-02 15:19:09,477 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.s.ServletContextHandler@48c40605{logs,/logs,file:///var/log/hadoop-hdfs/,AVAILABLE} 2025-07-02 15:19:09,478 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.s.ServletContextHandler@6cea706c{static,/static,file:///opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/static/,AVAILABLE} 2025-07-02 15:19:09,591 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.w.WebAppContext@2c444798{datanode,/,file:///opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/datanode/,AVAILABLE}{file:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/datanode} 2025-07-02 15:19:09,614 INFO org.eclipse.jetty.server.AbstractConnector: Started ServerConnector@532721fd{HTTP/1.1, (http/1.1)}{localhost:39851} 2025-07-02 15:19:09,614 INFO org.eclipse.jetty.server.Server: Started @2833ms 2025-07-02 15:19:09,926 INFO org.apache.hadoop.hdfs.server.datanode.web.DatanodeHttpServer: Listening HTTP traffic on /192.168.158.4:9864 2025-07-02 15:19:09,937 INFO org.apache.hadoop.util.JvmPauseMonitor: Starting JVM pause monitor 2025-07-02 15:19:09,938 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: dnUserName = hdfs 2025-07-02 15:19:09,939 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: supergroup = supergroup 2025-07-02 15:19:10,008 INFO org.apache.hadoop.ipc.CallQueueManager: Using callQueue: class java.util.concurrent.LinkedBlockingQueue queueCapacity: 1000 scheduler: class org.apache.hadoop.ipc.DefaultRpcScheduler 2025-07-02 15:19:10,030 INFO org.apache.hadoop.ipc.Server: Starting Socket Reader #1 for port 9867 2025-07-02 15:19:10,083 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened IPC server at /192.168.158.4:9867 2025-07-02 15:19:10,127 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Refresh request received for nameservices: null 2025-07-02 15:19:10,139 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting BPOfferServices for nameservices: 2025-07-02 15:19:10,156 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool (Datanode Uuid unassigned) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 starting to offer service 2025-07-02 15:19:10,164 INFO org.apache.hadoop.ipc.Server: IPC Server Responder: starting 2025-07-02 15:19:10,164 INFO org.apache.hadoop.ipc.Server: IPC Server listener on 9867: starting 2025-07-02 15:19:11,356 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-02 15:19:12,357 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-02 15:19:12,701 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Acknowledging ACTIVE Namenode during handshakeBlock pool (Datanode Uuid unassigned) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-02 15:19:12,704 INFO org.apache.hadoop.hdfs.server.common.Storage: Using 4 threads to upgrade data directories (dfs.datanode.parallel.volumes.load.threads.num=4, dataDirs=4) 2025-07-02 15:19:12,711 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d1/dfs/dn/in_use.lock acquired by nodename 256878@dmidlkprdls04.svr.luc.edu 2025-07-02 15:19:12,714 WARN org.apache.hadoop.hdfs.server.common.Storage: Failed to add storage directory [DISK]file:/hdfs/d1/dfs/dn java.io.IOException: Incompatible clusterIDs in /hdfs/d1/dfs/dn: namenode clusterID = cluster87; datanode clusterID = cluster72 at org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:744) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadStorageDirectory(DataStorage.java:294) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadDataStorage(DataStorage.java:407) at org.apache.hadoop.hdfs.server.datanode.DataStorage.addStorageLocations(DataStorage.java:387) at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:559) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-02 15:19:12,719 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d2/dfs/dn/in_use.lock acquired by nodename 256878@dmidlkprdls04.svr.luc.edu 2025-07-02 15:19:12,719 WARN org.apache.hadoop.hdfs.server.common.Storage: Failed to add storage directory [DISK]file:/hdfs/d2/dfs/dn java.io.IOException: Incompatible clusterIDs in /hdfs/d2/dfs/dn: namenode clusterID = cluster87; datanode clusterID = cluster72 at org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:744) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadStorageDirectory(DataStorage.java:294) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadDataStorage(DataStorage.java:407) at org.apache.hadoop.hdfs.server.datanode.DataStorage.addStorageLocations(DataStorage.java:387) at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:559) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-02 15:19:12,720 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d3/dfs/dn/in_use.lock acquired by nodename 256878@dmidlkprdls04.svr.luc.edu 2025-07-02 15:19:12,721 WARN org.apache.hadoop.hdfs.server.common.Storage: Failed to add storage directory [DISK]file:/hdfs/d3/dfs/dn java.io.IOException: Incompatible clusterIDs in /hdfs/d3/dfs/dn: namenode clusterID = cluster87; datanode clusterID = cluster72 at org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:744) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadStorageDirectory(DataStorage.java:294) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadDataStorage(DataStorage.java:407) at org.apache.hadoop.hdfs.server.datanode.DataStorage.addStorageLocations(DataStorage.java:387) at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:559) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-02 15:19:12,722 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d4/dfs/dn/in_use.lock acquired by nodename 256878@dmidlkprdls04.svr.luc.edu 2025-07-02 15:19:12,722 WARN org.apache.hadoop.hdfs.server.common.Storage: Failed to add storage directory [DISK]file:/hdfs/d4/dfs/dn java.io.IOException: Incompatible clusterIDs in /hdfs/d4/dfs/dn: namenode clusterID = cluster87; datanode clusterID = cluster72 at org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:744) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadStorageDirectory(DataStorage.java:294) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadDataStorage(DataStorage.java:407) at org.apache.hadoop.hdfs.server.datanode.DataStorage.addStorageLocations(DataStorage.java:387) at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:559) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-02 15:19:12,725 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: Initialization failed for Block pool (Datanode Uuid 92aa408f-509e-4bad-98e0-6e83f0a34a12) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Exiting. java.io.IOException: All specified directories have failed to load. at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:560) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-02 15:19:12,726 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Ending block pool service for: Block pool (Datanode Uuid 92aa408f-509e-4bad-98e0-6e83f0a34a12) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-02 15:19:12,730 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Removed Block pool (Datanode Uuid 92aa408f-509e-4bad-98e0-6e83f0a34a12) 2025-07-02 15:19:14,731 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Exiting Datanode 2025-07-02 15:19:14,738 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG: /************************************************************ SHUTDOWN_MSG: Shutting down DataNode at dmidlkprdls04.svr.luc.edu/192.168.158.4 ************************************************************/ 2025-07-02 15:19:19,060 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: STARTUP_MSG: /************************************************************ STARTUP_MSG: Starting DataNode STARTUP_MSG: host = dmidlkprdls04.svr.luc.edu/192.168.158.4 STARTUP_MSG: args = [] STARTUP_MSG: version = 3.1.1.7.3.1.0-197 STARTUP_MSG: classpath = /var/run/cloudera-scm-agent/process/227-hdfs-DATANODE:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/audience-annotations-0.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/avro.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/aws-java-sdk-bundle-1.12.720.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-hdfs-plugin-shim-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-plugin-classloader-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-yarn-plugin-shim-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/azure-data-lake-store-sdk-2.3.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/checker-qual-3.33.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-beanutils-1.9.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-cli-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-codec-1.15.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-collections-3.2.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-compress-1.23.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-configuration2-2.10.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-io-2.11.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-lang-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-lang3-3.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-logging-1.1.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-math3-3.6.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-net-3.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-text-1.10.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-client-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-framework-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-recipes-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/failureaccess-1.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/gson-2.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/guava-32.0.1-jre.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/httpclient-4.5.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/httpcore-4.4.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/j2objc-annotations-2.8.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-core-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-core-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-databind-2.12.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-mapper-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-xc-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/javax.activation-api-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/javax.servlet-api-3.1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jaxb-api-2.2.11.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jcip-annotations-1.0-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-core-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-json-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-server-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-servlet-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jettison-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-http-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-io-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-security-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-server-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-servlet-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-util-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-util-ajax-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-webapp-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-xml-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jline-3.22.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsch-0.1.54.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsp-api-2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsr305-3.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsr311-api-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jul-to-slf4j-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-admin-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-client-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-common-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-core-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-crypto-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-identity-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-server-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-simplekdc-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-asn1-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-config-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-pkix-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-xdr-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/logredactor-2.0.16.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/lz4-java-1.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/metrics-core-3.2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-all-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-buffer-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-haproxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-http-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-http2-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-memcache-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-mqtt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-redis-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-smtp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-log4j12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-reload4j-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-api-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/reload4j-1.2.22.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/zookeeper.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-udt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-unix-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-kqueue-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final-linux-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-classes-kqueue-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-rxtx-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-classes-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-classes-macos-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-ssl-ocsp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-proxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-xml-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-stomp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-socks-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/wildfly-openssl-2.1.4.ClouderaFinal.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/protobuf-java-2.5.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/nimbus-jose-jwt-9.37.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/woodstox-core-5.4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/token-provider-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/stax2-api-4.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/snappy-java-1.1.10.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/re2j-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/zookeeper-jute.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-sctp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-kqueue-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final-linux-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//gcs-connector-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-annotations.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-auth.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-aws.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-datalake.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-kms.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-nfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//ozone-filesystem-hadoop3-1.3.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-nfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-kms-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-datalake-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-aws-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-auth-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-annotations-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//gcs-connector-2.1.2.7.3.1.0-197-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-thrift.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-scala_2.12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-protobuf.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-pig.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-pig-bundle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-jackson.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-hadoop.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-hadoop-bundle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-generator.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-format-structures.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-encoding.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-column.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-cascading3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-cascading.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-avro.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/./:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/audience-annotations-0.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/avro-1.11.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/checker-qual-3.33.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-beanutils-1.9.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-codec-1.15.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-collections-3.2.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-compress-1.23.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-configuration2-2.10.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-io-2.11.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-lang3-3.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-math3-3.6.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-net-3.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-text-1.10.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-client-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-framework-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-recipes-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/failureaccess-1.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/gson-2.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/guava-32.0.1-jre.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/httpclient-4.5.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/httpcore-4.4.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/j2objc-annotations-2.8.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-core-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-databind-2.12.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-jaxrs-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-xc-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/javax.activation-api-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/javax.servlet-api-3.1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jaxb-api-2.2.11.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jaxb-impl-2.2.3-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jcip-annotations-1.0-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-core-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-json-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-server-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-servlet-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jettison-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-http-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-io-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-security-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-server-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-servlet-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-util-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-util-ajax-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-webapp-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-xml-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jline-3.22.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsch-0.1.54.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/json-simple-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsr311-api-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-admin-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-client-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-common-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-core-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-crypto-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-identity-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-server-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-simplekdc-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-asn1-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-config-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-pkix-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-xdr-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/leveldbjni-cldr-1.9.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/lz4-java-1.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/metrics-core-3.2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-all-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-buffer-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-haproxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-http-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-http2-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-memcache-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-mqtt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-redis-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-smtp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-socks-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-stomp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-xml-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-proxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-ssl-ocsp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/re2j-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/zookeeper-3.8.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-sctp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-rxtx-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-unix-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-kqueue-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-kqueue-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final-linux-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final-linux-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-classes-kqueue-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-classes-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-classes-macos-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/nimbus-jose-jwt-9.37.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/woodstox-core-5.4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/token-provider-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/stax2-api-4.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/snappy-java-1.1.10.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/reload4j-1.2.22.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/zookeeper-jute-3.8.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-udt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-httpfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-nfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-nfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-httpfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//asm-5.0.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//aspectjweaver-1.9.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//azure-storage-7.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//checker-compat-qual-2.5.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-slf4j-backend-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-system-backend-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//google-extensions-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//azure-keyvault-core-1.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//accessors-smart-2.4.9.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gcs-connector-2.1.2.7.3.1.0-197-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archive-logs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archive-logs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archives-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archives.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-datajoin-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-datajoin.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-distcp-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-distcp.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-extras-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-extras.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-fs2img-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-fs2img.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-gridmix-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-gridmix.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-kafka-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-kafka.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-app-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-app.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-core-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-core.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-nativetask-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-nativetask.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-uploader-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-uploader.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-examples-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-examples.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-openstack-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-openstack.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-resourceestimator-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-resourceestimator.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-rumen-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-rumen.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-sls-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-sls.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-streaming-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-streaming.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ojalgo-43.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//kafka-clients-2.8.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//log4j-core-2.18.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-hook-abfs-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//forbiddenapis-3.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-intg-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//log4j-api-2.18.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//zstd-jni-1.4.9-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-i18n.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-s3-lib-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//javax.activation-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//bundle-2.23.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//json-smart-2.4.10.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-util-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-shell.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-cloud-bindings.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-hook-s3-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//aspectjrt-1.9.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/./:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/swagger-annotations-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/objenesis-1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jersey-client-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jakarta.xml.bind-api-2.3.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jakarta.activation-api-1.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-dataformat-yaml-2.9.10.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcutil-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcprov-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcpkix-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/HikariCP-java7-2.4.12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/snakeyaml-2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jsonschema2pojo-core-1.0.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/joda-time-2.10.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jna-5.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jersey-guice-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/javax.inject-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-module-jaxb-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-jaxrs-json-provider-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-jaxrs-base-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/guice-servlet-4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/guice-4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/geronimo-jcache_1.0_spec-1.0-alpha-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/fst-2.50.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/ehcache-3.3.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/dnsjava-2.1.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/codemodel-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-api.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-distributedshell.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-cloud-resourcemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-registry.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-nodemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-resourcemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-router.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-sharedcachemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-timeline-pluginstorage.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-web-proxy.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-api.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-core.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-registry-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-api-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-core-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-api-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-web-proxy-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-timeline-pluginstorage-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-tests-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-sharedcachemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-router-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-resourcemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-nodemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-cloud-resourcemanager-1.0.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-distributedshell-3.1.1.7.3.1.0-197.jar:/opt/cloudera/cm/lib/plugins/event-publish-7.13.1-shaded.jar:/opt/cloudera/cm/lib/plugins/tt-instrumentation-7.13.1.jar STARTUP_MSG: build = git@github.infra.cloudera.com:CDH/hadoop.git -r 31a42fb39494f541ffae15c3c61185deeeacca86; compiled by 'jenkins' on 2024-12-04T01:09Z STARTUP_MSG: java = 1.8.0_432 ************************************************************/ 2025-07-02 15:19:19,154 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: registered UNIX signal handlers for [TERM, HUP, INT] 2025-07-02 15:19:19,498 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d1/dfs/dn 2025-07-02 15:19:19,504 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d2/dfs/dn 2025-07-02 15:19:19,505 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d3/dfs/dn 2025-07-02 15:19:19,505 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d4/dfs/dn 2025-07-02 15:19:19,676 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: Loaded properties from hadoop-metrics2.properties 2025-07-02 15:19:19,788 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled Metric snapshot period at 10 second(s). 2025-07-02 15:19:19,788 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: DataNode metrics system started 2025-07-02 15:19:20,068 INFO org.apache.hadoop.hdfs.server.common.Util: dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling 2025-07-02 15:19:20,089 INFO org.apache.hadoop.hdfs.server.datanode.BlockScanner: Initialized block scanner with targetBytesPerSec 1048576 2025-07-02 15:19:20,096 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: File descriptor passing is enabled. 2025-07-02 15:19:20,097 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Configured hostname is dmidlkprdls04.svr.luc.edu 2025-07-02 15:19:20,097 INFO org.apache.hadoop.hdfs.server.common.Util: dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling 2025-07-02 15:19:20,104 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting DataNode with maxLockedMemory = 4294967296 2025-07-02 15:19:20,129 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened streaming server at /192.168.158.4:9866 2025-07-02 15:19:20,131 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Balancing bandwidth is 10485760 bytes/s 2025-07-02 15:19:20,131 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Number threads for balancing is 50 2025-07-02 15:19:20,135 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Balancing bandwidth is 10485760 bytes/s 2025-07-02 15:19:20,135 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Number threads for balancing is 50 2025-07-02 15:19:20,135 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Listening on UNIX domain socket: /var/run/hdfs-sockets/dn 2025-07-02 15:19:20,312 INFO org.eclipse.jetty.util.log: Logging initialized @2392ms to org.eclipse.jetty.util.log.Slf4jLog 2025-07-02 15:19:20,461 WARN org.apache.hadoop.security.authentication.server.AuthenticationFilter: Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /var/lib/hadoop-hdfs/hadoop-http-auth-signature-secret 2025-07-02 15:19:20,470 INFO org.apache.hadoop.http.HttpRequestLog: Http request log for http.requests.datanode is not defined 2025-07-02 15:19:20,481 INFO org.apache.hadoop.http.HttpServer2: Added global filter 'safety' (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter) 2025-07-02 15:19:20,485 INFO org.apache.hadoop.security.HttpCrossOriginFilterInitializer: CORS filter not enabled. Please set hadoop.http.cross-origin.enabled to 'true' to enable it 2025-07-02 15:19:20,486 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context datanode 2025-07-02 15:19:20,487 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context logs 2025-07-02 15:19:20,487 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context static 2025-07-02 15:19:20,532 INFO org.apache.hadoop.http.HttpServer2: Jetty bound to port 45939 2025-07-02 15:19:20,533 INFO org.eclipse.jetty.server.Server: jetty-9.4.54.v20240208; built: 2024-02-08T19:42:39.027Z; git: cef3fbd6d736a21e7d541a5db490381d95a2047d; jvm 1.8.0_432-b06 2025-07-02 15:19:20,587 INFO org.eclipse.jetty.server.session: DefaultSessionIdManager workerName=node0 2025-07-02 15:19:20,587 INFO org.eclipse.jetty.server.session: No SessionScavenger set, using defaults 2025-07-02 15:19:20,590 INFO org.eclipse.jetty.server.session: node0 Scavenging every 660000ms 2025-07-02 15:19:20,623 WARN org.apache.hadoop.security.authentication.server.AuthenticationFilter: Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /var/lib/hadoop-hdfs/hadoop-http-auth-signature-secret 2025-07-02 15:19:20,628 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.s.ServletContextHandler@6cea706c{logs,/logs,file:///var/log/hadoop-hdfs/,AVAILABLE} 2025-07-02 15:19:20,630 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.s.ServletContextHandler@21ec5d87{static,/static,file:///opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/static/,AVAILABLE} 2025-07-02 15:19:20,749 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.w.WebAppContext@4d157787{datanode,/,file:///opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/datanode/,AVAILABLE}{file:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/datanode} 2025-07-02 15:19:20,761 INFO org.eclipse.jetty.server.AbstractConnector: Started ServerConnector@51f49060{HTTP/1.1, (http/1.1)}{localhost:45939} 2025-07-02 15:19:20,761 INFO org.eclipse.jetty.server.Server: Started @2841ms 2025-07-02 15:19:21,043 INFO org.apache.hadoop.hdfs.server.datanode.web.DatanodeHttpServer: Listening HTTP traffic on /192.168.158.4:9864 2025-07-02 15:19:21,055 INFO org.apache.hadoop.util.JvmPauseMonitor: Starting JVM pause monitor 2025-07-02 15:19:21,057 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: dnUserName = hdfs 2025-07-02 15:19:21,057 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: supergroup = supergroup 2025-07-02 15:19:21,130 INFO org.apache.hadoop.ipc.CallQueueManager: Using callQueue: class java.util.concurrent.LinkedBlockingQueue queueCapacity: 1000 scheduler: class org.apache.hadoop.ipc.DefaultRpcScheduler 2025-07-02 15:19:21,152 INFO org.apache.hadoop.ipc.Server: Starting Socket Reader #1 for port 9867 2025-07-02 15:19:21,205 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened IPC server at /192.168.158.4:9867 2025-07-02 15:19:21,246 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Refresh request received for nameservices: null 2025-07-02 15:19:21,258 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting BPOfferServices for nameservices: 2025-07-02 15:19:21,275 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool (Datanode Uuid unassigned) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 starting to offer service 2025-07-02 15:19:21,283 INFO org.apache.hadoop.ipc.Server: IPC Server Responder: starting 2025-07-02 15:19:21,283 INFO org.apache.hadoop.ipc.Server: IPC Server listener on 9867: starting 2025-07-02 15:19:21,567 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Acknowledging ACTIVE Namenode during handshakeBlock pool (Datanode Uuid unassigned) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-02 15:19:21,572 INFO org.apache.hadoop.hdfs.server.common.Storage: Using 4 threads to upgrade data directories (dfs.datanode.parallel.volumes.load.threads.num=4, dataDirs=4) 2025-07-02 15:19:21,580 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d1/dfs/dn/in_use.lock acquired by nodename 257451@dmidlkprdls04.svr.luc.edu 2025-07-02 15:19:21,584 WARN org.apache.hadoop.hdfs.server.common.Storage: Failed to add storage directory [DISK]file:/hdfs/d1/dfs/dn java.io.IOException: Incompatible clusterIDs in /hdfs/d1/dfs/dn: namenode clusterID = cluster87; datanode clusterID = cluster72 at org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:744) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadStorageDirectory(DataStorage.java:294) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadDataStorage(DataStorage.java:407) at org.apache.hadoop.hdfs.server.datanode.DataStorage.addStorageLocations(DataStorage.java:387) at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:559) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-02 15:19:21,590 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d2/dfs/dn/in_use.lock acquired by nodename 257451@dmidlkprdls04.svr.luc.edu 2025-07-02 15:19:21,590 WARN org.apache.hadoop.hdfs.server.common.Storage: Failed to add storage directory [DISK]file:/hdfs/d2/dfs/dn java.io.IOException: Incompatible clusterIDs in /hdfs/d2/dfs/dn: namenode clusterID = cluster87; datanode clusterID = cluster72 at org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:744) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadStorageDirectory(DataStorage.java:294) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadDataStorage(DataStorage.java:407) at org.apache.hadoop.hdfs.server.datanode.DataStorage.addStorageLocations(DataStorage.java:387) at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:559) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-02 15:19:21,591 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d3/dfs/dn/in_use.lock acquired by nodename 257451@dmidlkprdls04.svr.luc.edu 2025-07-02 15:19:21,592 WARN org.apache.hadoop.hdfs.server.common.Storage: Failed to add storage directory [DISK]file:/hdfs/d3/dfs/dn java.io.IOException: Incompatible clusterIDs in /hdfs/d3/dfs/dn: namenode clusterID = cluster87; datanode clusterID = cluster72 at org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:744) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadStorageDirectory(DataStorage.java:294) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadDataStorage(DataStorage.java:407) at org.apache.hadoop.hdfs.server.datanode.DataStorage.addStorageLocations(DataStorage.java:387) at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:559) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-02 15:19:21,593 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d4/dfs/dn/in_use.lock acquired by nodename 257451@dmidlkprdls04.svr.luc.edu 2025-07-02 15:19:21,593 WARN org.apache.hadoop.hdfs.server.common.Storage: Failed to add storage directory [DISK]file:/hdfs/d4/dfs/dn java.io.IOException: Incompatible clusterIDs in /hdfs/d4/dfs/dn: namenode clusterID = cluster87; datanode clusterID = cluster72 at org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:744) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadStorageDirectory(DataStorage.java:294) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadDataStorage(DataStorage.java:407) at org.apache.hadoop.hdfs.server.datanode.DataStorage.addStorageLocations(DataStorage.java:387) at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:559) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-02 15:19:21,598 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: Initialization failed for Block pool (Datanode Uuid 92aa408f-509e-4bad-98e0-6e83f0a34a12) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Exiting. java.io.IOException: All specified directories have failed to load. at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:560) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-02 15:19:21,598 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Ending block pool service for: Block pool (Datanode Uuid 92aa408f-509e-4bad-98e0-6e83f0a34a12) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-02 15:19:21,600 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Removed Block pool (Datanode Uuid 92aa408f-509e-4bad-98e0-6e83f0a34a12) 2025-07-02 15:19:23,600 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Exiting Datanode 2025-07-02 15:19:23,607 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG: /************************************************************ SHUTDOWN_MSG: Shutting down DataNode at dmidlkprdls04.svr.luc.edu/192.168.158.4 ************************************************************/ 2025-07-02 15:19:28,971 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: STARTUP_MSG: /************************************************************ STARTUP_MSG: Starting DataNode STARTUP_MSG: host = dmidlkprdls04.svr.luc.edu/192.168.158.4 STARTUP_MSG: args = [] STARTUP_MSG: version = 3.1.1.7.3.1.0-197 STARTUP_MSG: classpath = /var/run/cloudera-scm-agent/process/227-hdfs-DATANODE:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/audience-annotations-0.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/avro.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/aws-java-sdk-bundle-1.12.720.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-hdfs-plugin-shim-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-plugin-classloader-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-yarn-plugin-shim-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/azure-data-lake-store-sdk-2.3.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/checker-qual-3.33.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-beanutils-1.9.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-cli-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-codec-1.15.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-collections-3.2.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-compress-1.23.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-configuration2-2.10.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-io-2.11.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-lang-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-lang3-3.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-logging-1.1.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-math3-3.6.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-net-3.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-text-1.10.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-client-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-framework-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-recipes-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/failureaccess-1.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/gson-2.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/guava-32.0.1-jre.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/httpclient-4.5.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/httpcore-4.4.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/j2objc-annotations-2.8.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-core-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-core-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-databind-2.12.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-mapper-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-xc-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/javax.activation-api-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/javax.servlet-api-3.1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jaxb-api-2.2.11.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jcip-annotations-1.0-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-core-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-json-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-server-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-servlet-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jettison-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-http-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-io-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-security-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-server-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-servlet-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-util-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-util-ajax-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-webapp-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-xml-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jline-3.22.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsch-0.1.54.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsp-api-2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsr305-3.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsr311-api-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jul-to-slf4j-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-admin-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-client-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-common-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-core-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-crypto-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-identity-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-server-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-simplekdc-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-asn1-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-config-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-pkix-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-xdr-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/logredactor-2.0.16.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/lz4-java-1.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/metrics-core-3.2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-all-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-buffer-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-haproxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-http-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-http2-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-memcache-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-mqtt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-redis-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-smtp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-log4j12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-reload4j-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-api-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/reload4j-1.2.22.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/zookeeper.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-udt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-unix-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-kqueue-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final-linux-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-classes-kqueue-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-rxtx-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-classes-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-classes-macos-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-ssl-ocsp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-proxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-xml-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-stomp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-socks-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/wildfly-openssl-2.1.4.ClouderaFinal.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/protobuf-java-2.5.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/nimbus-jose-jwt-9.37.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/woodstox-core-5.4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/token-provider-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/stax2-api-4.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/snappy-java-1.1.10.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/re2j-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/zookeeper-jute.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-sctp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-kqueue-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final-linux-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//gcs-connector-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-annotations.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-auth.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-aws.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-datalake.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-kms.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-nfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//ozone-filesystem-hadoop3-1.3.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-nfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-kms-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-datalake-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-aws-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-auth-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-annotations-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//gcs-connector-2.1.2.7.3.1.0-197-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-thrift.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-scala_2.12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-protobuf.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-pig.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-pig-bundle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-jackson.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-hadoop.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-hadoop-bundle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-generator.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-format-structures.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-encoding.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-column.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-cascading3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-cascading.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-avro.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/./:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/audience-annotations-0.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/avro-1.11.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/checker-qual-3.33.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-beanutils-1.9.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-codec-1.15.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-collections-3.2.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-compress-1.23.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-configuration2-2.10.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-io-2.11.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-lang3-3.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-math3-3.6.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-net-3.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-text-1.10.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-client-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-framework-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-recipes-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/failureaccess-1.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/gson-2.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/guava-32.0.1-jre.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/httpclient-4.5.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/httpcore-4.4.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/j2objc-annotations-2.8.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-core-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-databind-2.12.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-jaxrs-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-xc-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/javax.activation-api-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/javax.servlet-api-3.1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jaxb-api-2.2.11.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jaxb-impl-2.2.3-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jcip-annotations-1.0-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-core-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-json-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-server-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-servlet-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jettison-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-http-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-io-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-security-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-server-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-servlet-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-util-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-util-ajax-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-webapp-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-xml-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jline-3.22.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsch-0.1.54.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/json-simple-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsr311-api-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-admin-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-client-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-common-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-core-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-crypto-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-identity-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-server-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-simplekdc-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-asn1-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-config-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-pkix-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-xdr-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/leveldbjni-cldr-1.9.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/lz4-java-1.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/metrics-core-3.2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-all-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-buffer-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-haproxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-http-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-http2-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-memcache-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-mqtt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-redis-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-smtp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-socks-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-stomp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-xml-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-proxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-ssl-ocsp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/re2j-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/zookeeper-3.8.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-sctp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-rxtx-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-unix-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-kqueue-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-kqueue-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final-linux-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final-linux-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-classes-kqueue-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-classes-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-classes-macos-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/nimbus-jose-jwt-9.37.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/woodstox-core-5.4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/token-provider-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/stax2-api-4.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/snappy-java-1.1.10.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/reload4j-1.2.22.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/zookeeper-jute-3.8.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-udt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-httpfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-nfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-nfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-httpfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//asm-5.0.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//aspectjweaver-1.9.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//azure-storage-7.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//checker-compat-qual-2.5.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-slf4j-backend-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-system-backend-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//google-extensions-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//azure-keyvault-core-1.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//accessors-smart-2.4.9.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gcs-connector-2.1.2.7.3.1.0-197-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archive-logs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archive-logs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archives-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archives.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-datajoin-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-datajoin.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-distcp-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-distcp.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-extras-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-extras.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-fs2img-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-fs2img.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-gridmix-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-gridmix.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-kafka-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-kafka.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-app-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-app.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-core-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-core.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-nativetask-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-nativetask.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-uploader-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-uploader.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-examples-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-examples.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-openstack-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-openstack.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-resourceestimator-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-resourceestimator.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-rumen-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-rumen.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-sls-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-sls.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-streaming-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-streaming.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ojalgo-43.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//kafka-clients-2.8.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//log4j-core-2.18.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-hook-abfs-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//forbiddenapis-3.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-intg-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//log4j-api-2.18.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//zstd-jni-1.4.9-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-i18n.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-s3-lib-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//javax.activation-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//bundle-2.23.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//json-smart-2.4.10.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-util-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-shell.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-cloud-bindings.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-hook-s3-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//aspectjrt-1.9.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/./:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/swagger-annotations-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/objenesis-1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jersey-client-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jakarta.xml.bind-api-2.3.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jakarta.activation-api-1.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-dataformat-yaml-2.9.10.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcutil-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcprov-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcpkix-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/HikariCP-java7-2.4.12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/snakeyaml-2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jsonschema2pojo-core-1.0.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/joda-time-2.10.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jna-5.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jersey-guice-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/javax.inject-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-module-jaxb-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-jaxrs-json-provider-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-jaxrs-base-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/guice-servlet-4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/guice-4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/geronimo-jcache_1.0_spec-1.0-alpha-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/fst-2.50.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/ehcache-3.3.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/dnsjava-2.1.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/codemodel-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-api.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-distributedshell.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-cloud-resourcemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-registry.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-nodemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-resourcemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-router.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-sharedcachemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-timeline-pluginstorage.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-web-proxy.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-api.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-core.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-registry-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-api-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-core-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-api-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-web-proxy-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-timeline-pluginstorage-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-tests-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-sharedcachemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-router-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-resourcemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-nodemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-cloud-resourcemanager-1.0.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-distributedshell-3.1.1.7.3.1.0-197.jar:/opt/cloudera/cm/lib/plugins/event-publish-7.13.1-shaded.jar:/opt/cloudera/cm/lib/plugins/tt-instrumentation-7.13.1.jar STARTUP_MSG: build = git@github.infra.cloudera.com:CDH/hadoop.git -r 31a42fb39494f541ffae15c3c61185deeeacca86; compiled by 'jenkins' on 2024-12-04T01:09Z STARTUP_MSG: java = 1.8.0_432 ************************************************************/ 2025-07-02 15:19:29,059 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: registered UNIX signal handlers for [TERM, HUP, INT] 2025-07-02 15:19:29,420 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d1/dfs/dn 2025-07-02 15:19:29,427 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d2/dfs/dn 2025-07-02 15:19:29,427 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d3/dfs/dn 2025-07-02 15:19:29,427 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d4/dfs/dn 2025-07-02 15:19:29,584 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: Loaded properties from hadoop-metrics2.properties 2025-07-02 15:19:29,687 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled Metric snapshot period at 10 second(s). 2025-07-02 15:19:29,687 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: DataNode metrics system started 2025-07-02 15:19:29,967 INFO org.apache.hadoop.hdfs.server.common.Util: dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling 2025-07-02 15:19:29,990 INFO org.apache.hadoop.hdfs.server.datanode.BlockScanner: Initialized block scanner with targetBytesPerSec 1048576 2025-07-02 15:19:29,997 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: File descriptor passing is enabled. 2025-07-02 15:19:29,998 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Configured hostname is dmidlkprdls04.svr.luc.edu 2025-07-02 15:19:29,999 INFO org.apache.hadoop.hdfs.server.common.Util: dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling 2025-07-02 15:19:30,004 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting DataNode with maxLockedMemory = 4294967296 2025-07-02 15:19:30,139 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened streaming server at /192.168.158.4:9866 2025-07-02 15:19:30,144 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Balancing bandwidth is 10485760 bytes/s 2025-07-02 15:19:30,144 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Number threads for balancing is 50 2025-07-02 15:19:30,149 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Balancing bandwidth is 10485760 bytes/s 2025-07-02 15:19:30,150 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Number threads for balancing is 50 2025-07-02 15:19:30,150 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Listening on UNIX domain socket: /var/run/hdfs-sockets/dn 2025-07-02 15:19:30,222 INFO org.eclipse.jetty.util.log: Logging initialized @2426ms to org.eclipse.jetty.util.log.Slf4jLog 2025-07-02 15:19:30,363 WARN org.apache.hadoop.security.authentication.server.AuthenticationFilter: Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /var/lib/hadoop-hdfs/hadoop-http-auth-signature-secret 2025-07-02 15:19:30,372 INFO org.apache.hadoop.http.HttpRequestLog: Http request log for http.requests.datanode is not defined 2025-07-02 15:19:30,381 INFO org.apache.hadoop.http.HttpServer2: Added global filter 'safety' (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter) 2025-07-02 15:19:30,384 INFO org.apache.hadoop.security.HttpCrossOriginFilterInitializer: CORS filter not enabled. Please set hadoop.http.cross-origin.enabled to 'true' to enable it 2025-07-02 15:19:30,385 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context datanode 2025-07-02 15:19:30,385 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context logs 2025-07-02 15:19:30,385 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context static 2025-07-02 15:19:30,427 INFO org.apache.hadoop.http.HttpServer2: Jetty bound to port 43323 2025-07-02 15:19:30,429 INFO org.eclipse.jetty.server.Server: jetty-9.4.54.v20240208; built: 2024-02-08T19:42:39.027Z; git: cef3fbd6d736a21e7d541a5db490381d95a2047d; jvm 1.8.0_432-b06 2025-07-02 15:19:30,479 INFO org.eclipse.jetty.server.session: DefaultSessionIdManager workerName=node0 2025-07-02 15:19:30,479 INFO org.eclipse.jetty.server.session: No SessionScavenger set, using defaults 2025-07-02 15:19:30,481 INFO org.eclipse.jetty.server.session: node0 Scavenging every 660000ms 2025-07-02 15:19:30,518 WARN org.apache.hadoop.security.authentication.server.AuthenticationFilter: Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /var/lib/hadoop-hdfs/hadoop-http-auth-signature-secret 2025-07-02 15:19:30,524 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.s.ServletContextHandler@1b11ef33{logs,/logs,file:///var/log/hadoop-hdfs/,AVAILABLE} 2025-07-02 15:19:30,525 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.s.ServletContextHandler@2f2bf0e2{static,/static,file:///opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/static/,AVAILABLE} 2025-07-02 15:19:30,643 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.w.WebAppContext@6ebd78d1{datanode,/,file:///opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/datanode/,AVAILABLE}{file:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/datanode} 2025-07-02 15:19:30,653 INFO org.eclipse.jetty.server.AbstractConnector: Started ServerConnector@7fb9f71f{HTTP/1.1, (http/1.1)}{localhost:43323} 2025-07-02 15:19:30,654 INFO org.eclipse.jetty.server.Server: Started @2858ms 2025-07-02 15:19:30,935 INFO org.apache.hadoop.hdfs.server.datanode.web.DatanodeHttpServer: Listening HTTP traffic on /192.168.158.4:9864 2025-07-02 15:19:30,946 INFO org.apache.hadoop.util.JvmPauseMonitor: Starting JVM pause monitor 2025-07-02 15:19:30,947 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: dnUserName = hdfs 2025-07-02 15:19:30,947 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: supergroup = supergroup 2025-07-02 15:19:31,013 INFO org.apache.hadoop.ipc.CallQueueManager: Using callQueue: class java.util.concurrent.LinkedBlockingQueue queueCapacity: 1000 scheduler: class org.apache.hadoop.ipc.DefaultRpcScheduler 2025-07-02 15:19:31,034 INFO org.apache.hadoop.ipc.Server: Starting Socket Reader #1 for port 9867 2025-07-02 15:19:31,081 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened IPC server at /192.168.158.4:9867 2025-07-02 15:19:31,115 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Refresh request received for nameservices: null 2025-07-02 15:19:31,124 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting BPOfferServices for nameservices: 2025-07-02 15:19:31,138 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool (Datanode Uuid unassigned) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 starting to offer service 2025-07-02 15:19:31,144 INFO org.apache.hadoop.ipc.Server: IPC Server listener on 9867: starting 2025-07-02 15:19:31,145 INFO org.apache.hadoop.ipc.Server: IPC Server Responder: starting 2025-07-02 15:19:31,408 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Acknowledging ACTIVE Namenode during handshakeBlock pool (Datanode Uuid unassigned) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-02 15:19:31,413 INFO org.apache.hadoop.hdfs.server.common.Storage: Using 4 threads to upgrade data directories (dfs.datanode.parallel.volumes.load.threads.num=4, dataDirs=4) 2025-07-02 15:19:31,420 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d1/dfs/dn/in_use.lock acquired by nodename 258076@dmidlkprdls04.svr.luc.edu 2025-07-02 15:19:31,424 WARN org.apache.hadoop.hdfs.server.common.Storage: Failed to add storage directory [DISK]file:/hdfs/d1/dfs/dn java.io.IOException: Incompatible clusterIDs in /hdfs/d1/dfs/dn: namenode clusterID = cluster87; datanode clusterID = cluster72 at org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:744) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadStorageDirectory(DataStorage.java:294) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadDataStorage(DataStorage.java:407) at org.apache.hadoop.hdfs.server.datanode.DataStorage.addStorageLocations(DataStorage.java:387) at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:559) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-02 15:19:31,429 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d2/dfs/dn/in_use.lock acquired by nodename 258076@dmidlkprdls04.svr.luc.edu 2025-07-02 15:19:31,430 WARN org.apache.hadoop.hdfs.server.common.Storage: Failed to add storage directory [DISK]file:/hdfs/d2/dfs/dn java.io.IOException: Incompatible clusterIDs in /hdfs/d2/dfs/dn: namenode clusterID = cluster87; datanode clusterID = cluster72 at org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:744) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadStorageDirectory(DataStorage.java:294) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadDataStorage(DataStorage.java:407) at org.apache.hadoop.hdfs.server.datanode.DataStorage.addStorageLocations(DataStorage.java:387) at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:559) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-02 15:19:31,431 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d3/dfs/dn/in_use.lock acquired by nodename 258076@dmidlkprdls04.svr.luc.edu 2025-07-02 15:19:31,432 WARN org.apache.hadoop.hdfs.server.common.Storage: Failed to add storage directory [DISK]file:/hdfs/d3/dfs/dn java.io.IOException: Incompatible clusterIDs in /hdfs/d3/dfs/dn: namenode clusterID = cluster87; datanode clusterID = cluster72 at org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:744) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadStorageDirectory(DataStorage.java:294) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadDataStorage(DataStorage.java:407) at org.apache.hadoop.hdfs.server.datanode.DataStorage.addStorageLocations(DataStorage.java:387) at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:559) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-02 15:19:31,433 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d4/dfs/dn/in_use.lock acquired by nodename 258076@dmidlkprdls04.svr.luc.edu 2025-07-02 15:19:31,433 WARN org.apache.hadoop.hdfs.server.common.Storage: Failed to add storage directory [DISK]file:/hdfs/d4/dfs/dn java.io.IOException: Incompatible clusterIDs in /hdfs/d4/dfs/dn: namenode clusterID = cluster87; datanode clusterID = cluster72 at org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:744) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadStorageDirectory(DataStorage.java:294) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadDataStorage(DataStorage.java:407) at org.apache.hadoop.hdfs.server.datanode.DataStorage.addStorageLocations(DataStorage.java:387) at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:559) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-02 15:19:31,436 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: Initialization failed for Block pool (Datanode Uuid 92aa408f-509e-4bad-98e0-6e83f0a34a12) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Exiting. java.io.IOException: All specified directories have failed to load. at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:560) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-02 15:19:31,436 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Ending block pool service for: Block pool (Datanode Uuid 92aa408f-509e-4bad-98e0-6e83f0a34a12) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-02 15:19:31,438 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Removed Block pool (Datanode Uuid 92aa408f-509e-4bad-98e0-6e83f0a34a12) 2025-07-02 15:19:33,438 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Exiting Datanode 2025-07-02 15:19:33,445 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG: /************************************************************ SHUTDOWN_MSG: Shutting down DataNode at dmidlkprdls04.svr.luc.edu/192.168.158.4 ************************************************************/ 2025-07-02 15:19:39,894 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: STARTUP_MSG: /************************************************************ STARTUP_MSG: Starting DataNode STARTUP_MSG: host = dmidlkprdls04.svr.luc.edu/192.168.158.4 STARTUP_MSG: args = [] STARTUP_MSG: version = 3.1.1.7.3.1.0-197 STARTUP_MSG: classpath = /var/run/cloudera-scm-agent/process/227-hdfs-DATANODE:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/audience-annotations-0.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/avro.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/aws-java-sdk-bundle-1.12.720.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-hdfs-plugin-shim-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-plugin-classloader-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-yarn-plugin-shim-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/azure-data-lake-store-sdk-2.3.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/checker-qual-3.33.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-beanutils-1.9.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-cli-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-codec-1.15.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-collections-3.2.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-compress-1.23.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-configuration2-2.10.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-io-2.11.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-lang-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-lang3-3.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-logging-1.1.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-math3-3.6.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-net-3.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-text-1.10.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-client-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-framework-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-recipes-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/failureaccess-1.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/gson-2.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/guava-32.0.1-jre.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/httpclient-4.5.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/httpcore-4.4.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/j2objc-annotations-2.8.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-core-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-core-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-databind-2.12.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-mapper-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-xc-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/javax.activation-api-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/javax.servlet-api-3.1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jaxb-api-2.2.11.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jcip-annotations-1.0-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-core-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-json-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-server-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-servlet-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jettison-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-http-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-io-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-security-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-server-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-servlet-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-util-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-util-ajax-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-webapp-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-xml-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jline-3.22.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsch-0.1.54.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsp-api-2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsr305-3.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsr311-api-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jul-to-slf4j-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-admin-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-client-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-common-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-core-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-crypto-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-identity-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-server-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-simplekdc-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-asn1-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-config-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-pkix-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-xdr-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/logredactor-2.0.16.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/lz4-java-1.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/metrics-core-3.2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-all-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-buffer-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-haproxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-http-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-http2-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-memcache-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-mqtt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-redis-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-smtp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-log4j12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-reload4j-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-api-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/reload4j-1.2.22.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/zookeeper.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-udt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-unix-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-kqueue-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final-linux-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-classes-kqueue-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-rxtx-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-classes-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-classes-macos-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-ssl-ocsp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-proxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-xml-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-stomp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-socks-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/wildfly-openssl-2.1.4.ClouderaFinal.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/protobuf-java-2.5.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/nimbus-jose-jwt-9.37.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/woodstox-core-5.4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/token-provider-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/stax2-api-4.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/snappy-java-1.1.10.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/re2j-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/zookeeper-jute.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-sctp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-kqueue-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final-linux-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//gcs-connector-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-annotations.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-auth.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-aws.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-datalake.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-kms.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-nfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//ozone-filesystem-hadoop3-1.3.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-nfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-kms-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-datalake-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-aws-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-auth-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-annotations-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//gcs-connector-2.1.2.7.3.1.0-197-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-thrift.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-scala_2.12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-protobuf.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-pig.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-pig-bundle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-jackson.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-hadoop.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-hadoop-bundle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-generator.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-format-structures.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-encoding.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-column.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-cascading3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-cascading.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-avro.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/./:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/audience-annotations-0.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/avro-1.11.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/checker-qual-3.33.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-beanutils-1.9.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-codec-1.15.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-collections-3.2.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-compress-1.23.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-configuration2-2.10.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-io-2.11.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-lang3-3.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-math3-3.6.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-net-3.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-text-1.10.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-client-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-framework-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-recipes-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/failureaccess-1.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/gson-2.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/guava-32.0.1-jre.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/httpclient-4.5.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/httpcore-4.4.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/j2objc-annotations-2.8.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-core-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-databind-2.12.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-jaxrs-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-xc-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/javax.activation-api-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/javax.servlet-api-3.1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jaxb-api-2.2.11.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jaxb-impl-2.2.3-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jcip-annotations-1.0-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-core-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-json-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-server-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-servlet-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jettison-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-http-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-io-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-security-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-server-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-servlet-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-util-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-util-ajax-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-webapp-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-xml-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jline-3.22.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsch-0.1.54.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/json-simple-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsr311-api-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-admin-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-client-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-common-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-core-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-crypto-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-identity-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-server-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-simplekdc-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-asn1-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-config-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-pkix-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-xdr-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/leveldbjni-cldr-1.9.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/lz4-java-1.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/metrics-core-3.2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-all-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-buffer-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-haproxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-http-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-http2-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-memcache-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-mqtt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-redis-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-smtp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-socks-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-stomp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-xml-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-proxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-ssl-ocsp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/re2j-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/zookeeper-3.8.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-sctp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-rxtx-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-unix-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-kqueue-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-kqueue-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final-linux-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final-linux-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-classes-kqueue-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-classes-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-classes-macos-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/nimbus-jose-jwt-9.37.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/woodstox-core-5.4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/token-provider-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/stax2-api-4.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/snappy-java-1.1.10.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/reload4j-1.2.22.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/zookeeper-jute-3.8.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-udt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-httpfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-nfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-nfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-httpfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//asm-5.0.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//aspectjweaver-1.9.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//azure-storage-7.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//checker-compat-qual-2.5.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-slf4j-backend-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-system-backend-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//google-extensions-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//azure-keyvault-core-1.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//accessors-smart-2.4.9.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gcs-connector-2.1.2.7.3.1.0-197-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archive-logs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archive-logs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archives-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archives.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-datajoin-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-datajoin.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-distcp-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-distcp.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-extras-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-extras.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-fs2img-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-fs2img.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-gridmix-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-gridmix.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-kafka-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-kafka.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-app-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-app.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-core-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-core.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-nativetask-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-nativetask.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-uploader-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-uploader.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-examples-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-examples.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-openstack-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-openstack.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-resourceestimator-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-resourceestimator.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-rumen-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-rumen.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-sls-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-sls.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-streaming-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-streaming.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ojalgo-43.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//kafka-clients-2.8.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//log4j-core-2.18.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-hook-abfs-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//forbiddenapis-3.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-intg-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//log4j-api-2.18.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//zstd-jni-1.4.9-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-i18n.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-s3-lib-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//javax.activation-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//bundle-2.23.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//json-smart-2.4.10.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-util-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-shell.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-cloud-bindings.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-hook-s3-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//aspectjrt-1.9.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/./:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/swagger-annotations-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/objenesis-1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jersey-client-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jakarta.xml.bind-api-2.3.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jakarta.activation-api-1.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-dataformat-yaml-2.9.10.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcutil-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcprov-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcpkix-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/HikariCP-java7-2.4.12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/snakeyaml-2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jsonschema2pojo-core-1.0.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/joda-time-2.10.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jna-5.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jersey-guice-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/javax.inject-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-module-jaxb-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-jaxrs-json-provider-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-jaxrs-base-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/guice-servlet-4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/guice-4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/geronimo-jcache_1.0_spec-1.0-alpha-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/fst-2.50.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/ehcache-3.3.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/dnsjava-2.1.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/codemodel-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-api.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-distributedshell.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-cloud-resourcemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-registry.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-nodemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-resourcemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-router.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-sharedcachemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-timeline-pluginstorage.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-web-proxy.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-api.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-core.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-registry-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-api-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-core-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-api-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-web-proxy-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-timeline-pluginstorage-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-tests-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-sharedcachemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-router-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-resourcemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-nodemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-cloud-resourcemanager-1.0.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-distributedshell-3.1.1.7.3.1.0-197.jar:/opt/cloudera/cm/lib/plugins/event-publish-7.13.1-shaded.jar:/opt/cloudera/cm/lib/plugins/tt-instrumentation-7.13.1.jar STARTUP_MSG: build = git@github.infra.cloudera.com:CDH/hadoop.git -r 31a42fb39494f541ffae15c3c61185deeeacca86; compiled by 'jenkins' on 2024-12-04T01:09Z STARTUP_MSG: java = 1.8.0_432 ************************************************************/ 2025-07-02 15:19:39,978 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: registered UNIX signal handlers for [TERM, HUP, INT] 2025-07-02 15:19:40,350 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d1/dfs/dn 2025-07-02 15:19:40,357 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d2/dfs/dn 2025-07-02 15:19:40,357 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d3/dfs/dn 2025-07-02 15:19:40,358 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d4/dfs/dn 2025-07-02 15:19:40,519 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: Loaded properties from hadoop-metrics2.properties 2025-07-02 15:19:40,619 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled Metric snapshot period at 10 second(s). 2025-07-02 15:19:40,619 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: DataNode metrics system started 2025-07-02 15:19:40,912 INFO org.apache.hadoop.hdfs.server.common.Util: dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling 2025-07-02 15:19:41,051 INFO org.apache.hadoop.hdfs.server.datanode.BlockScanner: Initialized block scanner with targetBytesPerSec 1048576 2025-07-02 15:19:41,061 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: File descriptor passing is enabled. 2025-07-02 15:19:41,062 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Configured hostname is dmidlkprdls04.svr.luc.edu 2025-07-02 15:19:41,063 INFO org.apache.hadoop.hdfs.server.common.Util: dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling 2025-07-02 15:19:41,071 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting DataNode with maxLockedMemory = 4294967296 2025-07-02 15:19:41,103 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened streaming server at /192.168.158.4:9866 2025-07-02 15:19:41,106 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Balancing bandwidth is 10485760 bytes/s 2025-07-02 15:19:41,106 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Number threads for balancing is 50 2025-07-02 15:19:41,110 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Balancing bandwidth is 10485760 bytes/s 2025-07-02 15:19:41,110 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Number threads for balancing is 50 2025-07-02 15:19:41,111 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Listening on UNIX domain socket: /var/run/hdfs-sockets/dn 2025-07-02 15:19:41,163 INFO org.eclipse.jetty.util.log: Logging initialized @2480ms to org.eclipse.jetty.util.log.Slf4jLog 2025-07-02 15:19:41,293 WARN org.apache.hadoop.security.authentication.server.AuthenticationFilter: Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /var/lib/hadoop-hdfs/hadoop-http-auth-signature-secret 2025-07-02 15:19:41,302 INFO org.apache.hadoop.http.HttpRequestLog: Http request log for http.requests.datanode is not defined 2025-07-02 15:19:41,313 INFO org.apache.hadoop.http.HttpServer2: Added global filter 'safety' (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter) 2025-07-02 15:19:41,317 INFO org.apache.hadoop.security.HttpCrossOriginFilterInitializer: CORS filter not enabled. Please set hadoop.http.cross-origin.enabled to 'true' to enable it 2025-07-02 15:19:41,318 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context datanode 2025-07-02 15:19:41,318 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context logs 2025-07-02 15:19:41,318 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context static 2025-07-02 15:19:41,363 INFO org.apache.hadoop.http.HttpServer2: Jetty bound to port 33517 2025-07-02 15:19:41,365 INFO org.eclipse.jetty.server.Server: jetty-9.4.54.v20240208; built: 2024-02-08T19:42:39.027Z; git: cef3fbd6d736a21e7d541a5db490381d95a2047d; jvm 1.8.0_432-b06 2025-07-02 15:19:41,423 INFO org.eclipse.jetty.server.session: DefaultSessionIdManager workerName=node0 2025-07-02 15:19:41,423 INFO org.eclipse.jetty.server.session: No SessionScavenger set, using defaults 2025-07-02 15:19:41,427 INFO org.eclipse.jetty.server.session: node0 Scavenging every 600000ms 2025-07-02 15:19:41,459 WARN org.apache.hadoop.security.authentication.server.AuthenticationFilter: Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /var/lib/hadoop-hdfs/hadoop-http-auth-signature-secret 2025-07-02 15:19:41,464 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.s.ServletContextHandler@1b11ef33{logs,/logs,file:///var/log/hadoop-hdfs/,AVAILABLE} 2025-07-02 15:19:41,466 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.s.ServletContextHandler@2f2bf0e2{static,/static,file:///opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/static/,AVAILABLE} 2025-07-02 15:19:41,588 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.w.WebAppContext@6ebd78d1{datanode,/,file:///opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/datanode/,AVAILABLE}{file:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/datanode} 2025-07-02 15:19:41,600 INFO org.eclipse.jetty.server.AbstractConnector: Started ServerConnector@7fb9f71f{HTTP/1.1, (http/1.1)}{localhost:33517} 2025-07-02 15:19:41,601 INFO org.eclipse.jetty.server.Server: Started @2918ms 2025-07-02 15:19:41,883 INFO org.apache.hadoop.hdfs.server.datanode.web.DatanodeHttpServer: Listening HTTP traffic on /192.168.158.4:9864 2025-07-02 15:19:41,894 INFO org.apache.hadoop.util.JvmPauseMonitor: Starting JVM pause monitor 2025-07-02 15:19:41,895 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: dnUserName = hdfs 2025-07-02 15:19:41,895 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: supergroup = supergroup 2025-07-02 15:19:41,969 INFO org.apache.hadoop.ipc.CallQueueManager: Using callQueue: class java.util.concurrent.LinkedBlockingQueue queueCapacity: 1000 scheduler: class org.apache.hadoop.ipc.DefaultRpcScheduler 2025-07-02 15:19:41,990 INFO org.apache.hadoop.ipc.Server: Starting Socket Reader #1 for port 9867 2025-07-02 15:19:42,037 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened IPC server at /192.168.158.4:9867 2025-07-02 15:19:42,071 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Refresh request received for nameservices: null 2025-07-02 15:19:42,081 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting BPOfferServices for nameservices: 2025-07-02 15:19:42,097 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool (Datanode Uuid unassigned) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 starting to offer service 2025-07-02 15:19:42,104 INFO org.apache.hadoop.ipc.Server: IPC Server Responder: starting 2025-07-02 15:19:42,104 INFO org.apache.hadoop.ipc.Server: IPC Server listener on 9867: starting 2025-07-02 15:19:42,393 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Acknowledging ACTIVE Namenode during handshakeBlock pool (Datanode Uuid unassigned) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-02 15:19:42,398 INFO org.apache.hadoop.hdfs.server.common.Storage: Using 4 threads to upgrade data directories (dfs.datanode.parallel.volumes.load.threads.num=4, dataDirs=4) 2025-07-02 15:19:42,406 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d1/dfs/dn/in_use.lock acquired by nodename 258587@dmidlkprdls04.svr.luc.edu 2025-07-02 15:19:42,410 WARN org.apache.hadoop.hdfs.server.common.Storage: Failed to add storage directory [DISK]file:/hdfs/d1/dfs/dn java.io.IOException: Incompatible clusterIDs in /hdfs/d1/dfs/dn: namenode clusterID = cluster87; datanode clusterID = cluster72 at org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:744) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadStorageDirectory(DataStorage.java:294) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadDataStorage(DataStorage.java:407) at org.apache.hadoop.hdfs.server.datanode.DataStorage.addStorageLocations(DataStorage.java:387) at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:559) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-02 15:19:42,415 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d2/dfs/dn/in_use.lock acquired by nodename 258587@dmidlkprdls04.svr.luc.edu 2025-07-02 15:19:42,415 WARN org.apache.hadoop.hdfs.server.common.Storage: Failed to add storage directory [DISK]file:/hdfs/d2/dfs/dn java.io.IOException: Incompatible clusterIDs in /hdfs/d2/dfs/dn: namenode clusterID = cluster87; datanode clusterID = cluster72 at org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:744) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadStorageDirectory(DataStorage.java:294) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadDataStorage(DataStorage.java:407) at org.apache.hadoop.hdfs.server.datanode.DataStorage.addStorageLocations(DataStorage.java:387) at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:559) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-02 15:19:42,416 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d3/dfs/dn/in_use.lock acquired by nodename 258587@dmidlkprdls04.svr.luc.edu 2025-07-02 15:19:42,417 WARN org.apache.hadoop.hdfs.server.common.Storage: Failed to add storage directory [DISK]file:/hdfs/d3/dfs/dn java.io.IOException: Incompatible clusterIDs in /hdfs/d3/dfs/dn: namenode clusterID = cluster87; datanode clusterID = cluster72 at org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:744) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadStorageDirectory(DataStorage.java:294) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadDataStorage(DataStorage.java:407) at org.apache.hadoop.hdfs.server.datanode.DataStorage.addStorageLocations(DataStorage.java:387) at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:559) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-02 15:19:42,418 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d4/dfs/dn/in_use.lock acquired by nodename 258587@dmidlkprdls04.svr.luc.edu 2025-07-02 15:19:42,418 WARN org.apache.hadoop.hdfs.server.common.Storage: Failed to add storage directory [DISK]file:/hdfs/d4/dfs/dn java.io.IOException: Incompatible clusterIDs in /hdfs/d4/dfs/dn: namenode clusterID = cluster87; datanode clusterID = cluster72 at org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:744) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadStorageDirectory(DataStorage.java:294) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadDataStorage(DataStorage.java:407) at org.apache.hadoop.hdfs.server.datanode.DataStorage.addStorageLocations(DataStorage.java:387) at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:559) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-02 15:19:42,422 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: Initialization failed for Block pool (Datanode Uuid 92aa408f-509e-4bad-98e0-6e83f0a34a12) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Exiting. java.io.IOException: All specified directories have failed to load. at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:560) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-02 15:19:42,422 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Ending block pool service for: Block pool (Datanode Uuid 92aa408f-509e-4bad-98e0-6e83f0a34a12) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-02 15:19:42,424 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Removed Block pool (Datanode Uuid 92aa408f-509e-4bad-98e0-6e83f0a34a12) 2025-07-02 15:19:44,424 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Exiting Datanode 2025-07-02 15:19:44,431 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG: /************************************************************ SHUTDOWN_MSG: Shutting down DataNode at dmidlkprdls04.svr.luc.edu/192.168.158.4 ************************************************************/ 2025-07-02 15:22:43,800 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: STARTUP_MSG: /************************************************************ STARTUP_MSG: Starting DataNode STARTUP_MSG: host = dmidlkprdls04.svr.luc.edu/192.168.158.4 STARTUP_MSG: args = [] STARTUP_MSG: version = 3.1.1.7.3.1.0-197 STARTUP_MSG: classpath = /var/run/cloudera-scm-agent/process/234-hdfs-DATANODE:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/audience-annotations-0.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/avro.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/aws-java-sdk-bundle-1.12.720.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-hdfs-plugin-shim-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-plugin-classloader-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-yarn-plugin-shim-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/azure-data-lake-store-sdk-2.3.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/checker-qual-3.33.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-beanutils-1.9.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-cli-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-codec-1.15.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-collections-3.2.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-compress-1.23.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-configuration2-2.10.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-io-2.11.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-lang-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-lang3-3.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-logging-1.1.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-math3-3.6.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-net-3.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-text-1.10.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-client-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-framework-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-recipes-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/failureaccess-1.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/gson-2.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/guava-32.0.1-jre.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/httpclient-4.5.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/httpcore-4.4.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/j2objc-annotations-2.8.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-core-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-core-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-databind-2.12.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-mapper-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-xc-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/javax.activation-api-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/javax.servlet-api-3.1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jaxb-api-2.2.11.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jcip-annotations-1.0-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-core-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-json-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-server-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-servlet-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jettison-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-http-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-io-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-security-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-server-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-servlet-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-util-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-util-ajax-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-webapp-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-xml-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jline-3.22.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsch-0.1.54.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsp-api-2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsr305-3.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsr311-api-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jul-to-slf4j-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-admin-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-client-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-common-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-core-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-crypto-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-identity-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-server-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-simplekdc-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-asn1-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-config-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-pkix-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-xdr-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/logredactor-2.0.16.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/lz4-java-1.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/metrics-core-3.2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-all-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-buffer-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-haproxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-http-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-http2-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-memcache-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-mqtt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-redis-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-smtp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-log4j12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-reload4j-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-api-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/reload4j-1.2.22.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/zookeeper.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-udt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-unix-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-kqueue-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final-linux-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-classes-kqueue-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-rxtx-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-classes-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-classes-macos-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-ssl-ocsp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-proxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-xml-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-stomp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-socks-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/wildfly-openssl-2.1.4.ClouderaFinal.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/protobuf-java-2.5.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/nimbus-jose-jwt-9.37.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/woodstox-core-5.4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/token-provider-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/stax2-api-4.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/snappy-java-1.1.10.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/re2j-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/zookeeper-jute.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-sctp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-kqueue-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final-linux-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//gcs-connector-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-annotations.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-auth.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-aws.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-datalake.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-kms.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-nfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//ozone-filesystem-hadoop3-1.3.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-nfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-kms-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-datalake-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-aws-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-auth-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-annotations-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//gcs-connector-2.1.2.7.3.1.0-197-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-thrift.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-scala_2.12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-protobuf.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-pig.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-pig-bundle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-jackson.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-hadoop.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-hadoop-bundle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-generator.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-format-structures.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-encoding.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-column.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-cascading3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-cascading.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-avro.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/./:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/audience-annotations-0.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/avro-1.11.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/checker-qual-3.33.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-beanutils-1.9.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-codec-1.15.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-collections-3.2.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-compress-1.23.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-configuration2-2.10.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-io-2.11.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-lang3-3.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-math3-3.6.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-net-3.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-text-1.10.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-client-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-framework-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-recipes-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/failureaccess-1.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/gson-2.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/guava-32.0.1-jre.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/httpclient-4.5.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/httpcore-4.4.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/j2objc-annotations-2.8.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-core-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-databind-2.12.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-jaxrs-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-xc-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/javax.activation-api-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/javax.servlet-api-3.1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jaxb-api-2.2.11.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jaxb-impl-2.2.3-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jcip-annotations-1.0-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-core-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-json-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-server-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-servlet-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jettison-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-http-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-io-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-security-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-server-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-servlet-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-util-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-util-ajax-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-webapp-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-xml-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jline-3.22.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsch-0.1.54.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/json-simple-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsr311-api-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-admin-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-client-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-common-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-core-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-crypto-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-identity-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-server-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-simplekdc-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-asn1-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-config-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-pkix-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-xdr-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/leveldbjni-cldr-1.9.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/lz4-java-1.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/metrics-core-3.2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-all-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-buffer-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-haproxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-http-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-http2-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-memcache-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-mqtt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-redis-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-smtp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-socks-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-stomp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-xml-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-proxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-ssl-ocsp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/re2j-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/zookeeper-3.8.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-sctp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-rxtx-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-unix-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-kqueue-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-kqueue-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final-linux-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final-linux-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-classes-kqueue-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-classes-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-classes-macos-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/nimbus-jose-jwt-9.37.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/woodstox-core-5.4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/token-provider-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/stax2-api-4.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/snappy-java-1.1.10.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/reload4j-1.2.22.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/zookeeper-jute-3.8.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-udt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-httpfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-nfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-nfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-httpfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//asm-5.0.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//aspectjweaver-1.9.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//azure-storage-7.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//checker-compat-qual-2.5.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-slf4j-backend-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-system-backend-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//google-extensions-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//azure-keyvault-core-1.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//accessors-smart-2.4.9.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gcs-connector-2.1.2.7.3.1.0-197-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archive-logs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archive-logs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archives-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archives.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-datajoin-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-datajoin.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-distcp-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-distcp.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-extras-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-extras.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-fs2img-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-fs2img.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-gridmix-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-gridmix.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-kafka-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-kafka.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-app-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-app.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-core-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-core.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-nativetask-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-nativetask.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-uploader-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-uploader.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-examples-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-examples.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-openstack-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-openstack.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-resourceestimator-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-resourceestimator.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-rumen-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-rumen.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-sls-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-sls.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-streaming-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-streaming.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ojalgo-43.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//kafka-clients-2.8.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//log4j-core-2.18.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-hook-abfs-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//forbiddenapis-3.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-intg-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//log4j-api-2.18.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//zstd-jni-1.4.9-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-i18n.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-s3-lib-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//javax.activation-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//bundle-2.23.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//json-smart-2.4.10.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-util-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-shell.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-cloud-bindings.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-hook-s3-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//aspectjrt-1.9.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/./:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/swagger-annotations-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/objenesis-1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jersey-client-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jakarta.xml.bind-api-2.3.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jakarta.activation-api-1.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-dataformat-yaml-2.9.10.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcutil-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcprov-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcpkix-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/HikariCP-java7-2.4.12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/snakeyaml-2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jsonschema2pojo-core-1.0.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/joda-time-2.10.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jna-5.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jersey-guice-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/javax.inject-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-module-jaxb-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-jaxrs-json-provider-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-jaxrs-base-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/guice-servlet-4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/guice-4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/geronimo-jcache_1.0_spec-1.0-alpha-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/fst-2.50.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/ehcache-3.3.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/dnsjava-2.1.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/codemodel-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-api.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-distributedshell.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-cloud-resourcemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-registry.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-nodemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-resourcemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-router.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-sharedcachemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-timeline-pluginstorage.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-web-proxy.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-api.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-core.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-registry-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-api-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-core-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-api-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-web-proxy-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-timeline-pluginstorage-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-tests-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-sharedcachemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-router-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-resourcemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-nodemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-cloud-resourcemanager-1.0.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-distributedshell-3.1.1.7.3.1.0-197.jar:/opt/cloudera/cm/lib/plugins/event-publish-7.13.1-shaded.jar:/opt/cloudera/cm/lib/plugins/tt-instrumentation-7.13.1.jar STARTUP_MSG: build = git@github.infra.cloudera.com:CDH/hadoop.git -r 31a42fb39494f541ffae15c3c61185deeeacca86; compiled by 'jenkins' on 2024-12-04T01:09Z STARTUP_MSG: java = 1.8.0_432 ************************************************************/ 2025-07-02 15:22:43,886 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: registered UNIX signal handlers for [TERM, HUP, INT] 2025-07-02 15:22:44,261 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d1/dfs/dn 2025-07-02 15:22:44,267 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d2/dfs/dn 2025-07-02 15:22:44,268 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d3/dfs/dn 2025-07-02 15:22:44,268 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d4/dfs/dn 2025-07-02 15:22:44,431 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: Loaded properties from hadoop-metrics2.properties 2025-07-02 15:22:44,547 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled Metric snapshot period at 10 second(s). 2025-07-02 15:22:44,547 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: DataNode metrics system started 2025-07-02 15:22:44,838 INFO org.apache.hadoop.hdfs.server.common.Util: dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling 2025-07-02 15:22:44,862 INFO org.apache.hadoop.hdfs.server.datanode.BlockScanner: Initialized block scanner with targetBytesPerSec 1048576 2025-07-02 15:22:44,869 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: File descriptor passing is enabled. 2025-07-02 15:22:44,870 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Configured hostname is dmidlkprdls04.svr.luc.edu 2025-07-02 15:22:44,870 INFO org.apache.hadoop.hdfs.server.common.Util: dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling 2025-07-02 15:22:44,980 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting DataNode with maxLockedMemory = 4294967296 2025-07-02 15:22:45,032 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened streaming server at /192.168.158.4:9866 2025-07-02 15:22:45,036 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Balancing bandwidth is 10485760 bytes/s 2025-07-02 15:22:45,036 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Number threads for balancing is 50 2025-07-02 15:22:45,040 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Balancing bandwidth is 10485760 bytes/s 2025-07-02 15:22:45,041 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Number threads for balancing is 50 2025-07-02 15:22:45,041 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Listening on UNIX domain socket: /var/run/hdfs-sockets/dn 2025-07-02 15:22:45,099 INFO org.eclipse.jetty.util.log: Logging initialized @2456ms to org.eclipse.jetty.util.log.Slf4jLog 2025-07-02 15:22:45,239 WARN org.apache.hadoop.security.authentication.server.AuthenticationFilter: Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /var/lib/hadoop-hdfs/hadoop-http-auth-signature-secret 2025-07-02 15:22:45,249 INFO org.apache.hadoop.http.HttpRequestLog: Http request log for http.requests.datanode is not defined 2025-07-02 15:22:45,259 INFO org.apache.hadoop.http.HttpServer2: Added global filter 'safety' (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter) 2025-07-02 15:22:45,262 INFO org.apache.hadoop.security.HttpCrossOriginFilterInitializer: CORS filter not enabled. Please set hadoop.http.cross-origin.enabled to 'true' to enable it 2025-07-02 15:22:45,263 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context datanode 2025-07-02 15:22:45,263 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context logs 2025-07-02 15:22:45,264 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context static 2025-07-02 15:22:45,312 INFO org.apache.hadoop.http.HttpServer2: Jetty bound to port 46389 2025-07-02 15:22:45,314 INFO org.eclipse.jetty.server.Server: jetty-9.4.54.v20240208; built: 2024-02-08T19:42:39.027Z; git: cef3fbd6d736a21e7d541a5db490381d95a2047d; jvm 1.8.0_432-b06 2025-07-02 15:22:45,372 INFO org.eclipse.jetty.server.session: DefaultSessionIdManager workerName=node0 2025-07-02 15:22:45,372 INFO org.eclipse.jetty.server.session: No SessionScavenger set, using defaults 2025-07-02 15:22:45,375 INFO org.eclipse.jetty.server.session: node0 Scavenging every 600000ms 2025-07-02 15:22:45,402 WARN org.apache.hadoop.security.authentication.server.AuthenticationFilter: Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /var/lib/hadoop-hdfs/hadoop-http-auth-signature-secret 2025-07-02 15:22:45,407 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.s.ServletContextHandler@1b11ef33{logs,/logs,file:///var/log/hadoop-hdfs/,AVAILABLE} 2025-07-02 15:22:45,409 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.s.ServletContextHandler@2f2bf0e2{static,/static,file:///opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/static/,AVAILABLE} 2025-07-02 15:22:45,531 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.w.WebAppContext@6ebd78d1{datanode,/,file:///opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/datanode/,AVAILABLE}{file:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/datanode} 2025-07-02 15:22:45,545 INFO org.eclipse.jetty.server.AbstractConnector: Started ServerConnector@7fb9f71f{HTTP/1.1, (http/1.1)}{localhost:46389} 2025-07-02 15:22:45,546 INFO org.eclipse.jetty.server.Server: Started @2903ms 2025-07-02 15:22:45,840 INFO org.apache.hadoop.hdfs.server.datanode.web.DatanodeHttpServer: Listening HTTP traffic on /192.168.158.4:9864 2025-07-02 15:22:45,849 INFO org.apache.hadoop.util.JvmPauseMonitor: Starting JVM pause monitor 2025-07-02 15:22:45,850 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: dnUserName = hdfs 2025-07-02 15:22:45,850 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: supergroup = supergroup 2025-07-02 15:22:45,915 INFO org.apache.hadoop.ipc.CallQueueManager: Using callQueue: class java.util.concurrent.LinkedBlockingQueue queueCapacity: 1000 scheduler: class org.apache.hadoop.ipc.DefaultRpcScheduler 2025-07-02 15:22:45,937 INFO org.apache.hadoop.ipc.Server: Starting Socket Reader #1 for port 9867 2025-07-02 15:22:45,981 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened IPC server at /192.168.158.4:9867 2025-07-02 15:22:46,016 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Refresh request received for nameservices: null 2025-07-02 15:22:46,025 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting BPOfferServices for nameservices: 2025-07-02 15:22:46,039 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool (Datanode Uuid unassigned) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 starting to offer service 2025-07-02 15:22:46,046 INFO org.apache.hadoop.ipc.Server: IPC Server Responder: starting 2025-07-02 15:22:46,046 INFO org.apache.hadoop.ipc.Server: IPC Server listener on 9867: starting 2025-07-02 15:22:46,356 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Acknowledging ACTIVE Namenode during handshakeBlock pool (Datanode Uuid unassigned) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-02 15:22:46,361 INFO org.apache.hadoop.hdfs.server.common.Storage: Using 4 threads to upgrade data directories (dfs.datanode.parallel.volumes.load.threads.num=4, dataDirs=4) 2025-07-02 15:22:46,369 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d1/dfs/dn/in_use.lock acquired by nodename 259773@dmidlkprdls04.svr.luc.edu 2025-07-02 15:22:46,373 WARN org.apache.hadoop.hdfs.server.common.Storage: Failed to add storage directory [DISK]file:/hdfs/d1/dfs/dn java.io.IOException: Incompatible clusterIDs in /hdfs/d1/dfs/dn: namenode clusterID = cluster87; datanode clusterID = cluster72 at org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:744) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadStorageDirectory(DataStorage.java:294) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadDataStorage(DataStorage.java:407) at org.apache.hadoop.hdfs.server.datanode.DataStorage.addStorageLocations(DataStorage.java:387) at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:559) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-02 15:22:46,379 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d2/dfs/dn/in_use.lock acquired by nodename 259773@dmidlkprdls04.svr.luc.edu 2025-07-02 15:22:46,379 WARN org.apache.hadoop.hdfs.server.common.Storage: Failed to add storage directory [DISK]file:/hdfs/d2/dfs/dn java.io.IOException: Incompatible clusterIDs in /hdfs/d2/dfs/dn: namenode clusterID = cluster87; datanode clusterID = cluster72 at org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:744) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadStorageDirectory(DataStorage.java:294) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadDataStorage(DataStorage.java:407) at org.apache.hadoop.hdfs.server.datanode.DataStorage.addStorageLocations(DataStorage.java:387) at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:559) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-02 15:22:46,381 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d3/dfs/dn/in_use.lock acquired by nodename 259773@dmidlkprdls04.svr.luc.edu 2025-07-02 15:22:46,381 WARN org.apache.hadoop.hdfs.server.common.Storage: Failed to add storage directory [DISK]file:/hdfs/d3/dfs/dn java.io.IOException: Incompatible clusterIDs in /hdfs/d3/dfs/dn: namenode clusterID = cluster87; datanode clusterID = cluster72 at org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:744) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadStorageDirectory(DataStorage.java:294) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadDataStorage(DataStorage.java:407) at org.apache.hadoop.hdfs.server.datanode.DataStorage.addStorageLocations(DataStorage.java:387) at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:559) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-02 15:22:46,382 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d4/dfs/dn/in_use.lock acquired by nodename 259773@dmidlkprdls04.svr.luc.edu 2025-07-02 15:22:46,383 WARN org.apache.hadoop.hdfs.server.common.Storage: Failed to add storage directory [DISK]file:/hdfs/d4/dfs/dn java.io.IOException: Incompatible clusterIDs in /hdfs/d4/dfs/dn: namenode clusterID = cluster87; datanode clusterID = cluster72 at org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:744) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadStorageDirectory(DataStorage.java:294) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadDataStorage(DataStorage.java:407) at org.apache.hadoop.hdfs.server.datanode.DataStorage.addStorageLocations(DataStorage.java:387) at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:559) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-02 15:22:46,386 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: Initialization failed for Block pool (Datanode Uuid 92aa408f-509e-4bad-98e0-6e83f0a34a12) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Exiting. java.io.IOException: All specified directories have failed to load. at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:560) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-02 15:22:46,386 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Ending block pool service for: Block pool (Datanode Uuid 92aa408f-509e-4bad-98e0-6e83f0a34a12) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-02 15:22:46,390 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Removed Block pool (Datanode Uuid 92aa408f-509e-4bad-98e0-6e83f0a34a12) 2025-07-02 15:22:48,390 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Exiting Datanode 2025-07-02 15:22:48,398 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG: /************************************************************ SHUTDOWN_MSG: Shutting down DataNode at dmidlkprdls04.svr.luc.edu/192.168.158.4 ************************************************************/ 2025-07-02 15:22:52,805 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: STARTUP_MSG: /************************************************************ STARTUP_MSG: Starting DataNode STARTUP_MSG: host = dmidlkprdls04.svr.luc.edu/192.168.158.4 STARTUP_MSG: args = [] STARTUP_MSG: version = 3.1.1.7.3.1.0-197 STARTUP_MSG: classpath = /var/run/cloudera-scm-agent/process/234-hdfs-DATANODE:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/audience-annotations-0.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/avro.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/aws-java-sdk-bundle-1.12.720.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-hdfs-plugin-shim-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-plugin-classloader-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-yarn-plugin-shim-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/azure-data-lake-store-sdk-2.3.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/checker-qual-3.33.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-beanutils-1.9.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-cli-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-codec-1.15.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-collections-3.2.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-compress-1.23.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-configuration2-2.10.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-io-2.11.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-lang-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-lang3-3.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-logging-1.1.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-math3-3.6.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-net-3.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-text-1.10.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-client-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-framework-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-recipes-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/failureaccess-1.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/gson-2.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/guava-32.0.1-jre.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/httpclient-4.5.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/httpcore-4.4.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/j2objc-annotations-2.8.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-core-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-core-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-databind-2.12.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-mapper-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-xc-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/javax.activation-api-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/javax.servlet-api-3.1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jaxb-api-2.2.11.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jcip-annotations-1.0-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-core-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-json-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-server-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-servlet-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jettison-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-http-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-io-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-security-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-server-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-servlet-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-util-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-util-ajax-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-webapp-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-xml-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jline-3.22.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsch-0.1.54.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsp-api-2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsr305-3.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsr311-api-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jul-to-slf4j-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-admin-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-client-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-common-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-core-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-crypto-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-identity-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-server-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-simplekdc-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-asn1-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-config-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-pkix-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-xdr-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/logredactor-2.0.16.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/lz4-java-1.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/metrics-core-3.2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-all-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-buffer-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-haproxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-http-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-http2-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-memcache-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-mqtt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-redis-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-smtp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-log4j12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-reload4j-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-api-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/reload4j-1.2.22.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/zookeeper.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-udt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-unix-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-kqueue-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final-linux-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-classes-kqueue-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-rxtx-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-classes-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-classes-macos-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-ssl-ocsp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-proxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-xml-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-stomp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-socks-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/wildfly-openssl-2.1.4.ClouderaFinal.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/protobuf-java-2.5.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/nimbus-jose-jwt-9.37.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/woodstox-core-5.4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/token-provider-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/stax2-api-4.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/snappy-java-1.1.10.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/re2j-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/zookeeper-jute.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-sctp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-kqueue-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final-linux-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//gcs-connector-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-annotations.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-auth.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-aws.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-datalake.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-kms.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-nfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//ozone-filesystem-hadoop3-1.3.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-nfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-kms-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-datalake-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-aws-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-auth-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-annotations-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//gcs-connector-2.1.2.7.3.1.0-197-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-thrift.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-scala_2.12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-protobuf.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-pig.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-pig-bundle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-jackson.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-hadoop.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-hadoop-bundle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-generator.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-format-structures.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-encoding.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-column.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-cascading3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-cascading.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-avro.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/./:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/audience-annotations-0.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/avro-1.11.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/checker-qual-3.33.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-beanutils-1.9.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-codec-1.15.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-collections-3.2.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-compress-1.23.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-configuration2-2.10.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-io-2.11.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-lang3-3.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-math3-3.6.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-net-3.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-text-1.10.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-client-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-framework-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-recipes-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/failureaccess-1.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/gson-2.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/guava-32.0.1-jre.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/httpclient-4.5.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/httpcore-4.4.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/j2objc-annotations-2.8.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-core-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-databind-2.12.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-jaxrs-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-xc-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/javax.activation-api-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/javax.servlet-api-3.1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jaxb-api-2.2.11.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jaxb-impl-2.2.3-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jcip-annotations-1.0-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-core-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-json-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-server-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-servlet-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jettison-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-http-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-io-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-security-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-server-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-servlet-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-util-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-util-ajax-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-webapp-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-xml-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jline-3.22.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsch-0.1.54.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/json-simple-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsr311-api-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-admin-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-client-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-common-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-core-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-crypto-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-identity-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-server-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-simplekdc-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-asn1-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-config-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-pkix-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-xdr-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/leveldbjni-cldr-1.9.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/lz4-java-1.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/metrics-core-3.2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-all-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-buffer-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-haproxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-http-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-http2-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-memcache-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-mqtt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-redis-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-smtp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-socks-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-stomp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-xml-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-proxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-ssl-ocsp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/re2j-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/zookeeper-3.8.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-sctp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-rxtx-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-unix-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-kqueue-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-kqueue-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final-linux-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final-linux-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-classes-kqueue-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-classes-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-classes-macos-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/nimbus-jose-jwt-9.37.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/woodstox-core-5.4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/token-provider-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/stax2-api-4.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/snappy-java-1.1.10.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/reload4j-1.2.22.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/zookeeper-jute-3.8.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-udt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-httpfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-nfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-nfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-httpfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//asm-5.0.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//aspectjweaver-1.9.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//azure-storage-7.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//checker-compat-qual-2.5.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-slf4j-backend-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-system-backend-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//google-extensions-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//azure-keyvault-core-1.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//accessors-smart-2.4.9.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gcs-connector-2.1.2.7.3.1.0-197-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archive-logs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archive-logs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archives-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archives.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-datajoin-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-datajoin.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-distcp-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-distcp.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-extras-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-extras.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-fs2img-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-fs2img.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-gridmix-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-gridmix.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-kafka-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-kafka.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-app-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-app.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-core-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-core.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-nativetask-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-nativetask.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-uploader-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-uploader.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-examples-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-examples.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-openstack-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-openstack.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-resourceestimator-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-resourceestimator.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-rumen-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-rumen.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-sls-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-sls.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-streaming-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-streaming.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ojalgo-43.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//kafka-clients-2.8.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//log4j-core-2.18.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-hook-abfs-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//forbiddenapis-3.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-intg-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//log4j-api-2.18.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//zstd-jni-1.4.9-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-i18n.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-s3-lib-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//javax.activation-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//bundle-2.23.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//json-smart-2.4.10.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-util-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-shell.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-cloud-bindings.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-hook-s3-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//aspectjrt-1.9.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/./:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/swagger-annotations-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/objenesis-1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jersey-client-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jakarta.xml.bind-api-2.3.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jakarta.activation-api-1.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-dataformat-yaml-2.9.10.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcutil-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcprov-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcpkix-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/HikariCP-java7-2.4.12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/snakeyaml-2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jsonschema2pojo-core-1.0.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/joda-time-2.10.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jna-5.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jersey-guice-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/javax.inject-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-module-jaxb-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-jaxrs-json-provider-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-jaxrs-base-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/guice-servlet-4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/guice-4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/geronimo-jcache_1.0_spec-1.0-alpha-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/fst-2.50.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/ehcache-3.3.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/dnsjava-2.1.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/codemodel-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-api.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-distributedshell.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-cloud-resourcemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-registry.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-nodemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-resourcemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-router.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-sharedcachemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-timeline-pluginstorage.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-web-proxy.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-api.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-core.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-registry-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-api-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-core-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-api-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-web-proxy-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-timeline-pluginstorage-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-tests-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-sharedcachemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-router-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-resourcemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-nodemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-cloud-resourcemanager-1.0.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-distributedshell-3.1.1.7.3.1.0-197.jar:/opt/cloudera/cm/lib/plugins/event-publish-7.13.1-shaded.jar:/opt/cloudera/cm/lib/plugins/tt-instrumentation-7.13.1.jar STARTUP_MSG: build = git@github.infra.cloudera.com:CDH/hadoop.git -r 31a42fb39494f541ffae15c3c61185deeeacca86; compiled by 'jenkins' on 2024-12-04T01:09Z STARTUP_MSG: java = 1.8.0_432 ************************************************************/ 2025-07-02 15:22:52,893 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: registered UNIX signal handlers for [TERM, HUP, INT] 2025-07-02 15:22:53,239 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d1/dfs/dn 2025-07-02 15:22:53,246 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d2/dfs/dn 2025-07-02 15:22:53,246 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d3/dfs/dn 2025-07-02 15:22:53,247 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d4/dfs/dn 2025-07-02 15:22:53,409 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: Loaded properties from hadoop-metrics2.properties 2025-07-02 15:22:53,520 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled Metric snapshot period at 10 second(s). 2025-07-02 15:22:53,520 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: DataNode metrics system started 2025-07-02 15:22:53,933 INFO org.apache.hadoop.hdfs.server.common.Util: dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling 2025-07-02 15:22:53,963 INFO org.apache.hadoop.hdfs.server.datanode.BlockScanner: Initialized block scanner with targetBytesPerSec 1048576 2025-07-02 15:22:53,972 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: File descriptor passing is enabled. 2025-07-02 15:22:53,973 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Configured hostname is dmidlkprdls04.svr.luc.edu 2025-07-02 15:22:53,974 INFO org.apache.hadoop.hdfs.server.common.Util: dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling 2025-07-02 15:22:53,982 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting DataNode with maxLockedMemory = 4294967296 2025-07-02 15:22:54,014 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened streaming server at /192.168.158.4:9866 2025-07-02 15:22:54,018 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Balancing bandwidth is 10485760 bytes/s 2025-07-02 15:22:54,018 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Number threads for balancing is 50 2025-07-02 15:22:54,022 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Balancing bandwidth is 10485760 bytes/s 2025-07-02 15:22:54,022 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Number threads for balancing is 50 2025-07-02 15:22:54,022 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Listening on UNIX domain socket: /var/run/hdfs-sockets/dn 2025-07-02 15:22:54,078 INFO org.eclipse.jetty.util.log: Logging initialized @2493ms to org.eclipse.jetty.util.log.Slf4jLog 2025-07-02 15:22:54,205 WARN org.apache.hadoop.security.authentication.server.AuthenticationFilter: Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /var/lib/hadoop-hdfs/hadoop-http-auth-signature-secret 2025-07-02 15:22:54,214 INFO org.apache.hadoop.http.HttpRequestLog: Http request log for http.requests.datanode is not defined 2025-07-02 15:22:54,223 INFO org.apache.hadoop.http.HttpServer2: Added global filter 'safety' (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter) 2025-07-02 15:22:54,225 INFO org.apache.hadoop.security.HttpCrossOriginFilterInitializer: CORS filter not enabled. Please set hadoop.http.cross-origin.enabled to 'true' to enable it 2025-07-02 15:22:54,226 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context datanode 2025-07-02 15:22:54,226 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context logs 2025-07-02 15:22:54,226 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context static 2025-07-02 15:22:54,269 INFO org.apache.hadoop.http.HttpServer2: Jetty bound to port 46475 2025-07-02 15:22:54,271 INFO org.eclipse.jetty.server.Server: jetty-9.4.54.v20240208; built: 2024-02-08T19:42:39.027Z; git: cef3fbd6d736a21e7d541a5db490381d95a2047d; jvm 1.8.0_432-b06 2025-07-02 15:22:54,321 INFO org.eclipse.jetty.server.session: DefaultSessionIdManager workerName=node0 2025-07-02 15:22:54,321 INFO org.eclipse.jetty.server.session: No SessionScavenger set, using defaults 2025-07-02 15:22:54,325 INFO org.eclipse.jetty.server.session: node0 Scavenging every 600000ms 2025-07-02 15:22:54,357 WARN org.apache.hadoop.security.authentication.server.AuthenticationFilter: Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /var/lib/hadoop-hdfs/hadoop-http-auth-signature-secret 2025-07-02 15:22:54,364 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.s.ServletContextHandler@1b11ef33{logs,/logs,file:///var/log/hadoop-hdfs/,AVAILABLE} 2025-07-02 15:22:54,366 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.s.ServletContextHandler@2f2bf0e2{static,/static,file:///opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/static/,AVAILABLE} 2025-07-02 15:22:54,488 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.w.WebAppContext@6ebd78d1{datanode,/,file:///opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/datanode/,AVAILABLE}{file:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/datanode} 2025-07-02 15:22:54,501 INFO org.eclipse.jetty.server.AbstractConnector: Started ServerConnector@7fb9f71f{HTTP/1.1, (http/1.1)}{localhost:46475} 2025-07-02 15:22:54,502 INFO org.eclipse.jetty.server.Server: Started @2917ms 2025-07-02 15:22:54,823 INFO org.apache.hadoop.hdfs.server.datanode.web.DatanodeHttpServer: Listening HTTP traffic on /192.168.158.4:9864 2025-07-02 15:22:54,832 INFO org.apache.hadoop.util.JvmPauseMonitor: Starting JVM pause monitor 2025-07-02 15:22:54,835 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: dnUserName = hdfs 2025-07-02 15:22:54,835 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: supergroup = supergroup 2025-07-02 15:22:54,904 INFO org.apache.hadoop.ipc.CallQueueManager: Using callQueue: class java.util.concurrent.LinkedBlockingQueue queueCapacity: 1000 scheduler: class org.apache.hadoop.ipc.DefaultRpcScheduler 2025-07-02 15:22:54,925 INFO org.apache.hadoop.ipc.Server: Starting Socket Reader #1 for port 9867 2025-07-02 15:22:54,977 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened IPC server at /192.168.158.4:9867 2025-07-02 15:22:55,022 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Refresh request received for nameservices: null 2025-07-02 15:22:55,038 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting BPOfferServices for nameservices: 2025-07-02 15:22:55,059 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool (Datanode Uuid unassigned) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 starting to offer service 2025-07-02 15:22:55,076 INFO org.apache.hadoop.ipc.Server: IPC Server Responder: starting 2025-07-02 15:22:55,076 INFO org.apache.hadoop.ipc.Server: IPC Server listener on 9867: starting 2025-07-02 15:22:55,362 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Acknowledging ACTIVE Namenode during handshakeBlock pool (Datanode Uuid unassigned) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-02 15:22:55,365 INFO org.apache.hadoop.hdfs.server.common.Storage: Using 4 threads to upgrade data directories (dfs.datanode.parallel.volumes.load.threads.num=4, dataDirs=4) 2025-07-02 15:22:55,372 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d1/dfs/dn/in_use.lock acquired by nodename 260285@dmidlkprdls04.svr.luc.edu 2025-07-02 15:22:55,375 WARN org.apache.hadoop.hdfs.server.common.Storage: Failed to add storage directory [DISK]file:/hdfs/d1/dfs/dn java.io.IOException: Incompatible clusterIDs in /hdfs/d1/dfs/dn: namenode clusterID = cluster87; datanode clusterID = cluster72 at org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:744) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadStorageDirectory(DataStorage.java:294) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadDataStorage(DataStorage.java:407) at org.apache.hadoop.hdfs.server.datanode.DataStorage.addStorageLocations(DataStorage.java:387) at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:559) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-02 15:22:55,380 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d2/dfs/dn/in_use.lock acquired by nodename 260285@dmidlkprdls04.svr.luc.edu 2025-07-02 15:22:55,380 WARN org.apache.hadoop.hdfs.server.common.Storage: Failed to add storage directory [DISK]file:/hdfs/d2/dfs/dn java.io.IOException: Incompatible clusterIDs in /hdfs/d2/dfs/dn: namenode clusterID = cluster87; datanode clusterID = cluster72 at org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:744) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadStorageDirectory(DataStorage.java:294) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadDataStorage(DataStorage.java:407) at org.apache.hadoop.hdfs.server.datanode.DataStorage.addStorageLocations(DataStorage.java:387) at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:559) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-02 15:22:55,381 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d3/dfs/dn/in_use.lock acquired by nodename 260285@dmidlkprdls04.svr.luc.edu 2025-07-02 15:22:55,382 WARN org.apache.hadoop.hdfs.server.common.Storage: Failed to add storage directory [DISK]file:/hdfs/d3/dfs/dn java.io.IOException: Incompatible clusterIDs in /hdfs/d3/dfs/dn: namenode clusterID = cluster87; datanode clusterID = cluster72 at org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:744) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadStorageDirectory(DataStorage.java:294) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadDataStorage(DataStorage.java:407) at org.apache.hadoop.hdfs.server.datanode.DataStorage.addStorageLocations(DataStorage.java:387) at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:559) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-02 15:22:55,383 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d4/dfs/dn/in_use.lock acquired by nodename 260285@dmidlkprdls04.svr.luc.edu 2025-07-02 15:22:55,383 WARN org.apache.hadoop.hdfs.server.common.Storage: Failed to add storage directory [DISK]file:/hdfs/d4/dfs/dn java.io.IOException: Incompatible clusterIDs in /hdfs/d4/dfs/dn: namenode clusterID = cluster87; datanode clusterID = cluster72 at org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:744) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadStorageDirectory(DataStorage.java:294) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadDataStorage(DataStorage.java:407) at org.apache.hadoop.hdfs.server.datanode.DataStorage.addStorageLocations(DataStorage.java:387) at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:559) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-02 15:22:55,385 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: Initialization failed for Block pool (Datanode Uuid 92aa408f-509e-4bad-98e0-6e83f0a34a12) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Exiting. java.io.IOException: All specified directories have failed to load. at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:560) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-02 15:22:55,386 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Ending block pool service for: Block pool (Datanode Uuid 92aa408f-509e-4bad-98e0-6e83f0a34a12) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-02 15:22:55,387 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Removed Block pool (Datanode Uuid 92aa408f-509e-4bad-98e0-6e83f0a34a12) 2025-07-02 15:22:57,388 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Exiting Datanode 2025-07-02 15:22:57,395 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG: /************************************************************ SHUTDOWN_MSG: Shutting down DataNode at dmidlkprdls04.svr.luc.edu/192.168.158.4 ************************************************************/ 2025-07-02 15:23:02,769 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: STARTUP_MSG: /************************************************************ STARTUP_MSG: Starting DataNode STARTUP_MSG: host = dmidlkprdls04.svr.luc.edu/192.168.158.4 STARTUP_MSG: args = [] STARTUP_MSG: version = 3.1.1.7.3.1.0-197 STARTUP_MSG: classpath = /var/run/cloudera-scm-agent/process/234-hdfs-DATANODE:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/audience-annotations-0.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/avro.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/aws-java-sdk-bundle-1.12.720.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-hdfs-plugin-shim-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-plugin-classloader-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-yarn-plugin-shim-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/azure-data-lake-store-sdk-2.3.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/checker-qual-3.33.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-beanutils-1.9.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-cli-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-codec-1.15.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-collections-3.2.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-compress-1.23.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-configuration2-2.10.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-io-2.11.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-lang-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-lang3-3.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-logging-1.1.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-math3-3.6.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-net-3.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-text-1.10.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-client-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-framework-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-recipes-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/failureaccess-1.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/gson-2.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/guava-32.0.1-jre.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/httpclient-4.5.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/httpcore-4.4.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/j2objc-annotations-2.8.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-core-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-core-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-databind-2.12.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-mapper-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-xc-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/javax.activation-api-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/javax.servlet-api-3.1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jaxb-api-2.2.11.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jcip-annotations-1.0-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-core-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-json-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-server-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-servlet-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jettison-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-http-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-io-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-security-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-server-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-servlet-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-util-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-util-ajax-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-webapp-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-xml-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jline-3.22.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsch-0.1.54.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsp-api-2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsr305-3.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsr311-api-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jul-to-slf4j-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-admin-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-client-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-common-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-core-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-crypto-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-identity-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-server-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-simplekdc-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-asn1-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-config-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-pkix-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-xdr-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/logredactor-2.0.16.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/lz4-java-1.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/metrics-core-3.2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-all-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-buffer-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-haproxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-http-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-http2-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-memcache-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-mqtt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-redis-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-smtp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-log4j12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-reload4j-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-api-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/reload4j-1.2.22.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/zookeeper.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-udt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-unix-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-kqueue-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final-linux-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-classes-kqueue-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-rxtx-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-classes-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-classes-macos-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-ssl-ocsp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-proxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-xml-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-stomp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-socks-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/wildfly-openssl-2.1.4.ClouderaFinal.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/protobuf-java-2.5.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/nimbus-jose-jwt-9.37.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/woodstox-core-5.4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/token-provider-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/stax2-api-4.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/snappy-java-1.1.10.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/re2j-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/zookeeper-jute.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-sctp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-kqueue-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final-linux-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//gcs-connector-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-annotations.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-auth.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-aws.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-datalake.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-kms.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-nfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//ozone-filesystem-hadoop3-1.3.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-nfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-kms-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-datalake-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-aws-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-auth-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-annotations-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//gcs-connector-2.1.2.7.3.1.0-197-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-thrift.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-scala_2.12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-protobuf.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-pig.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-pig-bundle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-jackson.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-hadoop.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-hadoop-bundle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-generator.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-format-structures.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-encoding.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-column.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-cascading3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-cascading.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-avro.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/./:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/audience-annotations-0.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/avro-1.11.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/checker-qual-3.33.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-beanutils-1.9.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-codec-1.15.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-collections-3.2.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-compress-1.23.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-configuration2-2.10.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-io-2.11.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-lang3-3.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-math3-3.6.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-net-3.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-text-1.10.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-client-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-framework-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-recipes-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/failureaccess-1.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/gson-2.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/guava-32.0.1-jre.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/httpclient-4.5.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/httpcore-4.4.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/j2objc-annotations-2.8.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-core-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-databind-2.12.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-jaxrs-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-xc-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/javax.activation-api-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/javax.servlet-api-3.1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jaxb-api-2.2.11.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jaxb-impl-2.2.3-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jcip-annotations-1.0-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-core-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-json-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-server-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-servlet-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jettison-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-http-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-io-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-security-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-server-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-servlet-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-util-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-util-ajax-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-webapp-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-xml-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jline-3.22.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsch-0.1.54.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/json-simple-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsr311-api-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-admin-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-client-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-common-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-core-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-crypto-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-identity-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-server-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-simplekdc-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-asn1-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-config-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-pkix-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-xdr-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/leveldbjni-cldr-1.9.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/lz4-java-1.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/metrics-core-3.2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-all-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-buffer-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-haproxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-http-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-http2-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-memcache-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-mqtt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-redis-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-smtp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-socks-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-stomp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-xml-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-proxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-ssl-ocsp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/re2j-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/zookeeper-3.8.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-sctp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-rxtx-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-unix-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-kqueue-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-kqueue-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final-linux-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final-linux-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-classes-kqueue-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-classes-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-classes-macos-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/nimbus-jose-jwt-9.37.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/woodstox-core-5.4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/token-provider-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/stax2-api-4.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/snappy-java-1.1.10.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/reload4j-1.2.22.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/zookeeper-jute-3.8.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-udt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-httpfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-nfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-nfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-httpfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//asm-5.0.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//aspectjweaver-1.9.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//azure-storage-7.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//checker-compat-qual-2.5.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-slf4j-backend-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-system-backend-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//google-extensions-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//azure-keyvault-core-1.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//accessors-smart-2.4.9.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gcs-connector-2.1.2.7.3.1.0-197-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archive-logs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archive-logs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archives-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archives.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-datajoin-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-datajoin.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-distcp-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-distcp.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-extras-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-extras.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-fs2img-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-fs2img.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-gridmix-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-gridmix.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-kafka-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-kafka.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-app-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-app.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-core-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-core.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-nativetask-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-nativetask.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-uploader-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-uploader.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-examples-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-examples.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-openstack-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-openstack.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-resourceestimator-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-resourceestimator.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-rumen-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-rumen.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-sls-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-sls.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-streaming-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-streaming.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ojalgo-43.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//kafka-clients-2.8.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//log4j-core-2.18.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-hook-abfs-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//forbiddenapis-3.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-intg-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//log4j-api-2.18.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//zstd-jni-1.4.9-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-i18n.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-s3-lib-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//javax.activation-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//bundle-2.23.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//json-smart-2.4.10.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-util-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-shell.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-cloud-bindings.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-hook-s3-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//aspectjrt-1.9.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/./:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/swagger-annotations-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/objenesis-1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jersey-client-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jakarta.xml.bind-api-2.3.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jakarta.activation-api-1.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-dataformat-yaml-2.9.10.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcutil-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcprov-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcpkix-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/HikariCP-java7-2.4.12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/snakeyaml-2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jsonschema2pojo-core-1.0.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/joda-time-2.10.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jna-5.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jersey-guice-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/javax.inject-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-module-jaxb-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-jaxrs-json-provider-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-jaxrs-base-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/guice-servlet-4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/guice-4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/geronimo-jcache_1.0_spec-1.0-alpha-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/fst-2.50.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/ehcache-3.3.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/dnsjava-2.1.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/codemodel-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-api.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-distributedshell.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-cloud-resourcemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-registry.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-nodemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-resourcemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-router.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-sharedcachemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-timeline-pluginstorage.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-web-proxy.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-api.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-core.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-registry-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-api-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-core-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-api-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-web-proxy-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-timeline-pluginstorage-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-tests-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-sharedcachemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-router-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-resourcemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-nodemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-cloud-resourcemanager-1.0.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-distributedshell-3.1.1.7.3.1.0-197.jar:/opt/cloudera/cm/lib/plugins/event-publish-7.13.1-shaded.jar:/opt/cloudera/cm/lib/plugins/tt-instrumentation-7.13.1.jar STARTUP_MSG: build = git@github.infra.cloudera.com:CDH/hadoop.git -r 31a42fb39494f541ffae15c3c61185deeeacca86; compiled by 'jenkins' on 2024-12-04T01:09Z STARTUP_MSG: java = 1.8.0_432 ************************************************************/ 2025-07-02 15:23:02,873 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: registered UNIX signal handlers for [TERM, HUP, INT] 2025-07-02 15:23:03,248 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d1/dfs/dn 2025-07-02 15:23:03,254 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d2/dfs/dn 2025-07-02 15:23:03,255 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d3/dfs/dn 2025-07-02 15:23:03,255 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d4/dfs/dn 2025-07-02 15:23:03,414 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: Loaded properties from hadoop-metrics2.properties 2025-07-02 15:23:03,532 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled Metric snapshot period at 10 second(s). 2025-07-02 15:23:03,532 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: DataNode metrics system started 2025-07-02 15:23:03,928 INFO org.apache.hadoop.hdfs.server.common.Util: dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling 2025-07-02 15:23:03,955 INFO org.apache.hadoop.hdfs.server.datanode.BlockScanner: Initialized block scanner with targetBytesPerSec 1048576 2025-07-02 15:23:03,962 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: File descriptor passing is enabled. 2025-07-02 15:23:03,964 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Configured hostname is dmidlkprdls04.svr.luc.edu 2025-07-02 15:23:03,964 INFO org.apache.hadoop.hdfs.server.common.Util: dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling 2025-07-02 15:23:03,970 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting DataNode with maxLockedMemory = 4294967296 2025-07-02 15:23:03,999 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened streaming server at /192.168.158.4:9866 2025-07-02 15:23:04,002 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Balancing bandwidth is 10485760 bytes/s 2025-07-02 15:23:04,002 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Number threads for balancing is 50 2025-07-02 15:23:04,006 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Balancing bandwidth is 10485760 bytes/s 2025-07-02 15:23:04,006 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Number threads for balancing is 50 2025-07-02 15:23:04,006 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Listening on UNIX domain socket: /var/run/hdfs-sockets/dn 2025-07-02 15:23:04,056 INFO org.eclipse.jetty.util.log: Logging initialized @2478ms to org.eclipse.jetty.util.log.Slf4jLog 2025-07-02 15:23:04,196 WARN org.apache.hadoop.security.authentication.server.AuthenticationFilter: Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /var/lib/hadoop-hdfs/hadoop-http-auth-signature-secret 2025-07-02 15:23:04,206 INFO org.apache.hadoop.http.HttpRequestLog: Http request log for http.requests.datanode is not defined 2025-07-02 15:23:04,217 INFO org.apache.hadoop.http.HttpServer2: Added global filter 'safety' (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter) 2025-07-02 15:23:04,221 INFO org.apache.hadoop.security.HttpCrossOriginFilterInitializer: CORS filter not enabled. Please set hadoop.http.cross-origin.enabled to 'true' to enable it 2025-07-02 15:23:04,222 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context datanode 2025-07-02 15:23:04,222 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context logs 2025-07-02 15:23:04,222 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context static 2025-07-02 15:23:04,263 INFO org.apache.hadoop.http.HttpServer2: Jetty bound to port 42799 2025-07-02 15:23:04,265 INFO org.eclipse.jetty.server.Server: jetty-9.4.54.v20240208; built: 2024-02-08T19:42:39.027Z; git: cef3fbd6d736a21e7d541a5db490381d95a2047d; jvm 1.8.0_432-b06 2025-07-02 15:23:04,320 INFO org.eclipse.jetty.server.session: DefaultSessionIdManager workerName=node0 2025-07-02 15:23:04,320 INFO org.eclipse.jetty.server.session: No SessionScavenger set, using defaults 2025-07-02 15:23:04,323 INFO org.eclipse.jetty.server.session: node0 Scavenging every 600000ms 2025-07-02 15:23:04,351 WARN org.apache.hadoop.security.authentication.server.AuthenticationFilter: Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /var/lib/hadoop-hdfs/hadoop-http-auth-signature-secret 2025-07-02 15:23:04,356 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.s.ServletContextHandler@1b11ef33{logs,/logs,file:///var/log/hadoop-hdfs/,AVAILABLE} 2025-07-02 15:23:04,358 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.s.ServletContextHandler@2f2bf0e2{static,/static,file:///opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/static/,AVAILABLE} 2025-07-02 15:23:04,475 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.w.WebAppContext@6ebd78d1{datanode,/,file:///opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/datanode/,AVAILABLE}{file:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/datanode} 2025-07-02 15:23:04,489 INFO org.eclipse.jetty.server.AbstractConnector: Started ServerConnector@7fb9f71f{HTTP/1.1, (http/1.1)}{localhost:42799} 2025-07-02 15:23:04,489 INFO org.eclipse.jetty.server.Server: Started @2911ms 2025-07-02 15:23:04,798 INFO org.apache.hadoop.hdfs.server.datanode.web.DatanodeHttpServer: Listening HTTP traffic on /192.168.158.4:9864 2025-07-02 15:23:04,808 INFO org.apache.hadoop.util.JvmPauseMonitor: Starting JVM pause monitor 2025-07-02 15:23:04,809 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: dnUserName = hdfs 2025-07-02 15:23:04,809 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: supergroup = supergroup 2025-07-02 15:23:04,872 INFO org.apache.hadoop.ipc.CallQueueManager: Using callQueue: class java.util.concurrent.LinkedBlockingQueue queueCapacity: 1000 scheduler: class org.apache.hadoop.ipc.DefaultRpcScheduler 2025-07-02 15:23:04,892 INFO org.apache.hadoop.ipc.Server: Starting Socket Reader #1 for port 9867 2025-07-02 15:23:04,937 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened IPC server at /192.168.158.4:9867 2025-07-02 15:23:04,970 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Refresh request received for nameservices: null 2025-07-02 15:23:04,980 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting BPOfferServices for nameservices: 2025-07-02 15:23:04,995 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool (Datanode Uuid unassigned) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 starting to offer service 2025-07-02 15:23:05,004 INFO org.apache.hadoop.ipc.Server: IPC Server Responder: starting 2025-07-02 15:23:05,004 INFO org.apache.hadoop.ipc.Server: IPC Server listener on 9867: starting 2025-07-02 15:23:05,309 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Acknowledging ACTIVE Namenode during handshakeBlock pool (Datanode Uuid unassigned) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-02 15:23:05,316 INFO org.apache.hadoop.hdfs.server.common.Storage: Using 4 threads to upgrade data directories (dfs.datanode.parallel.volumes.load.threads.num=4, dataDirs=4) 2025-07-02 15:23:05,326 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d1/dfs/dn/in_use.lock acquired by nodename 260796@dmidlkprdls04.svr.luc.edu 2025-07-02 15:23:05,331 WARN org.apache.hadoop.hdfs.server.common.Storage: Failed to add storage directory [DISK]file:/hdfs/d1/dfs/dn java.io.IOException: Incompatible clusterIDs in /hdfs/d1/dfs/dn: namenode clusterID = cluster87; datanode clusterID = cluster72 at org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:744) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadStorageDirectory(DataStorage.java:294) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadDataStorage(DataStorage.java:407) at org.apache.hadoop.hdfs.server.datanode.DataStorage.addStorageLocations(DataStorage.java:387) at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:559) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-02 15:23:05,338 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d2/dfs/dn/in_use.lock acquired by nodename 260796@dmidlkprdls04.svr.luc.edu 2025-07-02 15:23:05,338 WARN org.apache.hadoop.hdfs.server.common.Storage: Failed to add storage directory [DISK]file:/hdfs/d2/dfs/dn java.io.IOException: Incompatible clusterIDs in /hdfs/d2/dfs/dn: namenode clusterID = cluster87; datanode clusterID = cluster72 at org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:744) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadStorageDirectory(DataStorage.java:294) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadDataStorage(DataStorage.java:407) at org.apache.hadoop.hdfs.server.datanode.DataStorage.addStorageLocations(DataStorage.java:387) at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:559) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-02 15:23:05,339 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d3/dfs/dn/in_use.lock acquired by nodename 260796@dmidlkprdls04.svr.luc.edu 2025-07-02 15:23:05,340 WARN org.apache.hadoop.hdfs.server.common.Storage: Failed to add storage directory [DISK]file:/hdfs/d3/dfs/dn java.io.IOException: Incompatible clusterIDs in /hdfs/d3/dfs/dn: namenode clusterID = cluster87; datanode clusterID = cluster72 at org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:744) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadStorageDirectory(DataStorage.java:294) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadDataStorage(DataStorage.java:407) at org.apache.hadoop.hdfs.server.datanode.DataStorage.addStorageLocations(DataStorage.java:387) at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:559) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-02 15:23:05,341 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d4/dfs/dn/in_use.lock acquired by nodename 260796@dmidlkprdls04.svr.luc.edu 2025-07-02 15:23:05,341 WARN org.apache.hadoop.hdfs.server.common.Storage: Failed to add storage directory [DISK]file:/hdfs/d4/dfs/dn java.io.IOException: Incompatible clusterIDs in /hdfs/d4/dfs/dn: namenode clusterID = cluster87; datanode clusterID = cluster72 at org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:744) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadStorageDirectory(DataStorage.java:294) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadDataStorage(DataStorage.java:407) at org.apache.hadoop.hdfs.server.datanode.DataStorage.addStorageLocations(DataStorage.java:387) at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:559) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-02 15:23:05,344 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: Initialization failed for Block pool (Datanode Uuid 92aa408f-509e-4bad-98e0-6e83f0a34a12) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Exiting. java.io.IOException: All specified directories have failed to load. at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:560) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-02 15:23:05,344 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Ending block pool service for: Block pool (Datanode Uuid 92aa408f-509e-4bad-98e0-6e83f0a34a12) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-02 15:23:05,345 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Removed Block pool (Datanode Uuid 92aa408f-509e-4bad-98e0-6e83f0a34a12) 2025-07-02 15:23:07,346 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Exiting Datanode 2025-07-02 15:23:07,353 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG: /************************************************************ SHUTDOWN_MSG: Shutting down DataNode at dmidlkprdls04.svr.luc.edu/192.168.158.4 ************************************************************/ 2025-07-02 15:23:13,809 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: STARTUP_MSG: /************************************************************ STARTUP_MSG: Starting DataNode STARTUP_MSG: host = dmidlkprdls04.svr.luc.edu/192.168.158.4 STARTUP_MSG: args = [] STARTUP_MSG: version = 3.1.1.7.3.1.0-197 STARTUP_MSG: classpath = /var/run/cloudera-scm-agent/process/234-hdfs-DATANODE:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/audience-annotations-0.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/avro.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/aws-java-sdk-bundle-1.12.720.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-hdfs-plugin-shim-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-plugin-classloader-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-yarn-plugin-shim-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/azure-data-lake-store-sdk-2.3.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/checker-qual-3.33.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-beanutils-1.9.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-cli-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-codec-1.15.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-collections-3.2.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-compress-1.23.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-configuration2-2.10.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-io-2.11.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-lang-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-lang3-3.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-logging-1.1.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-math3-3.6.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-net-3.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-text-1.10.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-client-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-framework-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-recipes-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/failureaccess-1.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/gson-2.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/guava-32.0.1-jre.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/httpclient-4.5.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/httpcore-4.4.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/j2objc-annotations-2.8.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-core-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-core-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-databind-2.12.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-mapper-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-xc-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/javax.activation-api-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/javax.servlet-api-3.1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jaxb-api-2.2.11.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jcip-annotations-1.0-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-core-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-json-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-server-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-servlet-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jettison-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-http-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-io-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-security-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-server-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-servlet-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-util-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-util-ajax-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-webapp-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-xml-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jline-3.22.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsch-0.1.54.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsp-api-2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsr305-3.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsr311-api-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jul-to-slf4j-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-admin-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-client-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-common-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-core-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-crypto-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-identity-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-server-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-simplekdc-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-asn1-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-config-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-pkix-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-xdr-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/logredactor-2.0.16.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/lz4-java-1.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/metrics-core-3.2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-all-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-buffer-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-haproxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-http-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-http2-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-memcache-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-mqtt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-redis-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-smtp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-log4j12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-reload4j-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-api-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/reload4j-1.2.22.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/zookeeper.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-udt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-unix-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-kqueue-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final-linux-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-classes-kqueue-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-rxtx-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-classes-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-classes-macos-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-ssl-ocsp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-proxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-xml-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-stomp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-socks-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/wildfly-openssl-2.1.4.ClouderaFinal.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/protobuf-java-2.5.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/nimbus-jose-jwt-9.37.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/woodstox-core-5.4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/token-provider-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/stax2-api-4.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/snappy-java-1.1.10.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/re2j-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/zookeeper-jute.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-sctp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-kqueue-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final-linux-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//gcs-connector-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-annotations.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-auth.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-aws.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-datalake.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-kms.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-nfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//ozone-filesystem-hadoop3-1.3.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-nfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-kms-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-datalake-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-aws-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-auth-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-annotations-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//gcs-connector-2.1.2.7.3.1.0-197-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-thrift.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-scala_2.12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-protobuf.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-pig.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-pig-bundle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-jackson.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-hadoop.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-hadoop-bundle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-generator.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-format-structures.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-encoding.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-column.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-cascading3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-cascading.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-avro.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/./:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/audience-annotations-0.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/avro-1.11.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/checker-qual-3.33.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-beanutils-1.9.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-codec-1.15.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-collections-3.2.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-compress-1.23.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-configuration2-2.10.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-io-2.11.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-lang3-3.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-math3-3.6.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-net-3.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-text-1.10.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-client-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-framework-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-recipes-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/failureaccess-1.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/gson-2.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/guava-32.0.1-jre.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/httpclient-4.5.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/httpcore-4.4.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/j2objc-annotations-2.8.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-core-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-databind-2.12.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-jaxrs-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-xc-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/javax.activation-api-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/javax.servlet-api-3.1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jaxb-api-2.2.11.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jaxb-impl-2.2.3-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jcip-annotations-1.0-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-core-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-json-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-server-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-servlet-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jettison-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-http-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-io-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-security-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-server-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-servlet-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-util-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-util-ajax-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-webapp-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-xml-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jline-3.22.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsch-0.1.54.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/json-simple-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsr311-api-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-admin-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-client-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-common-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-core-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-crypto-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-identity-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-server-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-simplekdc-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-asn1-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-config-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-pkix-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-xdr-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/leveldbjni-cldr-1.9.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/lz4-java-1.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/metrics-core-3.2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-all-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-buffer-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-haproxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-http-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-http2-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-memcache-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-mqtt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-redis-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-smtp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-socks-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-stomp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-xml-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-proxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-ssl-ocsp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/re2j-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/zookeeper-3.8.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-sctp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-rxtx-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-unix-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-kqueue-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-kqueue-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final-linux-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final-linux-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-classes-kqueue-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-classes-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-classes-macos-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/nimbus-jose-jwt-9.37.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/woodstox-core-5.4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/token-provider-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/stax2-api-4.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/snappy-java-1.1.10.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/reload4j-1.2.22.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/zookeeper-jute-3.8.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-udt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-httpfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-nfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-nfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-httpfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//asm-5.0.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//aspectjweaver-1.9.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//azure-storage-7.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//checker-compat-qual-2.5.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-slf4j-backend-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-system-backend-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//google-extensions-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//azure-keyvault-core-1.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//accessors-smart-2.4.9.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gcs-connector-2.1.2.7.3.1.0-197-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archive-logs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archive-logs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archives-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archives.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-datajoin-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-datajoin.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-distcp-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-distcp.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-extras-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-extras.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-fs2img-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-fs2img.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-gridmix-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-gridmix.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-kafka-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-kafka.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-app-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-app.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-core-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-core.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-nativetask-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-nativetask.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-uploader-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-uploader.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-examples-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-examples.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-openstack-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-openstack.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-resourceestimator-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-resourceestimator.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-rumen-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-rumen.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-sls-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-sls.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-streaming-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-streaming.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ojalgo-43.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//kafka-clients-2.8.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//log4j-core-2.18.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-hook-abfs-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//forbiddenapis-3.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-intg-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//log4j-api-2.18.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//zstd-jni-1.4.9-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-i18n.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-s3-lib-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//javax.activation-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//bundle-2.23.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//json-smart-2.4.10.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-util-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-shell.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-cloud-bindings.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-hook-s3-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//aspectjrt-1.9.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/./:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/swagger-annotations-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/objenesis-1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jersey-client-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jakarta.xml.bind-api-2.3.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jakarta.activation-api-1.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-dataformat-yaml-2.9.10.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcutil-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcprov-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcpkix-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/HikariCP-java7-2.4.12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/snakeyaml-2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jsonschema2pojo-core-1.0.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/joda-time-2.10.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jna-5.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jersey-guice-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/javax.inject-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-module-jaxb-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-jaxrs-json-provider-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-jaxrs-base-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/guice-servlet-4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/guice-4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/geronimo-jcache_1.0_spec-1.0-alpha-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/fst-2.50.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/ehcache-3.3.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/dnsjava-2.1.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/codemodel-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-api.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-distributedshell.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-cloud-resourcemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-registry.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-nodemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-resourcemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-router.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-sharedcachemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-timeline-pluginstorage.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-web-proxy.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-api.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-core.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-registry-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-api-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-core-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-api-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-web-proxy-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-timeline-pluginstorage-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-tests-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-sharedcachemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-router-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-resourcemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-nodemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-cloud-resourcemanager-1.0.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-distributedshell-3.1.1.7.3.1.0-197.jar:/opt/cloudera/cm/lib/plugins/event-publish-7.13.1-shaded.jar:/opt/cloudera/cm/lib/plugins/tt-instrumentation-7.13.1.jar STARTUP_MSG: build = git@github.infra.cloudera.com:CDH/hadoop.git -r 31a42fb39494f541ffae15c3c61185deeeacca86; compiled by 'jenkins' on 2024-12-04T01:09Z STARTUP_MSG: java = 1.8.0_432 ************************************************************/ 2025-07-02 15:23:13,893 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: registered UNIX signal handlers for [TERM, HUP, INT] 2025-07-02 15:23:14,239 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d1/dfs/dn 2025-07-02 15:23:14,245 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d2/dfs/dn 2025-07-02 15:23:14,246 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d3/dfs/dn 2025-07-02 15:23:14,246 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d4/dfs/dn 2025-07-02 15:23:14,413 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: Loaded properties from hadoop-metrics2.properties 2025-07-02 15:23:14,527 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled Metric snapshot period at 10 second(s). 2025-07-02 15:23:14,527 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: DataNode metrics system started 2025-07-02 15:23:14,796 INFO org.apache.hadoop.hdfs.server.common.Util: dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling 2025-07-02 15:23:14,930 INFO org.apache.hadoop.hdfs.server.datanode.BlockScanner: Initialized block scanner with targetBytesPerSec 1048576 2025-07-02 15:23:14,940 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: File descriptor passing is enabled. 2025-07-02 15:23:14,941 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Configured hostname is dmidlkprdls04.svr.luc.edu 2025-07-02 15:23:14,942 INFO org.apache.hadoop.hdfs.server.common.Util: dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling 2025-07-02 15:23:14,951 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting DataNode with maxLockedMemory = 4294967296 2025-07-02 15:23:14,982 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened streaming server at /192.168.158.4:9866 2025-07-02 15:23:14,985 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Balancing bandwidth is 10485760 bytes/s 2025-07-02 15:23:14,986 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Number threads for balancing is 50 2025-07-02 15:23:14,988 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Balancing bandwidth is 10485760 bytes/s 2025-07-02 15:23:14,988 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Number threads for balancing is 50 2025-07-02 15:23:14,988 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Listening on UNIX domain socket: /var/run/hdfs-sockets/dn 2025-07-02 15:23:15,028 INFO org.eclipse.jetty.util.log: Logging initialized @2448ms to org.eclipse.jetty.util.log.Slf4jLog 2025-07-02 15:23:15,141 WARN org.apache.hadoop.security.authentication.server.AuthenticationFilter: Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /var/lib/hadoop-hdfs/hadoop-http-auth-signature-secret 2025-07-02 15:23:15,149 INFO org.apache.hadoop.http.HttpRequestLog: Http request log for http.requests.datanode is not defined 2025-07-02 15:23:15,160 INFO org.apache.hadoop.http.HttpServer2: Added global filter 'safety' (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter) 2025-07-02 15:23:15,162 INFO org.apache.hadoop.security.HttpCrossOriginFilterInitializer: CORS filter not enabled. Please set hadoop.http.cross-origin.enabled to 'true' to enable it 2025-07-02 15:23:15,163 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context datanode 2025-07-02 15:23:15,163 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context logs 2025-07-02 15:23:15,163 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context static 2025-07-02 15:23:15,214 INFO org.apache.hadoop.http.HttpServer2: Jetty bound to port 43023 2025-07-02 15:23:15,217 INFO org.eclipse.jetty.server.Server: jetty-9.4.54.v20240208; built: 2024-02-08T19:42:39.027Z; git: cef3fbd6d736a21e7d541a5db490381d95a2047d; jvm 1.8.0_432-b06 2025-07-02 15:23:15,272 INFO org.eclipse.jetty.server.session: DefaultSessionIdManager workerName=node0 2025-07-02 15:23:15,273 INFO org.eclipse.jetty.server.session: No SessionScavenger set, using defaults 2025-07-02 15:23:15,276 INFO org.eclipse.jetty.server.session: node0 Scavenging every 660000ms 2025-07-02 15:23:15,305 WARN org.apache.hadoop.security.authentication.server.AuthenticationFilter: Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /var/lib/hadoop-hdfs/hadoop-http-auth-signature-secret 2025-07-02 15:23:15,312 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.s.ServletContextHandler@6cea706c{logs,/logs,file:///var/log/hadoop-hdfs/,AVAILABLE} 2025-07-02 15:23:15,314 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.s.ServletContextHandler@21ec5d87{static,/static,file:///opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/static/,AVAILABLE} 2025-07-02 15:23:15,440 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.w.WebAppContext@4d157787{datanode,/,file:///opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/datanode/,AVAILABLE}{file:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/datanode} 2025-07-02 15:23:15,454 INFO org.eclipse.jetty.server.AbstractConnector: Started ServerConnector@51f49060{HTTP/1.1, (http/1.1)}{localhost:43023} 2025-07-02 15:23:15,455 INFO org.eclipse.jetty.server.Server: Started @2876ms 2025-07-02 15:23:15,772 INFO org.apache.hadoop.hdfs.server.datanode.web.DatanodeHttpServer: Listening HTTP traffic on /192.168.158.4:9864 2025-07-02 15:23:15,784 INFO org.apache.hadoop.util.JvmPauseMonitor: Starting JVM pause monitor 2025-07-02 15:23:15,786 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: dnUserName = hdfs 2025-07-02 15:23:15,786 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: supergroup = supergroup 2025-07-02 15:23:15,863 INFO org.apache.hadoop.ipc.CallQueueManager: Using callQueue: class java.util.concurrent.LinkedBlockingQueue queueCapacity: 1000 scheduler: class org.apache.hadoop.ipc.DefaultRpcScheduler 2025-07-02 15:23:15,885 INFO org.apache.hadoop.ipc.Server: Starting Socket Reader #1 for port 9867 2025-07-02 15:23:15,932 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened IPC server at /192.168.158.4:9867 2025-07-02 15:23:15,965 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Refresh request received for nameservices: null 2025-07-02 15:23:15,975 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting BPOfferServices for nameservices: 2025-07-02 15:23:15,989 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool (Datanode Uuid unassigned) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 starting to offer service 2025-07-02 15:23:15,996 INFO org.apache.hadoop.ipc.Server: IPC Server Responder: starting 2025-07-02 15:23:15,996 INFO org.apache.hadoop.ipc.Server: IPC Server listener on 9867: starting 2025-07-02 15:23:16,303 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Acknowledging ACTIVE Namenode during handshakeBlock pool (Datanode Uuid unassigned) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-02 15:23:16,307 INFO org.apache.hadoop.hdfs.server.common.Storage: Using 4 threads to upgrade data directories (dfs.datanode.parallel.volumes.load.threads.num=4, dataDirs=4) 2025-07-02 15:23:16,314 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d1/dfs/dn/in_use.lock acquired by nodename 261365@dmidlkprdls04.svr.luc.edu 2025-07-02 15:23:16,318 WARN org.apache.hadoop.hdfs.server.common.Storage: Failed to add storage directory [DISK]file:/hdfs/d1/dfs/dn java.io.IOException: Incompatible clusterIDs in /hdfs/d1/dfs/dn: namenode clusterID = cluster87; datanode clusterID = cluster72 at org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:744) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadStorageDirectory(DataStorage.java:294) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadDataStorage(DataStorage.java:407) at org.apache.hadoop.hdfs.server.datanode.DataStorage.addStorageLocations(DataStorage.java:387) at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:559) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-02 15:23:16,323 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d2/dfs/dn/in_use.lock acquired by nodename 261365@dmidlkprdls04.svr.luc.edu 2025-07-02 15:23:16,323 WARN org.apache.hadoop.hdfs.server.common.Storage: Failed to add storage directory [DISK]file:/hdfs/d2/dfs/dn java.io.IOException: Incompatible clusterIDs in /hdfs/d2/dfs/dn: namenode clusterID = cluster87; datanode clusterID = cluster72 at org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:744) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadStorageDirectory(DataStorage.java:294) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadDataStorage(DataStorage.java:407) at org.apache.hadoop.hdfs.server.datanode.DataStorage.addStorageLocations(DataStorage.java:387) at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:559) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-02 15:23:16,325 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d3/dfs/dn/in_use.lock acquired by nodename 261365@dmidlkprdls04.svr.luc.edu 2025-07-02 15:23:16,325 WARN org.apache.hadoop.hdfs.server.common.Storage: Failed to add storage directory [DISK]file:/hdfs/d3/dfs/dn java.io.IOException: Incompatible clusterIDs in /hdfs/d3/dfs/dn: namenode clusterID = cluster87; datanode clusterID = cluster72 at org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:744) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadStorageDirectory(DataStorage.java:294) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadDataStorage(DataStorage.java:407) at org.apache.hadoop.hdfs.server.datanode.DataStorage.addStorageLocations(DataStorage.java:387) at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:559) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-02 15:23:16,326 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d4/dfs/dn/in_use.lock acquired by nodename 261365@dmidlkprdls04.svr.luc.edu 2025-07-02 15:23:16,327 WARN org.apache.hadoop.hdfs.server.common.Storage: Failed to add storage directory [DISK]file:/hdfs/d4/dfs/dn java.io.IOException: Incompatible clusterIDs in /hdfs/d4/dfs/dn: namenode clusterID = cluster87; datanode clusterID = cluster72 at org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:744) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadStorageDirectory(DataStorage.java:294) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadDataStorage(DataStorage.java:407) at org.apache.hadoop.hdfs.server.datanode.DataStorage.addStorageLocations(DataStorage.java:387) at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:559) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-02 15:23:16,330 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: Initialization failed for Block pool (Datanode Uuid 92aa408f-509e-4bad-98e0-6e83f0a34a12) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Exiting. java.io.IOException: All specified directories have failed to load. at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:560) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-02 15:23:16,331 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Ending block pool service for: Block pool (Datanode Uuid 92aa408f-509e-4bad-98e0-6e83f0a34a12) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-02 15:23:16,335 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Removed Block pool (Datanode Uuid 92aa408f-509e-4bad-98e0-6e83f0a34a12) 2025-07-02 15:23:18,335 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Exiting Datanode 2025-07-02 15:23:18,341 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG: /************************************************************ SHUTDOWN_MSG: Shutting down DataNode at dmidlkprdls04.svr.luc.edu/192.168.158.4 ************************************************************/ 2025-07-02 15:55:59,035 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: STARTUP_MSG: /************************************************************ STARTUP_MSG: Starting DataNode STARTUP_MSG: host = dmidlkprdls04.svr.luc.edu/192.168.158.4 STARTUP_MSG: args = [] STARTUP_MSG: version = 3.1.1.7.3.1.0-197 STARTUP_MSG: classpath = /var/run/cloudera-scm-agent/process/259-hdfs-DATANODE:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/audience-annotations-0.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/avro.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/aws-java-sdk-bundle-1.12.720.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-hdfs-plugin-shim-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-plugin-classloader-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-yarn-plugin-shim-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/azure-data-lake-store-sdk-2.3.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/checker-qual-3.33.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-beanutils-1.9.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-cli-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-codec-1.15.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-collections-3.2.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-compress-1.23.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-configuration2-2.10.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-io-2.11.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-lang-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-lang3-3.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-logging-1.1.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-math3-3.6.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-net-3.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-text-1.10.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-client-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-framework-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-recipes-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/failureaccess-1.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/gson-2.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/guava-32.0.1-jre.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/httpclient-4.5.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/httpcore-4.4.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/j2objc-annotations-2.8.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-core-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-core-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-databind-2.12.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-mapper-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-xc-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/javax.activation-api-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/javax.servlet-api-3.1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jaxb-api-2.2.11.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jcip-annotations-1.0-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-core-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-json-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-server-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-servlet-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jettison-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-http-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-io-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-security-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-server-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-servlet-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-util-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-util-ajax-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-webapp-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-xml-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jline-3.22.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsch-0.1.54.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsp-api-2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsr305-3.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsr311-api-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jul-to-slf4j-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-admin-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-client-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-common-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-core-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-crypto-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-identity-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-server-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-simplekdc-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-asn1-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-config-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-pkix-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-xdr-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/logredactor-2.0.16.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/lz4-java-1.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/metrics-core-3.2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-all-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-buffer-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-haproxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-http-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-http2-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-memcache-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-mqtt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-redis-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-smtp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-log4j12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-reload4j-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-api-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/reload4j-1.2.22.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/zookeeper.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-udt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-unix-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-kqueue-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final-linux-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-classes-kqueue-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-rxtx-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-classes-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-classes-macos-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-ssl-ocsp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-proxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-xml-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-stomp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-socks-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/wildfly-openssl-2.1.4.ClouderaFinal.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/protobuf-java-2.5.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/nimbus-jose-jwt-9.37.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/woodstox-core-5.4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/token-provider-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/stax2-api-4.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/snappy-java-1.1.10.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/re2j-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/zookeeper-jute.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-sctp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-kqueue-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final-linux-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//gcs-connector-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-annotations.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-auth.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-aws.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-datalake.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-kms.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-nfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//ozone-filesystem-hadoop3-1.3.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-nfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-kms-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-datalake-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-aws-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-auth-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-annotations-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//gcs-connector-2.1.2.7.3.1.0-197-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-thrift.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-scala_2.12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-protobuf.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-pig.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-pig-bundle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-jackson.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-hadoop.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-hadoop-bundle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-generator.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-format-structures.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-encoding.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-column.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-cascading3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-cascading.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-avro.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/./:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/audience-annotations-0.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/avro-1.11.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/checker-qual-3.33.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-beanutils-1.9.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-codec-1.15.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-collections-3.2.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-compress-1.23.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-configuration2-2.10.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-io-2.11.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-lang3-3.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-math3-3.6.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-net-3.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-text-1.10.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-client-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-framework-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-recipes-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/failureaccess-1.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/gson-2.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/guava-32.0.1-jre.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/httpclient-4.5.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/httpcore-4.4.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/j2objc-annotations-2.8.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-core-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-databind-2.12.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-jaxrs-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-xc-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/javax.activation-api-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/javax.servlet-api-3.1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jaxb-api-2.2.11.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jaxb-impl-2.2.3-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jcip-annotations-1.0-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-core-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-json-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-server-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-servlet-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jettison-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-http-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-io-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-security-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-server-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-servlet-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-util-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-util-ajax-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-webapp-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-xml-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jline-3.22.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsch-0.1.54.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/json-simple-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsr311-api-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-admin-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-client-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-common-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-core-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-crypto-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-identity-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-server-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-simplekdc-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-asn1-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-config-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-pkix-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-xdr-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/leveldbjni-cldr-1.9.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/lz4-java-1.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/metrics-core-3.2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-all-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-buffer-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-haproxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-http-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-http2-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-memcache-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-mqtt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-redis-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-smtp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-socks-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-stomp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-xml-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-proxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-ssl-ocsp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/re2j-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/zookeeper-3.8.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-sctp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-rxtx-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-unix-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-kqueue-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-kqueue-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final-linux-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final-linux-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-classes-kqueue-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-classes-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-classes-macos-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/nimbus-jose-jwt-9.37.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/woodstox-core-5.4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/token-provider-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/stax2-api-4.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/snappy-java-1.1.10.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/reload4j-1.2.22.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/zookeeper-jute-3.8.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-udt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-httpfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-nfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-nfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-httpfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//asm-5.0.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//aspectjweaver-1.9.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//azure-storage-7.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//checker-compat-qual-2.5.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-slf4j-backend-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-system-backend-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//google-extensions-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//azure-keyvault-core-1.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//accessors-smart-2.4.9.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gcs-connector-2.1.2.7.3.1.0-197-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archive-logs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archive-logs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archives-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archives.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-datajoin-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-datajoin.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-distcp-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-distcp.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-extras-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-extras.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-fs2img-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-fs2img.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-gridmix-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-gridmix.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-kafka-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-kafka.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-app-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-app.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-core-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-core.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-nativetask-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-nativetask.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-uploader-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-uploader.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-examples-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-examples.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-openstack-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-openstack.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-resourceestimator-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-resourceestimator.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-rumen-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-rumen.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-sls-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-sls.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-streaming-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-streaming.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ojalgo-43.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//kafka-clients-2.8.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//log4j-core-2.18.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-hook-abfs-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//forbiddenapis-3.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-intg-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//log4j-api-2.18.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//zstd-jni-1.4.9-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-i18n.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-s3-lib-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//javax.activation-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//bundle-2.23.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//json-smart-2.4.10.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-util-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-shell.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-cloud-bindings.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-hook-s3-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//aspectjrt-1.9.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/./:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/swagger-annotations-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/objenesis-1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jersey-client-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jakarta.xml.bind-api-2.3.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jakarta.activation-api-1.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-dataformat-yaml-2.9.10.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcutil-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcprov-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcpkix-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/HikariCP-java7-2.4.12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/snakeyaml-2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jsonschema2pojo-core-1.0.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/joda-time-2.10.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jna-5.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jersey-guice-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/javax.inject-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-module-jaxb-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-jaxrs-json-provider-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-jaxrs-base-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/guice-servlet-4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/guice-4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/geronimo-jcache_1.0_spec-1.0-alpha-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/fst-2.50.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/ehcache-3.3.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/dnsjava-2.1.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/codemodel-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-api.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-distributedshell.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-cloud-resourcemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-registry.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-nodemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-resourcemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-router.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-sharedcachemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-timeline-pluginstorage.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-web-proxy.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-api.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-core.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-registry-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-api-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-core-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-api-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-web-proxy-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-timeline-pluginstorage-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-tests-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-sharedcachemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-router-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-resourcemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-nodemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-cloud-resourcemanager-1.0.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-distributedshell-3.1.1.7.3.1.0-197.jar:/opt/cloudera/cm/lib/plugins/event-publish-7.13.1-shaded.jar:/opt/cloudera/cm/lib/plugins/tt-instrumentation-7.13.1.jar STARTUP_MSG: build = git@github.infra.cloudera.com:CDH/hadoop.git -r 31a42fb39494f541ffae15c3c61185deeeacca86; compiled by 'jenkins' on 2024-12-04T01:09Z STARTUP_MSG: java = 1.8.0_432 ************************************************************/ 2025-07-02 15:55:59,127 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: registered UNIX signal handlers for [TERM, HUP, INT] 2025-07-02 15:55:59,503 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d1/dfs/dn 2025-07-02 15:55:59,510 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d2/dfs/dn 2025-07-02 15:55:59,511 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d3/dfs/dn 2025-07-02 15:55:59,511 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d4/dfs/dn 2025-07-02 15:55:59,660 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: Loaded properties from hadoop-metrics2.properties 2025-07-02 15:55:59,784 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled Metric snapshot period at 10 second(s). 2025-07-02 15:55:59,785 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: DataNode metrics system started 2025-07-02 15:56:00,106 INFO org.apache.hadoop.hdfs.server.common.Util: dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling 2025-07-02 15:56:00,130 INFO org.apache.hadoop.hdfs.server.datanode.BlockScanner: Initialized block scanner with targetBytesPerSec 1048576 2025-07-02 15:56:00,137 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: File descriptor passing is enabled. 2025-07-02 15:56:00,138 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Configured hostname is dmidlkprdls04.svr.luc.edu 2025-07-02 15:56:00,139 INFO org.apache.hadoop.hdfs.server.common.Util: dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling 2025-07-02 15:56:00,146 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting DataNode with maxLockedMemory = 4294967296 2025-07-02 15:56:00,176 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened streaming server at /192.168.158.4:9866 2025-07-02 15:56:00,179 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Balancing bandwidth is 10485760 bytes/s 2025-07-02 15:56:00,179 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Number threads for balancing is 50 2025-07-02 15:56:00,182 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Balancing bandwidth is 10485760 bytes/s 2025-07-02 15:56:00,183 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Number threads for balancing is 50 2025-07-02 15:56:00,183 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Listening on UNIX domain socket: /var/run/hdfs-sockets/dn 2025-07-02 15:56:00,237 INFO org.eclipse.jetty.util.log: Logging initialized @3108ms to org.eclipse.jetty.util.log.Slf4jLog 2025-07-02 15:56:00,348 WARN org.apache.hadoop.security.authentication.server.AuthenticationFilter: Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /var/lib/hadoop-hdfs/hadoop-http-auth-signature-secret 2025-07-02 15:56:00,356 INFO org.apache.hadoop.http.HttpRequestLog: Http request log for http.requests.datanode is not defined 2025-07-02 15:56:00,366 INFO org.apache.hadoop.http.HttpServer2: Added global filter 'safety' (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter) 2025-07-02 15:56:00,368 INFO org.apache.hadoop.security.HttpCrossOriginFilterInitializer: CORS filter not enabled. Please set hadoop.http.cross-origin.enabled to 'true' to enable it 2025-07-02 15:56:00,369 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context datanode 2025-07-02 15:56:00,369 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context logs 2025-07-02 15:56:00,369 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context static 2025-07-02 15:56:00,409 INFO org.apache.hadoop.http.HttpServer2: Jetty bound to port 32839 2025-07-02 15:56:00,412 INFO org.eclipse.jetty.server.Server: jetty-9.4.54.v20240208; built: 2024-02-08T19:42:39.027Z; git: cef3fbd6d736a21e7d541a5db490381d95a2047d; jvm 1.8.0_432-b06 2025-07-02 15:56:00,445 INFO org.eclipse.jetty.server.session: DefaultSessionIdManager workerName=node0 2025-07-02 15:56:00,445 INFO org.eclipse.jetty.server.session: No SessionScavenger set, using defaults 2025-07-02 15:56:00,447 INFO org.eclipse.jetty.server.session: node0 Scavenging every 660000ms 2025-07-02 15:56:00,467 WARN org.apache.hadoop.security.authentication.server.AuthenticationFilter: Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /var/lib/hadoop-hdfs/hadoop-http-auth-signature-secret 2025-07-02 15:56:00,471 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.s.ServletContextHandler@1b11ef33{logs,/logs,file:///var/log/hadoop-hdfs/,AVAILABLE} 2025-07-02 15:56:00,473 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.s.ServletContextHandler@2f2bf0e2{static,/static,file:///opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/static/,AVAILABLE} 2025-07-02 15:56:00,612 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.w.WebAppContext@6ebd78d1{datanode,/,file:///opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/datanode/,AVAILABLE}{file:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/datanode} 2025-07-02 15:56:00,625 INFO org.eclipse.jetty.server.AbstractConnector: Started ServerConnector@7fb9f71f{HTTP/1.1, (http/1.1)}{localhost:32839} 2025-07-02 15:56:00,625 INFO org.eclipse.jetty.server.Server: Started @3497ms 2025-07-02 15:56:00,904 INFO org.apache.hadoop.hdfs.server.datanode.web.DatanodeHttpServer: Listening HTTP traffic on /192.168.158.4:9864 2025-07-02 15:56:00,915 INFO org.apache.hadoop.util.JvmPauseMonitor: Starting JVM pause monitor 2025-07-02 15:56:00,916 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: dnUserName = hdfs 2025-07-02 15:56:00,916 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: supergroup = supergroup 2025-07-02 15:56:00,981 INFO org.apache.hadoop.ipc.CallQueueManager: Using callQueue: class java.util.concurrent.LinkedBlockingQueue queueCapacity: 1000 scheduler: class org.apache.hadoop.ipc.DefaultRpcScheduler 2025-07-02 15:56:01,003 INFO org.apache.hadoop.ipc.Server: Starting Socket Reader #1 for port 9867 2025-07-02 15:56:01,049 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened IPC server at /192.168.158.4:9867 2025-07-02 15:56:01,083 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Refresh request received for nameservices: null 2025-07-02 15:56:01,092 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting BPOfferServices for nameservices: 2025-07-02 15:56:01,107 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool (Datanode Uuid unassigned) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 starting to offer service 2025-07-02 15:56:01,115 INFO org.apache.hadoop.ipc.Server: IPC Server Responder: starting 2025-07-02 15:56:01,116 INFO org.apache.hadoop.ipc.Server: IPC Server listener on 9867: starting 2025-07-02 15:56:01,405 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Acknowledging ACTIVE Namenode during handshakeBlock pool (Datanode Uuid unassigned) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-02 15:56:01,410 INFO org.apache.hadoop.hdfs.server.common.Storage: Using 4 threads to upgrade data directories (dfs.datanode.parallel.volumes.load.threads.num=4, dataDirs=4) 2025-07-02 15:56:01,420 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d1/dfs/dn/in_use.lock acquired by nodename 3757@dmidlkprdls04.svr.luc.edu 2025-07-02 15:56:01,424 WARN org.apache.hadoop.hdfs.server.common.Storage: Failed to add storage directory [DISK]file:/hdfs/d1/dfs/dn java.io.IOException: Incompatible clusterIDs in /hdfs/d1/dfs/dn: namenode clusterID = cluster87; datanode clusterID = cluster72 at org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:744) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadStorageDirectory(DataStorage.java:294) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadDataStorage(DataStorage.java:407) at org.apache.hadoop.hdfs.server.datanode.DataStorage.addStorageLocations(DataStorage.java:387) at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:559) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-02 15:56:01,430 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d2/dfs/dn/in_use.lock acquired by nodename 3757@dmidlkprdls04.svr.luc.edu 2025-07-02 15:56:01,431 WARN org.apache.hadoop.hdfs.server.common.Storage: Failed to add storage directory [DISK]file:/hdfs/d2/dfs/dn java.io.IOException: Incompatible clusterIDs in /hdfs/d2/dfs/dn: namenode clusterID = cluster87; datanode clusterID = cluster72 at org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:744) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadStorageDirectory(DataStorage.java:294) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadDataStorage(DataStorage.java:407) at org.apache.hadoop.hdfs.server.datanode.DataStorage.addStorageLocations(DataStorage.java:387) at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:559) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-02 15:56:01,433 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d3/dfs/dn/in_use.lock acquired by nodename 3757@dmidlkprdls04.svr.luc.edu 2025-07-02 15:56:01,434 WARN org.apache.hadoop.hdfs.server.common.Storage: Failed to add storage directory [DISK]file:/hdfs/d3/dfs/dn java.io.IOException: Incompatible clusterIDs in /hdfs/d3/dfs/dn: namenode clusterID = cluster87; datanode clusterID = cluster72 at org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:744) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadStorageDirectory(DataStorage.java:294) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadDataStorage(DataStorage.java:407) at org.apache.hadoop.hdfs.server.datanode.DataStorage.addStorageLocations(DataStorage.java:387) at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:559) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-02 15:56:01,435 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d4/dfs/dn/in_use.lock acquired by nodename 3757@dmidlkprdls04.svr.luc.edu 2025-07-02 15:56:01,436 WARN org.apache.hadoop.hdfs.server.common.Storage: Failed to add storage directory [DISK]file:/hdfs/d4/dfs/dn java.io.IOException: Incompatible clusterIDs in /hdfs/d4/dfs/dn: namenode clusterID = cluster87; datanode clusterID = cluster72 at org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:744) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadStorageDirectory(DataStorage.java:294) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadDataStorage(DataStorage.java:407) at org.apache.hadoop.hdfs.server.datanode.DataStorage.addStorageLocations(DataStorage.java:387) at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:559) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-02 15:56:01,439 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: Initialization failed for Block pool (Datanode Uuid 92aa408f-509e-4bad-98e0-6e83f0a34a12) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Exiting. java.io.IOException: All specified directories have failed to load. at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:560) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-02 15:56:01,440 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Ending block pool service for: Block pool (Datanode Uuid 92aa408f-509e-4bad-98e0-6e83f0a34a12) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-02 15:56:01,444 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Removed Block pool (Datanode Uuid 92aa408f-509e-4bad-98e0-6e83f0a34a12) 2025-07-02 15:56:03,445 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Exiting Datanode 2025-07-02 15:56:03,452 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG: /************************************************************ SHUTDOWN_MSG: Shutting down DataNode at dmidlkprdls04.svr.luc.edu/192.168.158.4 ************************************************************/ 2025-07-02 15:56:07,817 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: STARTUP_MSG: /************************************************************ STARTUP_MSG: Starting DataNode STARTUP_MSG: host = dmidlkprdls04.svr.luc.edu/192.168.158.4 STARTUP_MSG: args = [] STARTUP_MSG: version = 3.1.1.7.3.1.0-197 STARTUP_MSG: classpath = /var/run/cloudera-scm-agent/process/259-hdfs-DATANODE:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/audience-annotations-0.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/avro.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/aws-java-sdk-bundle-1.12.720.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-hdfs-plugin-shim-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-plugin-classloader-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-yarn-plugin-shim-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/azure-data-lake-store-sdk-2.3.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/checker-qual-3.33.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-beanutils-1.9.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-cli-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-codec-1.15.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-collections-3.2.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-compress-1.23.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-configuration2-2.10.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-io-2.11.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-lang-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-lang3-3.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-logging-1.1.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-math3-3.6.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-net-3.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-text-1.10.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-client-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-framework-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-recipes-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/failureaccess-1.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/gson-2.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/guava-32.0.1-jre.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/httpclient-4.5.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/httpcore-4.4.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/j2objc-annotations-2.8.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-core-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-core-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-databind-2.12.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-mapper-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-xc-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/javax.activation-api-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/javax.servlet-api-3.1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jaxb-api-2.2.11.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jcip-annotations-1.0-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-core-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-json-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-server-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-servlet-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jettison-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-http-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-io-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-security-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-server-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-servlet-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-util-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-util-ajax-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-webapp-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-xml-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jline-3.22.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsch-0.1.54.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsp-api-2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsr305-3.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsr311-api-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jul-to-slf4j-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-admin-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-client-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-common-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-core-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-crypto-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-identity-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-server-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-simplekdc-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-asn1-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-config-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-pkix-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-xdr-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/logredactor-2.0.16.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/lz4-java-1.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/metrics-core-3.2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-all-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-buffer-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-haproxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-http-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-http2-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-memcache-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-mqtt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-redis-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-smtp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-log4j12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-reload4j-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-api-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/reload4j-1.2.22.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/zookeeper.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-udt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-unix-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-kqueue-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final-linux-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-classes-kqueue-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-rxtx-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-classes-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-classes-macos-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-ssl-ocsp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-proxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-xml-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-stomp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-socks-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/wildfly-openssl-2.1.4.ClouderaFinal.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/protobuf-java-2.5.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/nimbus-jose-jwt-9.37.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/woodstox-core-5.4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/token-provider-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/stax2-api-4.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/snappy-java-1.1.10.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/re2j-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/zookeeper-jute.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-sctp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-kqueue-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final-linux-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//gcs-connector-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-annotations.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-auth.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-aws.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-datalake.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-kms.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-nfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//ozone-filesystem-hadoop3-1.3.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-nfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-kms-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-datalake-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-aws-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-auth-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-annotations-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//gcs-connector-2.1.2.7.3.1.0-197-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-thrift.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-scala_2.12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-protobuf.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-pig.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-pig-bundle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-jackson.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-hadoop.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-hadoop-bundle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-generator.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-format-structures.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-encoding.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-column.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-cascading3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-cascading.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-avro.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/./:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/audience-annotations-0.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/avro-1.11.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/checker-qual-3.33.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-beanutils-1.9.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-codec-1.15.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-collections-3.2.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-compress-1.23.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-configuration2-2.10.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-io-2.11.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-lang3-3.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-math3-3.6.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-net-3.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-text-1.10.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-client-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-framework-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-recipes-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/failureaccess-1.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/gson-2.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/guava-32.0.1-jre.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/httpclient-4.5.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/httpcore-4.4.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/j2objc-annotations-2.8.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-core-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-databind-2.12.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-jaxrs-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-xc-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/javax.activation-api-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/javax.servlet-api-3.1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jaxb-api-2.2.11.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jaxb-impl-2.2.3-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jcip-annotations-1.0-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-core-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-json-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-server-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-servlet-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jettison-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-http-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-io-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-security-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-server-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-servlet-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-util-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-util-ajax-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-webapp-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-xml-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jline-3.22.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsch-0.1.54.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/json-simple-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsr311-api-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-admin-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-client-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-common-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-core-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-crypto-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-identity-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-server-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-simplekdc-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-asn1-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-config-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-pkix-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-xdr-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/leveldbjni-cldr-1.9.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/lz4-java-1.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/metrics-core-3.2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-all-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-buffer-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-haproxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-http-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-http2-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-memcache-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-mqtt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-redis-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-smtp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-socks-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-stomp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-xml-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-proxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-ssl-ocsp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/re2j-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/zookeeper-3.8.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-sctp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-rxtx-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-unix-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-kqueue-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-kqueue-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final-linux-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final-linux-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-classes-kqueue-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-classes-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-classes-macos-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/nimbus-jose-jwt-9.37.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/woodstox-core-5.4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/token-provider-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/stax2-api-4.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/snappy-java-1.1.10.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/reload4j-1.2.22.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/zookeeper-jute-3.8.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-udt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-httpfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-nfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-nfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-httpfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//asm-5.0.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//aspectjweaver-1.9.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//azure-storage-7.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//checker-compat-qual-2.5.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-slf4j-backend-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-system-backend-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//google-extensions-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//azure-keyvault-core-1.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//accessors-smart-2.4.9.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gcs-connector-2.1.2.7.3.1.0-197-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archive-logs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archive-logs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archives-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archives.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-datajoin-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-datajoin.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-distcp-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-distcp.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-extras-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-extras.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-fs2img-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-fs2img.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-gridmix-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-gridmix.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-kafka-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-kafka.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-app-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-app.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-core-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-core.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-nativetask-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-nativetask.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-uploader-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-uploader.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-examples-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-examples.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-openstack-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-openstack.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-resourceestimator-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-resourceestimator.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-rumen-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-rumen.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-sls-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-sls.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-streaming-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-streaming.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ojalgo-43.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//kafka-clients-2.8.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//log4j-core-2.18.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-hook-abfs-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//forbiddenapis-3.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-intg-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//log4j-api-2.18.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//zstd-jni-1.4.9-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-i18n.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-s3-lib-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//javax.activation-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//bundle-2.23.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//json-smart-2.4.10.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-util-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-shell.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-cloud-bindings.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-hook-s3-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//aspectjrt-1.9.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/./:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/swagger-annotations-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/objenesis-1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jersey-client-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jakarta.xml.bind-api-2.3.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jakarta.activation-api-1.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-dataformat-yaml-2.9.10.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcutil-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcprov-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcpkix-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/HikariCP-java7-2.4.12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/snakeyaml-2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jsonschema2pojo-core-1.0.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/joda-time-2.10.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jna-5.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jersey-guice-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/javax.inject-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-module-jaxb-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-jaxrs-json-provider-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-jaxrs-base-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/guice-servlet-4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/guice-4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/geronimo-jcache_1.0_spec-1.0-alpha-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/fst-2.50.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/ehcache-3.3.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/dnsjava-2.1.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/codemodel-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-api.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-distributedshell.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-cloud-resourcemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-registry.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-nodemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-resourcemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-router.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-sharedcachemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-timeline-pluginstorage.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-web-proxy.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-api.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-core.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-registry-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-api-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-core-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-api-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-web-proxy-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-timeline-pluginstorage-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-tests-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-sharedcachemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-router-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-resourcemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-nodemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-cloud-resourcemanager-1.0.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-distributedshell-3.1.1.7.3.1.0-197.jar:/opt/cloudera/cm/lib/plugins/event-publish-7.13.1-shaded.jar:/opt/cloudera/cm/lib/plugins/tt-instrumentation-7.13.1.jar STARTUP_MSG: build = git@github.infra.cloudera.com:CDH/hadoop.git -r 31a42fb39494f541ffae15c3c61185deeeacca86; compiled by 'jenkins' on 2024-12-04T01:09Z STARTUP_MSG: java = 1.8.0_432 ************************************************************/ 2025-07-02 15:56:07,914 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: registered UNIX signal handlers for [TERM, HUP, INT] 2025-07-02 15:56:08,271 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d1/dfs/dn 2025-07-02 15:56:08,277 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d2/dfs/dn 2025-07-02 15:56:08,278 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d3/dfs/dn 2025-07-02 15:56:08,278 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d4/dfs/dn 2025-07-02 15:56:08,438 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: Loaded properties from hadoop-metrics2.properties 2025-07-02 15:56:08,571 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled Metric snapshot period at 10 second(s). 2025-07-02 15:56:08,571 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: DataNode metrics system started 2025-07-02 15:56:08,955 INFO org.apache.hadoop.hdfs.server.common.Util: dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling 2025-07-02 15:56:08,979 INFO org.apache.hadoop.hdfs.server.datanode.BlockScanner: Initialized block scanner with targetBytesPerSec 1048576 2025-07-02 15:56:08,986 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: File descriptor passing is enabled. 2025-07-02 15:56:08,987 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Configured hostname is dmidlkprdls04.svr.luc.edu 2025-07-02 15:56:08,987 INFO org.apache.hadoop.hdfs.server.common.Util: dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling 2025-07-02 15:56:08,995 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting DataNode with maxLockedMemory = 4294967296 2025-07-02 15:56:09,024 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened streaming server at /192.168.158.4:9866 2025-07-02 15:56:09,027 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Balancing bandwidth is 10485760 bytes/s 2025-07-02 15:56:09,027 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Number threads for balancing is 50 2025-07-02 15:56:09,030 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Balancing bandwidth is 10485760 bytes/s 2025-07-02 15:56:09,031 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Number threads for balancing is 50 2025-07-02 15:56:09,031 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Listening on UNIX domain socket: /var/run/hdfs-sockets/dn 2025-07-02 15:56:09,080 INFO org.eclipse.jetty.util.log: Logging initialized @2495ms to org.eclipse.jetty.util.log.Slf4jLog 2025-07-02 15:56:09,196 WARN org.apache.hadoop.security.authentication.server.AuthenticationFilter: Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /var/lib/hadoop-hdfs/hadoop-http-auth-signature-secret 2025-07-02 15:56:09,206 INFO org.apache.hadoop.http.HttpRequestLog: Http request log for http.requests.datanode is not defined 2025-07-02 15:56:09,214 INFO org.apache.hadoop.http.HttpServer2: Added global filter 'safety' (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter) 2025-07-02 15:56:09,216 INFO org.apache.hadoop.security.HttpCrossOriginFilterInitializer: CORS filter not enabled. Please set hadoop.http.cross-origin.enabled to 'true' to enable it 2025-07-02 15:56:09,217 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context datanode 2025-07-02 15:56:09,217 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context logs 2025-07-02 15:56:09,218 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context static 2025-07-02 15:56:09,259 INFO org.apache.hadoop.http.HttpServer2: Jetty bound to port 44715 2025-07-02 15:56:09,261 INFO org.eclipse.jetty.server.Server: jetty-9.4.54.v20240208; built: 2024-02-08T19:42:39.027Z; git: cef3fbd6d736a21e7d541a5db490381d95a2047d; jvm 1.8.0_432-b06 2025-07-02 15:56:09,318 INFO org.eclipse.jetty.server.session: DefaultSessionIdManager workerName=node0 2025-07-02 15:56:09,318 INFO org.eclipse.jetty.server.session: No SessionScavenger set, using defaults 2025-07-02 15:56:09,323 INFO org.eclipse.jetty.server.session: node0 Scavenging every 660000ms 2025-07-02 15:56:09,354 WARN org.apache.hadoop.security.authentication.server.AuthenticationFilter: Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /var/lib/hadoop-hdfs/hadoop-http-auth-signature-secret 2025-07-02 15:56:09,360 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.s.ServletContextHandler@1b11ef33{logs,/logs,file:///var/log/hadoop-hdfs/,AVAILABLE} 2025-07-02 15:56:09,362 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.s.ServletContextHandler@2f2bf0e2{static,/static,file:///opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/static/,AVAILABLE} 2025-07-02 15:56:09,495 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.w.WebAppContext@6ebd78d1{datanode,/,file:///opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/datanode/,AVAILABLE}{file:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/datanode} 2025-07-02 15:56:09,508 INFO org.eclipse.jetty.server.AbstractConnector: Started ServerConnector@7fb9f71f{HTTP/1.1, (http/1.1)}{localhost:44715} 2025-07-02 15:56:09,509 INFO org.eclipse.jetty.server.Server: Started @2923ms 2025-07-02 15:56:09,799 INFO org.apache.hadoop.hdfs.server.datanode.web.DatanodeHttpServer: Listening HTTP traffic on /192.168.158.4:9864 2025-07-02 15:56:09,810 INFO org.apache.hadoop.util.JvmPauseMonitor: Starting JVM pause monitor 2025-07-02 15:56:09,811 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: dnUserName = hdfs 2025-07-02 15:56:09,811 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: supergroup = supergroup 2025-07-02 15:56:09,873 INFO org.apache.hadoop.ipc.CallQueueManager: Using callQueue: class java.util.concurrent.LinkedBlockingQueue queueCapacity: 1000 scheduler: class org.apache.hadoop.ipc.DefaultRpcScheduler 2025-07-02 15:56:09,894 INFO org.apache.hadoop.ipc.Server: Starting Socket Reader #1 for port 9867 2025-07-02 15:56:09,941 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened IPC server at /192.168.158.4:9867 2025-07-02 15:56:09,976 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Refresh request received for nameservices: null 2025-07-02 15:56:09,986 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting BPOfferServices for nameservices: 2025-07-02 15:56:10,002 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool (Datanode Uuid unassigned) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 starting to offer service 2025-07-02 15:56:10,011 INFO org.apache.hadoop.ipc.Server: IPC Server Responder: starting 2025-07-02 15:56:10,011 INFO org.apache.hadoop.ipc.Server: IPC Server listener on 9867: starting 2025-07-02 15:56:10,303 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Acknowledging ACTIVE Namenode during handshakeBlock pool (Datanode Uuid unassigned) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-02 15:56:10,308 INFO org.apache.hadoop.hdfs.server.common.Storage: Using 4 threads to upgrade data directories (dfs.datanode.parallel.volumes.load.threads.num=4, dataDirs=4) 2025-07-02 15:56:10,317 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d1/dfs/dn/in_use.lock acquired by nodename 4268@dmidlkprdls04.svr.luc.edu 2025-07-02 15:56:10,320 WARN org.apache.hadoop.hdfs.server.common.Storage: Failed to add storage directory [DISK]file:/hdfs/d1/dfs/dn java.io.IOException: Incompatible clusterIDs in /hdfs/d1/dfs/dn: namenode clusterID = cluster87; datanode clusterID = cluster72 at org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:744) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadStorageDirectory(DataStorage.java:294) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadDataStorage(DataStorage.java:407) at org.apache.hadoop.hdfs.server.datanode.DataStorage.addStorageLocations(DataStorage.java:387) at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:559) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-02 15:56:10,326 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d2/dfs/dn/in_use.lock acquired by nodename 4268@dmidlkprdls04.svr.luc.edu 2025-07-02 15:56:10,327 WARN org.apache.hadoop.hdfs.server.common.Storage: Failed to add storage directory [DISK]file:/hdfs/d2/dfs/dn java.io.IOException: Incompatible clusterIDs in /hdfs/d2/dfs/dn: namenode clusterID = cluster87; datanode clusterID = cluster72 at org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:744) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadStorageDirectory(DataStorage.java:294) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadDataStorage(DataStorage.java:407) at org.apache.hadoop.hdfs.server.datanode.DataStorage.addStorageLocations(DataStorage.java:387) at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:559) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-02 15:56:10,328 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d3/dfs/dn/in_use.lock acquired by nodename 4268@dmidlkprdls04.svr.luc.edu 2025-07-02 15:56:10,328 WARN org.apache.hadoop.hdfs.server.common.Storage: Failed to add storage directory [DISK]file:/hdfs/d3/dfs/dn java.io.IOException: Incompatible clusterIDs in /hdfs/d3/dfs/dn: namenode clusterID = cluster87; datanode clusterID = cluster72 at org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:744) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadStorageDirectory(DataStorage.java:294) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadDataStorage(DataStorage.java:407) at org.apache.hadoop.hdfs.server.datanode.DataStorage.addStorageLocations(DataStorage.java:387) at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:559) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-02 15:56:10,329 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d4/dfs/dn/in_use.lock acquired by nodename 4268@dmidlkprdls04.svr.luc.edu 2025-07-02 15:56:10,330 WARN org.apache.hadoop.hdfs.server.common.Storage: Failed to add storage directory [DISK]file:/hdfs/d4/dfs/dn java.io.IOException: Incompatible clusterIDs in /hdfs/d4/dfs/dn: namenode clusterID = cluster87; datanode clusterID = cluster72 at org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:744) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadStorageDirectory(DataStorage.java:294) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadDataStorage(DataStorage.java:407) at org.apache.hadoop.hdfs.server.datanode.DataStorage.addStorageLocations(DataStorage.java:387) at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:559) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-02 15:56:10,333 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: Initialization failed for Block pool (Datanode Uuid 92aa408f-509e-4bad-98e0-6e83f0a34a12) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Exiting. java.io.IOException: All specified directories have failed to load. at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:560) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-02 15:56:10,334 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Ending block pool service for: Block pool (Datanode Uuid 92aa408f-509e-4bad-98e0-6e83f0a34a12) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-02 15:56:10,335 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Removed Block pool (Datanode Uuid 92aa408f-509e-4bad-98e0-6e83f0a34a12) 2025-07-02 15:56:12,336 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Exiting Datanode 2025-07-02 15:56:12,343 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG: /************************************************************ SHUTDOWN_MSG: Shutting down DataNode at dmidlkprdls04.svr.luc.edu/192.168.158.4 ************************************************************/ 2025-07-02 15:56:17,644 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: STARTUP_MSG: /************************************************************ STARTUP_MSG: Starting DataNode STARTUP_MSG: host = dmidlkprdls04.svr.luc.edu/192.168.158.4 STARTUP_MSG: args = [] STARTUP_MSG: version = 3.1.1.7.3.1.0-197 STARTUP_MSG: classpath = /var/run/cloudera-scm-agent/process/259-hdfs-DATANODE:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/audience-annotations-0.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/avro.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/aws-java-sdk-bundle-1.12.720.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-hdfs-plugin-shim-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-plugin-classloader-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-yarn-plugin-shim-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/azure-data-lake-store-sdk-2.3.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/checker-qual-3.33.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-beanutils-1.9.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-cli-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-codec-1.15.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-collections-3.2.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-compress-1.23.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-configuration2-2.10.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-io-2.11.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-lang-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-lang3-3.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-logging-1.1.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-math3-3.6.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-net-3.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-text-1.10.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-client-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-framework-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-recipes-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/failureaccess-1.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/gson-2.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/guava-32.0.1-jre.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/httpclient-4.5.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/httpcore-4.4.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/j2objc-annotations-2.8.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-core-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-core-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-databind-2.12.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-mapper-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-xc-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/javax.activation-api-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/javax.servlet-api-3.1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jaxb-api-2.2.11.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jcip-annotations-1.0-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-core-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-json-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-server-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-servlet-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jettison-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-http-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-io-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-security-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-server-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-servlet-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-util-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-util-ajax-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-webapp-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-xml-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jline-3.22.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsch-0.1.54.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsp-api-2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsr305-3.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsr311-api-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jul-to-slf4j-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-admin-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-client-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-common-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-core-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-crypto-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-identity-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-server-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-simplekdc-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-asn1-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-config-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-pkix-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-xdr-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/logredactor-2.0.16.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/lz4-java-1.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/metrics-core-3.2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-all-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-buffer-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-haproxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-http-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-http2-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-memcache-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-mqtt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-redis-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-smtp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-log4j12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-reload4j-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-api-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/reload4j-1.2.22.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/zookeeper.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-udt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-unix-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-kqueue-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final-linux-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-classes-kqueue-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-rxtx-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-classes-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-classes-macos-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-ssl-ocsp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-proxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-xml-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-stomp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-socks-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/wildfly-openssl-2.1.4.ClouderaFinal.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/protobuf-java-2.5.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/nimbus-jose-jwt-9.37.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/woodstox-core-5.4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/token-provider-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/stax2-api-4.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/snappy-java-1.1.10.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/re2j-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/zookeeper-jute.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-sctp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-kqueue-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final-linux-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//gcs-connector-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-annotations.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-auth.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-aws.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-datalake.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-kms.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-nfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//ozone-filesystem-hadoop3-1.3.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-nfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-kms-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-datalake-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-aws-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-auth-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-annotations-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//gcs-connector-2.1.2.7.3.1.0-197-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-thrift.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-scala_2.12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-protobuf.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-pig.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-pig-bundle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-jackson.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-hadoop.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-hadoop-bundle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-generator.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-format-structures.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-encoding.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-column.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-cascading3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-cascading.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-avro.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/./:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/audience-annotations-0.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/avro-1.11.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/checker-qual-3.33.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-beanutils-1.9.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-codec-1.15.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-collections-3.2.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-compress-1.23.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-configuration2-2.10.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-io-2.11.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-lang3-3.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-math3-3.6.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-net-3.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-text-1.10.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-client-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-framework-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-recipes-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/failureaccess-1.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/gson-2.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/guava-32.0.1-jre.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/httpclient-4.5.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/httpcore-4.4.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/j2objc-annotations-2.8.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-core-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-databind-2.12.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-jaxrs-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-xc-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/javax.activation-api-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/javax.servlet-api-3.1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jaxb-api-2.2.11.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jaxb-impl-2.2.3-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jcip-annotations-1.0-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-core-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-json-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-server-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-servlet-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jettison-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-http-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-io-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-security-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-server-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-servlet-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-util-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-util-ajax-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-webapp-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-xml-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jline-3.22.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsch-0.1.54.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/json-simple-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsr311-api-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-admin-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-client-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-common-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-core-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-crypto-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-identity-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-server-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-simplekdc-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-asn1-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-config-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-pkix-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-xdr-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/leveldbjni-cldr-1.9.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/lz4-java-1.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/metrics-core-3.2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-all-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-buffer-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-haproxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-http-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-http2-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-memcache-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-mqtt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-redis-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-smtp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-socks-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-stomp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-xml-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-proxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-ssl-ocsp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/re2j-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/zookeeper-3.8.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-sctp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-rxtx-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-unix-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-kqueue-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-kqueue-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final-linux-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final-linux-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-classes-kqueue-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-classes-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-classes-macos-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/nimbus-jose-jwt-9.37.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/woodstox-core-5.4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/token-provider-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/stax2-api-4.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/snappy-java-1.1.10.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/reload4j-1.2.22.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/zookeeper-jute-3.8.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-udt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-httpfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-nfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-nfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-httpfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//asm-5.0.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//aspectjweaver-1.9.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//azure-storage-7.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//checker-compat-qual-2.5.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-slf4j-backend-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-system-backend-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//google-extensions-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//azure-keyvault-core-1.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//accessors-smart-2.4.9.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gcs-connector-2.1.2.7.3.1.0-197-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archive-logs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archive-logs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archives-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archives.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-datajoin-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-datajoin.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-distcp-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-distcp.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-extras-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-extras.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-fs2img-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-fs2img.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-gridmix-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-gridmix.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-kafka-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-kafka.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-app-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-app.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-core-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-core.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-nativetask-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-nativetask.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-uploader-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-uploader.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-examples-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-examples.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-openstack-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-openstack.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-resourceestimator-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-resourceestimator.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-rumen-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-rumen.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-sls-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-sls.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-streaming-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-streaming.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ojalgo-43.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//kafka-clients-2.8.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//log4j-core-2.18.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-hook-abfs-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//forbiddenapis-3.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-intg-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//log4j-api-2.18.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//zstd-jni-1.4.9-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-i18n.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-s3-lib-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//javax.activation-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//bundle-2.23.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//json-smart-2.4.10.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-util-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-shell.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-cloud-bindings.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-hook-s3-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//aspectjrt-1.9.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/./:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/swagger-annotations-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/objenesis-1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jersey-client-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jakarta.xml.bind-api-2.3.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jakarta.activation-api-1.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-dataformat-yaml-2.9.10.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcutil-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcprov-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcpkix-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/HikariCP-java7-2.4.12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/snakeyaml-2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jsonschema2pojo-core-1.0.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/joda-time-2.10.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jna-5.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jersey-guice-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/javax.inject-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-module-jaxb-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-jaxrs-json-provider-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-jaxrs-base-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/guice-servlet-4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/guice-4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/geronimo-jcache_1.0_spec-1.0-alpha-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/fst-2.50.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/ehcache-3.3.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/dnsjava-2.1.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/codemodel-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-api.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-distributedshell.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-cloud-resourcemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-registry.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-nodemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-resourcemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-router.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-sharedcachemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-timeline-pluginstorage.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-web-proxy.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-api.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-core.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-registry-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-api-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-core-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-api-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-web-proxy-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-timeline-pluginstorage-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-tests-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-sharedcachemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-router-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-resourcemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-nodemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-cloud-resourcemanager-1.0.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-distributedshell-3.1.1.7.3.1.0-197.jar:/opt/cloudera/cm/lib/plugins/event-publish-7.13.1-shaded.jar:/opt/cloudera/cm/lib/plugins/tt-instrumentation-7.13.1.jar STARTUP_MSG: build = git@github.infra.cloudera.com:CDH/hadoop.git -r 31a42fb39494f541ffae15c3c61185deeeacca86; compiled by 'jenkins' on 2024-12-04T01:09Z STARTUP_MSG: java = 1.8.0_432 ************************************************************/ 2025-07-02 15:56:17,718 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: registered UNIX signal handlers for [TERM, HUP, INT] 2025-07-02 15:56:18,067 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d1/dfs/dn 2025-07-02 15:56:18,073 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d2/dfs/dn 2025-07-02 15:56:18,073 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d3/dfs/dn 2025-07-02 15:56:18,074 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d4/dfs/dn 2025-07-02 15:56:18,246 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: Loaded properties from hadoop-metrics2.properties 2025-07-02 15:56:18,345 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled Metric snapshot period at 10 second(s). 2025-07-02 15:56:18,345 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: DataNode metrics system started 2025-07-02 15:56:18,641 INFO org.apache.hadoop.hdfs.server.common.Util: dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling 2025-07-02 15:56:18,663 INFO org.apache.hadoop.hdfs.server.datanode.BlockScanner: Initialized block scanner with targetBytesPerSec 1048576 2025-07-02 15:56:18,670 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: File descriptor passing is enabled. 2025-07-02 15:56:18,671 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Configured hostname is dmidlkprdls04.svr.luc.edu 2025-07-02 15:56:18,671 INFO org.apache.hadoop.hdfs.server.common.Util: dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling 2025-07-02 15:56:18,677 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting DataNode with maxLockedMemory = 4294967296 2025-07-02 15:56:18,702 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened streaming server at /192.168.158.4:9866 2025-07-02 15:56:18,813 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Balancing bandwidth is 10485760 bytes/s 2025-07-02 15:56:18,813 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Number threads for balancing is 50 2025-07-02 15:56:18,818 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Balancing bandwidth is 10485760 bytes/s 2025-07-02 15:56:18,818 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Number threads for balancing is 50 2025-07-02 15:56:18,818 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Listening on UNIX domain socket: /var/run/hdfs-sockets/dn 2025-07-02 15:56:18,891 INFO org.eclipse.jetty.util.log: Logging initialized @2414ms to org.eclipse.jetty.util.log.Slf4jLog 2025-07-02 15:56:19,027 WARN org.apache.hadoop.security.authentication.server.AuthenticationFilter: Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /var/lib/hadoop-hdfs/hadoop-http-auth-signature-secret 2025-07-02 15:56:19,036 INFO org.apache.hadoop.http.HttpRequestLog: Http request log for http.requests.datanode is not defined 2025-07-02 15:56:19,045 INFO org.apache.hadoop.http.HttpServer2: Added global filter 'safety' (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter) 2025-07-02 15:56:19,047 INFO org.apache.hadoop.security.HttpCrossOriginFilterInitializer: CORS filter not enabled. Please set hadoop.http.cross-origin.enabled to 'true' to enable it 2025-07-02 15:56:19,049 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context datanode 2025-07-02 15:56:19,049 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context logs 2025-07-02 15:56:19,049 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context static 2025-07-02 15:56:19,088 INFO org.apache.hadoop.http.HttpServer2: Jetty bound to port 41527 2025-07-02 15:56:19,089 INFO org.eclipse.jetty.server.Server: jetty-9.4.54.v20240208; built: 2024-02-08T19:42:39.027Z; git: cef3fbd6d736a21e7d541a5db490381d95a2047d; jvm 1.8.0_432-b06 2025-07-02 15:56:19,142 INFO org.eclipse.jetty.server.session: DefaultSessionIdManager workerName=node0 2025-07-02 15:56:19,142 INFO org.eclipse.jetty.server.session: No SessionScavenger set, using defaults 2025-07-02 15:56:19,147 INFO org.eclipse.jetty.server.session: node0 Scavenging every 600000ms 2025-07-02 15:56:19,174 WARN org.apache.hadoop.security.authentication.server.AuthenticationFilter: Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /var/lib/hadoop-hdfs/hadoop-http-auth-signature-secret 2025-07-02 15:56:19,179 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.s.ServletContextHandler@6cea706c{logs,/logs,file:///var/log/hadoop-hdfs/,AVAILABLE} 2025-07-02 15:56:19,180 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.s.ServletContextHandler@21ec5d87{static,/static,file:///opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/static/,AVAILABLE} 2025-07-02 15:56:19,297 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.w.WebAppContext@4d157787{datanode,/,file:///opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/datanode/,AVAILABLE}{file:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/datanode} 2025-07-02 15:56:19,310 INFO org.eclipse.jetty.server.AbstractConnector: Started ServerConnector@51f49060{HTTP/1.1, (http/1.1)}{localhost:41527} 2025-07-02 15:56:19,310 INFO org.eclipse.jetty.server.Server: Started @2834ms 2025-07-02 15:56:19,593 INFO org.apache.hadoop.hdfs.server.datanode.web.DatanodeHttpServer: Listening HTTP traffic on /192.168.158.4:9864 2025-07-02 15:56:19,603 INFO org.apache.hadoop.util.JvmPauseMonitor: Starting JVM pause monitor 2025-07-02 15:56:19,604 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: dnUserName = hdfs 2025-07-02 15:56:19,605 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: supergroup = supergroup 2025-07-02 15:56:19,669 INFO org.apache.hadoop.ipc.CallQueueManager: Using callQueue: class java.util.concurrent.LinkedBlockingQueue queueCapacity: 1000 scheduler: class org.apache.hadoop.ipc.DefaultRpcScheduler 2025-07-02 15:56:19,691 INFO org.apache.hadoop.ipc.Server: Starting Socket Reader #1 for port 9867 2025-07-02 15:56:19,738 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened IPC server at /192.168.158.4:9867 2025-07-02 15:56:19,775 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Refresh request received for nameservices: null 2025-07-02 15:56:19,787 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting BPOfferServices for nameservices: 2025-07-02 15:56:19,803 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool (Datanode Uuid unassigned) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 starting to offer service 2025-07-02 15:56:19,812 INFO org.apache.hadoop.ipc.Server: IPC Server Responder: starting 2025-07-02 15:56:19,813 INFO org.apache.hadoop.ipc.Server: IPC Server listener on 9867: starting 2025-07-02 15:56:20,096 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Acknowledging ACTIVE Namenode during handshakeBlock pool (Datanode Uuid unassigned) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-02 15:56:20,099 INFO org.apache.hadoop.hdfs.server.common.Storage: Using 4 threads to upgrade data directories (dfs.datanode.parallel.volumes.load.threads.num=4, dataDirs=4) 2025-07-02 15:56:20,106 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d1/dfs/dn/in_use.lock acquired by nodename 4892@dmidlkprdls04.svr.luc.edu 2025-07-02 15:56:20,109 WARN org.apache.hadoop.hdfs.server.common.Storage: Failed to add storage directory [DISK]file:/hdfs/d1/dfs/dn java.io.IOException: Incompatible clusterIDs in /hdfs/d1/dfs/dn: namenode clusterID = cluster87; datanode clusterID = cluster72 at org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:744) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadStorageDirectory(DataStorage.java:294) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadDataStorage(DataStorage.java:407) at org.apache.hadoop.hdfs.server.datanode.DataStorage.addStorageLocations(DataStorage.java:387) at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:559) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-02 15:56:20,114 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d2/dfs/dn/in_use.lock acquired by nodename 4892@dmidlkprdls04.svr.luc.edu 2025-07-02 15:56:20,115 WARN org.apache.hadoop.hdfs.server.common.Storage: Failed to add storage directory [DISK]file:/hdfs/d2/dfs/dn java.io.IOException: Incompatible clusterIDs in /hdfs/d2/dfs/dn: namenode clusterID = cluster87; datanode clusterID = cluster72 at org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:744) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadStorageDirectory(DataStorage.java:294) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadDataStorage(DataStorage.java:407) at org.apache.hadoop.hdfs.server.datanode.DataStorage.addStorageLocations(DataStorage.java:387) at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:559) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-02 15:56:20,116 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d3/dfs/dn/in_use.lock acquired by nodename 4892@dmidlkprdls04.svr.luc.edu 2025-07-02 15:56:20,117 WARN org.apache.hadoop.hdfs.server.common.Storage: Failed to add storage directory [DISK]file:/hdfs/d3/dfs/dn java.io.IOException: Incompatible clusterIDs in /hdfs/d3/dfs/dn: namenode clusterID = cluster87; datanode clusterID = cluster72 at org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:744) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadStorageDirectory(DataStorage.java:294) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadDataStorage(DataStorage.java:407) at org.apache.hadoop.hdfs.server.datanode.DataStorage.addStorageLocations(DataStorage.java:387) at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:559) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-02 15:56:20,118 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d4/dfs/dn/in_use.lock acquired by nodename 4892@dmidlkprdls04.svr.luc.edu 2025-07-02 15:56:20,118 WARN org.apache.hadoop.hdfs.server.common.Storage: Failed to add storage directory [DISK]file:/hdfs/d4/dfs/dn java.io.IOException: Incompatible clusterIDs in /hdfs/d4/dfs/dn: namenode clusterID = cluster87; datanode clusterID = cluster72 at org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:744) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadStorageDirectory(DataStorage.java:294) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadDataStorage(DataStorage.java:407) at org.apache.hadoop.hdfs.server.datanode.DataStorage.addStorageLocations(DataStorage.java:387) at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:559) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-02 15:56:20,122 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: Initialization failed for Block pool (Datanode Uuid 92aa408f-509e-4bad-98e0-6e83f0a34a12) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Exiting. java.io.IOException: All specified directories have failed to load. at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:560) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-02 15:56:20,122 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Ending block pool service for: Block pool (Datanode Uuid 92aa408f-509e-4bad-98e0-6e83f0a34a12) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-02 15:56:20,124 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Removed Block pool (Datanode Uuid 92aa408f-509e-4bad-98e0-6e83f0a34a12) 2025-07-02 15:56:22,124 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Exiting Datanode 2025-07-02 15:56:22,130 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG: /************************************************************ SHUTDOWN_MSG: Shutting down DataNode at dmidlkprdls04.svr.luc.edu/192.168.158.4 ************************************************************/ 2025-07-02 15:56:29,100 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: STARTUP_MSG: /************************************************************ STARTUP_MSG: Starting DataNode STARTUP_MSG: host = dmidlkprdls04.svr.luc.edu/192.168.158.4 STARTUP_MSG: args = [] STARTUP_MSG: version = 3.1.1.7.3.1.0-197 STARTUP_MSG: classpath = /var/run/cloudera-scm-agent/process/259-hdfs-DATANODE:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/audience-annotations-0.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/avro.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/aws-java-sdk-bundle-1.12.720.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-hdfs-plugin-shim-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-plugin-classloader-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-yarn-plugin-shim-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/azure-data-lake-store-sdk-2.3.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/checker-qual-3.33.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-beanutils-1.9.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-cli-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-codec-1.15.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-collections-3.2.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-compress-1.23.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-configuration2-2.10.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-io-2.11.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-lang-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-lang3-3.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-logging-1.1.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-math3-3.6.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-net-3.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-text-1.10.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-client-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-framework-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-recipes-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/failureaccess-1.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/gson-2.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/guava-32.0.1-jre.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/httpclient-4.5.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/httpcore-4.4.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/j2objc-annotations-2.8.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-core-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-core-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-databind-2.12.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-mapper-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-xc-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/javax.activation-api-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/javax.servlet-api-3.1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jaxb-api-2.2.11.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jcip-annotations-1.0-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-core-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-json-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-server-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-servlet-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jettison-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-http-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-io-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-security-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-server-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-servlet-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-util-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-util-ajax-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-webapp-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-xml-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jline-3.22.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsch-0.1.54.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsp-api-2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsr305-3.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsr311-api-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jul-to-slf4j-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-admin-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-client-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-common-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-core-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-crypto-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-identity-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-server-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-simplekdc-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-asn1-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-config-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-pkix-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-xdr-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/logredactor-2.0.16.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/lz4-java-1.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/metrics-core-3.2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-all-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-buffer-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-haproxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-http-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-http2-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-memcache-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-mqtt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-redis-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-smtp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-log4j12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-reload4j-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-api-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/reload4j-1.2.22.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/zookeeper.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-udt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-unix-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-kqueue-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final-linux-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-classes-kqueue-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-rxtx-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-classes-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-classes-macos-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-ssl-ocsp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-proxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-xml-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-stomp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-socks-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/wildfly-openssl-2.1.4.ClouderaFinal.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/protobuf-java-2.5.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/nimbus-jose-jwt-9.37.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/woodstox-core-5.4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/token-provider-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/stax2-api-4.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/snappy-java-1.1.10.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/re2j-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/zookeeper-jute.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-sctp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-kqueue-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final-linux-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//gcs-connector-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-annotations.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-auth.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-aws.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-datalake.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-kms.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-nfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//ozone-filesystem-hadoop3-1.3.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-nfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-kms-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-datalake-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-aws-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-auth-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-annotations-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//gcs-connector-2.1.2.7.3.1.0-197-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-thrift.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-scala_2.12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-protobuf.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-pig.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-pig-bundle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-jackson.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-hadoop.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-hadoop-bundle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-generator.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-format-structures.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-encoding.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-column.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-cascading3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-cascading.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-avro.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/./:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/audience-annotations-0.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/avro-1.11.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/checker-qual-3.33.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-beanutils-1.9.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-codec-1.15.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-collections-3.2.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-compress-1.23.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-configuration2-2.10.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-io-2.11.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-lang3-3.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-math3-3.6.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-net-3.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-text-1.10.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-client-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-framework-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-recipes-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/failureaccess-1.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/gson-2.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/guava-32.0.1-jre.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/httpclient-4.5.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/httpcore-4.4.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/j2objc-annotations-2.8.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-core-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-databind-2.12.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-jaxrs-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-xc-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/javax.activation-api-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/javax.servlet-api-3.1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jaxb-api-2.2.11.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jaxb-impl-2.2.3-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jcip-annotations-1.0-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-core-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-json-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-server-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-servlet-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jettison-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-http-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-io-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-security-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-server-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-servlet-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-util-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-util-ajax-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-webapp-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-xml-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jline-3.22.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsch-0.1.54.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/json-simple-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsr311-api-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-admin-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-client-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-common-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-core-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-crypto-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-identity-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-server-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-simplekdc-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-asn1-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-config-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-pkix-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-xdr-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/leveldbjni-cldr-1.9.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/lz4-java-1.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/metrics-core-3.2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-all-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-buffer-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-haproxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-http-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-http2-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-memcache-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-mqtt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-redis-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-smtp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-socks-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-stomp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-xml-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-proxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-ssl-ocsp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/re2j-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/zookeeper-3.8.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-sctp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-rxtx-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-unix-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-kqueue-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-kqueue-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final-linux-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final-linux-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-classes-kqueue-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-classes-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-classes-macos-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/nimbus-jose-jwt-9.37.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/woodstox-core-5.4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/token-provider-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/stax2-api-4.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/snappy-java-1.1.10.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/reload4j-1.2.22.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/zookeeper-jute-3.8.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-udt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-httpfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-nfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-nfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-httpfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//asm-5.0.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//aspectjweaver-1.9.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//azure-storage-7.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//checker-compat-qual-2.5.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-slf4j-backend-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-system-backend-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//google-extensions-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//azure-keyvault-core-1.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//accessors-smart-2.4.9.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gcs-connector-2.1.2.7.3.1.0-197-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archive-logs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archive-logs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archives-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archives.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-datajoin-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-datajoin.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-distcp-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-distcp.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-extras-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-extras.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-fs2img-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-fs2img.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-gridmix-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-gridmix.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-kafka-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-kafka.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-app-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-app.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-core-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-core.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-nativetask-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-nativetask.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-uploader-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-uploader.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-examples-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-examples.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-openstack-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-openstack.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-resourceestimator-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-resourceestimator.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-rumen-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-rumen.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-sls-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-sls.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-streaming-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-streaming.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ojalgo-43.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//kafka-clients-2.8.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//log4j-core-2.18.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-hook-abfs-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//forbiddenapis-3.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-intg-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//log4j-api-2.18.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//zstd-jni-1.4.9-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-i18n.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-s3-lib-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//javax.activation-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//bundle-2.23.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//json-smart-2.4.10.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-util-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-shell.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-cloud-bindings.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-hook-s3-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//aspectjrt-1.9.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/./:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/swagger-annotations-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/objenesis-1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jersey-client-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jakarta.xml.bind-api-2.3.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jakarta.activation-api-1.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-dataformat-yaml-2.9.10.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcutil-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcprov-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcpkix-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/HikariCP-java7-2.4.12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/snakeyaml-2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jsonschema2pojo-core-1.0.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/joda-time-2.10.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jna-5.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jersey-guice-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/javax.inject-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-module-jaxb-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-jaxrs-json-provider-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-jaxrs-base-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/guice-servlet-4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/guice-4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/geronimo-jcache_1.0_spec-1.0-alpha-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/fst-2.50.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/ehcache-3.3.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/dnsjava-2.1.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/codemodel-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-api.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-distributedshell.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-cloud-resourcemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-registry.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-nodemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-resourcemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-router.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-sharedcachemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-timeline-pluginstorage.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-web-proxy.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-api.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-core.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-registry-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-api-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-core-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-api-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-web-proxy-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-timeline-pluginstorage-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-tests-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-sharedcachemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-router-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-resourcemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-nodemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-cloud-resourcemanager-1.0.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-distributedshell-3.1.1.7.3.1.0-197.jar:/opt/cloudera/cm/lib/plugins/event-publish-7.13.1-shaded.jar:/opt/cloudera/cm/lib/plugins/tt-instrumentation-7.13.1.jar STARTUP_MSG: build = git@github.infra.cloudera.com:CDH/hadoop.git -r 31a42fb39494f541ffae15c3c61185deeeacca86; compiled by 'jenkins' on 2024-12-04T01:09Z STARTUP_MSG: java = 1.8.0_432 ************************************************************/ 2025-07-02 15:56:29,188 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: registered UNIX signal handlers for [TERM, HUP, INT] 2025-07-02 15:56:29,506 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d1/dfs/dn 2025-07-02 15:56:29,512 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d2/dfs/dn 2025-07-02 15:56:29,512 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d3/dfs/dn 2025-07-02 15:56:29,513 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d4/dfs/dn 2025-07-02 15:56:29,666 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: Loaded properties from hadoop-metrics2.properties 2025-07-02 15:56:29,766 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled Metric snapshot period at 10 second(s). 2025-07-02 15:56:29,766 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: DataNode metrics system started 2025-07-02 15:56:30,039 INFO org.apache.hadoop.hdfs.server.common.Util: dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling 2025-07-02 15:56:30,061 INFO org.apache.hadoop.hdfs.server.datanode.BlockScanner: Initialized block scanner with targetBytesPerSec 1048576 2025-07-02 15:56:30,068 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: File descriptor passing is enabled. 2025-07-02 15:56:30,069 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Configured hostname is dmidlkprdls04.svr.luc.edu 2025-07-02 15:56:30,069 INFO org.apache.hadoop.hdfs.server.common.Util: dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling 2025-07-02 15:56:30,075 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting DataNode with maxLockedMemory = 4294967296 2025-07-02 15:56:30,103 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened streaming server at /192.168.158.4:9866 2025-07-02 15:56:30,106 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Balancing bandwidth is 10485760 bytes/s 2025-07-02 15:56:30,106 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Number threads for balancing is 50 2025-07-02 15:56:30,109 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Balancing bandwidth is 10485760 bytes/s 2025-07-02 15:56:30,109 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Number threads for balancing is 50 2025-07-02 15:56:30,109 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Listening on UNIX domain socket: /var/run/hdfs-sockets/dn 2025-07-02 15:56:30,161 INFO org.eclipse.jetty.util.log: Logging initialized @2199ms to org.eclipse.jetty.util.log.Slf4jLog 2025-07-02 15:56:30,449 WARN org.apache.hadoop.security.authentication.server.AuthenticationFilter: Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /var/lib/hadoop-hdfs/hadoop-http-auth-signature-secret 2025-07-02 15:56:30,462 INFO org.apache.hadoop.http.HttpRequestLog: Http request log for http.requests.datanode is not defined 2025-07-02 15:56:30,471 INFO org.apache.hadoop.http.HttpServer2: Added global filter 'safety' (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter) 2025-07-02 15:56:30,474 INFO org.apache.hadoop.security.HttpCrossOriginFilterInitializer: CORS filter not enabled. Please set hadoop.http.cross-origin.enabled to 'true' to enable it 2025-07-02 15:56:30,476 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context datanode 2025-07-02 15:56:30,476 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context logs 2025-07-02 15:56:30,476 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context static 2025-07-02 15:56:30,520 INFO org.apache.hadoop.http.HttpServer2: Jetty bound to port 45291 2025-07-02 15:56:30,522 INFO org.eclipse.jetty.server.Server: jetty-9.4.54.v20240208; built: 2024-02-08T19:42:39.027Z; git: cef3fbd6d736a21e7d541a5db490381d95a2047d; jvm 1.8.0_432-b06 2025-07-02 15:56:30,578 INFO org.eclipse.jetty.server.session: DefaultSessionIdManager workerName=node0 2025-07-02 15:56:30,578 INFO org.eclipse.jetty.server.session: No SessionScavenger set, using defaults 2025-07-02 15:56:30,581 INFO org.eclipse.jetty.server.session: node0 Scavenging every 600000ms 2025-07-02 15:56:30,606 WARN org.apache.hadoop.security.authentication.server.AuthenticationFilter: Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /var/lib/hadoop-hdfs/hadoop-http-auth-signature-secret 2025-07-02 15:56:30,610 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.s.ServletContextHandler@1b11ef33{logs,/logs,file:///var/log/hadoop-hdfs/,AVAILABLE} 2025-07-02 15:56:30,612 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.s.ServletContextHandler@2f2bf0e2{static,/static,file:///opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/static/,AVAILABLE} 2025-07-02 15:56:30,755 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.w.WebAppContext@6ebd78d1{datanode,/,file:///opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/datanode/,AVAILABLE}{file:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/datanode} 2025-07-02 15:56:30,769 INFO org.eclipse.jetty.server.AbstractConnector: Started ServerConnector@7fb9f71f{HTTP/1.1, (http/1.1)}{localhost:45291} 2025-07-02 15:56:30,769 INFO org.eclipse.jetty.server.Server: Started @2807ms 2025-07-02 15:56:31,072 INFO org.apache.hadoop.hdfs.server.datanode.web.DatanodeHttpServer: Listening HTTP traffic on /192.168.158.4:9864 2025-07-02 15:56:31,082 INFO org.apache.hadoop.util.JvmPauseMonitor: Starting JVM pause monitor 2025-07-02 15:56:31,082 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: dnUserName = hdfs 2025-07-02 15:56:31,083 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: supergroup = supergroup 2025-07-02 15:56:31,145 INFO org.apache.hadoop.ipc.CallQueueManager: Using callQueue: class java.util.concurrent.LinkedBlockingQueue queueCapacity: 1000 scheduler: class org.apache.hadoop.ipc.DefaultRpcScheduler 2025-07-02 15:56:31,165 INFO org.apache.hadoop.ipc.Server: Starting Socket Reader #1 for port 9867 2025-07-02 15:56:31,217 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened IPC server at /192.168.158.4:9867 2025-07-02 15:56:31,251 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Refresh request received for nameservices: null 2025-07-02 15:56:31,261 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting BPOfferServices for nameservices: 2025-07-02 15:56:31,283 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool (Datanode Uuid unassigned) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 starting to offer service 2025-07-02 15:56:31,293 INFO org.apache.hadoop.ipc.Server: IPC Server listener on 9867: starting 2025-07-02 15:56:31,293 INFO org.apache.hadoop.ipc.Server: IPC Server Responder: starting 2025-07-02 15:56:31,592 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Acknowledging ACTIVE Namenode during handshakeBlock pool (Datanode Uuid unassigned) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-02 15:56:31,595 INFO org.apache.hadoop.hdfs.server.common.Storage: Using 4 threads to upgrade data directories (dfs.datanode.parallel.volumes.load.threads.num=4, dataDirs=4) 2025-07-02 15:56:31,601 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d1/dfs/dn/in_use.lock acquired by nodename 5512@dmidlkprdls04.svr.luc.edu 2025-07-02 15:56:31,604 WARN org.apache.hadoop.hdfs.server.common.Storage: Failed to add storage directory [DISK]file:/hdfs/d1/dfs/dn java.io.IOException: Incompatible clusterIDs in /hdfs/d1/dfs/dn: namenode clusterID = cluster87; datanode clusterID = cluster72 at org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:744) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadStorageDirectory(DataStorage.java:294) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadDataStorage(DataStorage.java:407) at org.apache.hadoop.hdfs.server.datanode.DataStorage.addStorageLocations(DataStorage.java:387) at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:559) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-02 15:56:31,610 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d2/dfs/dn/in_use.lock acquired by nodename 5512@dmidlkprdls04.svr.luc.edu 2025-07-02 15:56:31,611 WARN org.apache.hadoop.hdfs.server.common.Storage: Failed to add storage directory [DISK]file:/hdfs/d2/dfs/dn java.io.IOException: Incompatible clusterIDs in /hdfs/d2/dfs/dn: namenode clusterID = cluster87; datanode clusterID = cluster72 at org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:744) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadStorageDirectory(DataStorage.java:294) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadDataStorage(DataStorage.java:407) at org.apache.hadoop.hdfs.server.datanode.DataStorage.addStorageLocations(DataStorage.java:387) at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:559) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-02 15:56:31,612 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d3/dfs/dn/in_use.lock acquired by nodename 5512@dmidlkprdls04.svr.luc.edu 2025-07-02 15:56:31,612 WARN org.apache.hadoop.hdfs.server.common.Storage: Failed to add storage directory [DISK]file:/hdfs/d3/dfs/dn java.io.IOException: Incompatible clusterIDs in /hdfs/d3/dfs/dn: namenode clusterID = cluster87; datanode clusterID = cluster72 at org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:744) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadStorageDirectory(DataStorage.java:294) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadDataStorage(DataStorage.java:407) at org.apache.hadoop.hdfs.server.datanode.DataStorage.addStorageLocations(DataStorage.java:387) at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:559) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-02 15:56:31,613 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d4/dfs/dn/in_use.lock acquired by nodename 5512@dmidlkprdls04.svr.luc.edu 2025-07-02 15:56:31,613 WARN org.apache.hadoop.hdfs.server.common.Storage: Failed to add storage directory [DISK]file:/hdfs/d4/dfs/dn java.io.IOException: Incompatible clusterIDs in /hdfs/d4/dfs/dn: namenode clusterID = cluster87; datanode clusterID = cluster72 at org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:744) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadStorageDirectory(DataStorage.java:294) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadDataStorage(DataStorage.java:407) at org.apache.hadoop.hdfs.server.datanode.DataStorage.addStorageLocations(DataStorage.java:387) at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:559) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-02 15:56:31,618 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: Initialization failed for Block pool (Datanode Uuid 92aa408f-509e-4bad-98e0-6e83f0a34a12) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Exiting. java.io.IOException: All specified directories have failed to load. at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:560) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-02 15:56:31,618 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Ending block pool service for: Block pool (Datanode Uuid 92aa408f-509e-4bad-98e0-6e83f0a34a12) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-02 15:56:31,620 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Removed Block pool (Datanode Uuid 92aa408f-509e-4bad-98e0-6e83f0a34a12) 2025-07-02 15:56:33,621 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Exiting Datanode 2025-07-02 15:56:33,627 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG: /************************************************************ SHUTDOWN_MSG: Shutting down DataNode at dmidlkprdls04.svr.luc.edu/192.168.158.4 ************************************************************/ 2025-07-07 10:20:47,307 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: STARTUP_MSG: /************************************************************ STARTUP_MSG: Starting DataNode STARTUP_MSG: host = dmidlkprdls04.svr.luc.edu/192.168.158.4 STARTUP_MSG: args = [] STARTUP_MSG: version = 3.1.1.7.3.1.0-197 STARTUP_MSG: classpath = /var/run/cloudera-scm-agent/process/316-hdfs-DATANODE:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/audience-annotations-0.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/avro.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/aws-java-sdk-bundle-1.12.720.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-hdfs-plugin-shim-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-plugin-classloader-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-yarn-plugin-shim-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/azure-data-lake-store-sdk-2.3.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/checker-qual-3.33.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-beanutils-1.9.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-cli-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-codec-1.15.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-collections-3.2.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-compress-1.23.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-configuration2-2.10.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-io-2.11.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-lang-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-lang3-3.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-logging-1.1.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-math3-3.6.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-net-3.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-text-1.10.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-client-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-framework-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-recipes-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/failureaccess-1.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/gson-2.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/guava-32.0.1-jre.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/httpclient-4.5.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/httpcore-4.4.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/j2objc-annotations-2.8.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-core-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-core-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-databind-2.12.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-mapper-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-xc-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/javax.activation-api-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/javax.servlet-api-3.1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jaxb-api-2.2.11.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jcip-annotations-1.0-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-core-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-json-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-server-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-servlet-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jettison-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-http-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-io-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-security-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-server-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-servlet-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-util-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-util-ajax-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-webapp-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-xml-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jline-3.22.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsch-0.1.54.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsp-api-2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsr305-3.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsr311-api-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jul-to-slf4j-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-admin-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-client-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-common-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-core-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-crypto-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-identity-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-server-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-simplekdc-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-asn1-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-config-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-pkix-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-xdr-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/logredactor-2.0.16.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/lz4-java-1.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/metrics-core-3.2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-all-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-buffer-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-haproxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-http-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-http2-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-memcache-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-mqtt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-redis-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-smtp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-log4j12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-reload4j-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-api-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/reload4j-1.2.22.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/zookeeper.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-udt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-unix-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-kqueue-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final-linux-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-classes-kqueue-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-rxtx-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-classes-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-classes-macos-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-ssl-ocsp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-proxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-xml-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-stomp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-socks-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/wildfly-openssl-2.1.4.ClouderaFinal.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/protobuf-java-2.5.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/nimbus-jose-jwt-9.37.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/woodstox-core-5.4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/token-provider-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/stax2-api-4.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/snappy-java-1.1.10.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/re2j-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/zookeeper-jute.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-sctp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-kqueue-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final-linux-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//gcs-connector-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-annotations.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-auth.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-aws.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-datalake.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-kms.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-nfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//ozone-filesystem-hadoop3-1.3.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-nfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-kms-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-datalake-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-aws-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-auth-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-annotations-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//gcs-connector-2.1.2.7.3.1.0-197-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-thrift.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-scala_2.12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-protobuf.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-pig.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-pig-bundle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-jackson.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-hadoop.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-hadoop-bundle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-generator.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-format-structures.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-encoding.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-column.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-cascading3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-cascading.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-avro.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/./:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/audience-annotations-0.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/avro-1.11.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/checker-qual-3.33.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-beanutils-1.9.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-codec-1.15.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-collections-3.2.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-compress-1.23.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-configuration2-2.10.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-io-2.11.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-lang3-3.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-math3-3.6.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-net-3.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-text-1.10.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-client-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-framework-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-recipes-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/failureaccess-1.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/gson-2.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/guava-32.0.1-jre.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/httpclient-4.5.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/httpcore-4.4.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/j2objc-annotations-2.8.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-core-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-databind-2.12.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-jaxrs-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-xc-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/javax.activation-api-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/javax.servlet-api-3.1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jaxb-api-2.2.11.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jaxb-impl-2.2.3-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jcip-annotations-1.0-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-core-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-json-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-server-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-servlet-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jettison-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-http-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-io-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-security-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-server-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-servlet-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-util-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-util-ajax-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-webapp-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-xml-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jline-3.22.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsch-0.1.54.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/json-simple-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsr311-api-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-admin-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-client-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-common-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-core-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-crypto-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-identity-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-server-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-simplekdc-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-asn1-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-config-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-pkix-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-xdr-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/leveldbjni-cldr-1.9.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/lz4-java-1.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/metrics-core-3.2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-all-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-buffer-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-haproxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-http-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-http2-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-memcache-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-mqtt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-redis-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-smtp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-socks-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-stomp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-xml-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-proxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-ssl-ocsp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/re2j-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/zookeeper-3.8.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-sctp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-rxtx-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-unix-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-kqueue-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-kqueue-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final-linux-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final-linux-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-classes-kqueue-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-classes-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-classes-macos-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/nimbus-jose-jwt-9.37.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/woodstox-core-5.4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/token-provider-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/stax2-api-4.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/snappy-java-1.1.10.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/reload4j-1.2.22.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/zookeeper-jute-3.8.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-udt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-httpfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-nfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-nfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-httpfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//asm-5.0.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//aspectjweaver-1.9.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//azure-storage-7.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//checker-compat-qual-2.5.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-slf4j-backend-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-system-backend-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//google-extensions-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//azure-keyvault-core-1.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//accessors-smart-2.4.9.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gcs-connector-2.1.2.7.3.1.0-197-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archive-logs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archive-logs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archives-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archives.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-datajoin-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-datajoin.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-distcp-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-distcp.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-extras-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-extras.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-fs2img-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-fs2img.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-gridmix-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-gridmix.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-kafka-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-kafka.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-app-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-app.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-core-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-core.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-nativetask-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-nativetask.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-uploader-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-uploader.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-examples-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-examples.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-openstack-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-openstack.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-resourceestimator-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-resourceestimator.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-rumen-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-rumen.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-sls-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-sls.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-streaming-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-streaming.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ojalgo-43.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//kafka-clients-2.8.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//log4j-core-2.18.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-hook-abfs-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//forbiddenapis-3.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-intg-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//log4j-api-2.18.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//zstd-jni-1.4.9-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-i18n.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-s3-lib-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//javax.activation-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//bundle-2.23.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//json-smart-2.4.10.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-util-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-shell.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-cloud-bindings.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-hook-s3-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//aspectjrt-1.9.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/./:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/swagger-annotations-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/objenesis-1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jersey-client-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jakarta.xml.bind-api-2.3.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jakarta.activation-api-1.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-dataformat-yaml-2.9.10.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcutil-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcprov-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcpkix-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/HikariCP-java7-2.4.12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/snakeyaml-2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jsonschema2pojo-core-1.0.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/joda-time-2.10.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jna-5.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jersey-guice-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/javax.inject-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-module-jaxb-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-jaxrs-json-provider-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-jaxrs-base-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/guice-servlet-4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/guice-4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/geronimo-jcache_1.0_spec-1.0-alpha-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/fst-2.50.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/ehcache-3.3.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/dnsjava-2.1.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/codemodel-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-api.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-distributedshell.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-cloud-resourcemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-registry.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-nodemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-resourcemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-router.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-sharedcachemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-timeline-pluginstorage.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-web-proxy.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-api.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-core.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-registry-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-api-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-core-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-api-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-web-proxy-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-timeline-pluginstorage-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-tests-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-sharedcachemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-router-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-resourcemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-nodemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-cloud-resourcemanager-1.0.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-distributedshell-3.1.1.7.3.1.0-197.jar:/opt/cloudera/cm/lib/plugins/event-publish-7.13.1-shaded.jar:/opt/cloudera/cm/lib/plugins/tt-instrumentation-7.13.1.jar STARTUP_MSG: build = git@github.infra.cloudera.com:CDH/hadoop.git -r 31a42fb39494f541ffae15c3c61185deeeacca86; compiled by 'jenkins' on 2024-12-04T01:09Z STARTUP_MSG: java = 1.8.0_432 ************************************************************/ 2025-07-07 10:20:47,391 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: registered UNIX signal handlers for [TERM, HUP, INT] 2025-07-07 10:20:47,741 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d1/dfs/dn 2025-07-07 10:20:47,748 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d2/dfs/dn 2025-07-07 10:20:47,748 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d3/dfs/dn 2025-07-07 10:20:47,749 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d4/dfs/dn 2025-07-07 10:20:47,915 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: Loaded properties from hadoop-metrics2.properties 2025-07-07 10:20:48,042 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled Metric snapshot period at 10 second(s). 2025-07-07 10:20:48,042 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: DataNode metrics system started 2025-07-07 10:20:48,472 INFO org.apache.hadoop.hdfs.server.common.Util: dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling 2025-07-07 10:20:48,497 INFO org.apache.hadoop.hdfs.server.datanode.BlockScanner: Initialized block scanner with targetBytesPerSec 1048576 2025-07-07 10:20:48,504 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: File descriptor passing is enabled. 2025-07-07 10:20:48,505 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Configured hostname is dmidlkprdls04.svr.luc.edu 2025-07-07 10:20:48,506 INFO org.apache.hadoop.hdfs.server.common.Util: dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling 2025-07-07 10:20:48,512 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting DataNode with maxLockedMemory = 4294967296 2025-07-07 10:20:48,542 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened streaming server at /192.168.158.4:9866 2025-07-07 10:20:48,545 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Balancing bandwidth is 10485760 bytes/s 2025-07-07 10:20:48,545 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Number threads for balancing is 50 2025-07-07 10:20:48,549 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Balancing bandwidth is 10485760 bytes/s 2025-07-07 10:20:48,549 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Number threads for balancing is 50 2025-07-07 10:20:48,550 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Listening on UNIX domain socket: /var/run/hdfs-sockets/dn 2025-07-07 10:20:48,603 INFO org.eclipse.jetty.util.log: Logging initialized @2572ms to org.eclipse.jetty.util.log.Slf4jLog 2025-07-07 10:20:48,737 WARN org.apache.hadoop.security.authentication.server.AuthenticationFilter: Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /var/lib/hadoop-hdfs/hadoop-http-auth-signature-secret 2025-07-07 10:20:48,746 INFO org.apache.hadoop.http.HttpRequestLog: Http request log for http.requests.datanode is not defined 2025-07-07 10:20:48,756 INFO org.apache.hadoop.http.HttpServer2: Added global filter 'safety' (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter) 2025-07-07 10:20:48,758 INFO org.apache.hadoop.security.HttpCrossOriginFilterInitializer: CORS filter not enabled. Please set hadoop.http.cross-origin.enabled to 'true' to enable it 2025-07-07 10:20:48,759 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context datanode 2025-07-07 10:20:48,760 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context logs 2025-07-07 10:20:48,760 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context static 2025-07-07 10:20:48,801 INFO org.apache.hadoop.http.HttpServer2: Jetty bound to port 34031 2025-07-07 10:20:48,803 INFO org.eclipse.jetty.server.Server: jetty-9.4.54.v20240208; built: 2024-02-08T19:42:39.027Z; git: cef3fbd6d736a21e7d541a5db490381d95a2047d; jvm 1.8.0_432-b06 2025-07-07 10:20:48,861 INFO org.eclipse.jetty.server.session: DefaultSessionIdManager workerName=node0 2025-07-07 10:20:48,861 INFO org.eclipse.jetty.server.session: No SessionScavenger set, using defaults 2025-07-07 10:20:48,865 INFO org.eclipse.jetty.server.session: node0 Scavenging every 600000ms 2025-07-07 10:20:48,900 WARN org.apache.hadoop.security.authentication.server.AuthenticationFilter: Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /var/lib/hadoop-hdfs/hadoop-http-auth-signature-secret 2025-07-07 10:20:48,906 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.s.ServletContextHandler@1b11ef33{logs,/logs,file:///var/log/hadoop-hdfs/,AVAILABLE} 2025-07-07 10:20:48,908 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.s.ServletContextHandler@2f2bf0e2{static,/static,file:///opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/static/,AVAILABLE} 2025-07-07 10:20:49,046 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.w.WebAppContext@6ebd78d1{datanode,/,file:///opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/datanode/,AVAILABLE}{file:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/datanode} 2025-07-07 10:20:49,059 INFO org.eclipse.jetty.server.AbstractConnector: Started ServerConnector@7fb9f71f{HTTP/1.1, (http/1.1)}{localhost:34031} 2025-07-07 10:20:49,059 INFO org.eclipse.jetty.server.Server: Started @3029ms 2025-07-07 10:20:49,384 WARN com.cloudera.cmf.event.publish.EventStorePublisherWithRetry: Failed to publish event: SimpleEvent{attributes={ROLE=[hdfs-DATANODE-b30a464b10a57fdd49ea734cd52a8291], HOSTS=[dmidlkprdls04.svr.luc.edu], ROLE_TYPE=[DATANODE], CATEGORY=[LOG_MESSAGE], EVENTCODE=[EV_LOG_EVENT], SERVICE=[hdfs], SERVICE_TYPE=[HDFS], LOG_LEVEL=[WARN], HOST_IDS=[5c33df90-d247-4c6d-b9e0-5908a423580a], SEVERITY=[IMPORTANT]}, content=Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /var/lib/hadoop-hdfs/hadoop-http-auth-signature-secret, timestamp=1751901648737} java.io.IOException: Error connecting to dmidlkprdls01.svr.luc.edu/192.168.158.1:7184 at com.cloudera.cmf.event.shaded.org.apache.avro.ipc.netty.NettyTransceiver.getChannel(NettyTransceiver.java:269) at com.cloudera.cmf.event.shaded.org.apache.avro.ipc.netty.NettyTransceiver.(NettyTransceiver.java:197) at com.cloudera.cmf.event.shaded.org.apache.avro.ipc.netty.NettyTransceiver.(NettyTransceiver.java:120) at com.cloudera.cmf.event.publish.AvroEventStorePublishProxy.checkSpecificRequestor(AvroEventStorePublishProxy.java:120) at com.cloudera.cmf.event.publish.AvroEventStorePublishProxy.publishEvent(AvroEventStorePublishProxy.java:204) at com.cloudera.cmf.event.publish.EventStorePublisherWithRetry$PublishEventTask.run(EventStorePublisherWithRetry.java:242) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:750) Caused by: com.cloudera.cmf.event.shaded.io.netty.channel.AbstractChannel$AnnotatedNoRouteToHostException: No route to host: dmidlkprdls01.svr.luc.edu/192.168.158.1:7184 Caused by: java.net.NoRouteToHostException: No route to host at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:716) at com.cloudera.cmf.event.shaded.io.netty.channel.socket.nio.NioSocketChannel.doFinishConnect(NioSocketChannel.java:337) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.finishConnect(AbstractNioChannel.java:339) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:776) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:724) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:650) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:562) at com.cloudera.cmf.event.shaded.io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997) at com.cloudera.cmf.event.shaded.io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:20:49,427 INFO org.apache.hadoop.hdfs.server.datanode.web.DatanodeHttpServer: Listening HTTP traffic on /192.168.158.4:9864 2025-07-07 10:20:49,437 INFO org.apache.hadoop.util.JvmPauseMonitor: Starting JVM pause monitor 2025-07-07 10:20:49,438 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: dnUserName = hdfs 2025-07-07 10:20:49,438 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: supergroup = supergroup 2025-07-07 10:20:49,501 INFO org.apache.hadoop.ipc.CallQueueManager: Using callQueue: class java.util.concurrent.LinkedBlockingQueue queueCapacity: 1000 scheduler: class org.apache.hadoop.ipc.DefaultRpcScheduler 2025-07-07 10:20:49,518 INFO org.apache.hadoop.ipc.Server: Starting Socket Reader #1 for port 9867 2025-07-07 10:20:49,579 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened IPC server at /192.168.158.4:9867 2025-07-07 10:20:49,633 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Refresh request received for nameservices: null 2025-07-07 10:20:49,644 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting BPOfferServices for nameservices: 2025-07-07 10:20:49,663 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool (Datanode Uuid unassigned) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 starting to offer service 2025-07-07 10:20:49,671 INFO org.apache.hadoop.ipc.Server: IPC Server Responder: starting 2025-07-07 10:20:49,671 INFO org.apache.hadoop.ipc.Server: IPC Server listener on 9867: starting 2025-07-07 10:20:50,844 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:20:51,845 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:20:52,847 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:20:53,849 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:20:54,851 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:20:55,852 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:20:56,854 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:20:57,856 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:20:58,857 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:20:59,859 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:21:00,861 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:21:01,863 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:21:02,865 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:21:03,866 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:21:04,868 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:21:05,870 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:21:06,664 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:21:06,670 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:21:06,671 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 45 more 2025-07-07 10:21:06,871 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:21:07,003 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.produce(EatWhatYouKill.java:137) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:21:07,005 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.produce(EatWhatYouKill.java:137) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:21:07,006 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.produce(EatWhatYouKill.java:137) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 49 more 2025-07-07 10:21:07,874 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:21:08,876 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:21:09,878 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:21:10,879 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:21:11,881 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:21:12,883 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:21:13,884 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:21:14,886 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:21:15,889 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:21:16,890 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:21:17,892 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:21:18,894 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:21:19,896 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:21:20,898 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:21:21,900 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:21:22,901 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:21:23,903 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:21:24,905 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:21:25,906 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:21:26,908 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:21:27,910 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:21:28,912 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:21:29,913 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:21:30,915 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:21:31,917 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:21:32,918 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:21:33,920 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:21:34,922 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:21:35,923 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:21:36,925 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:21:37,928 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:21:38,929 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:21:39,931 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:21:39,935 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 10:21:45,937 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:21:46,939 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:21:47,941 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:21:48,942 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:21:49,944 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:21:50,946 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:21:51,947 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:21:52,949 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:21:53,951 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:21:54,953 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:21:55,954 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:21:56,956 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:21:57,957 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:21:58,959 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:21:59,961 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:22:00,962 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:22:01,964 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:22:02,965 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:22:03,967 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:22:04,969 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:22:05,972 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:22:06,974 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:22:06,987 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:22:06,988 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:22:06,990 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 10:22:07,975 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:22:08,977 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:22:09,979 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:22:10,980 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:22:11,982 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:22:12,984 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:22:13,985 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:22:14,987 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:22:15,989 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:22:16,990 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:22:17,992 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:22:18,994 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:22:19,996 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:22:20,997 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:22:21,999 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:22:23,001 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:22:24,002 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:22:25,004 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:22:26,006 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:22:27,007 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:22:28,009 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:22:29,011 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:22:30,012 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:22:31,014 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:22:32,016 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:22:33,018 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:22:34,019 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:22:35,021 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:22:35,024 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 10:22:41,026 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:22:42,027 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:22:43,029 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:22:44,031 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:22:45,033 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:22:46,034 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:22:47,036 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:22:48,038 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:22:49,040 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:22:50,042 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:22:51,043 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:22:52,045 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:22:53,047 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:22:54,048 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:22:55,050 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:22:56,052 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:22:57,053 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:22:58,055 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:22:59,057 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:23:00,058 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:23:01,060 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:23:02,061 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:23:03,063 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:23:04,064 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:23:05,066 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:23:06,068 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:23:06,997 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:23:06,998 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:23:06,999 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 10:23:07,070 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:23:08,071 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:23:09,073 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:23:10,074 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:23:11,076 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:23:12,078 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:23:13,079 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:23:14,081 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:23:15,082 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:23:16,084 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:23:17,086 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:23:18,087 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:23:19,089 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:23:20,091 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:23:21,092 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:23:22,094 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:23:23,095 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:23:24,097 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:23:25,098 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:23:26,100 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:23:27,102 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:23:28,103 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:23:29,105 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:23:30,106 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:23:30,109 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 10:23:36,111 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:23:37,113 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:23:38,114 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:23:39,116 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:23:40,117 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:23:41,119 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:23:42,120 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:23:43,122 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:23:44,124 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:23:45,125 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:23:46,127 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:23:47,128 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:23:48,130 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:23:49,132 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:23:50,133 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:23:51,135 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:23:52,137 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:23:53,138 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:23:54,140 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:23:55,141 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:23:56,143 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:23:57,145 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:23:58,146 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:23:59,148 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:24:00,149 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:24:01,151 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:24:02,153 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:24:03,154 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:24:04,156 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:24:05,157 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:24:06,159 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:24:06,982 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:24:06,984 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:24:06,985 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 10:24:07,160 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:24:08,162 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:24:09,164 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:24:10,165 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:24:11,167 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:24:12,169 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:24:13,170 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:24:14,172 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:24:15,173 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:24:16,175 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:24:17,177 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:24:18,178 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:24:19,180 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:24:20,182 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:24:21,183 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:24:22,185 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:24:23,186 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:24:24,188 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:24:25,189 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:24:25,192 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 10:24:31,194 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:24:32,195 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:24:33,197 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:24:34,199 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:24:35,200 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:24:36,202 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:24:37,203 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:24:38,205 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:24:39,207 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:24:40,208 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:24:41,210 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:24:42,211 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:24:43,213 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:24:44,214 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:24:45,216 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:24:46,218 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:24:47,219 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:24:48,221 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:24:49,223 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:24:50,224 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:24:51,226 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:24:52,228 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:24:53,229 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:24:54,231 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:24:55,232 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:24:56,234 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:24:57,235 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:24:58,237 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:24:59,239 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:25:00,240 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:25:01,242 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:25:02,243 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:25:03,245 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:25:04,247 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:25:05,248 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:25:06,250 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:25:06,998 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:25:06,999 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:25:07,000 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 10:25:07,251 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:25:08,253 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:25:09,254 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:25:10,256 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:25:11,257 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:25:12,258 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:25:13,260 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:25:14,261 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:25:15,263 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:25:16,264 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:25:17,266 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:25:18,267 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:25:19,269 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:25:20,270 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:25:20,273 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 10:25:26,275 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:25:27,276 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:25:28,278 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:25:29,279 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:25:30,281 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:25:31,282 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:25:32,283 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:25:33,285 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:25:34,286 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:25:35,287 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:25:36,289 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:25:37,290 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:25:38,291 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:25:39,293 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:25:40,294 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:25:41,296 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:25:42,297 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:25:43,299 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:25:44,300 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:25:45,301 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:25:46,303 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:25:47,304 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:25:48,306 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:25:49,307 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:25:50,309 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:25:51,310 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:25:52,311 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:25:53,313 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:25:54,314 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:25:55,316 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:25:56,317 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:25:57,319 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:25:58,320 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:25:59,321 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:26:00,323 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:26:01,324 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:26:02,326 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:26:03,327 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:26:04,329 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:26:05,330 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:26:06,331 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:26:06,983 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:26:06,984 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:26:06,986 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 10:26:07,333 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:26:08,334 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:26:09,336 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:26:10,337 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:26:11,338 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:26:12,340 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:26:13,341 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:26:14,343 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:26:15,344 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:26:15,347 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 10:26:21,349 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:26:22,350 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:26:23,352 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:26:24,353 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:26:25,355 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:26:26,356 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:26:27,358 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:26:28,359 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:26:29,360 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:26:30,362 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:26:31,363 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:26:32,364 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:26:33,366 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:26:34,367 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:26:35,369 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:26:36,370 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:26:37,371 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:26:38,373 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:26:39,374 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:26:40,376 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:26:41,377 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:26:42,378 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:26:43,380 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:26:44,381 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:26:45,382 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:26:46,384 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:26:47,385 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:26:48,387 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:26:49,388 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:26:50,389 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:26:51,391 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:26:52,392 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:26:53,393 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:26:54,395 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:26:55,396 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:26:56,397 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:26:57,399 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:26:58,400 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:26:59,401 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:27:00,402 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:27:01,404 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:27:02,405 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:27:03,406 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:27:04,408 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:27:05,409 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:27:06,410 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:27:06,998 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:27:06,999 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:27:07,000 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 10:27:07,412 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:27:08,413 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:27:09,414 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:27:10,415 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:27:10,417 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 10:27:16,419 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:27:17,420 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:27:18,421 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:27:19,423 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:27:20,424 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:27:21,426 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:27:22,427 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:27:23,428 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:27:24,429 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:27:25,431 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:27:26,432 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:27:27,433 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:27:28,435 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:27:29,436 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:27:30,437 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:27:31,439 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:27:32,440 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:27:33,441 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:27:34,442 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:27:35,444 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:27:36,445 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:27:37,446 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:27:38,448 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:27:39,450 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:27:40,451 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:27:41,452 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:27:42,453 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:27:43,455 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:27:44,456 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:27:45,457 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:27:46,459 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:27:47,460 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:27:48,461 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:27:49,462 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:27:50,464 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:27:51,465 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:27:52,466 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:27:53,468 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:27:54,469 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:27:55,470 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:27:56,471 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:27:57,473 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:27:58,474 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:27:59,475 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:28:00,476 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:28:01,478 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:28:02,479 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:28:03,480 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:28:04,482 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:28:05,483 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:28:05,485 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 10:28:06,983 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:28:06,984 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:28:06,985 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 10:28:11,486 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:28:12,488 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:28:13,489 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:28:14,491 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:28:15,492 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:28:16,493 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:28:17,494 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:28:18,496 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:28:19,497 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:28:20,498 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:28:21,500 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:28:22,501 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:28:23,502 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:28:24,504 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:28:25,505 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:28:26,506 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:28:27,507 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:28:28,509 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:28:29,510 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:28:30,512 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:28:31,513 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:28:32,514 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:28:33,515 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:28:34,517 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:28:35,518 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:28:36,520 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:28:37,521 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:28:38,522 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:28:39,523 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:28:40,525 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:28:41,526 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:28:42,527 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:28:43,529 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:28:44,530 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:28:45,531 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:28:46,532 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:28:47,534 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:28:48,535 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:28:49,536 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:28:50,538 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:28:51,539 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:28:52,541 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:28:53,542 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:28:54,543 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:28:55,545 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:28:56,546 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:28:57,547 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:28:58,549 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:28:59,550 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:29:00,551 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:29:00,553 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 10:29:06,555 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:29:06,997 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:29:06,998 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:29:07,000 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 10:29:07,556 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:29:08,558 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:29:09,559 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:29:10,560 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:29:11,562 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:29:12,563 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:29:13,564 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:29:14,565 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:29:15,567 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:29:16,568 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:29:17,569 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:29:18,571 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:29:19,572 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:29:20,573 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:29:21,575 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:29:22,576 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:29:23,577 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:29:24,579 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:29:25,580 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:29:26,581 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:29:27,583 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:29:28,584 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:29:29,585 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:29:30,587 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:29:31,588 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:29:32,589 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:29:33,591 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:29:34,592 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:29:35,593 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:29:36,595 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:29:37,596 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:29:38,598 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:29:39,599 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:29:40,601 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:29:41,602 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:29:42,603 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:29:43,605 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:29:44,606 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:29:45,607 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:29:46,609 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:29:47,610 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:29:48,611 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:29:49,613 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:29:50,614 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:29:51,615 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:29:52,617 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:29:53,618 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:29:54,619 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:29:55,621 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:29:55,623 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 10:30:01,625 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:30:02,626 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:30:03,627 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:30:04,629 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:30:05,630 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:30:06,631 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:30:06,984 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:30:06,985 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:30:06,986 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 10:30:07,633 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:30:08,634 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:30:09,635 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:30:10,637 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:30:11,638 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:30:12,639 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:30:13,641 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:30:14,642 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:30:15,643 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:30:16,645 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:30:17,646 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:30:18,647 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:30:19,649 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:30:20,650 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:30:21,652 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:30:22,653 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:30:23,654 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:30:24,656 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:30:25,657 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:30:26,658 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:30:27,660 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:30:28,661 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:30:29,662 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:30:30,664 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:30:31,665 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:30:32,666 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:30:33,668 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:30:34,669 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:30:35,670 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:30:36,672 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:30:37,673 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:30:38,675 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:30:39,676 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:30:40,678 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:30:41,679 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:30:42,680 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:30:43,682 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:30:44,683 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:30:45,684 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:30:46,686 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:30:47,687 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:30:48,688 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:30:49,690 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:30:50,691 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:30:50,693 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 10:30:56,695 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:30:57,696 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:30:58,698 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:30:59,699 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:31:00,700 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:31:01,702 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:31:02,703 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:31:03,704 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:31:04,706 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:31:05,707 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:31:06,708 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:31:06,989 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:31:06,991 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:31:06,992 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 10:31:07,710 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:31:08,711 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:31:09,712 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:31:10,714 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:31:11,715 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:31:12,716 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:31:13,718 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:31:14,719 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:31:15,720 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:31:16,722 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:31:17,723 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:31:18,724 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:31:19,726 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:31:20,727 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:31:21,728 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:31:22,730 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:31:23,731 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:31:24,732 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:31:25,733 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:31:26,735 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:31:27,736 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:31:28,737 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:31:29,739 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:31:30,740 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:31:31,741 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:31:32,743 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:31:33,744 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:31:34,745 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:31:35,746 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:31:36,748 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:31:37,749 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:31:38,751 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:31:39,752 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:31:40,753 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:31:41,755 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:31:42,756 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:31:43,757 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:31:44,759 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:31:45,760 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:31:45,762 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 10:31:51,764 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:31:52,765 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:31:53,766 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:31:54,768 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:31:55,769 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:31:56,770 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:31:57,771 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:31:58,773 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:31:59,774 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:32:00,775 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:32:01,776 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:32:02,777 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:32:03,779 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:32:04,780 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:32:05,781 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:32:06,783 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:32:07,003 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:32:07,004 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:32:07,006 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 10:32:07,784 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:32:08,785 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:32:09,786 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:32:10,788 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:32:11,789 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:32:12,790 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:32:13,791 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:32:14,793 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:32:15,794 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:32:16,795 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:32:17,796 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:32:18,798 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:32:19,799 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:32:20,801 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:32:21,802 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:32:22,803 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:32:23,805 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:32:24,806 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:32:25,807 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:32:26,808 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:32:27,809 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:32:28,811 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:32:29,812 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:32:30,813 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:32:31,814 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:32:32,816 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:32:33,817 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:32:34,818 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:32:35,819 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:32:36,820 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:32:37,822 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:32:38,823 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:32:39,825 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:32:40,826 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:32:40,828 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 10:32:46,830 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:32:47,831 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:32:48,832 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:32:49,833 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:32:50,835 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:32:51,836 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:32:52,837 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:32:53,839 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:32:54,840 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:32:55,841 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:32:56,843 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:32:57,844 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:32:58,845 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:32:59,846 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:33:00,848 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:33:01,849 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:33:02,850 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:33:03,851 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:33:04,853 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:33:05,854 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:33:06,855 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:33:06,985 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:33:06,987 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:33:06,988 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 10:33:07,856 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:33:08,857 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:33:09,859 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:33:10,860 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:33:11,861 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:33:12,862 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:33:13,864 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:33:14,865 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:33:15,866 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:33:16,867 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:33:17,869 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:33:18,870 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:33:19,871 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:33:20,872 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:33:21,874 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:33:22,875 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:33:23,876 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:33:24,878 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:33:25,879 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:33:26,880 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:33:27,882 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:33:28,883 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:33:29,884 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:33:30,886 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:33:31,887 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:33:32,888 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:33:33,889 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:33:34,891 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:33:35,892 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:33:35,895 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 10:33:41,896 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:33:42,898 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:33:43,899 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:33:44,900 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:33:45,902 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:33:46,903 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:33:47,904 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:33:48,905 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:33:49,907 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:33:50,908 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:33:51,909 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:33:52,911 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:33:53,912 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:33:54,913 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:33:55,915 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:33:56,916 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:33:57,917 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:33:58,918 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:33:59,920 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:34:00,921 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:34:01,922 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:34:02,923 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:34:03,925 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:34:04,926 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:34:05,927 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:34:06,929 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:34:07,000 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:34:07,001 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:34:07,002 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 10:34:07,930 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:34:08,931 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:34:09,932 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:34:10,934 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:34:11,935 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:34:12,936 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:34:13,938 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:34:14,939 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:34:15,940 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:34:16,941 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:34:17,943 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:34:18,944 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:34:19,945 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:34:20,946 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:34:21,948 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:34:22,949 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:34:23,950 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:34:24,952 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:34:25,953 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:34:26,954 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:34:27,955 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:34:28,956 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:34:29,958 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:34:30,959 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:34:30,961 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 10:34:36,963 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:34:37,964 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:34:38,965 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:34:39,967 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:34:40,968 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:34:41,969 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:34:42,970 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:34:43,972 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:34:44,973 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:34:45,974 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:34:46,975 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:34:47,977 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:34:48,978 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:34:49,979 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:34:50,981 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:34:51,982 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:34:52,983 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:34:53,985 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:34:54,986 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:34:55,987 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:34:56,989 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:34:57,990 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:34:58,991 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:34:59,992 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:35:00,994 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:35:01,995 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:35:02,996 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:35:03,998 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:35:04,999 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:35:06,000 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:35:07,002 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:35:07,004 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:35:07,005 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:35:07,007 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 10:35:08,003 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:35:09,004 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:35:10,005 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:35:11,007 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:35:12,008 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:35:13,009 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:35:14,011 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:35:15,012 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:35:16,013 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:35:17,014 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:35:18,016 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:35:19,017 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:35:20,018 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:35:21,020 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:35:22,021 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:35:23,022 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:35:24,024 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:35:25,025 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:35:26,026 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:35:26,029 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 10:35:32,031 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:35:33,032 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:35:34,033 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:35:35,035 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:35:36,036 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:35:37,037 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:35:38,038 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:35:39,040 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:35:40,041 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:35:41,042 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:35:42,043 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:35:43,045 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:35:44,046 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:35:45,047 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:35:46,048 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:35:47,050 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:35:48,051 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:35:49,052 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:35:50,053 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:35:51,055 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:35:52,056 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:35:53,057 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:35:54,059 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:35:55,060 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:35:56,061 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:35:57,062 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:35:58,064 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:35:59,065 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:36:00,066 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:36:01,067 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:36:02,068 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:36:03,070 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:36:04,071 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:36:05,072 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:36:06,074 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:36:06,998 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:36:06,999 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:36:07,000 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 10:36:07,075 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:36:08,076 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:36:09,077 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:36:10,079 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:36:11,080 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:36:12,081 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:36:13,082 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:36:14,084 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:36:15,085 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:36:16,086 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:36:17,087 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:36:18,089 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:36:19,090 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:36:20,091 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:36:21,092 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:36:21,095 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 10:36:27,097 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:36:28,098 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:36:29,099 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:36:30,100 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:36:31,101 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:36:32,103 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:36:33,104 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:36:34,105 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:36:35,106 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:36:36,108 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:36:37,109 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:36:38,110 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:36:39,111 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:36:40,113 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:36:41,114 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:36:42,115 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:36:43,116 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:36:44,117 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:36:45,119 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:36:46,120 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:36:47,121 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:36:48,122 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:36:49,124 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:36:50,125 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:36:51,126 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:36:52,128 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:36:53,129 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:36:54,130 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:36:55,132 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:36:56,133 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:36:57,134 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:36:58,135 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:36:59,136 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:37:00,138 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:37:01,139 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:37:02,140 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:37:03,141 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:37:04,143 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:37:05,144 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:37:06,145 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:37:06,983 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:37:06,985 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:37:06,986 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 10:37:07,146 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:37:08,147 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:37:09,149 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:37:10,150 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:37:11,151 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:37:12,152 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:37:13,154 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:37:14,155 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:37:15,156 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:37:16,157 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:37:16,160 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 10:37:22,162 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:37:23,163 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:37:24,164 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:37:25,165 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:37:26,166 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:37:27,168 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:37:28,169 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:37:29,170 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:37:30,171 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:37:31,173 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:37:32,174 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:37:33,175 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:37:34,176 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:37:35,178 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:37:36,179 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:37:37,180 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:37:38,181 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:37:39,183 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:37:40,184 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:37:41,185 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:37:42,186 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:37:43,187 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:37:44,189 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:37:45,190 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:37:46,191 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:37:47,192 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:37:48,194 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:37:49,195 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:37:50,196 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:37:51,197 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:37:52,199 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:37:53,200 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:37:54,201 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:37:55,203 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:37:56,204 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:37:57,205 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:37:58,207 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:37:59,208 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:38:00,209 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:38:01,210 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:38:02,211 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:38:03,213 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:38:04,214 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:38:05,215 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:38:06,216 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:38:06,998 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:38:06,999 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:38:07,000 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 10:38:07,217 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:38:08,219 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:38:09,220 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:38:10,221 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:38:11,222 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:38:11,224 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 10:38:17,226 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:38:18,227 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:38:19,228 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:38:20,229 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:38:21,231 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:38:22,232 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:38:23,233 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:38:24,234 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:38:25,236 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:38:26,237 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:38:27,238 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:38:28,239 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:38:29,241 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:38:30,242 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:38:31,243 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:38:32,244 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:38:33,246 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:38:34,247 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:38:35,248 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:38:36,249 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:38:37,250 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:38:38,253 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:38:39,254 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:38:40,255 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:38:41,256 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:38:42,258 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:38:43,259 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:38:44,260 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:38:45,261 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:38:46,263 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:38:47,264 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:38:48,265 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:38:49,266 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:38:50,268 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:38:51,269 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:38:52,270 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:38:53,271 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:38:54,273 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:38:55,274 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:38:56,275 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:38:57,276 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:38:58,278 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:38:59,279 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:39:00,280 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:39:01,281 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:39:02,283 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:39:03,284 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:39:04,285 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:39:05,286 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:39:06,288 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:39:06,289 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 10:39:06,987 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:39:06,989 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:39:06,990 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 10:39:12,292 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:39:13,293 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:39:14,294 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:39:15,296 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:39:16,297 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:39:17,298 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:39:18,299 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:39:19,301 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:39:20,302 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:39:21,303 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:39:22,304 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:39:23,305 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:39:24,307 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:39:25,308 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:39:26,309 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:39:27,310 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:39:28,312 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:39:29,313 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:39:30,314 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:39:31,315 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:39:32,316 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:39:33,318 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:39:34,319 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:39:35,320 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:39:36,321 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:39:37,323 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:39:38,324 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:39:39,325 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:39:40,326 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:39:41,328 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:39:42,329 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:39:43,330 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:39:44,331 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:39:45,332 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:39:46,334 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:39:47,335 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:39:48,336 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:39:49,338 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:39:50,339 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:39:51,340 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:39:52,342 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:39:53,343 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:39:54,344 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:39:55,345 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:39:56,346 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:39:57,347 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:39:58,349 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:39:59,350 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:40:00,351 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:40:01,352 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:40:01,355 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 10:40:07,002 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:40:07,003 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:40:07,005 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 10:40:07,357 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:40:08,358 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:40:09,359 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:40:10,361 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:40:11,362 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:40:12,363 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:40:13,364 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:40:14,366 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:40:15,367 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:40:16,368 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:40:17,369 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:40:18,371 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:40:19,372 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:40:20,373 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:40:21,374 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:40:22,376 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:40:23,377 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:40:24,378 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:40:25,380 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:40:26,381 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:40:27,382 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:40:28,383 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:40:29,385 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:40:30,386 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:40:31,387 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:40:32,388 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:40:33,390 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:40:34,391 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:40:35,392 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:40:36,393 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:40:37,395 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:40:38,396 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:40:39,398 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:40:40,399 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:40:41,400 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:40:42,401 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:40:43,403 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:40:44,404 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:40:45,405 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:40:46,407 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:40:47,408 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:40:48,409 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:40:49,411 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:40:50,412 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:40:51,413 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:40:52,415 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:40:53,416 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:40:54,417 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:40:55,418 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:40:56,419 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:40:56,421 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 10:41:02,423 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:41:03,424 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:41:04,425 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:41:05,427 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:41:06,428 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:41:06,989 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:41:06,991 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:41:06,992 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 10:41:07,429 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:41:08,431 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:41:09,432 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:41:10,433 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:41:11,435 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:41:12,436 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:41:13,437 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:41:14,438 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:41:15,440 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:41:16,441 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:41:17,442 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:41:18,443 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:41:19,445 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:41:20,446 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:41:21,447 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:41:22,449 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:41:23,450 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:41:24,451 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:41:25,452 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:41:26,454 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:41:27,455 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:41:28,456 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:41:29,457 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:41:30,459 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:41:31,460 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:41:32,461 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:41:33,462 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:41:34,464 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:41:35,465 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:41:36,466 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:41:37,468 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:41:38,469 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:41:39,471 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:41:40,472 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:41:41,473 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:41:42,475 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:41:43,476 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:41:44,477 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:41:45,478 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:41:46,480 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:41:47,481 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:41:48,482 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:41:49,484 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:41:50,485 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:41:51,486 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:41:51,488 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 10:41:57,490 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:41:58,491 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:41:59,492 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:42:00,493 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:42:01,495 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:42:02,496 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:42:03,497 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:42:04,498 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:42:05,500 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:42:06,501 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:42:07,006 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:42:07,007 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:42:07,008 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 10:42:07,502 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:42:08,503 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:42:09,505 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:42:10,506 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:42:11,507 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:42:12,509 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:42:13,510 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:42:14,511 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:42:15,513 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:42:16,514 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:42:17,516 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:42:18,517 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:42:19,518 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:42:20,520 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:42:21,521 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:42:22,522 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:42:23,524 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:42:24,525 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:42:25,526 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:42:26,528 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:42:27,529 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:42:28,530 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:42:29,532 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:42:30,533 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:42:31,534 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:42:32,535 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:42:33,537 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:42:34,538 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:42:35,539 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:42:36,540 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:42:37,542 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:42:38,543 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:42:39,545 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:42:40,546 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:42:41,547 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:42:42,548 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:42:43,550 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:42:44,551 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:42:45,552 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:42:46,554 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:42:46,555 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 10:42:52,557 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:42:53,559 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:42:54,560 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:42:55,561 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:42:56,563 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:42:57,564 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:42:58,565 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:42:59,567 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:43:00,568 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:43:01,569 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:43:02,570 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:43:03,572 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:43:04,573 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:43:05,574 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:43:06,576 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:43:06,987 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:43:06,988 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:43:06,989 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 10:43:07,577 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:43:08,578 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:43:09,580 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:43:10,581 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:43:11,582 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:43:12,584 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:43:13,585 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:43:14,586 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:43:15,588 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:43:16,589 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:43:17,590 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:43:18,592 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:43:19,593 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:43:20,594 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:43:21,596 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:43:22,597 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:43:23,598 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:43:24,600 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:43:25,601 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:43:26,602 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:43:27,604 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:43:28,605 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:43:29,606 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:43:30,607 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:43:31,609 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:43:32,610 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:43:33,611 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:43:34,612 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:43:35,614 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:43:36,615 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:43:37,616 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:43:38,618 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:43:39,620 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:43:40,621 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:43:41,622 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:43:41,624 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 10:43:47,626 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:43:48,627 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:43:49,628 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:43:50,630 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:43:51,631 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:43:52,633 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:43:53,634 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:43:54,635 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:43:55,636 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:43:56,638 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:43:57,639 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:43:58,640 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:43:59,642 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:44:00,643 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:44:01,644 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:44:02,646 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:44:03,647 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:44:04,648 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:44:05,650 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:44:06,651 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:44:07,001 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:44:07,003 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:44:07,004 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 10:44:07,652 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:44:08,653 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:44:09,655 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:44:10,656 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:44:11,657 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:44:12,659 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:44:13,660 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:44:14,661 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:44:15,662 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:44:16,664 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:44:17,665 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:44:18,666 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:44:19,668 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:44:20,669 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:44:21,670 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:44:22,672 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:44:23,673 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:44:24,674 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:44:25,676 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:44:26,677 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:44:27,678 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:44:28,680 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:44:29,681 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:44:30,682 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:44:31,684 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:44:32,685 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:44:33,686 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:44:34,688 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:44:35,689 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:44:36,690 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:44:36,693 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 10:44:42,695 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:44:43,696 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:44:44,697 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:44:45,698 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:44:46,700 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:44:47,701 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:44:48,702 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:44:49,704 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:44:50,705 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:44:51,706 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:44:52,708 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:44:53,709 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:44:54,710 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:44:55,712 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:44:56,713 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:44:57,714 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:44:58,716 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:44:59,717 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:45:00,718 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:45:01,720 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:45:02,721 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:45:03,722 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:45:04,724 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:45:05,725 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:45:06,726 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:45:06,991 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:45:06,992 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:45:06,994 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 10:45:07,727 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:45:08,729 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:45:09,730 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:45:10,731 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:45:11,733 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:45:12,734 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:45:13,735 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:45:14,737 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:45:15,738 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:45:16,739 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:45:17,740 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:45:18,742 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:45:19,743 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:45:20,744 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:45:21,746 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:45:22,747 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:45:23,749 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:45:24,750 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:45:25,751 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:45:26,752 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:45:27,754 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:45:28,755 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:45:29,756 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:45:30,758 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:45:31,759 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:45:31,761 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 10:45:37,763 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:45:38,765 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:45:39,766 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:45:40,767 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:45:41,768 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:45:42,770 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:45:43,771 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:45:44,773 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:45:45,774 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:45:46,775 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:45:47,777 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:45:48,778 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:45:49,779 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:45:50,780 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:45:51,782 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:45:52,783 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:45:53,785 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:45:54,786 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:45:55,787 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:45:56,789 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:45:57,790 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:45:58,791 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:45:59,793 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:46:00,794 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:46:01,795 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:46:02,797 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:46:03,798 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:46:04,799 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:46:05,801 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:46:06,802 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:46:06,986 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:46:06,987 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:46:06,988 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 10:46:07,803 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:46:08,805 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:46:09,806 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:46:10,807 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:46:11,808 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:46:12,810 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:46:13,811 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:46:14,812 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:46:15,814 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:46:16,815 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:46:17,816 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:46:18,818 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:46:19,819 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:46:20,820 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:46:21,822 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:46:22,823 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:46:23,824 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:46:24,825 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:46:25,827 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:46:26,828 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:46:26,830 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 10:46:32,832 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:46:33,833 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:46:34,835 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:46:35,836 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:46:36,837 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:46:37,839 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:46:38,840 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:46:39,841 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:46:40,842 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:46:41,844 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:46:42,845 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:46:43,846 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:46:44,847 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:46:45,849 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:46:46,850 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:46:47,851 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:46:48,853 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:46:49,854 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:46:50,856 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:46:51,857 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:46:52,858 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:46:53,860 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:46:54,861 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:46:55,862 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:46:56,864 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:46:57,865 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:46:58,866 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:46:59,868 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:47:00,869 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:47:01,870 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:47:02,872 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:47:03,873 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:47:04,874 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:47:05,876 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:47:06,877 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:47:07,006 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:47:07,007 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:47:07,008 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 10:47:07,878 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:47:08,880 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:47:09,881 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:47:10,882 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:47:11,883 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:47:12,885 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:47:13,886 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:47:14,887 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:47:15,889 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:47:16,890 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:47:17,891 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:47:18,893 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:47:19,894 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:47:20,895 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:47:21,897 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:47:21,899 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 10:47:27,901 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:47:28,902 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:47:29,904 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:47:30,905 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:47:31,907 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:47:32,908 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:47:33,909 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:47:34,910 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:47:35,912 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:47:36,913 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:47:37,914 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:47:38,916 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:47:39,917 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:47:40,918 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:47:41,920 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:47:42,921 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:47:43,922 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:47:44,924 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:47:45,925 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:47:46,926 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:47:47,927 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:47:48,929 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:47:49,930 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:47:50,931 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:47:51,933 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:47:52,934 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:47:53,936 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:47:54,937 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:47:55,938 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:47:56,940 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:47:57,941 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:47:58,942 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:47:59,944 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:48:00,945 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:48:01,946 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:48:02,947 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:48:03,949 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:48:04,950 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:48:05,951 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:48:06,952 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:48:06,992 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:48:06,994 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:48:06,995 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 10:48:07,954 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:48:08,955 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:48:09,957 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:48:10,958 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:48:11,959 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:48:12,960 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:48:13,962 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:48:14,963 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:48:15,964 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:48:16,966 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:48:16,968 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 10:48:22,970 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:48:23,971 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:48:24,972 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:48:25,974 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:48:26,975 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:48:27,976 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:48:28,978 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:48:29,979 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:48:30,980 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:48:31,982 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:48:32,983 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:48:33,984 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:48:34,985 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:48:35,987 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:48:36,988 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:48:37,989 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:48:38,990 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:48:39,992 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:48:40,993 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:48:41,994 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:48:42,996 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:48:43,997 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:48:44,998 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:48:46,000 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:48:47,001 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:48:48,002 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:48:49,003 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:48:50,005 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:48:51,006 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:48:52,007 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:48:53,009 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:48:54,010 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:48:55,012 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:48:56,013 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:48:57,014 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:48:58,015 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:48:59,017 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:49:00,018 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:49:01,019 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:49:02,021 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:49:03,022 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:49:04,023 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:49:05,025 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:49:06,026 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:49:07,003 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:49:07,004 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:49:07,007 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 10:49:07,027 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:49:08,030 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:49:09,031 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:49:10,032 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:49:11,033 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:49:12,035 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:49:12,037 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 10:49:18,039 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:49:19,041 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:49:20,042 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:49:21,043 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:49:22,044 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:49:23,046 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:49:24,047 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:49:25,048 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:49:26,050 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:49:27,051 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:49:28,052 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:49:29,053 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:49:30,055 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:49:31,056 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:49:32,057 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:49:33,058 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:49:34,060 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:49:35,061 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:49:36,062 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:49:37,064 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:49:38,065 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:49:39,066 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:49:40,067 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:49:41,069 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:49:42,070 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:49:43,071 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:49:44,073 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:49:45,074 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:49:46,075 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:49:47,076 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:49:48,078 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:49:49,079 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:49:50,080 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:49:51,082 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:49:52,083 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:49:53,084 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:49:54,086 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:49:55,087 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:49:56,088 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:49:57,089 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:49:58,090 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:49:59,092 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:50:00,093 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:50:01,094 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:50:02,095 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:50:03,096 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:50:04,098 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:50:05,099 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:50:06,100 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:50:06,989 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:50:06,990 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:50:06,991 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 10:50:07,101 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:50:07,103 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 10:50:13,104 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:50:14,106 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:50:15,107 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:50:16,108 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:50:17,109 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:50:18,110 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:50:19,112 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:50:20,113 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:50:21,114 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:50:22,115 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:50:23,117 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:50:24,118 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:50:25,119 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:50:26,120 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:50:27,122 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:50:28,123 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:50:29,124 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:50:30,125 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:50:31,126 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:50:32,128 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:50:33,129 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:50:34,130 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:50:35,131 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:50:36,133 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:50:37,134 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:50:38,136 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:50:39,137 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:50:40,138 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:50:41,139 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:50:42,141 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:50:43,142 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:50:44,143 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:50:45,144 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:50:46,146 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:50:47,147 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:50:48,148 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:50:49,149 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:50:50,150 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:50:51,151 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:50:52,153 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:50:53,154 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:50:54,155 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:50:55,157 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:50:56,158 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:50:57,159 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:50:58,160 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:50:59,162 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:51:00,163 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:51:01,164 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:51:02,165 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:51:02,167 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 10:51:07,006 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:51:07,007 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:51:07,008 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 10:51:07,390 WARN com.cloudera.cmf.event.publish.EventStorePublisherWithRetry: Failed to publish event: SimpleEvent{attributes={ROLE=[hdfs-DATANODE-b30a464b10a57fdd49ea734cd52a8291], HOSTS=[dmidlkprdls04.svr.luc.edu], ROLE_TYPE=[DATANODE], CATEGORY=[LOG_MESSAGE], EVENTCODE=[EV_LOG_EVENT], SERVICE=[hdfs], SERVICE_TYPE=[HDFS], LOG_LEVEL=[WARN], HOST_IDS=[5c33df90-d247-4c6d-b9e0-5908a423580a], SEVERITY=[IMPORTANT]}, content=Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /var/lib/hadoop-hdfs/hadoop-http-auth-signature-secret, timestamp=1751901648737} - 1 of 60 failure(s) in last 1818s java.io.IOException: Error connecting to dmidlkprdls01.svr.luc.edu/192.168.158.1:7184 at com.cloudera.cmf.event.shaded.org.apache.avro.ipc.netty.NettyTransceiver.getChannel(NettyTransceiver.java:269) at com.cloudera.cmf.event.shaded.org.apache.avro.ipc.netty.NettyTransceiver.(NettyTransceiver.java:197) at com.cloudera.cmf.event.shaded.org.apache.avro.ipc.netty.NettyTransceiver.(NettyTransceiver.java:120) at com.cloudera.cmf.event.publish.AvroEventStorePublishProxy.checkSpecificRequestor(AvroEventStorePublishProxy.java:120) at com.cloudera.cmf.event.publish.AvroEventStorePublishProxy.publishEvent(AvroEventStorePublishProxy.java:204) at com.cloudera.cmf.event.publish.EventStorePublisherWithRetry$PublishEventTask.run(EventStorePublisherWithRetry.java:242) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:750) Caused by: com.cloudera.cmf.event.shaded.io.netty.channel.AbstractChannel$AnnotatedNoRouteToHostException: No route to host: dmidlkprdls01.svr.luc.edu/192.168.158.1:7184 Caused by: java.net.NoRouteToHostException: No route to host at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:716) at com.cloudera.cmf.event.shaded.io.netty.channel.socket.nio.NioSocketChannel.doFinishConnect(NioSocketChannel.java:337) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.finishConnect(AbstractNioChannel.java:339) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:776) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:724) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:650) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:562) at com.cloudera.cmf.event.shaded.io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997) at com.cloudera.cmf.event.shaded.io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:51:08,168 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:51:09,170 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:51:10,171 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:51:11,172 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:51:12,174 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:51:13,175 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:51:14,176 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:51:15,177 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:51:16,179 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:51:17,180 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:51:18,181 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:51:19,183 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:51:20,184 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:51:21,185 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:51:22,186 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:51:23,188 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:51:24,189 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:51:25,190 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:51:26,191 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:51:27,193 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:51:28,194 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:51:29,195 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:51:30,197 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:51:31,198 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:51:32,199 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:51:33,200 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:51:34,202 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:51:35,203 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:51:36,204 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:51:37,205 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:51:38,206 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:51:39,208 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:51:40,209 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:51:41,210 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:51:42,212 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:51:43,213 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:51:44,214 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:51:45,215 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:51:46,217 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:51:47,218 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:51:48,219 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:51:49,220 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:51:50,222 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:51:51,223 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:51:52,224 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:51:53,226 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:51:54,227 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:51:55,228 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:51:56,229 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:51:57,230 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:51:57,232 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 10:52:03,234 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:52:04,235 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:52:05,236 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:52:06,238 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:52:06,991 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:52:06,994 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:52:06,995 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 10:52:07,239 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:52:08,240 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:52:09,241 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:52:10,242 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:52:11,243 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:52:12,245 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:52:13,246 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:52:14,247 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:52:15,248 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:52:16,250 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:52:17,251 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:52:18,252 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:52:19,253 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:52:20,254 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:52:21,255 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:52:22,256 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:52:23,258 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:52:24,259 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:52:25,260 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:52:26,262 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:52:27,263 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:52:28,264 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:52:29,265 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:52:30,267 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:52:31,268 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:52:32,269 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:52:33,270 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:52:34,271 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:52:35,273 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:52:36,274 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:52:37,275 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:52:38,277 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:52:39,278 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:52:40,279 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:52:41,280 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:52:42,282 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:52:43,283 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:52:44,284 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:52:45,285 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:52:46,287 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:52:47,288 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:52:48,289 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:52:49,290 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:52:50,291 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:52:51,293 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:52:52,294 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:52:52,296 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 10:52:58,298 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:52:59,299 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:53:00,300 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:53:01,301 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:53:02,303 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:53:03,304 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:53:04,305 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:53:05,306 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:53:06,308 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:53:07,003 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:53:07,004 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:53:07,005 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 10:53:07,309 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:53:08,310 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:53:09,311 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:53:29,333 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); maxRetries=45 2025-07-07 10:53:48,832 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:53:52,928 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:53:59,072 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:54:02,144 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:54:05,216 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:54:06,992 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:54:06,993 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:54:06,993 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 10:54:08,288 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:54:11,360 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:54:14,432 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:54:17,504 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:54:20,576 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:54:23,648 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:54:26,720 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:54:29,792 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:54:32,864 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:54:35,936 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:54:39,009 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:54:42,080 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:54:45,152 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:54:46,154 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:54:47,155 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:54:48,157 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:54:49,158 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:54:50,159 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:54:51,161 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:54:52,162 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:54:53,163 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:54:54,164 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:54:55,166 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:54:56,167 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:54:57,168 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:54:58,169 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:54:59,171 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:55:00,172 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:55:01,173 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:55:02,174 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:55:03,176 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:55:04,177 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:55:05,178 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:55:05,180 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 10:55:07,003 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:55:07,004 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:55:07,004 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 10:55:11,182 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:55:12,183 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:55:13,184 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:55:14,185 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:55:15,187 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:55:16,188 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:55:17,189 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:55:18,191 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:55:19,192 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:55:20,193 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:55:21,194 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:55:22,196 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:55:23,197 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:55:24,198 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:55:25,200 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:55:26,201 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:55:27,202 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:55:28,203 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:55:29,205 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:55:30,206 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:55:31,207 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:55:32,209 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:55:33,210 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:55:34,211 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:55:35,213 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:55:36,214 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:55:37,215 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:55:38,216 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:55:39,218 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:55:40,219 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:55:41,220 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:55:42,221 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:55:43,223 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:55:44,224 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:55:45,225 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:55:46,227 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:55:47,228 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:55:48,229 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:55:49,230 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:55:50,232 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:55:51,233 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:55:52,234 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:55:53,235 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:55:54,237 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:55:55,238 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:55:56,239 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:55:57,240 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:55:58,242 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:55:59,243 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:56:00,244 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:56:00,246 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 10:56:06,248 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:56:06,996 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:56:06,997 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:56:06,998 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 10:56:07,249 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:56:08,250 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:56:09,252 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:56:10,253 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:56:11,254 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:56:12,256 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:56:13,257 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:56:14,258 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:56:15,259 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:56:16,261 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:56:17,262 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:56:18,263 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:56:19,265 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:56:20,266 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:56:21,267 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:56:22,268 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:56:23,270 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:56:24,271 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:56:25,272 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:56:26,274 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:56:27,275 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:56:28,276 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:56:29,278 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:56:30,279 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:56:31,280 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:56:32,281 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:56:33,283 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:56:34,284 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:56:35,285 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:56:36,287 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:56:37,288 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:56:38,290 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:56:39,291 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:56:40,292 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:56:41,294 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:56:42,295 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:56:43,296 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:56:44,297 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:56:45,299 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:56:46,300 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:56:47,301 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:56:48,302 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:56:49,304 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:56:50,305 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:56:51,306 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:56:52,308 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:56:53,309 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:56:54,310 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:56:55,312 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:56:55,313 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 10:57:01,315 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:57:02,316 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:57:03,318 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:57:04,319 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:57:05,320 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:57:06,322 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:57:07,007 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:57:07,008 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:57:07,009 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 10:57:07,323 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:57:08,324 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:57:09,326 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:57:10,327 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:57:11,328 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:57:12,330 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:57:13,331 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:57:14,332 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:57:15,334 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:57:16,335 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:57:17,336 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:57:18,337 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:57:19,339 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:57:20,340 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:57:21,341 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:57:22,343 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:57:23,344 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:57:24,345 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:57:25,347 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:57:26,348 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:57:27,349 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:57:28,350 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:57:29,352 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:57:30,353 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:57:31,354 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:57:32,356 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:57:33,357 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:57:34,358 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:57:35,359 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:57:36,361 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:57:37,362 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:57:38,364 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:57:39,365 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:57:40,367 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:57:41,368 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:57:42,369 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:57:43,370 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:57:44,372 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:57:45,373 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:57:46,374 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:57:47,375 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:57:48,377 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:57:49,378 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:57:50,379 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:57:50,381 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 10:57:56,383 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:57:57,384 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:57:58,385 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:57:59,387 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:58:00,388 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:58:01,389 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:58:02,391 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:58:03,392 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:58:04,393 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:58:05,395 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:58:06,396 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:58:06,991 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:58:06,992 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:58:06,993 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 10:58:07,397 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:58:08,399 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:58:09,400 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:58:10,401 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:58:11,403 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:58:12,404 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:58:13,405 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:58:14,407 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:58:15,408 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:58:16,409 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:58:17,411 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:58:18,412 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:58:19,413 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:58:20,415 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:58:21,416 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:58:22,417 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:58:23,419 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:58:24,420 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:58:25,421 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:58:26,423 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:58:27,424 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:58:28,425 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:58:29,427 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:58:30,428 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:58:31,429 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:58:32,431 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:58:33,432 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:58:34,433 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:58:35,435 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:58:36,436 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:58:37,438 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:58:38,439 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:58:39,441 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:58:40,442 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:58:41,443 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:58:42,445 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:58:43,446 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:58:44,447 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:58:45,449 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:58:45,450 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 10:58:51,453 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:58:52,455 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:58:53,456 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:58:54,457 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:58:55,459 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:58:56,460 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:58:57,461 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:58:58,463 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:58:59,464 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:59:00,465 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:59:01,466 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:59:02,468 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:59:03,469 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:59:04,470 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:59:05,472 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:59:06,473 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:59:07,006 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:59:07,007 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 10:59:07,009 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 10:59:07,474 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:59:08,476 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:59:09,477 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:59:10,478 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:59:11,480 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:59:12,481 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:59:13,482 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:59:14,483 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:59:15,485 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:59:16,486 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:59:17,487 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:59:18,489 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:59:19,490 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:59:20,491 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:59:21,493 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:59:22,494 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:59:23,495 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:59:24,496 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:59:25,498 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:59:26,499 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:59:27,500 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:59:28,502 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:59:29,503 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:59:30,504 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:59:31,506 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:59:32,507 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:59:33,508 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:59:34,509 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:59:35,511 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:59:36,512 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:59:37,514 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:59:38,515 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:59:39,517 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:59:40,518 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:59:40,521 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 10:59:46,523 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:59:47,524 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:59:48,525 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:59:49,527 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:59:50,528 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:59:51,529 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:59:52,531 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:59:53,532 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:59:54,533 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:59:55,535 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:59:56,536 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:59:57,537 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:59:58,539 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 10:59:59,540 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:00:00,541 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:00:01,543 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:00:02,544 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:00:03,545 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:00:04,547 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:00:05,548 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:00:06,550 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:00:06,993 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 11:00:06,994 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 11:00:06,994 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 11:00:07,551 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:00:08,552 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:00:09,553 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:00:10,555 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:00:11,556 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:00:12,557 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:00:13,559 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:00:14,560 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:00:15,561 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:00:16,563 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:00:17,564 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:00:18,565 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:00:19,567 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:00:20,568 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:00:21,569 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:00:22,571 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:00:23,572 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:00:24,573 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:00:25,575 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:00:26,576 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:00:27,577 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:00:28,578 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:00:29,580 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:00:30,581 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:00:31,582 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:00:32,584 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:00:33,585 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:00:34,586 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:00:35,588 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:00:35,590 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 11:00:41,592 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:00:42,593 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:00:43,595 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:00:44,596 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:00:45,597 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:00:46,599 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:00:47,600 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:00:48,601 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:00:49,603 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:00:50,604 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:00:51,605 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:00:52,607 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:00:53,608 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:00:54,609 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:00:55,611 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:00:56,612 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:00:57,613 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:00:58,615 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:00:59,616 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:01:00,617 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:01:01,619 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:01:02,620 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:01:03,621 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:01:04,622 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:01:05,624 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:01:06,625 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:01:06,993 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 11:01:06,994 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 11:01:06,995 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 11:01:07,627 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:01:08,628 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:01:09,629 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:01:10,631 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:01:11,632 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:01:12,633 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:01:13,635 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:01:14,636 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:01:15,637 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:01:16,639 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:01:17,640 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:01:18,642 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:01:19,643 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:01:20,644 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:01:21,646 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:01:22,647 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:01:23,648 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:01:24,650 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:01:25,651 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:01:26,652 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:01:27,654 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:01:28,655 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:01:29,656 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:01:30,658 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:01:30,660 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 11:01:36,662 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:01:37,664 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:01:38,665 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:01:39,666 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:01:40,668 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:01:41,669 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:01:42,670 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:01:43,672 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:01:44,673 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:01:45,674 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:01:46,676 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:01:47,677 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:01:48,678 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:01:49,680 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:01:50,681 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:01:51,682 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:01:52,684 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:01:53,685 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:01:54,686 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:01:55,687 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:01:56,689 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:01:57,690 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:01:58,691 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:01:59,693 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:02:00,694 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:02:01,695 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:02:02,697 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:02:03,698 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:02:04,699 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:02:05,701 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:02:06,702 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:02:07,013 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 11:02:07,014 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 11:02:07,015 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 11:02:07,704 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:02:08,705 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:02:09,706 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:02:10,707 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:02:11,709 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:02:12,710 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:02:13,711 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:02:14,713 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:02:15,714 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:02:16,715 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:02:17,717 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:02:18,718 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:02:19,719 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:02:20,721 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:02:21,722 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:02:22,723 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:02:23,725 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:02:24,726 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:02:25,727 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:02:25,730 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 11:02:31,732 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:02:32,733 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:02:33,735 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:02:34,736 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:02:35,737 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:02:36,739 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:02:37,740 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:02:38,741 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:02:39,743 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:02:40,744 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:02:41,745 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:02:42,747 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:02:43,748 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:02:44,749 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:02:45,751 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:02:46,752 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:02:47,753 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:02:48,755 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:02:49,756 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:02:50,757 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:02:51,759 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:02:52,760 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:02:53,761 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:02:54,763 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:02:55,764 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:02:56,765 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:02:57,767 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:02:58,768 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:02:59,769 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:03:00,771 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:03:01,772 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:03:02,773 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:03:03,775 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:03:04,776 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:03:05,777 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:03:06,779 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:03:06,994 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 11:03:06,996 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 11:03:06,998 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 11:03:07,780 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:03:08,781 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:03:09,783 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:03:10,784 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:03:11,785 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:03:12,787 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:03:13,788 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:03:14,789 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:03:15,791 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:03:16,792 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:03:17,793 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:03:18,795 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:03:19,796 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:03:20,797 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:03:20,800 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 11:03:26,802 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:03:27,803 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:03:28,804 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:03:29,805 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:03:30,807 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:03:31,808 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:03:32,809 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:03:33,811 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:03:34,812 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:03:35,813 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:03:36,815 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:03:37,816 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:03:38,818 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:03:39,819 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:03:40,820 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:03:41,821 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:03:42,823 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:03:43,824 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:03:44,825 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:03:45,827 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:03:46,828 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:03:47,829 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:03:48,831 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:03:49,832 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:03:50,833 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:03:51,835 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:03:52,836 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:03:53,837 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:03:54,839 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:03:55,840 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:03:56,841 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:03:57,843 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:03:58,844 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:03:59,845 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:04:00,847 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:04:01,848 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:04:02,849 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:04:03,851 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:04:04,852 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:04:05,853 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:04:06,855 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:04:07,006 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 11:04:07,007 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 11:04:07,008 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 11:04:07,856 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:04:08,857 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:04:09,859 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:04:10,860 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:04:11,861 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:04:12,862 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:04:13,864 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:04:14,865 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:04:15,866 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:04:15,869 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 11:04:21,870 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:04:22,872 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:04:23,873 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:04:24,874 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:04:25,876 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:04:26,877 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:04:27,878 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:04:28,880 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:04:29,881 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:04:30,882 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:04:31,884 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:04:32,885 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:04:33,886 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:04:34,888 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:04:35,889 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:04:36,891 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:04:37,892 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:04:38,893 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:04:39,894 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:04:40,896 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:04:41,897 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:04:42,898 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:04:43,900 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:04:44,901 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:04:45,902 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:04:46,904 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:04:47,905 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:04:48,906 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:04:49,907 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:04:50,909 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:04:51,910 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:04:52,911 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:04:53,913 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:04:54,914 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:04:55,915 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:04:56,916 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:04:57,918 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:04:58,919 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:04:59,920 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:05:00,922 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:05:01,923 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:05:02,924 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:05:03,926 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:05:04,927 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:05:05,928 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:05:06,930 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:05:06,991 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 11:05:06,992 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 11:05:06,994 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 11:05:07,931 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:05:08,932 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:05:09,933 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:05:10,935 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:05:10,936 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 11:05:16,938 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:05:17,939 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:05:18,941 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:05:19,942 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:05:20,944 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:05:21,945 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:05:22,946 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:05:23,948 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:05:24,949 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:05:25,950 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:05:26,951 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:05:27,953 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:05:28,954 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:05:29,955 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:05:30,957 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:05:31,958 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:05:32,959 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:05:33,960 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:05:34,962 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:05:35,963 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:05:36,965 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:05:37,966 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:05:38,968 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:05:39,969 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:05:40,970 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:05:41,972 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:05:42,973 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:05:43,974 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:05:44,976 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:05:45,977 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:05:46,978 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:05:47,980 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:05:48,981 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:05:49,982 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:05:50,984 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:05:51,985 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:05:52,987 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:05:53,988 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:05:54,990 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:05:55,991 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:05:56,992 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:05:57,993 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:05:58,995 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:05:59,996 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:06:00,997 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:06:01,999 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:06:03,000 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:06:04,001 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:06:05,002 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:06:06,004 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:06:06,005 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 11:06:07,010 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 11:06:07,012 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 11:06:07,013 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 11:06:12,007 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:06:13,009 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:06:14,010 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:06:15,011 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:06:16,013 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:06:17,014 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:06:18,015 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:06:19,017 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:06:20,018 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:06:21,019 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:06:22,020 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:06:23,022 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:06:24,023 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:06:25,024 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:06:26,026 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:06:27,027 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:06:28,028 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:06:29,029 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:06:30,031 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:06:31,032 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:06:32,033 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:06:33,034 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:06:34,036 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:06:35,037 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:06:36,038 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:06:37,039 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:06:38,041 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:06:39,042 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:06:40,043 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:06:41,045 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:06:42,046 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:06:43,047 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:06:44,049 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:06:45,050 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:06:46,051 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:06:47,052 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:06:48,054 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:06:49,055 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:06:50,056 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:06:51,057 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:06:52,059 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:06:53,060 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:06:54,061 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:06:55,062 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:06:56,063 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:06:57,065 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:06:58,066 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:06:59,067 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:07:00,068 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:07:01,070 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:07:01,071 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 11:07:06,995 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 11:07:06,996 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 11:07:06,997 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 11:07:07,073 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:07:08,075 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:07:09,076 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:07:10,077 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:07:11,078 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:07:12,080 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:07:13,081 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:07:14,082 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:07:15,084 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:07:16,085 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:07:17,086 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:07:18,087 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:07:19,089 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:07:20,090 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:07:21,091 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:07:22,092 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:07:23,093 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:07:24,095 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:07:25,096 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:07:26,097 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:07:27,098 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:07:28,100 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:07:29,101 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:07:30,102 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:07:31,103 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:07:32,105 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:07:33,106 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:07:34,107 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:07:35,108 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:07:36,110 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:07:37,111 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:07:38,113 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:07:39,114 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:07:40,115 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:07:41,117 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:07:42,118 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:07:43,119 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:07:44,121 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:07:45,122 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:07:46,123 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:07:47,124 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:07:48,125 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:07:49,127 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:07:50,128 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:07:51,129 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:07:52,130 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:07:53,132 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:07:54,133 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:07:55,134 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:07:56,135 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:07:56,137 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 11:08:02,139 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:08:03,140 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:08:04,142 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:08:05,143 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:08:06,144 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:08:07,006 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 11:08:07,007 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 11:08:07,008 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 11:08:07,146 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:08:08,147 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:08:09,148 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:08:10,149 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:08:11,151 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:08:12,152 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:08:13,153 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:08:14,154 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:08:15,156 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:08:16,157 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:08:17,158 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:08:18,160 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:08:19,161 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:08:20,162 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:08:21,163 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:08:22,165 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:08:23,166 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:08:24,167 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:08:25,168 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:08:26,170 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:08:27,171 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:08:28,172 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:08:29,173 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:08:30,175 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:08:31,176 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:08:32,177 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:08:33,178 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:08:34,180 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:08:35,181 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:08:36,182 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:08:37,184 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:08:38,186 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:08:39,187 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:08:40,188 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:08:41,189 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:08:42,191 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:08:43,192 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:08:44,193 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:08:45,194 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:08:46,196 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:08:47,197 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:08:48,198 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:08:49,199 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:08:50,201 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:08:51,202 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:08:51,203 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 11:08:57,205 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:08:58,206 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:08:59,208 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:09:00,209 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:09:01,210 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:09:02,211 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:09:03,213 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:09:04,214 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:09:05,215 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:09:06,216 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:09:06,997 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 11:09:06,998 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 11:09:07,000 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 11:09:07,218 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:09:08,219 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:09:09,220 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:09:10,222 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:09:11,223 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:09:12,224 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:09:13,226 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:09:14,227 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:09:15,228 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:09:16,230 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:09:17,231 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:09:18,232 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:09:19,233 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:09:20,235 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:09:21,236 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:09:22,237 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:09:23,238 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:09:24,240 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:09:25,241 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:09:26,242 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:09:27,243 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:09:28,245 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:09:29,246 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:09:30,248 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:09:31,249 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:09:32,250 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:09:33,252 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:09:34,253 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:09:35,254 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:09:36,255 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:09:37,257 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:09:38,259 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:09:39,260 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:09:40,261 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:09:41,263 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:09:42,264 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:09:43,265 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:09:44,266 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:09:45,268 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:09:46,269 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:09:46,271 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 11:09:52,272 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:09:53,274 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:09:54,275 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:09:55,276 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:09:56,278 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:09:57,279 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:09:58,280 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:09:59,281 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:10:00,283 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:10:01,284 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:10:02,285 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:10:03,286 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:10:04,288 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:10:05,289 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:10:06,290 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:10:07,000 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 11:10:07,001 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 11:10:07,002 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 11:10:07,292 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:10:08,293 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:10:09,294 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:10:10,295 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:10:11,297 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:10:12,298 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:10:13,299 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:10:14,300 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:10:15,302 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:10:16,303 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:10:17,304 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:10:18,305 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:10:19,307 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:10:20,308 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:10:21,309 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:10:22,310 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:10:23,312 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:10:24,313 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:10:25,314 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:10:26,315 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:10:27,316 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:10:28,318 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:10:29,319 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:10:30,320 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:10:31,321 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:10:32,323 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:10:33,324 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:10:34,325 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:10:35,326 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:10:36,328 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:10:37,329 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:10:38,331 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:10:39,332 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:10:40,334 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:10:41,335 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:10:41,336 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 11:10:47,338 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:10:48,339 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:10:49,341 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:10:50,342 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:10:51,343 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:10:52,344 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:10:53,346 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:10:54,347 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:10:55,348 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:10:56,349 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:10:57,350 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:10:58,352 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:10:59,353 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:11:00,354 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:11:01,355 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:11:02,357 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:11:03,358 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:11:04,359 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:11:05,360 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:11:06,361 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:11:07,013 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 11:11:07,014 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 11:11:07,015 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 11:11:07,363 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:11:08,364 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:11:09,366 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:11:10,367 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:11:11,368 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:11:12,369 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:11:13,371 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:11:14,372 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:11:15,373 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:11:16,374 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:11:17,375 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:11:18,377 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:11:19,378 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:11:20,379 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:11:21,380 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:11:22,381 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:11:23,382 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:11:24,384 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:11:25,385 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:11:26,386 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:11:27,388 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:11:28,389 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:11:29,390 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:11:30,391 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:11:31,393 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:11:32,394 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:11:33,395 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:11:34,396 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:11:35,398 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:11:36,399 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:11:36,401 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 11:11:42,403 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:11:43,404 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:11:44,405 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:11:45,406 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:11:46,408 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:11:47,409 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:11:48,410 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:11:49,411 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:11:50,412 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:11:51,414 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:11:52,415 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:11:53,416 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:11:54,417 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:11:55,419 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:11:56,420 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:11:57,421 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:11:58,422 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:11:59,423 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:12:00,425 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:12:01,426 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:12:02,427 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:12:03,428 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:12:04,429 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:12:05,430 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:12:06,432 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:12:06,998 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 11:12:07,000 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 11:12:07,001 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 11:12:07,433 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:12:08,435 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:12:09,436 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:12:10,437 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:12:11,438 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:12:12,440 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:12:13,441 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:12:14,442 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:12:15,443 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:12:16,445 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:12:17,446 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:12:18,447 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:12:19,448 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:12:20,450 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:12:21,451 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:12:22,452 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:12:23,453 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:12:24,454 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:12:25,456 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:12:26,457 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:12:27,458 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:12:28,459 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:12:29,461 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:12:30,462 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:12:31,463 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:12:31,466 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 11:12:37,468 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:12:38,469 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:12:39,470 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:12:40,471 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:12:41,473 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:12:42,474 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:12:43,475 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:12:44,476 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:12:45,478 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:12:46,479 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:12:47,480 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:12:48,481 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:12:49,483 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:12:50,484 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:12:51,485 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:12:52,486 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:12:53,488 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:12:54,489 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:12:55,490 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:12:56,491 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:12:57,492 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:12:58,494 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:12:59,495 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:13:00,496 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:13:01,498 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:13:02,499 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:13:03,500 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:13:04,501 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:13:05,503 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:13:06,504 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:13:07,011 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 11:13:07,012 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 11:13:07,013 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 11:13:07,505 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:13:08,506 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:13:09,508 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:13:10,509 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:13:11,510 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:13:12,511 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:13:13,513 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:13:14,514 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:13:15,515 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:13:16,516 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:13:17,518 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:13:18,519 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:13:19,520 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:13:20,521 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:13:21,522 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:13:22,524 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:13:23,525 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:13:24,526 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:13:25,527 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:13:26,529 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:13:26,531 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 11:13:32,533 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:13:33,534 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:13:34,535 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:13:35,537 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:13:36,538 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:13:37,539 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:13:38,540 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:13:39,542 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:13:40,543 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:13:41,544 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:13:42,546 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:13:43,547 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:13:44,548 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:13:45,549 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:13:46,550 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:13:47,552 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:13:48,553 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:13:49,554 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:13:50,555 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:13:51,556 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:13:52,558 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:13:53,559 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:13:54,560 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:13:55,561 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:13:56,563 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:13:57,564 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:13:58,565 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:13:59,566 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:14:00,567 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:14:01,569 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:14:02,570 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:14:03,571 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:14:04,572 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:14:05,573 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:14:06,575 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:14:06,995 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 11:14:06,996 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 11:14:06,997 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 11:14:07,576 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:14:08,577 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:14:09,578 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:14:10,580 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:14:11,581 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:14:12,582 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:14:13,583 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:14:14,584 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:14:15,585 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:14:16,587 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:14:17,588 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:14:18,589 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:14:19,590 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:14:20,591 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:14:21,592 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:14:21,595 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 11:14:27,596 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:14:28,598 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:14:29,599 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:14:30,600 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:14:31,601 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:14:32,602 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:14:33,604 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:14:34,605 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:14:35,606 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:14:36,607 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:14:37,609 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:14:38,610 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:14:39,611 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:14:40,612 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:14:41,613 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:14:42,615 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:14:43,616 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:14:44,617 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:14:45,618 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:14:46,619 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:14:47,621 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:14:48,622 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:14:49,623 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:14:50,624 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:14:51,626 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:14:52,627 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:14:53,628 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:14:54,629 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:14:55,630 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:14:56,632 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:14:57,633 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:14:58,634 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:14:59,635 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:15:00,636 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:15:01,637 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:15:02,639 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:15:03,640 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:15:04,641 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:15:05,642 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:15:06,643 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:15:07,012 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 11:15:07,013 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 11:15:07,013 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 11:15:07,645 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:15:08,646 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:15:09,647 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:15:10,648 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:15:11,650 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:15:12,651 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:15:13,652 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:15:14,653 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:15:15,654 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:15:16,656 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:15:16,658 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 11:15:22,660 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:15:23,661 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:15:24,662 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:15:25,663 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:15:26,665 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:15:27,666 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:15:28,667 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:15:29,668 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:15:30,669 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:15:31,671 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:15:32,672 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:15:33,673 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:15:34,674 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:15:35,675 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:15:36,676 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:15:37,678 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:15:38,679 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:15:39,680 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:15:40,681 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:15:41,682 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:15:42,683 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:15:43,685 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:15:44,686 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:15:45,687 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:15:46,688 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:15:47,689 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:15:48,690 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:15:49,691 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:15:50,693 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:15:51,694 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:15:52,695 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:15:53,696 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:15:54,697 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:15:55,698 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:15:56,700 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:15:57,701 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:15:58,702 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:15:59,703 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:16:00,704 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:16:01,706 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:16:02,707 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:16:03,708 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:16:04,709 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:16:05,711 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:16:06,712 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:16:06,996 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 11:16:06,997 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 11:16:06,998 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 11:16:07,713 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:16:08,714 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:16:09,716 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:16:10,717 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:16:11,718 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:16:11,720 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 11:16:17,721 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:16:18,723 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:16:19,724 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:16:20,725 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:16:21,726 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:16:22,728 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:16:23,729 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:16:24,730 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:16:25,731 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:16:26,732 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:16:27,734 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:16:28,735 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:16:29,736 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:16:30,737 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:16:31,739 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:16:32,740 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:16:33,741 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:16:34,742 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:16:35,743 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:16:36,745 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:16:37,746 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:16:38,748 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:16:39,749 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:16:40,751 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:16:41,752 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:16:42,753 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:16:43,754 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:16:44,756 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:16:45,757 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:16:46,758 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:16:47,759 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:16:48,761 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:16:49,762 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:16:50,763 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:16:51,764 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:16:52,766 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:16:53,767 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:16:54,768 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:16:55,769 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:16:56,770 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:16:57,771 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:16:58,773 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:16:59,774 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:17:00,775 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:17:01,776 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:17:02,778 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:17:03,779 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:17:04,780 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:17:05,781 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:17:06,783 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:17:06,784 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 11:17:07,011 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 11:17:07,012 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 11:17:07,013 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 11:17:12,787 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:17:13,789 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:17:14,790 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:17:15,791 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:17:16,792 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:17:17,794 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:17:18,795 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:17:19,796 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:17:20,797 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:17:21,799 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:17:22,800 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:17:23,801 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:17:24,802 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:17:25,804 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:17:26,805 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:17:27,806 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:17:28,807 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:17:29,808 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:17:30,810 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:17:31,811 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:17:32,812 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:17:33,814 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:17:34,815 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:17:35,816 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:17:36,817 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:17:37,819 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:17:38,820 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:17:39,821 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:17:40,823 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:17:41,824 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:17:42,825 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:17:43,826 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:17:44,828 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:17:45,829 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:17:46,830 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:17:47,831 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:17:48,832 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:17:49,834 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:17:50,835 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:17:51,836 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:17:52,837 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:17:53,839 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:17:54,840 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:17:55,841 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:17:56,842 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:17:57,844 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:17:58,845 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:17:59,846 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:18:00,847 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:18:01,849 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:18:01,851 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 11:18:07,004 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 11:18:07,005 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 11:18:07,006 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 11:18:07,853 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:18:08,855 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:18:09,856 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:18:10,857 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:18:11,859 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:18:12,860 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:18:13,861 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:18:14,862 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:18:15,864 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:18:16,865 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:18:17,866 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:18:18,867 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:18:19,869 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:18:20,870 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:18:21,871 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:18:22,873 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:18:23,874 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:18:24,875 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:18:25,877 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:18:26,878 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:18:27,879 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:18:28,880 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:18:29,882 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:18:30,883 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:18:31,884 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:18:32,886 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:18:33,887 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:18:34,888 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:18:35,889 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:18:36,891 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:18:37,892 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:18:38,894 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:18:39,895 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:18:40,896 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:18:41,898 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:18:42,899 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:18:43,900 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:18:44,901 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:18:45,903 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:18:46,904 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:18:47,905 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:18:48,906 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:18:49,908 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:18:50,909 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:18:51,910 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:18:52,911 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:18:53,913 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:18:54,914 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:18:55,915 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:18:56,917 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:18:56,918 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 11:19:02,920 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:19:03,921 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:19:04,923 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:19:05,924 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:19:06,925 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:19:07,017 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 11:19:07,018 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 11:19:07,020 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 45 more 2025-07-07 11:19:07,927 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:19:08,928 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:19:09,929 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:19:10,930 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:19:11,932 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:19:12,933 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:19:13,934 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:19:14,935 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:19:15,937 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:19:16,938 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:19:17,939 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:19:18,941 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:19:19,942 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:19:20,943 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:19:21,944 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:19:22,946 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:19:23,947 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:19:24,948 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:19:25,950 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:19:26,951 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:19:27,952 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:19:28,953 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:19:29,954 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:19:30,956 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:19:31,957 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:19:32,958 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:19:33,959 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:19:34,961 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:19:35,962 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:19:36,963 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:19:37,965 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:19:38,966 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:19:39,968 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:19:40,969 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:19:41,970 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:19:42,972 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:19:43,973 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:19:44,974 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:19:45,975 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:19:46,977 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:19:47,978 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:19:48,980 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:19:49,981 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:19:50,983 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:19:51,984 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:19:51,985 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 11:19:55,920 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: RECEIVED SIGNAL 15: SIGTERM 2025-07-07 11:19:55,925 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG: /************************************************************ SHUTDOWN_MSG: Shutting down DataNode at dmidlkprdls04.svr.luc.edu/192.168.158.4 ************************************************************/ 2025-07-07 11:55:35,201 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: STARTUP_MSG: /************************************************************ STARTUP_MSG: Starting DataNode STARTUP_MSG: host = dmidlkprdls04.svr.luc.edu/192.168.158.4 STARTUP_MSG: args = [] STARTUP_MSG: version = 3.1.1.7.3.1.0-197 STARTUP_MSG: classpath = /var/run/cloudera-scm-agent/process/38-hdfs-DATANODE:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/audience-annotations-0.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/avro.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/aws-java-sdk-bundle-1.12.720.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-hdfs-plugin-shim-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-plugin-classloader-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-yarn-plugin-shim-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/azure-data-lake-store-sdk-2.3.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/checker-qual-3.33.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-beanutils-1.9.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-cli-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-codec-1.15.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-collections-3.2.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-compress-1.23.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-configuration2-2.10.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-io-2.11.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-lang-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-lang3-3.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-logging-1.1.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-math3-3.6.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-net-3.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-text-1.10.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-client-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-framework-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-recipes-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/failureaccess-1.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/gson-2.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/guava-32.0.1-jre.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/httpclient-4.5.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/httpcore-4.4.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/j2objc-annotations-2.8.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-core-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-core-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-databind-2.12.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-mapper-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-xc-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/javax.activation-api-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/javax.servlet-api-3.1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jaxb-api-2.2.11.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jcip-annotations-1.0-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-core-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-json-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-server-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-servlet-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jettison-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-http-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-io-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-security-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-server-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-servlet-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-util-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-util-ajax-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-webapp-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-xml-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jline-3.22.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsch-0.1.54.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsp-api-2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsr305-3.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsr311-api-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jul-to-slf4j-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-admin-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-client-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-common-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-core-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-crypto-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-identity-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-server-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-simplekdc-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-asn1-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-config-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-pkix-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-xdr-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/logredactor-2.0.16.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/lz4-java-1.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/metrics-core-3.2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-all-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-buffer-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-haproxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-http-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-http2-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-memcache-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-mqtt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-redis-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-smtp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-log4j12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-reload4j-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-api-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/reload4j-1.2.22.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/zookeeper.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-udt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-unix-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-kqueue-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final-linux-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-classes-kqueue-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-rxtx-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-classes-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-classes-macos-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-ssl-ocsp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-proxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-xml-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-stomp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-socks-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/wildfly-openssl-2.1.4.ClouderaFinal.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/protobuf-java-2.5.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/nimbus-jose-jwt-9.37.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/woodstox-core-5.4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/token-provider-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/stax2-api-4.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/snappy-java-1.1.10.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/re2j-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/zookeeper-jute.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-sctp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-kqueue-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final-linux-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//gcs-connector-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-annotations.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-auth.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-aws.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-datalake.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-kms.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-nfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//ozone-filesystem-hadoop3-1.3.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-nfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-kms-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-datalake-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-aws-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-auth-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-annotations-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//gcs-connector-2.1.2.7.3.1.0-197-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-thrift.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-scala_2.12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-protobuf.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-pig.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-pig-bundle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-jackson.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-hadoop.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-hadoop-bundle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-generator.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-format-structures.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-encoding.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-column.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-cascading3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-cascading.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-avro.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/./:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/audience-annotations-0.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/avro-1.11.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/checker-qual-3.33.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-beanutils-1.9.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-codec-1.15.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-collections-3.2.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-compress-1.23.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-configuration2-2.10.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-io-2.11.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-lang3-3.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-math3-3.6.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-net-3.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-text-1.10.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-client-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-framework-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-recipes-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/failureaccess-1.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/gson-2.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/guava-32.0.1-jre.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/httpclient-4.5.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/httpcore-4.4.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/j2objc-annotations-2.8.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-core-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-databind-2.12.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-jaxrs-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-xc-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/javax.activation-api-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/javax.servlet-api-3.1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jaxb-api-2.2.11.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jaxb-impl-2.2.3-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jcip-annotations-1.0-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-core-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-json-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-server-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-servlet-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jettison-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-http-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-io-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-security-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-server-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-servlet-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-util-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-util-ajax-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-webapp-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-xml-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jline-3.22.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsch-0.1.54.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/json-simple-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsr311-api-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-admin-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-client-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-common-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-core-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-crypto-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-identity-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-server-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-simplekdc-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-asn1-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-config-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-pkix-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-xdr-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/leveldbjni-cldr-1.9.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/lz4-java-1.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/metrics-core-3.2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-all-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-buffer-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-haproxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-http-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-http2-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-memcache-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-mqtt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-redis-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-smtp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-socks-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-stomp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-xml-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-proxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-ssl-ocsp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/re2j-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/zookeeper-3.8.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-sctp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-rxtx-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-unix-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-kqueue-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-kqueue-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final-linux-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final-linux-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-classes-kqueue-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-classes-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-classes-macos-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/nimbus-jose-jwt-9.37.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/woodstox-core-5.4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/token-provider-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/stax2-api-4.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/snappy-java-1.1.10.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/reload4j-1.2.22.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/zookeeper-jute-3.8.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-udt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-httpfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-nfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-nfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-httpfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//asm-5.0.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//aspectjweaver-1.9.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//azure-storage-7.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//checker-compat-qual-2.5.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-slf4j-backend-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-system-backend-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//google-extensions-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//azure-keyvault-core-1.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//accessors-smart-2.4.9.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gcs-connector-2.1.2.7.3.1.0-197-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archive-logs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archive-logs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archives-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archives.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-datajoin-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-datajoin.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-distcp-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-distcp.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-extras-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-extras.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-fs2img-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-fs2img.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-gridmix-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-gridmix.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-kafka-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-kafka.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-app-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-app.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-core-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-core.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-nativetask-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-nativetask.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-uploader-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-uploader.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-examples-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-examples.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-openstack-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-openstack.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-resourceestimator-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-resourceestimator.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-rumen-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-rumen.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-sls-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-sls.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-streaming-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-streaming.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ojalgo-43.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//kafka-clients-2.8.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//log4j-core-2.18.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-hook-abfs-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//forbiddenapis-3.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-intg-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//log4j-api-2.18.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//zstd-jni-1.4.9-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-i18n.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-s3-lib-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//javax.activation-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//bundle-2.23.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//json-smart-2.4.10.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-util-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-shell.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-cloud-bindings.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-hook-s3-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//aspectjrt-1.9.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/./:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/swagger-annotations-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/objenesis-1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jersey-client-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jakarta.xml.bind-api-2.3.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jakarta.activation-api-1.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-dataformat-yaml-2.9.10.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcutil-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcprov-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcpkix-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/HikariCP-java7-2.4.12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/snakeyaml-2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jsonschema2pojo-core-1.0.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/joda-time-2.10.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jna-5.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jersey-guice-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/javax.inject-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-module-jaxb-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-jaxrs-json-provider-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-jaxrs-base-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/guice-servlet-4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/guice-4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/geronimo-jcache_1.0_spec-1.0-alpha-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/fst-2.50.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/ehcache-3.3.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/dnsjava-2.1.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/codemodel-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-api.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-distributedshell.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-cloud-resourcemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-registry.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-nodemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-resourcemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-router.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-sharedcachemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-timeline-pluginstorage.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-web-proxy.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-api.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-core.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-registry-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-api-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-core-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-api-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-web-proxy-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-timeline-pluginstorage-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-tests-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-sharedcachemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-router-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-resourcemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-nodemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-cloud-resourcemanager-1.0.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-distributedshell-3.1.1.7.3.1.0-197.jar:/opt/cloudera/cm/lib/plugins/event-publish-7.13.1-shaded.jar:/opt/cloudera/cm/lib/plugins/tt-instrumentation-7.13.1.jar STARTUP_MSG: build = git@github.infra.cloudera.com:CDH/hadoop.git -r 31a42fb39494f541ffae15c3c61185deeeacca86; compiled by 'jenkins' on 2024-12-04T01:09Z STARTUP_MSG: java = 1.8.0_432 ************************************************************/ 2025-07-07 11:55:35,293 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: registered UNIX signal handlers for [TERM, HUP, INT] 2025-07-07 11:55:35,618 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d1/dfs/dn 2025-07-07 11:55:35,624 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d2/dfs/dn 2025-07-07 11:55:35,625 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d3/dfs/dn 2025-07-07 11:55:35,625 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d4/dfs/dn 2025-07-07 11:55:35,791 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: Loaded properties from hadoop-metrics2.properties 2025-07-07 11:55:35,907 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled Metric snapshot period at 10 second(s). 2025-07-07 11:55:35,907 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: DataNode metrics system started 2025-07-07 11:55:36,347 INFO org.apache.hadoop.hdfs.server.common.Util: dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling 2025-07-07 11:55:36,372 INFO org.apache.hadoop.hdfs.server.datanode.BlockScanner: Initialized block scanner with targetBytesPerSec 1048576 2025-07-07 11:55:36,379 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: File descriptor passing is enabled. 2025-07-07 11:55:36,380 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Configured hostname is dmidlkprdls04.svr.luc.edu 2025-07-07 11:55:36,381 INFO org.apache.hadoop.hdfs.server.common.Util: dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling 2025-07-07 11:55:36,386 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting DataNode with maxLockedMemory = 4294967296 2025-07-07 11:55:36,414 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened streaming server at /192.168.158.4:9866 2025-07-07 11:55:36,417 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Balancing bandwidth is 10485760 bytes/s 2025-07-07 11:55:36,417 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Number threads for balancing is 50 2025-07-07 11:55:36,421 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Balancing bandwidth is 10485760 bytes/s 2025-07-07 11:55:36,421 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Number threads for balancing is 50 2025-07-07 11:55:36,421 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Listening on UNIX domain socket: /var/run/hdfs-sockets/dn 2025-07-07 11:55:36,470 INFO org.eclipse.jetty.util.log: Logging initialized @2594ms to org.eclipse.jetty.util.log.Slf4jLog 2025-07-07 11:55:36,596 WARN org.apache.hadoop.security.authentication.server.AuthenticationFilter: Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /var/lib/hadoop-hdfs/hadoop-http-auth-signature-secret 2025-07-07 11:55:36,604 INFO org.apache.hadoop.http.HttpRequestLog: Http request log for http.requests.datanode is not defined 2025-07-07 11:55:36,613 INFO org.apache.hadoop.http.HttpServer2: Added global filter 'safety' (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter) 2025-07-07 11:55:36,616 INFO org.apache.hadoop.security.HttpCrossOriginFilterInitializer: CORS filter not enabled. Please set hadoop.http.cross-origin.enabled to 'true' to enable it 2025-07-07 11:55:36,617 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context datanode 2025-07-07 11:55:36,617 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context logs 2025-07-07 11:55:36,617 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context static 2025-07-07 11:55:36,662 INFO org.apache.hadoop.http.HttpServer2: Jetty bound to port 42203 2025-07-07 11:55:36,664 INFO org.eclipse.jetty.server.Server: jetty-9.4.54.v20240208; built: 2024-02-08T19:42:39.027Z; git: cef3fbd6d736a21e7d541a5db490381d95a2047d; jvm 1.8.0_432-b06 2025-07-07 11:55:36,716 INFO org.eclipse.jetty.server.session: DefaultSessionIdManager workerName=node0 2025-07-07 11:55:36,717 INFO org.eclipse.jetty.server.session: No SessionScavenger set, using defaults 2025-07-07 11:55:36,720 INFO org.eclipse.jetty.server.session: node0 Scavenging every 660000ms 2025-07-07 11:55:36,752 WARN org.apache.hadoop.security.authentication.server.AuthenticationFilter: Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /var/lib/hadoop-hdfs/hadoop-http-auth-signature-secret 2025-07-07 11:55:36,758 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.s.ServletContextHandler@6cea706c{logs,/logs,file:///var/log/hadoop-hdfs/,AVAILABLE} 2025-07-07 11:55:36,762 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.s.ServletContextHandler@21ec5d87{static,/static,file:///opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/static/,AVAILABLE} 2025-07-07 11:55:36,914 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.w.WebAppContext@4d157787{datanode,/,file:///opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/datanode/,AVAILABLE}{file:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/datanode} 2025-07-07 11:55:36,929 INFO org.eclipse.jetty.server.AbstractConnector: Started ServerConnector@51f49060{HTTP/1.1, (http/1.1)}{localhost:42203} 2025-07-07 11:55:36,929 INFO org.eclipse.jetty.server.Server: Started @3054ms 2025-07-07 11:55:37,222 WARN com.cloudera.cmf.event.publish.EventStorePublisherWithRetry: Failed to publish event: SimpleEvent{attributes={ROLE=[hdfs-DATANODE-b30a464b10a57fdd49ea734cd52a8291], HOSTS=[dmidlkprdls04.svr.luc.edu], ROLE_TYPE=[DATANODE], CATEGORY=[LOG_MESSAGE], EVENTCODE=[EV_LOG_EVENT], SERVICE=[hdfs], SERVICE_TYPE=[HDFS], LOG_LEVEL=[WARN], HOST_IDS=[5c33df90-d247-4c6d-b9e0-5908a423580a], SEVERITY=[IMPORTANT]}, content=Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /var/lib/hadoop-hdfs/hadoop-http-auth-signature-secret, timestamp=1751907336596} java.io.IOException: Error connecting to dmidlkprdls01.svr.luc.edu/192.168.158.1:7184 at com.cloudera.cmf.event.shaded.org.apache.avro.ipc.netty.NettyTransceiver.getChannel(NettyTransceiver.java:269) at com.cloudera.cmf.event.shaded.org.apache.avro.ipc.netty.NettyTransceiver.(NettyTransceiver.java:197) at com.cloudera.cmf.event.shaded.org.apache.avro.ipc.netty.NettyTransceiver.(NettyTransceiver.java:120) at com.cloudera.cmf.event.publish.AvroEventStorePublishProxy.checkSpecificRequestor(AvroEventStorePublishProxy.java:120) at com.cloudera.cmf.event.publish.AvroEventStorePublishProxy.publishEvent(AvroEventStorePublishProxy.java:204) at com.cloudera.cmf.event.publish.EventStorePublisherWithRetry$PublishEventTask.run(EventStorePublisherWithRetry.java:242) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:750) Caused by: com.cloudera.cmf.event.shaded.io.netty.channel.AbstractChannel$AnnotatedNoRouteToHostException: No route to host: dmidlkprdls01.svr.luc.edu/192.168.158.1:7184 Caused by: java.net.NoRouteToHostException: No route to host at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:716) at com.cloudera.cmf.event.shaded.io.netty.channel.socket.nio.NioSocketChannel.doFinishConnect(NioSocketChannel.java:337) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.finishConnect(AbstractNioChannel.java:339) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:776) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:724) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:650) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:562) at com.cloudera.cmf.event.shaded.io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997) at com.cloudera.cmf.event.shaded.io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) at java.lang.Thread.run(Thread.java:750) 2025-07-07 11:55:37,298 INFO org.apache.hadoop.hdfs.server.datanode.web.DatanodeHttpServer: Listening HTTP traffic on /192.168.158.4:9864 2025-07-07 11:55:37,309 INFO org.apache.hadoop.util.JvmPauseMonitor: Starting JVM pause monitor 2025-07-07 11:55:37,311 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: dnUserName = hdfs 2025-07-07 11:55:37,311 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: supergroup = supergroup 2025-07-07 11:55:37,372 INFO org.apache.hadoop.ipc.CallQueueManager: Using callQueue: class java.util.concurrent.LinkedBlockingQueue queueCapacity: 1000 scheduler: class org.apache.hadoop.ipc.DefaultRpcScheduler 2025-07-07 11:55:37,391 INFO org.apache.hadoop.ipc.Server: Starting Socket Reader #1 for port 9867 2025-07-07 11:55:37,467 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened IPC server at /192.168.158.4:9867 2025-07-07 11:55:37,508 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Refresh request received for nameservices: null 2025-07-07 11:55:37,519 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting BPOfferServices for nameservices: 2025-07-07 11:55:37,537 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool (Datanode Uuid unassigned) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 starting to offer service 2025-07-07 11:55:37,545 INFO org.apache.hadoop.ipc.Server: IPC Server Responder: starting 2025-07-07 11:55:37,545 INFO org.apache.hadoop.ipc.Server: IPC Server listener on 9867: starting 2025-07-07 11:55:38,655 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:55:39,657 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:55:40,659 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:55:41,661 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:55:42,663 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:55:43,664 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:55:44,666 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:55:45,668 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:55:46,669 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:55:47,671 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:55:48,673 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:55:49,674 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:55:50,676 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:55:51,678 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:55:52,679 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:55:53,681 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:55:54,506 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 11:55:54,513 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 11:55:54,516 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 45 more 2025-07-07 11:55:54,683 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:55:54,816 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.produce(EatWhatYouKill.java:137) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 11:55:54,818 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.produce(EatWhatYouKill.java:137) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 11:55:54,819 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.produce(EatWhatYouKill.java:137) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 49 more 2025-07-07 11:55:55,685 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:55:56,686 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:55:57,688 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:55:58,690 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:55:59,691 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:56:00,693 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:56:01,695 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:56:02,696 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:56:03,699 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:56:04,700 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:56:05,702 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:56:06,704 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:56:07,706 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:56:08,707 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:56:09,709 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:56:10,711 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:56:11,713 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:56:12,714 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:56:13,716 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:56:14,718 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:56:15,719 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:56:16,721 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:56:17,722 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:56:18,724 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:56:19,725 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:56:20,727 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:56:21,729 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:56:22,730 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:56:23,732 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:56:24,733 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:56:25,736 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:56:26,738 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:56:27,739 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:56:27,743 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 11:56:33,745 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:56:34,747 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:56:35,748 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:56:36,750 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:56:37,752 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:56:38,753 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:56:39,755 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:56:40,757 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:56:41,758 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:56:42,760 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:56:43,762 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:56:44,763 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:56:45,765 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:56:46,767 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:56:47,768 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:56:48,770 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:56:49,771 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:56:50,773 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:56:51,775 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:56:51,829 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 11:56:51,830 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 11:56:51,832 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 11:56:52,776 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:56:53,778 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:56:54,779 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:56:55,781 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:56:56,783 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:56:57,784 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:56:58,786 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:56:59,787 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:57:00,789 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:57:01,791 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:57:02,792 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:57:03,794 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:57:04,795 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:57:05,797 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:57:06,799 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:57:07,800 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:57:08,802 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:57:09,803 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:57:10,805 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:57:11,806 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:57:12,808 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:57:13,809 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:57:14,811 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:57:15,813 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:57:16,814 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:57:17,816 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:57:18,818 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:57:19,819 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:57:20,821 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:57:21,823 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:57:22,825 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:57:22,827 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 11:57:28,830 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:57:29,831 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:57:30,833 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:57:31,834 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:57:32,836 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:57:33,838 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:57:34,839 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:57:35,841 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:57:36,843 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:57:37,844 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:57:38,846 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:57:39,848 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:57:40,850 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:57:41,851 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:57:42,853 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:57:43,855 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:57:44,856 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:57:45,858 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:57:46,859 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:57:47,861 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:57:48,863 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:57:49,864 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:57:50,866 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:57:51,811 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 11:57:51,813 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 11:57:51,815 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 11:57:51,867 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:57:52,869 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:57:53,871 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:57:54,872 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:57:55,874 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:57:56,876 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:57:57,877 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:57:58,879 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:57:59,881 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:58:00,882 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:58:01,884 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:58:02,886 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:58:03,887 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:58:04,889 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:58:05,891 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:58:06,893 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:58:07,894 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:58:08,896 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:58:09,898 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:58:10,900 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:58:11,901 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:58:12,903 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:58:13,904 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:58:14,906 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:58:15,908 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:58:16,909 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:58:17,911 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:58:17,913 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 11:58:23,916 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:58:24,918 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:58:25,919 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:58:26,921 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:58:27,923 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:58:28,924 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:58:29,926 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:58:30,928 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:58:31,929 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:58:32,931 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:58:33,933 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:58:34,934 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:58:35,936 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:58:36,938 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:58:37,940 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:58:38,941 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:58:39,943 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:58:40,944 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:58:41,946 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:58:42,948 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:58:43,950 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:58:44,951 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:58:45,953 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:58:46,955 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:58:47,956 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:58:48,958 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:58:49,959 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:58:50,961 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:58:51,824 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 11:58:51,825 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 11:58:51,826 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 11:58:51,963 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:58:52,964 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:58:53,966 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:58:54,968 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:58:55,969 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:58:56,971 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:58:57,973 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:58:58,974 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:58:59,976 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:59:00,978 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:59:01,979 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:59:02,981 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:59:03,983 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:59:04,984 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:59:05,986 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:59:06,988 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:59:07,989 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:59:08,991 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:59:09,992 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:59:10,994 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:59:11,996 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:59:12,997 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:59:13,000 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 11:59:19,002 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:59:20,004 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:59:21,005 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:59:22,007 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:59:23,008 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:59:24,010 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:59:25,012 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:59:26,013 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:59:27,015 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:59:28,017 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:59:29,018 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:59:30,020 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:59:31,022 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:59:32,023 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:59:33,025 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:59:34,027 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:59:35,029 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:59:36,030 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:59:37,032 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:59:38,034 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:59:39,035 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:59:40,037 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:59:41,039 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:59:42,041 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:59:43,042 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:59:44,044 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:59:45,046 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:59:46,047 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:59:47,049 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:59:48,051 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:59:49,052 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:59:50,054 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:59:51,055 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:59:51,812 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 11:59:51,814 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 11:59:51,815 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 11:59:52,057 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:59:53,059 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:59:54,060 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:59:55,062 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:59:56,063 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:59:57,065 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:59:58,066 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 11:59:59,068 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:00:00,069 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:00:01,071 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:00:02,072 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:00:03,074 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:00:04,075 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:00:05,077 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:00:06,079 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:00:07,080 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:00:08,082 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:00:08,084 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 12:00:14,086 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:00:15,088 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:00:16,089 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:00:17,091 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:00:18,092 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:00:19,094 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:00:20,095 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:00:21,097 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:00:22,098 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:00:23,100 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:00:24,101 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:00:25,103 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:00:26,104 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:00:27,105 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:00:28,107 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:00:29,108 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:00:30,110 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:00:31,111 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:00:32,113 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:00:33,114 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:00:34,115 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:00:35,117 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:00:36,118 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:00:37,120 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:00:38,121 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:00:39,123 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:00:40,124 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:00:41,126 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:00:42,127 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:00:43,128 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:00:44,130 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:00:45,131 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:00:46,133 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:00:47,134 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:00:48,136 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:00:49,137 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:00:50,138 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:00:51,140 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:00:51,824 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:00:51,826 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:00:51,827 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 12:00:52,141 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:00:53,143 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:00:54,144 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:00:55,146 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:00:56,147 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:00:57,149 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:00:58,150 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:00:59,152 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:01:00,153 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:01:01,154 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:01:02,156 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:01:03,157 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:01:03,160 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 12:01:09,162 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:01:10,164 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:01:11,165 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:01:12,166 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:01:13,168 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:01:14,169 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:01:15,171 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:01:16,172 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:01:17,173 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:01:18,174 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:01:19,176 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:01:20,177 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:01:21,178 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:01:22,179 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:01:23,181 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:01:24,182 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:01:25,183 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:01:26,185 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:01:27,186 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:01:28,187 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:01:29,189 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:01:30,190 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:01:31,191 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:01:32,193 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:01:33,194 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:01:34,195 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:01:35,197 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:01:36,198 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:01:37,199 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:01:38,201 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:01:39,202 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:01:40,203 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:01:41,205 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:01:42,207 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:01:43,208 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:01:44,209 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:01:45,210 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:01:46,212 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:01:47,213 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:01:48,214 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:01:49,216 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:01:50,217 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:01:51,219 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:01:51,815 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:01:51,816 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:01:51,817 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 12:01:52,220 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:01:53,221 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:01:54,223 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:01:55,224 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:01:56,225 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:01:57,226 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:01:58,228 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:01:58,230 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 12:02:04,232 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:02:05,233 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:02:06,234 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:02:07,236 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:02:08,237 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:02:09,239 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:02:10,240 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:02:11,241 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:02:12,243 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:02:13,244 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:02:14,245 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:02:15,247 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:02:16,248 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:02:17,249 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:02:18,251 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:02:19,252 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:02:20,253 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:02:21,255 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:02:22,256 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:02:23,257 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:02:24,259 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:02:25,260 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:02:26,261 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:02:27,263 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:02:28,264 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:02:29,265 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:02:30,267 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:02:31,268 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:02:32,270 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:02:33,271 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:02:34,272 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:02:35,273 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:02:36,275 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:02:37,276 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:02:38,277 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:02:39,279 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:02:40,280 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:02:41,281 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:02:42,283 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:02:43,284 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:02:44,285 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:02:45,286 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:02:46,288 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:02:47,289 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:02:48,290 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:02:49,291 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:02:50,293 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:02:51,294 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:02:51,828 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:02:51,829 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:02:51,830 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 12:02:52,295 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:02:53,297 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:02:53,298 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 12:02:59,300 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:03:00,301 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:03:01,303 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:03:02,304 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:03:03,305 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:03:04,307 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:03:05,308 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:03:06,309 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:03:07,310 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:03:08,312 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:03:09,313 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:03:10,314 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:03:11,315 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:03:12,317 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:03:13,318 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:03:14,319 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:03:15,321 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:03:16,322 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:03:17,323 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:03:18,325 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:03:19,326 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:03:20,327 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:03:21,328 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:03:22,330 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:03:23,332 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:03:24,333 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:03:25,334 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:03:26,336 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:03:27,337 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:03:28,338 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:03:29,340 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:03:30,341 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:03:31,342 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:03:32,343 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:03:33,345 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:03:34,346 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:03:35,347 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:03:36,348 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:03:37,350 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:03:38,351 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:03:39,353 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:03:40,354 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:03:41,355 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:03:42,356 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:03:43,358 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:03:44,359 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:03:45,360 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:03:46,362 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:03:47,363 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:03:48,364 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:03:48,366 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 12:03:51,811 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:03:51,812 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:03:51,814 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 12:03:54,368 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:03:55,369 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:03:56,371 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:03:57,372 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:03:58,373 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:03:59,374 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:04:00,376 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:04:01,377 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:04:02,378 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:04:03,380 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:04:04,381 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:04:05,382 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:04:06,383 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:04:07,385 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:04:08,386 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:04:09,388 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:04:10,389 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:04:11,390 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:04:12,392 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:04:13,393 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:04:14,394 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:04:15,395 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:04:16,397 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:04:17,398 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:04:18,399 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:04:19,400 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:04:20,402 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:04:21,403 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:04:22,404 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:04:23,405 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:04:24,407 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:04:25,408 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:04:26,409 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:04:27,411 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:04:28,412 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:04:29,413 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:04:30,415 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:04:31,416 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:04:32,417 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:04:33,419 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:04:34,420 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:04:35,421 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:04:36,423 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:04:37,424 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:04:38,425 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:04:39,427 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:04:40,428 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:04:41,429 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:04:42,430 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:04:43,432 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:04:43,433 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 12:04:49,435 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:04:50,437 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:04:51,438 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:04:51,826 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:04:51,828 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:04:51,829 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 12:04:52,439 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:04:53,440 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:04:54,442 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:04:55,443 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:04:56,445 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:04:57,446 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:04:58,447 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:04:59,448 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:05:00,450 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:05:01,451 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:05:02,452 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:05:03,453 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:05:04,455 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:05:05,456 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:05:06,457 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:05:07,459 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:05:08,460 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:05:09,462 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:05:10,463 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:05:11,464 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:05:12,466 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:05:13,467 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:05:14,468 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:05:15,469 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:05:16,471 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:05:17,472 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:05:18,473 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:05:19,475 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:05:20,476 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:05:21,477 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:05:22,478 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:05:23,480 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:05:24,481 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:05:25,483 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:05:26,484 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:05:27,485 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:05:28,486 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:05:29,488 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:05:30,489 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:05:31,490 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:05:32,491 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:05:33,493 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:05:34,494 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:05:35,495 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:05:36,496 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:05:37,498 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:05:38,499 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:05:38,501 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 12:05:44,503 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:05:45,504 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:05:46,505 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:05:47,507 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:05:48,508 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:05:49,509 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:05:50,510 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:05:51,512 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:05:51,817 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:05:51,818 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:05:51,820 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 12:05:52,513 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:05:53,514 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:05:54,516 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:05:55,517 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:05:56,518 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:05:57,520 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:05:58,521 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:05:59,522 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:06:00,523 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:06:01,525 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:06:02,526 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:06:03,527 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:06:04,528 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:06:05,530 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:06:06,531 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:06:07,532 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:06:08,533 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:06:09,535 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:06:10,536 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:06:11,538 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:06:12,539 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:06:13,540 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:06:14,541 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:06:15,543 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:06:16,544 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:06:17,545 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:06:18,546 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:06:19,548 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:06:20,549 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:06:21,550 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:06:22,552 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:06:23,554 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:06:24,555 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:06:25,556 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:06:26,557 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:06:27,558 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:06:28,560 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:06:29,561 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:06:30,562 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:06:31,563 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:06:32,565 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:06:33,566 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:06:33,567 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 12:06:39,569 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:06:40,571 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:06:41,572 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:06:42,573 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:06:43,574 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:06:44,576 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:06:45,577 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:06:46,578 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:06:47,580 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:06:48,581 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:06:49,582 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:06:50,584 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:06:51,585 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:06:51,827 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:06:51,828 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:06:51,829 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 12:06:52,586 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:06:53,587 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:06:54,589 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:06:55,590 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:06:56,591 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:06:57,592 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:06:58,594 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:06:59,595 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:07:00,596 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:07:01,598 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:07:02,599 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:07:03,600 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:07:04,601 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:07:05,603 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:07:06,604 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:07:07,605 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:07:08,607 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:07:09,608 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:07:10,609 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:07:11,611 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:07:12,612 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:07:13,613 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:07:14,614 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:07:15,616 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:07:16,617 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:07:17,618 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:07:18,620 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:07:19,621 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:07:20,622 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:07:21,624 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:07:22,625 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:07:23,627 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:07:24,628 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:07:25,629 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:07:26,630 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:07:27,632 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:07:28,633 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:07:28,635 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 12:07:34,636 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:07:35,638 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:07:36,639 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:07:37,640 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:07:38,641 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:07:39,643 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:07:40,644 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:07:41,645 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:07:42,647 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:07:43,648 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:07:44,649 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:07:45,651 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:07:46,652 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:07:47,653 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:07:48,655 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:07:49,656 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:07:50,657 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:07:51,659 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:07:51,815 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:07:51,817 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:07:51,818 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 12:07:52,660 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:07:53,661 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:07:54,663 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:07:55,664 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:07:56,665 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:07:57,666 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:07:58,668 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:07:59,669 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:08:00,671 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:08:01,672 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:08:02,673 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:08:03,675 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:08:04,676 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:08:05,677 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:08:06,679 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:08:07,680 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:08:08,681 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:08:09,683 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:08:10,684 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:08:11,685 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:08:12,687 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:08:13,688 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:08:14,689 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:08:15,691 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:08:16,692 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:08:17,693 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:08:18,695 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:08:19,696 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:08:20,697 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:08:21,699 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:08:22,700 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:08:23,702 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:08:23,703 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 12:08:29,705 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:08:30,707 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:08:31,708 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:08:32,709 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:08:33,711 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:08:34,712 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:08:35,713 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:08:36,715 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:08:37,716 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:08:38,717 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:08:39,719 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:08:40,720 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:08:41,721 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:08:42,723 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:08:43,724 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:08:44,725 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:08:45,726 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:08:46,728 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:08:47,729 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:08:48,730 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:08:49,732 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:08:50,733 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:08:51,734 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:08:51,829 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:08:51,831 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:08:51,832 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 12:08:52,735 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:08:53,737 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:08:54,738 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:08:55,739 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:08:56,741 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:08:57,742 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:08:58,743 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:08:59,745 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:09:00,746 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:09:01,747 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:09:02,748 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:09:03,750 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:09:04,751 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:09:05,752 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:09:06,754 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:09:07,755 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:09:08,756 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:09:09,758 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:09:10,759 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:09:11,760 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:09:12,762 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:09:13,763 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:09:14,764 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:09:15,765 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:09:16,767 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:09:17,768 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:09:18,769 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:09:18,771 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 12:09:24,773 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:09:25,774 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:09:26,776 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:09:27,777 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:09:28,778 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:09:29,779 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:09:30,781 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:09:31,782 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:09:32,783 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:09:33,785 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:09:34,786 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:09:35,787 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:09:36,788 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:09:37,790 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:09:38,791 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:09:39,792 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:09:40,794 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:09:41,795 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:09:42,796 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:09:43,798 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:09:44,799 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:09:45,800 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:09:46,801 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:09:47,803 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:09:48,804 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:09:49,805 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:09:50,807 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:09:51,808 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:09:51,834 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:09:51,835 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:09:51,836 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 12:09:52,809 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:09:53,810 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:09:54,812 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:09:55,813 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:09:56,814 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:09:57,816 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:09:58,817 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:09:59,818 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:10:00,820 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:10:01,821 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:10:02,822 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:10:03,823 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:10:04,825 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:10:05,826 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:10:06,827 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:10:07,828 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:10:08,829 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:10:09,831 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:10:10,832 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:10:11,833 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:10:12,835 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:10:13,836 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:10:13,839 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 12:10:19,840 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:10:20,842 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:10:21,843 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:10:22,844 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:10:23,845 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:10:24,847 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:10:25,848 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:10:26,849 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:10:27,850 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:10:28,852 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:10:29,853 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:10:30,854 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:10:31,855 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:10:32,857 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:10:33,858 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:10:34,859 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:10:35,860 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:10:36,862 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:10:37,863 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:10:38,864 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:10:39,866 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:10:40,867 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:10:41,868 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:10:42,870 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:10:43,871 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:10:44,872 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:10:45,874 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:10:46,875 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:10:47,876 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:10:48,877 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:10:49,879 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:10:50,880 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:10:51,828 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:10:51,829 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:10:51,830 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 12:10:51,881 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:10:52,883 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:10:53,884 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:10:54,885 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:10:55,886 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:10:56,888 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:10:57,889 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:10:58,890 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:10:59,891 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:11:00,893 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:11:01,894 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:11:02,895 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:11:03,896 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:11:04,898 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:11:05,899 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:11:06,900 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:11:07,901 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:11:08,902 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:11:08,905 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 12:11:14,907 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:11:15,908 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:11:16,909 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:11:17,911 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:11:18,912 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:11:19,913 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:11:20,914 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:11:21,916 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:11:22,917 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:11:23,918 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:11:24,919 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:11:25,921 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:11:26,922 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:11:27,923 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:11:28,924 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:11:29,926 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:11:30,927 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:11:31,928 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:11:32,929 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:11:33,931 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:11:34,932 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:11:35,933 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:11:36,935 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:11:37,936 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:11:38,937 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:11:39,938 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:11:40,940 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:11:41,941 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:11:42,942 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:11:43,943 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:11:44,945 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:11:45,946 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:11:46,947 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:11:47,948 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:11:48,950 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:11:49,951 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:11:50,952 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:11:51,812 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:11:51,813 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:11:51,815 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 12:11:51,954 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:11:52,954 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:11:53,956 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:11:54,957 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:11:55,958 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:11:56,959 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:11:57,961 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:11:58,962 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:11:59,963 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:12:00,965 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:12:01,966 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:12:02,967 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:12:03,968 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:12:03,971 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 12:12:09,973 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:12:10,974 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:12:11,975 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:12:12,977 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:12:13,978 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:12:14,979 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:12:15,980 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:12:16,981 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:12:17,983 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:12:18,984 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:12:19,985 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:12:20,986 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:12:21,988 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:12:22,989 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:12:23,990 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:12:24,991 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:12:25,993 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:12:26,994 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:12:27,995 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:12:28,997 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:12:29,998 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:12:30,999 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:12:32,000 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:12:33,002 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:12:34,003 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:12:35,004 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:12:36,006 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:12:37,007 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:12:38,008 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:12:39,009 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:12:40,011 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:12:41,012 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:12:42,013 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:12:43,015 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:12:44,016 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:12:45,017 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:12:46,019 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:12:47,020 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:12:48,021 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:12:49,022 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:12:50,024 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:12:51,025 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:12:51,825 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:12:51,826 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:12:51,827 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 12:12:52,026 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:12:53,027 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:12:54,029 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:12:55,030 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:12:56,031 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:12:57,032 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:12:58,033 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:12:59,035 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:12:59,037 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 12:13:05,039 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:13:06,040 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:13:07,041 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:13:08,042 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:13:09,044 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:13:10,045 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:13:11,046 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:13:12,048 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:13:13,049 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:13:14,050 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:13:15,051 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:13:16,052 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:13:17,054 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:13:18,055 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:13:19,056 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:13:20,058 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:13:21,059 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:13:22,060 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:13:23,061 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:13:24,063 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:13:25,064 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:13:26,065 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:13:27,067 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:13:28,068 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:13:29,069 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:13:30,070 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:13:31,072 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:13:32,073 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:13:33,074 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:13:34,076 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:13:35,077 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:13:36,078 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:13:37,079 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:13:38,081 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:13:39,082 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:13:40,083 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:13:41,085 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:13:42,086 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:13:43,087 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:13:44,088 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:13:45,090 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:13:46,091 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:13:47,092 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:13:48,093 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:13:49,095 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:13:50,096 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:13:51,097 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:13:51,817 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:13:51,818 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:13:51,821 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 12:13:52,098 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:13:53,100 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:13:54,101 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:13:54,102 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 12:14:00,105 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:14:01,106 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:14:02,107 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:14:03,108 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:14:04,110 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:14:05,111 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:14:06,112 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:14:07,113 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:14:08,115 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:14:09,116 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:14:10,117 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:14:11,118 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:14:12,120 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:14:13,121 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:14:14,122 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:14:15,123 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:14:16,124 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:14:17,125 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:14:18,127 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:14:19,128 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:14:20,129 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:14:21,130 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:14:22,131 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:14:23,133 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:14:24,134 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:14:25,136 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:14:26,137 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:14:27,138 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:14:28,139 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:14:29,140 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:14:30,142 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:14:31,143 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:14:32,144 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:14:33,145 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:14:34,147 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:14:35,148 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:14:36,149 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:14:37,150 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:14:38,152 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:14:39,153 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:14:40,154 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:14:41,155 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:14:42,157 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:14:43,158 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:14:44,159 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:14:45,160 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:14:46,162 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:14:47,163 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:14:48,164 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:14:49,165 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:14:49,168 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 12:14:51,828 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:14:51,829 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:14:51,831 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 12:14:55,170 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:14:56,171 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:14:57,172 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:14:58,173 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:14:59,175 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:15:00,176 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:15:01,177 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:15:02,178 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:15:03,179 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:15:04,181 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:15:05,182 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:15:06,183 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:15:07,184 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:15:08,185 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:15:09,187 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:15:10,188 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:15:11,189 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:15:12,191 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:15:13,192 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:15:14,193 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:15:15,194 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:15:16,195 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:15:17,197 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:15:18,198 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:15:19,199 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:15:20,200 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:15:21,202 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:15:22,203 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:15:23,204 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:15:24,205 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:15:25,207 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:15:26,208 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:15:27,209 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:15:28,210 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:15:29,212 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:15:30,213 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:15:31,214 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:15:32,216 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:15:33,217 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:15:34,218 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:15:35,219 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:15:36,221 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:15:37,222 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:15:38,223 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:15:39,224 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:15:40,226 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:15:41,227 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:15:42,228 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:15:43,230 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:15:44,231 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:15:44,233 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 12:15:50,234 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:15:51,236 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:15:51,817 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:15:51,818 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:15:51,819 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 12:15:52,237 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:15:53,238 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:15:54,239 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:15:55,241 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:15:56,242 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:15:57,243 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:15:58,244 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:15:59,246 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:16:00,247 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:16:01,248 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:16:02,249 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:16:03,250 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:16:04,251 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:16:05,253 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:16:06,254 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:16:07,255 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:16:08,256 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:16:09,258 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:16:10,259 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:16:11,260 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:16:12,262 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:16:13,263 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:16:14,264 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:16:15,265 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:16:16,267 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:16:17,268 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:16:18,269 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:16:19,270 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:16:20,271 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:16:21,273 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:16:22,274 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:16:23,276 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:16:24,277 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:16:25,278 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:16:26,280 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:16:27,281 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:16:28,282 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:16:29,283 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:16:30,284 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:16:31,286 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:16:32,287 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:16:33,288 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:16:34,289 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:16:35,291 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:16:36,292 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:16:37,293 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:16:38,294 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:16:39,296 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:16:39,297 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 12:16:45,299 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:16:46,300 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:16:47,301 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:16:48,303 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:16:49,304 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:16:50,305 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:16:51,306 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:16:51,834 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:16:51,835 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:16:51,836 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 12:16:52,308 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:16:53,309 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:16:54,310 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:16:55,311 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:16:56,312 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:16:57,313 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:16:58,315 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:16:59,316 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:17:00,317 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:17:01,318 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:17:02,320 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:17:03,321 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:17:04,322 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:17:05,323 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:17:06,325 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:17:07,326 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:17:08,327 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:17:09,328 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:17:10,330 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:17:11,331 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:17:12,332 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:17:13,333 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:17:14,335 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:17:15,336 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:17:16,337 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:17:17,338 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:17:18,340 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:17:19,341 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:17:20,342 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:17:21,343 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:17:22,344 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:17:23,346 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:17:24,348 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:17:25,349 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:17:26,350 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:17:27,351 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:17:28,353 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:17:29,354 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:17:30,355 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:17:31,356 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:17:32,357 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:17:33,359 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:17:34,360 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:17:34,362 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 12:17:40,364 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:17:41,365 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:17:42,366 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:17:43,368 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:17:44,369 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:17:45,370 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:17:46,371 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:17:47,372 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:17:48,374 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:17:49,375 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:17:50,376 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:17:51,377 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:17:51,817 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:17:51,818 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:17:51,820 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 12:17:52,378 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:17:53,380 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:17:54,381 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:17:55,382 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:17:56,384 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:17:57,385 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:17:58,386 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:17:59,387 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:18:00,388 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:18:01,390 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:18:02,391 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:18:03,392 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:18:04,393 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:18:05,395 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:18:06,396 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:18:07,397 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:18:08,398 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:18:09,399 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:18:10,401 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:18:11,402 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:18:12,404 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:18:13,405 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:18:14,406 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:18:15,407 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:18:16,409 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:18:17,410 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:18:18,411 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:18:19,412 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:18:20,414 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:18:21,415 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:18:22,416 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:18:23,418 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:18:24,419 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:18:25,421 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:18:26,422 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:18:27,423 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:18:28,424 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:18:29,426 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:18:29,427 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 12:18:35,429 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:18:36,430 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:18:37,432 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:18:38,433 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:18:39,434 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:18:40,436 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:18:41,437 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:18:42,438 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:18:43,440 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:18:44,441 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:18:45,442 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:18:46,443 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:18:47,445 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:18:48,446 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:18:49,447 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:18:50,448 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:18:51,449 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:18:51,830 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:18:51,831 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:18:51,832 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 12:18:52,451 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:18:53,452 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:18:54,453 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:18:55,454 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:18:56,456 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:18:57,457 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:18:58,458 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:18:59,459 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:19:00,460 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:19:01,462 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:19:02,463 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:19:03,464 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:19:04,465 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:19:05,467 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:19:06,468 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:19:07,469 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:19:08,470 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:19:09,472 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:19:10,473 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:19:11,474 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:19:12,476 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:19:13,477 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:19:14,478 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:19:15,479 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:19:16,480 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:19:17,481 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:19:18,483 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:19:19,484 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:19:20,485 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:19:21,486 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:19:22,487 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:19:23,489 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:19:24,490 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:19:24,492 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 12:19:30,493 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:19:31,495 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:19:32,496 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:19:33,497 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:19:34,498 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:19:35,499 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:19:36,501 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:19:37,502 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:19:38,503 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:19:39,504 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:19:40,506 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:19:41,507 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:19:42,508 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:19:43,509 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:19:44,511 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:19:45,255 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: RECEIVED SIGNAL 15: SIGTERM 2025-07-07 12:19:45,259 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG: /************************************************************ SHUTDOWN_MSG: Shutting down DataNode at dmidlkprdls04.svr.luc.edu/192.168.158.4 ************************************************************/ 2025-07-07 12:39:29,489 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: STARTUP_MSG: /************************************************************ STARTUP_MSG: Starting DataNode STARTUP_MSG: host = dmidlkprdls04.svr.luc.edu/192.168.158.4 STARTUP_MSG: args = [] STARTUP_MSG: version = 3.1.1.7.3.1.0-197 STARTUP_MSG: classpath = /var/run/cloudera-scm-agent/process/95-hdfs-DATANODE:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/audience-annotations-0.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/avro.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/aws-java-sdk-bundle-1.12.720.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-hdfs-plugin-shim-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-plugin-classloader-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-yarn-plugin-shim-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/azure-data-lake-store-sdk-2.3.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/checker-qual-3.33.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-beanutils-1.9.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-cli-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-codec-1.15.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-collections-3.2.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-compress-1.23.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-configuration2-2.10.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-io-2.11.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-lang-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-lang3-3.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-logging-1.1.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-math3-3.6.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-net-3.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-text-1.10.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-client-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-framework-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-recipes-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/failureaccess-1.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/gson-2.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/guava-32.0.1-jre.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/httpclient-4.5.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/httpcore-4.4.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/j2objc-annotations-2.8.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-core-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-core-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-databind-2.12.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-mapper-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-xc-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/javax.activation-api-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/javax.servlet-api-3.1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jaxb-api-2.2.11.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jcip-annotations-1.0-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-core-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-json-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-server-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-servlet-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jettison-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-http-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-io-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-security-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-server-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-servlet-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-util-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-util-ajax-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-webapp-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-xml-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jline-3.22.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsch-0.1.54.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsp-api-2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsr305-3.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsr311-api-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jul-to-slf4j-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-admin-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-client-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-common-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-core-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-crypto-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-identity-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-server-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-simplekdc-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-asn1-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-config-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-pkix-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-xdr-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/logredactor-2.0.16.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/lz4-java-1.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/metrics-core-3.2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-all-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-buffer-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-haproxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-http-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-http2-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-memcache-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-mqtt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-redis-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-smtp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-log4j12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-reload4j-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-api-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/reload4j-1.2.22.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/zookeeper.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-udt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-unix-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-kqueue-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final-linux-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-classes-kqueue-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-rxtx-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-classes-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-classes-macos-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-ssl-ocsp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-proxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-xml-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-stomp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-socks-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/wildfly-openssl-2.1.4.ClouderaFinal.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/protobuf-java-2.5.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/nimbus-jose-jwt-9.37.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/woodstox-core-5.4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/token-provider-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/stax2-api-4.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/snappy-java-1.1.10.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/re2j-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/zookeeper-jute.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-sctp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-kqueue-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final-linux-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//gcs-connector-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-annotations.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-auth.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-aws.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-datalake.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-kms.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-nfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//ozone-filesystem-hadoop3-1.3.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-nfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-kms-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-datalake-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-aws-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-auth-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-annotations-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//gcs-connector-2.1.2.7.3.1.0-197-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-thrift.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-scala_2.12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-protobuf.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-pig.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-pig-bundle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-jackson.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-hadoop.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-hadoop-bundle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-generator.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-format-structures.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-encoding.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-column.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-cascading3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-cascading.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-avro.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/./:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/audience-annotations-0.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/avro-1.11.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/checker-qual-3.33.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-beanutils-1.9.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-codec-1.15.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-collections-3.2.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-compress-1.23.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-configuration2-2.10.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-io-2.11.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-lang3-3.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-math3-3.6.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-net-3.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-text-1.10.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-client-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-framework-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-recipes-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/failureaccess-1.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/gson-2.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/guava-32.0.1-jre.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/httpclient-4.5.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/httpcore-4.4.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/j2objc-annotations-2.8.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-core-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-databind-2.12.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-jaxrs-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-xc-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/javax.activation-api-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/javax.servlet-api-3.1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jaxb-api-2.2.11.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jaxb-impl-2.2.3-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jcip-annotations-1.0-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-core-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-json-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-server-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-servlet-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jettison-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-http-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-io-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-security-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-server-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-servlet-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-util-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-util-ajax-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-webapp-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-xml-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jline-3.22.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsch-0.1.54.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/json-simple-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsr311-api-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-admin-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-client-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-common-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-core-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-crypto-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-identity-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-server-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-simplekdc-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-asn1-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-config-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-pkix-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-xdr-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/leveldbjni-cldr-1.9.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/lz4-java-1.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/metrics-core-3.2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-all-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-buffer-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-haproxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-http-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-http2-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-memcache-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-mqtt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-redis-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-smtp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-socks-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-stomp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-xml-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-proxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-ssl-ocsp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/re2j-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/zookeeper-3.8.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-sctp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-rxtx-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-unix-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-kqueue-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-kqueue-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final-linux-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final-linux-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-classes-kqueue-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-classes-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-classes-macos-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/nimbus-jose-jwt-9.37.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/woodstox-core-5.4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/token-provider-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/stax2-api-4.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/snappy-java-1.1.10.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/reload4j-1.2.22.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/zookeeper-jute-3.8.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-udt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-httpfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-nfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-nfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-httpfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//asm-5.0.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//aspectjweaver-1.9.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//azure-storage-7.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//checker-compat-qual-2.5.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-slf4j-backend-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-system-backend-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//google-extensions-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//azure-keyvault-core-1.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//accessors-smart-2.4.9.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gcs-connector-2.1.2.7.3.1.0-197-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archive-logs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archive-logs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archives-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archives.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-datajoin-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-datajoin.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-distcp-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-distcp.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-extras-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-extras.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-fs2img-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-fs2img.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-gridmix-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-gridmix.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-kafka-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-kafka.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-app-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-app.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-core-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-core.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-nativetask-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-nativetask.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-uploader-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-uploader.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-examples-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-examples.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-openstack-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-openstack.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-resourceestimator-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-resourceestimator.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-rumen-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-rumen.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-sls-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-sls.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-streaming-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-streaming.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ojalgo-43.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//kafka-clients-2.8.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//log4j-core-2.18.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-hook-abfs-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//forbiddenapis-3.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-intg-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//log4j-api-2.18.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//zstd-jni-1.4.9-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-i18n.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-s3-lib-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//javax.activation-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//bundle-2.23.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//json-smart-2.4.10.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-util-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-shell.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-cloud-bindings.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-hook-s3-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//aspectjrt-1.9.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/./:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/swagger-annotations-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/objenesis-1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jersey-client-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jakarta.xml.bind-api-2.3.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jakarta.activation-api-1.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-dataformat-yaml-2.9.10.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcutil-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcprov-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcpkix-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/HikariCP-java7-2.4.12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/snakeyaml-2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jsonschema2pojo-core-1.0.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/joda-time-2.10.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jna-5.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jersey-guice-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/javax.inject-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-module-jaxb-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-jaxrs-json-provider-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-jaxrs-base-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/guice-servlet-4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/guice-4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/geronimo-jcache_1.0_spec-1.0-alpha-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/fst-2.50.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/ehcache-3.3.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/dnsjava-2.1.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/codemodel-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-api.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-distributedshell.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-cloud-resourcemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-registry.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-nodemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-resourcemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-router.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-sharedcachemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-timeline-pluginstorage.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-web-proxy.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-api.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-core.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-registry-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-api-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-core-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-api-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-web-proxy-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-timeline-pluginstorage-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-tests-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-sharedcachemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-router-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-resourcemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-nodemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-cloud-resourcemanager-1.0.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-distributedshell-3.1.1.7.3.1.0-197.jar:/opt/cloudera/cm/lib/plugins/event-publish-7.13.1-shaded.jar:/opt/cloudera/cm/lib/plugins/tt-instrumentation-7.13.1.jar STARTUP_MSG: build = git@github.infra.cloudera.com:CDH/hadoop.git -r 31a42fb39494f541ffae15c3c61185deeeacca86; compiled by 'jenkins' on 2024-12-04T01:09Z STARTUP_MSG: java = 1.8.0_432 ************************************************************/ 2025-07-07 12:39:29,582 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: registered UNIX signal handlers for [TERM, HUP, INT] 2025-07-07 12:39:29,951 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d1/dfs/dn 2025-07-07 12:39:29,958 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d2/dfs/dn 2025-07-07 12:39:29,958 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d3/dfs/dn 2025-07-07 12:39:29,958 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d4/dfs/dn 2025-07-07 12:39:30,121 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: Loaded properties from hadoop-metrics2.properties 2025-07-07 12:39:30,230 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled Metric snapshot period at 10 second(s). 2025-07-07 12:39:30,230 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: DataNode metrics system started 2025-07-07 12:39:30,522 INFO org.apache.hadoop.hdfs.server.common.Util: dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling 2025-07-07 12:39:30,638 INFO org.apache.hadoop.hdfs.server.datanode.BlockScanner: Initialized block scanner with targetBytesPerSec 1048576 2025-07-07 12:39:30,648 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: File descriptor passing is enabled. 2025-07-07 12:39:30,649 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Configured hostname is dmidlkprdls04.svr.luc.edu 2025-07-07 12:39:30,650 INFO org.apache.hadoop.hdfs.server.common.Util: dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling 2025-07-07 12:39:30,658 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting DataNode with maxLockedMemory = 4294967296 2025-07-07 12:39:30,691 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened streaming server at /192.168.158.4:9866 2025-07-07 12:39:30,695 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Balancing bandwidth is 10485760 bytes/s 2025-07-07 12:39:30,695 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Number threads for balancing is 50 2025-07-07 12:39:30,699 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Balancing bandwidth is 10485760 bytes/s 2025-07-07 12:39:30,699 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Number threads for balancing is 50 2025-07-07 12:39:30,699 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Listening on UNIX domain socket: /var/run/hdfs-sockets/dn 2025-07-07 12:39:30,756 INFO org.eclipse.jetty.util.log: Logging initialized @2451ms to org.eclipse.jetty.util.log.Slf4jLog 2025-07-07 12:39:30,885 WARN org.apache.hadoop.security.authentication.server.AuthenticationFilter: Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /var/lib/hadoop-hdfs/hadoop-http-auth-signature-secret 2025-07-07 12:39:30,893 INFO org.apache.hadoop.http.HttpRequestLog: Http request log for http.requests.datanode is not defined 2025-07-07 12:39:30,904 INFO org.apache.hadoop.http.HttpServer2: Added global filter 'safety' (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter) 2025-07-07 12:39:30,906 INFO org.apache.hadoop.security.HttpCrossOriginFilterInitializer: CORS filter not enabled. Please set hadoop.http.cross-origin.enabled to 'true' to enable it 2025-07-07 12:39:30,907 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context datanode 2025-07-07 12:39:30,908 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context logs 2025-07-07 12:39:30,908 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context static 2025-07-07 12:39:30,949 INFO org.apache.hadoop.http.HttpServer2: Jetty bound to port 35247 2025-07-07 12:39:30,950 INFO org.eclipse.jetty.server.Server: jetty-9.4.54.v20240208; built: 2024-02-08T19:42:39.027Z; git: cef3fbd6d736a21e7d541a5db490381d95a2047d; jvm 1.8.0_432-b06 2025-07-07 12:39:31,005 INFO org.eclipse.jetty.server.session: DefaultSessionIdManager workerName=node0 2025-07-07 12:39:31,006 INFO org.eclipse.jetty.server.session: No SessionScavenger set, using defaults 2025-07-07 12:39:31,011 INFO org.eclipse.jetty.server.session: node0 Scavenging every 600000ms 2025-07-07 12:39:31,040 WARN org.apache.hadoop.security.authentication.server.AuthenticationFilter: Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /var/lib/hadoop-hdfs/hadoop-http-auth-signature-secret 2025-07-07 12:39:31,046 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.s.ServletContextHandler@1b11ef33{logs,/logs,file:///var/log/hadoop-hdfs/,AVAILABLE} 2025-07-07 12:39:31,047 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.s.ServletContextHandler@2f2bf0e2{static,/static,file:///opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/static/,AVAILABLE} 2025-07-07 12:39:31,177 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.w.WebAppContext@6ebd78d1{datanode,/,file:///opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/datanode/,AVAILABLE}{file:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/datanode} 2025-07-07 12:39:31,192 INFO org.eclipse.jetty.server.AbstractConnector: Started ServerConnector@7fb9f71f{HTTP/1.1, (http/1.1)}{localhost:35247} 2025-07-07 12:39:31,193 INFO org.eclipse.jetty.server.Server: Started @2888ms 2025-07-07 12:39:31,427 WARN com.cloudera.cmf.event.publish.EventStorePublisherWithRetry: Failed to publish event: SimpleEvent{attributes={ROLE=[hdfs-DATANODE-b30a464b10a57fdd49ea734cd52a8291], HOSTS=[dmidlkprdls04.svr.luc.edu], ROLE_TYPE=[DATANODE], CATEGORY=[LOG_MESSAGE], EVENTCODE=[EV_LOG_EVENT], SERVICE=[hdfs], SERVICE_TYPE=[HDFS], LOG_LEVEL=[WARN], HOST_IDS=[5c33df90-d247-4c6d-b9e0-5908a423580a], SEVERITY=[IMPORTANT]}, content=Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /var/lib/hadoop-hdfs/hadoop-http-auth-signature-secret, timestamp=1751909970885} java.io.IOException: Error connecting to dmidlkprdls01.svr.luc.edu/192.168.158.1:7184 at com.cloudera.cmf.event.shaded.org.apache.avro.ipc.netty.NettyTransceiver.getChannel(NettyTransceiver.java:269) at com.cloudera.cmf.event.shaded.org.apache.avro.ipc.netty.NettyTransceiver.(NettyTransceiver.java:197) at com.cloudera.cmf.event.shaded.org.apache.avro.ipc.netty.NettyTransceiver.(NettyTransceiver.java:120) at com.cloudera.cmf.event.publish.AvroEventStorePublishProxy.checkSpecificRequestor(AvroEventStorePublishProxy.java:120) at com.cloudera.cmf.event.publish.AvroEventStorePublishProxy.publishEvent(AvroEventStorePublishProxy.java:204) at com.cloudera.cmf.event.publish.EventStorePublisherWithRetry$PublishEventTask.run(EventStorePublisherWithRetry.java:242) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:750) Caused by: com.cloudera.cmf.event.shaded.io.netty.channel.AbstractChannel$AnnotatedNoRouteToHostException: No route to host: dmidlkprdls01.svr.luc.edu/192.168.158.1:7184 Caused by: java.net.NoRouteToHostException: No route to host at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:716) at com.cloudera.cmf.event.shaded.io.netty.channel.socket.nio.NioSocketChannel.doFinishConnect(NioSocketChannel.java:337) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.finishConnect(AbstractNioChannel.java:339) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:776) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:724) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:650) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:562) at com.cloudera.cmf.event.shaded.io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997) at com.cloudera.cmf.event.shaded.io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:39:31,549 INFO org.apache.hadoop.hdfs.server.datanode.web.DatanodeHttpServer: Listening HTTP traffic on /192.168.158.4:9864 2025-07-07 12:39:31,557 INFO org.apache.hadoop.util.JvmPauseMonitor: Starting JVM pause monitor 2025-07-07 12:39:31,557 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: dnUserName = hdfs 2025-07-07 12:39:31,557 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: supergroup = supergroup 2025-07-07 12:39:31,638 INFO org.apache.hadoop.ipc.CallQueueManager: Using callQueue: class java.util.concurrent.LinkedBlockingQueue queueCapacity: 1000 scheduler: class org.apache.hadoop.ipc.DefaultRpcScheduler 2025-07-07 12:39:31,662 INFO org.apache.hadoop.ipc.Server: Starting Socket Reader #1 for port 9867 2025-07-07 12:39:31,712 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened IPC server at /192.168.158.4:9867 2025-07-07 12:39:31,749 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Refresh request received for nameservices: null 2025-07-07 12:39:31,759 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting BPOfferServices for nameservices: 2025-07-07 12:39:31,831 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool (Datanode Uuid unassigned) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 starting to offer service 2025-07-07 12:39:31,842 INFO org.apache.hadoop.ipc.Server: IPC Server Responder: starting 2025-07-07 12:39:31,842 INFO org.apache.hadoop.ipc.Server: IPC Server listener on 9867: starting 2025-07-07 12:39:32,944 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:39:33,945 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:39:34,947 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:39:35,949 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:39:36,951 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:39:37,952 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:39:38,954 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:39:39,956 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:39:40,958 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:39:41,960 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:39:42,961 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:39:43,963 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:39:44,965 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:39:45,966 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:39:46,968 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:39:47,970 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:39:48,971 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:39:48,972 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:39:48,983 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:39:48,988 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 45 more 2025-07-07 12:39:49,276 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.produce(EatWhatYouKill.java:137) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:39:49,278 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.produce(EatWhatYouKill.java:137) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:39:49,279 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.produce(EatWhatYouKill.java:137) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 49 more 2025-07-07 12:39:49,976 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:39:50,978 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:39:51,979 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:39:52,431 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:39:52,433 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:39:52,434 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 12:39:52,981 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:39:53,983 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:39:54,984 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:39:55,986 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:39:56,988 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:39:57,989 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:39:58,991 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:39:59,993 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:40:00,994 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:40:01,997 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:40:02,999 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:40:04,001 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:40:05,003 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:40:06,005 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:40:07,007 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:40:08,008 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:40:09,010 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:40:10,012 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:40:11,013 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:40:12,015 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:40:13,017 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:40:14,018 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:40:15,020 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:40:16,022 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:40:17,024 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:40:18,025 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:40:19,027 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:40:20,030 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:40:21,032 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:40:22,034 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:40:22,038 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 12:40:28,040 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:40:29,042 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:40:30,044 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:40:31,045 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:40:32,047 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:40:33,049 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:40:34,051 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:40:35,052 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:40:36,054 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:40:37,056 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:40:38,058 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:40:39,059 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:40:40,061 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:40:41,063 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:40:42,064 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:40:43,066 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:40:44,068 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:40:45,069 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:40:46,071 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:40:47,073 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:40:48,074 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:40:49,076 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:40:50,078 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:40:51,079 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:40:52,081 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:40:52,416 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:40:52,418 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:40:52,419 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 12:40:53,083 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:40:54,084 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:40:55,086 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:40:56,088 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:40:57,089 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:40:58,091 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:40:59,093 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:41:00,094 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:41:01,096 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:41:02,098 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:41:03,100 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:41:04,101 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:41:05,103 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:41:06,105 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:41:07,107 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:41:08,108 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:41:09,110 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:41:10,112 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:41:11,114 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:41:12,115 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:41:13,117 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:41:14,119 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:41:15,121 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:41:16,122 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:41:17,124 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:41:17,126 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 12:41:23,128 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:41:24,130 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:41:25,132 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:41:26,133 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:41:27,135 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:41:28,137 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:41:29,138 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:41:30,140 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:41:31,142 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:41:32,143 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:41:33,145 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:41:34,147 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:41:35,149 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:41:36,150 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:41:37,152 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:41:38,154 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:41:39,155 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:41:40,157 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:41:41,159 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:41:42,160 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:41:43,162 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:41:44,163 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:41:45,165 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:41:46,167 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:41:47,168 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:41:48,170 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:41:49,172 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:41:50,173 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:41:51,175 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:41:52,176 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:41:52,432 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:41:52,434 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:41:52,435 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 12:41:53,178 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:41:54,179 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:41:55,181 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:41:56,182 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:41:57,184 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:41:58,186 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:41:59,187 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:42:00,189 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:42:01,190 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:42:02,192 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:42:03,194 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:42:04,196 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:42:05,197 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:42:06,199 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:42:07,200 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:42:08,202 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:42:09,203 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:42:10,205 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:42:11,207 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:42:12,208 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:42:12,211 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 12:42:18,213 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:42:19,215 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:42:20,216 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:42:21,218 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:42:22,220 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:42:23,221 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:42:24,223 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:42:25,225 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:42:26,226 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:42:27,228 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:42:28,230 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:42:29,231 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:42:30,233 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:42:31,235 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:42:32,237 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:42:33,238 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:42:34,240 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:42:35,241 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:42:36,243 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:42:37,244 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:42:38,246 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:42:39,248 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:42:40,249 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:42:41,251 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:42:42,252 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:42:43,254 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:42:44,256 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:42:45,257 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:42:46,259 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:42:47,261 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:42:48,262 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:42:49,264 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:42:50,266 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:42:51,267 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:42:52,269 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:42:52,414 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:42:52,415 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:42:52,416 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 12:42:53,271 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:42:54,272 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:42:55,274 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:42:56,275 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:42:57,277 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:42:58,279 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:42:59,280 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:43:00,282 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:43:01,284 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:43:02,285 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:43:03,287 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:43:04,289 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:43:05,290 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:43:06,292 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:43:07,293 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:43:07,296 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 12:43:13,298 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:43:14,300 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:43:15,301 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:43:16,303 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:43:17,305 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:43:18,306 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:43:19,308 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:43:20,309 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:43:21,311 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:43:22,312 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:43:23,314 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:43:24,316 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:43:25,317 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:43:26,319 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:43:27,321 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:43:28,322 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:43:29,324 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:43:30,325 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:43:31,327 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:43:32,329 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:43:33,330 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:43:34,332 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:43:35,334 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:43:36,335 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:43:37,337 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:43:38,338 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:43:39,340 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:43:40,342 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:43:41,343 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:43:42,345 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:43:43,346 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:43:44,348 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:43:45,349 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:43:46,351 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:43:47,353 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:43:48,354 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:43:49,356 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:43:50,357 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:43:51,359 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:43:52,360 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:43:52,412 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:43:52,414 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:43:52,415 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 12:43:53,362 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:43:54,363 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:43:55,365 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:43:56,366 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:43:57,368 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:43:58,369 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:43:59,371 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:44:00,372 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:44:01,374 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:44:02,375 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:44:02,378 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 12:44:08,380 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:44:09,381 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:44:10,383 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:44:11,384 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:44:12,386 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:44:13,387 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:44:14,389 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:44:15,390 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:44:16,392 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:44:17,393 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:44:18,395 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:44:19,396 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:44:20,398 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:44:21,399 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:44:22,401 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:44:23,402 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:44:24,404 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:44:25,405 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:44:26,406 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:44:27,408 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:44:28,409 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:44:29,411 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:44:30,412 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:44:31,414 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:44:32,415 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:44:33,417 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:44:34,418 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:44:35,420 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:44:36,421 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:44:37,423 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:44:38,424 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:44:39,426 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:44:40,427 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:44:41,429 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:44:42,430 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:44:43,431 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:44:44,433 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:44:45,434 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:44:46,436 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:44:47,437 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:44:48,439 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:44:49,440 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:44:50,441 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:44:51,443 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:44:52,434 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:44:52,435 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:44:52,436 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 12:44:52,444 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:44:53,446 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:44:54,447 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:44:55,449 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:44:56,450 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:44:57,452 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:44:57,454 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 12:45:03,456 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:45:04,458 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:45:05,460 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:45:06,461 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:45:07,462 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:45:08,464 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:45:09,465 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:45:10,466 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:45:11,468 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:45:12,469 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:45:13,470 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:45:14,472 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:45:15,473 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:45:16,474 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:45:17,476 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:45:18,477 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:45:19,478 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:45:20,480 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:45:21,481 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:45:22,482 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:45:23,484 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:45:24,485 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:45:25,486 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:45:26,488 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:45:27,489 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:45:28,491 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:45:29,492 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:45:30,493 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:45:31,495 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:45:32,496 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:45:33,498 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:45:34,499 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:45:35,501 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:45:36,502 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:45:37,503 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:45:38,505 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:45:39,506 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:45:40,507 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:45:41,509 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:45:42,510 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:45:43,512 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:45:44,513 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:45:45,515 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:45:46,516 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:45:47,517 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:45:48,519 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:45:49,520 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:45:50,521 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:45:51,523 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:45:52,418 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:45:52,419 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:45:52,420 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 12:45:52,524 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:45:52,525 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 12:45:58,527 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:45:59,529 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:46:00,530 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:46:01,531 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:46:02,533 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:46:03,535 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:46:04,536 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:46:05,537 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:46:06,539 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:46:07,540 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:46:08,542 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:46:09,543 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:46:10,544 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:46:11,546 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:46:12,547 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:46:13,548 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:46:14,550 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:46:15,551 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:46:16,552 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:46:17,554 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:46:18,555 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:46:19,556 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:46:20,558 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:46:21,559 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:46:22,560 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:46:23,562 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:46:24,564 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:46:25,565 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:46:26,566 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:46:27,568 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:46:28,569 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:46:29,570 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:46:30,571 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:46:31,573 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:46:32,574 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:46:33,575 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:46:34,577 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:46:35,578 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:46:36,579 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:46:37,580 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:46:38,582 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:46:39,583 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:46:40,584 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:46:41,586 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:46:42,587 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:46:43,588 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:46:44,590 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:46:45,591 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:46:46,592 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:46:47,593 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:46:47,595 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 12:46:52,429 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:46:52,430 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:46:52,432 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 12:46:53,597 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:46:54,598 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:46:55,599 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:46:56,601 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:46:57,602 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:46:58,603 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:46:59,604 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:47:00,606 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:47:01,607 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:47:02,608 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:47:03,610 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:47:04,611 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:47:05,612 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:47:06,614 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:47:07,615 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:47:08,616 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:47:09,617 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:47:10,619 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:47:11,620 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:47:12,621 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:47:13,623 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:47:14,624 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:47:15,625 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:47:16,626 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:47:17,628 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:47:18,629 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:47:19,630 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:47:20,632 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:47:21,633 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:47:22,634 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:47:23,636 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:47:24,637 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:47:25,638 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:47:26,640 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:47:27,641 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:47:28,642 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:47:29,643 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:47:30,645 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:47:31,646 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:47:32,647 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:47:33,649 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:47:34,650 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:47:35,651 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:47:36,653 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:47:37,654 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:47:38,655 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:47:39,657 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:47:40,658 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:47:41,659 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:47:42,660 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:47:42,662 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 12:47:48,664 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:47:49,665 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:47:50,667 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:47:51,668 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:47:52,417 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:47:52,418 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:47:52,419 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 12:47:52,669 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:47:53,670 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:47:54,672 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:47:55,673 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:47:56,674 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:47:57,676 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:47:58,677 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:47:59,678 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:48:00,679 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:48:01,681 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:48:02,682 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:48:03,683 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:48:04,685 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:48:05,686 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:48:06,687 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:48:07,689 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:48:08,690 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:48:09,691 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:48:10,692 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:48:11,694 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:48:12,695 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:48:13,696 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:48:14,697 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:48:15,699 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:48:16,700 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:48:17,701 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:48:18,703 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:48:19,704 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:48:20,705 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:48:21,707 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:48:22,708 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:48:23,710 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:48:24,711 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:48:25,712 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:48:26,714 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:48:27,715 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:48:28,716 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:48:29,718 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:48:30,719 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:48:31,720 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:48:32,722 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:48:33,723 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:48:34,724 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:48:35,725 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:48:36,727 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:48:37,728 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:48:37,730 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 12:48:43,732 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:48:44,733 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:48:45,734 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:48:46,736 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:48:47,737 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:48:48,738 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:48:49,740 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:48:50,741 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:48:51,742 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:48:52,430 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:48:52,431 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:48:52,433 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 12:48:52,744 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:48:53,745 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:48:54,746 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:48:55,748 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:48:56,749 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:48:57,750 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:48:58,752 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:48:59,753 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:49:00,754 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:49:01,756 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:49:02,757 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:49:03,758 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:49:04,760 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:49:05,761 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:49:06,762 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:49:07,764 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:49:08,765 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:49:09,766 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:49:10,767 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:49:11,769 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:49:12,770 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:49:13,771 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:49:14,772 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:49:15,774 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:49:16,775 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:49:17,776 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:49:18,778 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:49:19,779 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:49:20,780 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:49:21,782 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:49:22,783 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:49:23,785 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:49:24,786 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:49:25,788 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:49:26,789 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:49:27,790 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:49:28,791 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:49:29,793 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:49:30,794 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:49:31,795 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:49:32,797 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:49:32,799 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 12:49:38,801 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:49:39,802 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:49:40,803 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:49:41,805 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:49:42,806 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:49:43,807 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:49:44,809 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:49:45,810 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:49:46,811 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:49:47,812 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:49:48,814 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:49:49,815 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:49:50,816 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:49:51,818 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:49:52,419 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:49:52,420 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:49:52,421 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 12:49:52,819 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:49:53,820 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:49:54,822 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:49:55,823 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:49:56,824 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:49:57,826 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:49:58,827 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:49:59,828 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:50:00,830 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:50:01,831 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:50:02,832 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:50:03,834 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:50:04,835 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:50:05,836 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:50:06,838 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:50:07,839 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:50:08,840 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:50:09,842 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:50:10,843 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:50:11,844 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:50:12,845 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:50:13,847 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:50:14,848 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:50:15,849 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:50:16,851 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:50:17,852 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:50:18,853 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:50:19,854 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:50:20,856 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:50:21,857 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:50:22,858 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:50:23,860 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:50:24,861 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:50:25,863 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:50:26,864 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:50:27,865 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:50:27,867 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 12:50:33,869 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:50:34,870 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:50:35,872 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:50:36,873 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:50:37,874 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:50:38,876 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:50:39,877 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:50:40,878 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:50:41,879 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:50:42,881 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:50:43,882 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:50:44,883 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:50:45,885 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:50:46,886 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:50:47,887 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:50:48,888 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:50:49,890 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:50:50,891 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:50:51,892 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:50:52,419 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:50:52,420 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:50:52,422 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 12:50:52,894 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:50:53,895 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:50:54,896 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:50:55,898 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:50:56,899 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:50:57,900 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:50:58,902 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:50:59,903 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:51:00,904 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:51:01,906 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:51:02,907 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:51:03,909 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:51:04,910 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:51:05,911 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:51:06,912 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:51:07,914 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:51:08,915 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:51:09,916 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:51:10,917 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:51:11,919 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:51:12,920 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:51:13,921 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:51:14,922 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:51:15,924 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:51:16,925 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:51:17,926 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:51:18,927 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:51:19,929 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:51:20,930 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:51:21,931 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:51:22,933 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:51:22,935 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 12:51:28,937 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:51:29,938 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:51:30,939 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:51:31,941 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:51:32,942 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:51:33,943 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:51:34,944 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:51:35,946 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:51:36,947 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:51:37,948 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:51:38,949 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:51:39,951 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:51:40,952 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:51:41,953 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:51:42,955 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:51:43,956 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:51:44,957 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:51:45,959 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:51:46,960 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:51:47,961 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:51:48,962 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:51:49,964 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:51:50,965 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:51:51,966 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:51:52,432 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:51:52,433 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:51:52,434 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 12:51:52,967 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:51:53,969 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:51:54,970 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:51:55,971 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:51:56,972 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:51:57,973 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:51:58,975 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:51:59,976 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:52:00,977 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:52:01,978 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:52:02,980 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:52:03,981 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:52:04,982 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:52:05,984 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:52:06,985 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:52:07,986 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:52:08,987 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:52:09,989 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:52:10,990 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:52:11,991 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:52:12,992 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:52:13,994 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:52:14,995 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:52:15,996 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:52:16,997 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:52:17,998 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:52:18,001 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 12:52:24,002 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:52:25,004 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:52:26,005 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:52:27,006 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:52:28,007 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:52:29,009 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:52:30,010 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:52:31,011 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:52:32,012 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:52:33,013 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:52:34,015 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:52:35,016 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:52:36,017 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:52:37,019 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:52:38,020 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:52:39,021 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:52:40,022 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:52:41,023 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:52:42,025 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:52:43,026 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:52:44,027 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:52:45,028 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:52:46,029 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:52:47,030 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:52:48,032 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:52:49,033 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:52:50,034 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:52:51,035 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:52:52,037 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:52:52,437 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:52:52,438 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:52:52,440 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 12:52:53,038 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:52:54,039 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:52:55,040 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:52:56,042 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:52:57,043 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:52:58,044 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:52:59,045 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:53:00,046 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:53:01,047 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:53:02,049 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:53:03,050 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:53:04,051 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:53:05,052 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:53:06,054 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:53:07,055 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:53:08,056 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:53:09,057 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:53:10,058 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:53:11,060 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:53:12,061 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:53:13,062 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:53:13,064 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 12:53:19,066 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:53:20,067 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:53:21,069 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:53:22,070 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:53:23,071 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:53:24,073 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:53:25,074 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:53:26,075 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:53:27,076 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:53:28,078 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:53:29,079 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:53:30,080 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:53:31,081 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:53:32,083 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:53:33,084 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:53:34,085 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:53:35,087 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:53:36,088 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:53:37,089 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:53:38,090 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:53:39,092 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:53:40,093 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:53:41,094 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:53:42,096 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:53:43,097 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:53:44,098 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:53:45,099 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:53:46,101 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:53:47,102 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:53:48,103 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:53:49,105 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:53:50,106 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:53:51,107 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:53:52,108 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:53:52,431 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:53:52,432 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:53:52,432 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 12:53:53,110 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:53:54,111 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:53:55,112 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:53:56,114 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:53:57,115 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:53:58,116 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:53:59,117 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:54:00,119 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:54:01,120 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:54:02,121 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:54:03,123 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:54:04,124 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:54:05,125 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:54:06,127 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:54:07,128 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:54:08,129 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:54:08,132 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 12:54:14,134 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:54:15,135 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:54:16,136 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:54:17,138 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:54:18,139 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:54:19,140 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:54:20,141 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:54:21,143 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:54:22,144 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:54:23,145 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:54:24,146 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:54:25,148 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:54:26,149 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:54:27,150 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:54:28,151 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:54:29,153 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:54:30,154 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:54:31,155 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:54:32,157 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:54:33,158 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:54:34,159 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:54:35,161 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:54:36,162 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:54:37,163 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:54:38,164 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:54:39,166 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:54:40,167 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:54:41,168 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:54:42,169 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:54:43,171 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:54:44,172 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:54:45,173 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:54:46,174 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:54:47,176 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:54:48,177 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:54:49,178 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:54:50,179 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:54:51,181 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:54:52,182 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:54:52,417 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:54:52,418 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:54:52,419 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 12:54:53,183 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:54:54,185 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:54:55,186 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:54:56,187 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:54:57,188 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:54:58,190 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:54:59,191 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:55:00,192 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:55:01,194 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:55:02,195 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:55:03,196 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:55:03,198 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 12:55:09,200 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:55:10,202 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:55:11,203 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:55:12,204 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:55:13,206 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:55:14,207 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:55:15,208 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:55:16,209 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:55:17,211 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:55:18,212 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:55:19,213 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:55:20,215 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:55:21,216 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:55:22,217 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:55:23,219 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:55:24,220 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:55:25,221 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:55:26,222 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:55:27,224 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:55:28,225 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:55:29,226 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:55:30,228 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:55:31,229 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:55:32,230 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:55:33,232 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:55:34,233 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:55:35,234 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:55:36,236 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:55:37,237 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:55:38,238 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:55:39,240 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:55:40,241 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:55:41,242 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:55:42,243 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:55:43,245 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:55:44,246 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:55:45,247 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:55:46,249 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:55:47,250 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:55:48,251 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:55:49,253 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:55:50,254 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:55:51,255 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:55:52,257 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:55:52,420 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:55:52,421 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:55:52,422 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 12:55:53,258 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:55:54,259 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:55:55,260 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:55:56,262 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:55:57,263 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:55:58,264 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:55:58,266 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 12:56:04,269 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:56:05,270 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:56:06,271 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:56:07,272 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:56:08,274 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:56:09,275 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:56:10,276 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:56:11,278 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:56:12,279 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:56:13,280 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:56:14,282 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:56:15,283 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:56:16,284 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:56:17,285 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:56:18,287 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:56:19,288 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:56:20,289 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:56:21,291 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:56:22,292 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:56:23,293 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:56:24,294 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:56:25,295 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:56:26,297 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:56:27,298 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:56:28,299 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:56:29,301 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:56:30,302 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:56:31,303 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:56:32,304 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:56:33,306 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:56:34,307 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:56:35,308 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:56:36,310 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:56:37,311 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:56:38,312 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:56:39,314 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:56:40,315 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:56:41,316 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:56:42,318 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:56:43,319 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:56:44,320 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:56:45,321 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:56:46,323 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:56:47,324 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:56:48,325 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:56:49,327 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:56:50,328 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:56:51,329 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:56:52,330 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:56:52,432 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:56:52,433 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:56:52,435 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 12:56:53,332 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:56:53,333 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 12:56:59,335 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:57:00,336 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:57:01,337 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:57:02,339 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:57:03,340 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:57:04,342 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:57:05,343 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:57:06,344 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:57:07,345 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:57:08,347 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:57:09,348 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:57:10,349 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:57:11,350 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:57:12,352 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:57:13,353 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:57:14,354 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:57:15,356 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:57:16,357 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:57:17,358 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:57:18,359 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:57:19,360 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:57:20,362 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:57:21,363 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:57:22,364 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:57:23,366 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:57:24,367 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:57:25,369 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:57:26,370 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:57:27,371 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:57:28,373 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:57:29,374 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:57:30,375 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:57:31,376 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:57:32,377 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:57:33,379 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:57:34,380 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:57:35,382 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:57:36,383 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:57:37,384 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:57:38,385 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:57:39,387 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:57:40,388 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:57:41,389 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:57:42,391 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:57:43,392 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:57:44,393 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:57:45,394 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:57:46,396 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:57:47,397 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:57:48,398 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:57:48,400 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 12:57:52,419 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:57:52,421 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:57:52,422 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 12:57:54,403 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:57:55,404 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:57:56,405 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:57:57,407 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:57:58,408 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:57:59,409 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:58:00,410 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:58:01,412 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:58:02,413 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:58:03,414 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:58:04,416 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:58:05,417 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:58:06,418 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:58:07,420 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:58:08,421 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:58:09,422 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:58:10,423 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:58:11,425 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:58:12,426 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:58:13,427 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:58:14,429 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:58:15,430 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:58:16,431 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:58:17,432 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:58:18,434 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:58:19,435 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:58:20,436 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:58:21,438 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:58:22,439 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:58:23,440 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:58:24,441 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:58:25,443 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:58:26,444 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:58:27,445 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:58:28,447 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:58:29,448 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:58:30,449 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:58:31,450 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:58:32,452 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:58:33,453 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:58:34,455 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:58:35,456 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:58:36,457 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:58:37,458 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:58:38,460 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:58:39,461 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:58:40,462 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:58:41,464 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:58:42,465 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:58:43,466 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:58:43,469 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 12:58:49,471 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:58:50,472 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:58:51,473 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:58:52,420 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:58:52,421 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:58:52,422 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 12:58:52,475 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:58:53,476 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:58:54,477 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:58:55,479 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:58:56,480 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:58:57,481 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:58:58,482 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:58:59,484 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:59:00,485 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:59:01,486 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:59:02,488 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:59:03,489 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:59:04,490 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:59:05,492 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:59:06,493 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:59:07,494 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:59:08,495 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:59:09,497 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:59:10,498 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:59:11,499 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:59:12,500 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:59:13,502 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:59:14,503 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:59:15,504 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:59:16,506 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:59:17,507 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:59:18,508 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:59:19,509 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:59:20,511 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:59:21,512 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:59:22,513 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:59:23,515 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:59:24,516 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:59:25,518 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:59:26,519 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:59:27,520 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:59:28,522 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:59:29,523 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:59:30,524 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:59:31,526 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:59:32,527 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:59:33,528 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:59:34,530 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:59:35,531 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:59:36,532 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:59:37,534 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:59:38,535 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:59:38,537 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 12:59:44,538 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:59:45,540 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:59:46,541 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:59:47,542 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:59:48,544 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:59:49,545 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:59:50,546 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:59:51,547 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:59:52,437 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:59:52,438 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 12:59:52,439 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 12:59:52,549 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:59:53,550 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:59:54,551 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:59:55,552 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:59:56,554 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:59:57,555 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:59:58,556 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 12:59:59,557 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:00:00,559 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:00:01,560 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:00:02,561 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:00:03,563 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:00:04,564 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:00:05,565 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:00:06,567 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:00:07,568 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:00:08,569 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:00:09,571 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:00:10,572 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:00:11,573 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:00:12,575 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:00:13,576 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:00:14,577 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:00:15,579 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:00:16,580 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:00:17,581 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:00:18,582 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:00:19,584 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:00:20,585 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:00:21,587 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:00:22,588 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:00:23,590 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:00:24,591 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:00:25,592 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:00:26,594 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:00:27,595 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:00:28,596 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:00:29,597 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:00:30,599 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:00:31,600 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:00:32,601 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:00:33,602 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:00:33,604 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 13:00:39,606 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:00:40,607 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:00:41,609 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:00:42,610 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:00:43,611 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:00:44,613 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:00:45,614 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:00:46,615 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:00:47,617 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:00:48,618 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:00:49,619 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:00:50,620 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:00:51,622 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:00:52,418 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:00:52,419 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:00:52,420 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 13:00:52,623 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:00:53,624 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:00:54,626 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:00:55,627 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:00:56,628 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:00:57,629 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:00:58,630 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:00:59,632 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:01:00,633 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:01:01,634 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:01:02,636 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:01:03,637 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:01:04,639 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:01:05,640 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:01:06,641 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:01:07,642 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:01:08,644 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:01:09,645 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:01:10,646 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:01:11,648 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:01:12,649 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:01:13,650 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:01:14,652 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:01:15,653 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:01:16,654 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:01:17,656 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:01:18,657 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:01:19,658 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:01:20,660 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:01:21,661 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:01:22,662 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:01:23,664 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:01:24,665 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:01:25,667 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:01:26,668 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:01:27,669 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:01:28,670 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:01:28,672 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 13:01:34,674 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:01:35,675 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:01:36,677 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:01:37,678 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:01:38,679 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:01:39,680 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:01:40,682 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:01:41,683 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:01:42,684 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:01:43,686 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:01:44,687 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:01:45,688 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:01:46,689 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:01:47,691 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:01:48,692 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:01:49,693 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:01:50,695 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:01:51,696 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:01:52,433 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:01:52,435 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:01:52,436 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 45 more 2025-07-07 13:01:52,697 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:01:53,699 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:01:54,700 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:01:55,701 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:01:56,702 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:01:57,704 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:01:58,705 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:01:59,706 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:02:00,707 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:02:01,709 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:02:02,710 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:02:03,711 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:02:04,713 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:02:05,714 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:02:06,715 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:02:07,716 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:02:08,717 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:02:09,719 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:02:10,720 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:02:11,721 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:02:12,722 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:02:13,724 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:02:14,725 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:02:15,726 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:02:16,727 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:02:17,729 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:02:18,730 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:02:19,731 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:02:20,732 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:02:21,734 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:02:22,735 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:02:23,737 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:02:23,738 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 13:02:29,740 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:02:30,742 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:02:31,743 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:02:32,744 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:02:33,745 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:02:34,747 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:02:35,748 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:02:36,750 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:02:37,751 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:02:38,752 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:02:39,753 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:02:40,755 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:02:41,756 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:02:42,757 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:02:43,758 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:02:44,760 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:02:45,761 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:02:46,762 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:02:47,764 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:02:48,765 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:02:49,766 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:02:50,767 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:02:51,769 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:02:52,422 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:02:52,424 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:02:52,424 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 13:02:52,770 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:02:53,771 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:02:54,773 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:02:55,774 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:02:56,775 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:02:57,776 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:02:58,778 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:02:59,779 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:03:00,780 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:03:01,782 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:03:02,783 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:03:03,784 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:03:04,786 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:03:05,787 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:03:06,788 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:03:07,789 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:03:08,791 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:03:09,792 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:03:10,793 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:03:11,795 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:03:12,796 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:03:13,797 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:03:14,799 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:03:15,800 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:03:16,801 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:03:17,802 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:03:18,804 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:03:18,806 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 13:03:24,807 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:03:25,809 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:03:26,810 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:03:27,811 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:03:28,812 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:03:29,814 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:03:30,815 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:03:31,816 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:03:32,818 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:03:33,819 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:03:34,820 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:03:35,821 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:03:36,823 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:03:37,824 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:03:38,825 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:03:39,827 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:03:40,828 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:03:41,829 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:03:42,830 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:03:43,832 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:03:44,833 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:03:45,834 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:03:46,836 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:03:47,837 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:03:48,838 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:03:49,839 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:03:50,841 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:03:51,842 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:03:52,433 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:03:52,434 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:03:52,435 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 45 more 2025-07-07 13:03:52,843 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:03:53,845 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:03:54,846 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:03:55,847 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:03:56,848 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:03:57,849 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:03:58,851 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:03:59,852 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:04:00,853 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:04:01,855 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:04:02,856 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:04:03,857 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:04:04,859 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:04:05,860 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:04:06,861 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:04:07,863 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:04:08,864 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:04:09,865 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:04:10,866 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:04:11,868 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:04:12,869 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:04:13,870 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:04:13,872 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 13:04:19,874 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:04:20,875 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:04:21,877 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:04:22,878 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:04:23,879 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:04:24,880 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:04:25,882 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:04:26,883 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:04:27,884 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:04:28,886 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:04:29,887 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:04:30,888 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:04:31,889 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:04:32,890 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:04:33,892 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:04:34,893 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:04:35,894 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:04:36,896 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:04:37,897 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:04:38,898 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:04:39,900 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:04:40,901 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:04:41,902 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:04:42,903 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:04:43,905 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:04:44,906 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:04:45,907 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:04:46,908 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:04:47,910 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:04:48,911 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:04:49,912 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:04:50,913 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:04:51,915 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:04:52,420 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:04:52,421 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:04:52,422 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 13:04:52,916 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:04:53,917 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:04:54,918 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:04:55,919 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:04:56,921 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:04:57,922 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:04:58,923 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:04:59,925 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:05:00,926 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:05:01,927 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:05:02,928 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:05:03,930 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:05:04,931 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:05:05,932 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:05:06,934 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:05:07,935 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:05:08,936 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:05:08,938 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 13:05:14,940 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:05:15,941 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:05:16,943 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:05:17,944 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:05:18,945 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:05:19,947 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:05:20,948 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:05:21,949 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:05:22,951 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:05:23,952 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:05:24,953 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:05:25,954 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:05:26,956 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:05:27,957 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:05:28,958 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:05:29,959 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:05:30,961 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:05:31,962 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:05:32,963 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:05:33,964 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:05:34,966 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:05:35,967 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:05:36,969 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:05:37,970 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:05:38,971 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:05:39,973 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:05:40,974 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:05:41,975 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:05:42,976 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:05:43,978 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:05:44,979 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:05:45,980 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:05:46,982 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:05:47,983 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:05:48,984 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:05:49,986 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:05:50,987 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:05:51,988 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:05:52,435 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:05:52,436 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:05:52,438 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 13:05:52,990 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:05:53,991 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:05:54,992 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:05:55,994 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:05:56,995 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:05:57,996 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:05:58,997 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:05:59,999 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:06:01,000 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:06:02,001 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:06:03,002 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:06:04,004 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:06:04,006 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 13:06:10,008 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:06:11,009 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:06:12,010 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:06:13,012 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:06:14,013 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:06:15,014 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:06:16,016 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:06:17,017 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:06:18,018 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:06:19,019 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:06:20,021 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:06:21,022 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:06:22,023 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:06:23,024 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:06:24,026 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:06:25,027 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:06:26,028 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:06:27,029 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:06:28,031 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:06:29,032 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:06:30,033 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:06:31,035 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:06:32,036 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:06:33,037 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:06:34,038 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:06:35,040 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:06:36,041 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:06:37,042 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:06:38,044 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:06:39,045 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:06:40,046 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:06:41,047 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:06:42,049 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:06:43,050 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:06:44,051 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:06:45,052 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:06:46,054 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:06:47,055 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:06:48,056 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:06:49,058 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:06:50,059 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:06:51,060 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:06:52,061 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:06:52,423 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:06:52,424 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:06:52,426 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 13:06:53,063 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:06:54,064 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:06:55,065 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:06:56,066 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:06:57,068 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:06:58,069 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:06:59,070 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:06:59,072 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 13:07:05,074 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:07:06,075 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:07:07,077 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:07:08,078 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:07:09,079 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:07:10,081 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:07:11,082 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:07:12,083 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:07:13,084 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:07:14,086 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:07:15,087 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:07:16,088 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:07:17,090 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:07:18,091 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:07:19,092 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:07:20,094 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:07:21,095 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:07:22,096 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:07:23,097 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:07:24,099 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:07:25,100 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:07:26,101 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:07:27,102 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:07:28,104 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:07:29,105 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:07:30,106 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:07:31,107 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:07:32,109 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:07:33,110 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:07:34,111 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:07:35,113 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:07:36,114 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:07:37,115 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:07:38,117 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:07:39,118 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:07:40,119 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:07:41,120 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:07:42,122 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:07:43,123 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:07:44,124 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:07:45,125 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:07:46,127 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:07:47,128 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:07:48,129 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:07:49,130 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:07:50,132 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:07:51,133 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:07:52,134 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:07:52,435 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:07:52,436 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:07:52,437 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 13:07:53,135 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:07:54,137 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:07:54,138 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 13:08:00,140 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:08:01,141 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:08:02,143 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:08:03,144 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:08:04,145 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:08:05,147 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:08:06,148 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:08:07,149 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:08:08,150 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:08:09,152 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:08:10,153 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:08:11,154 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:08:12,156 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:08:13,157 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:08:14,158 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:08:15,159 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:08:16,160 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:08:17,162 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:08:18,163 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:08:19,164 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:08:20,166 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:08:21,167 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:08:22,168 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:08:23,169 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:08:24,171 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:08:25,172 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:08:26,174 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:08:27,175 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:08:28,176 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:08:29,177 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:08:30,179 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:08:31,180 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:08:32,181 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:08:33,182 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:08:34,184 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:08:35,185 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:08:36,186 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:08:37,187 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:08:38,189 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:08:39,190 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:08:40,191 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:08:41,192 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:08:42,194 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:08:43,195 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:08:44,196 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:08:45,197 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:08:46,199 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:08:47,200 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:08:48,201 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:08:49,202 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:08:49,204 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 13:08:52,420 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:08:52,421 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:08:52,422 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 13:08:55,206 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:08:56,207 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:08:57,208 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:08:58,210 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:08:59,211 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:09:00,212 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:09:01,213 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:09:02,215 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:09:03,216 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:09:04,217 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:09:05,219 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:09:06,220 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:09:07,221 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:09:08,222 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:09:09,224 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:09:10,225 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:09:11,226 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:09:12,228 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:09:13,229 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:09:14,230 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:09:15,231 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:09:16,233 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:09:17,234 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:09:18,235 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:09:19,236 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:09:20,238 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:09:21,239 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:09:22,240 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:09:23,241 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:09:24,243 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:09:25,244 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:09:26,245 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:09:27,247 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:09:28,248 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:09:29,249 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:09:30,250 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:09:31,252 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:09:32,253 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:09:33,254 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:09:34,255 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:09:35,257 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:09:36,258 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:09:37,259 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:09:38,261 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:09:39,262 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:09:40,263 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:09:41,265 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:09:42,266 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:09:43,267 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:09:44,268 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:09:44,270 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 13:09:50,272 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:09:50,744 WARN com.cloudera.cmf.event.publish.EventStorePublisherWithRetry: Failed to publish event: SimpleEvent{attributes={ROLE=[hdfs-DATANODE-b30a464b10a57fdd49ea734cd52a8291], HOSTS=[dmidlkprdls04.svr.luc.edu], ROLE_TYPE=[DATANODE], CATEGORY=[LOG_MESSAGE], EVENTCODE=[EV_LOG_EVENT], SERVICE=[hdfs], SERVICE_TYPE=[HDFS], LOG_LEVEL=[WARN], HOST_IDS=[5c33df90-d247-4c6d-b9e0-5908a423580a], SEVERITY=[IMPORTANT]}, content=Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /var/lib/hadoop-hdfs/hadoop-http-auth-signature-secret, timestamp=1751909970885} - 1 of 60 failure(s) in last 1819s java.io.IOException: Error connecting to dmidlkprdls01.svr.luc.edu/192.168.158.1:7184 at com.cloudera.cmf.event.shaded.org.apache.avro.ipc.netty.NettyTransceiver.getChannel(NettyTransceiver.java:269) at com.cloudera.cmf.event.shaded.org.apache.avro.ipc.netty.NettyTransceiver.(NettyTransceiver.java:197) at com.cloudera.cmf.event.shaded.org.apache.avro.ipc.netty.NettyTransceiver.(NettyTransceiver.java:120) at com.cloudera.cmf.event.publish.AvroEventStorePublishProxy.checkSpecificRequestor(AvroEventStorePublishProxy.java:120) at com.cloudera.cmf.event.publish.AvroEventStorePublishProxy.publishEvent(AvroEventStorePublishProxy.java:204) at com.cloudera.cmf.event.publish.EventStorePublisherWithRetry$PublishEventTask.run(EventStorePublisherWithRetry.java:242) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:750) Caused by: com.cloudera.cmf.event.shaded.io.netty.channel.AbstractChannel$AnnotatedNoRouteToHostException: No route to host: dmidlkprdls01.svr.luc.edu/192.168.158.1:7184 Caused by: java.net.NoRouteToHostException: No route to host at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:716) at com.cloudera.cmf.event.shaded.io.netty.channel.socket.nio.NioSocketChannel.doFinishConnect(NioSocketChannel.java:337) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.finishConnect(AbstractNioChannel.java:339) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:776) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:724) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:650) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:562) at com.cloudera.cmf.event.shaded.io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997) at com.cloudera.cmf.event.shaded.io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:09:51,273 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:09:52,274 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:09:52,440 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:09:52,443 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:09:52,444 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 13:09:53,276 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:09:54,277 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:09:55,278 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:09:56,279 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:09:57,281 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:09:58,282 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:09:59,283 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:10:00,284 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:10:01,286 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:10:02,287 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:10:03,288 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:10:04,290 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:10:05,291 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:10:06,292 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:10:07,294 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:10:08,295 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:10:09,296 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:10:10,297 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:10:11,299 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:10:12,300 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:10:13,301 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:10:14,303 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:10:15,304 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:10:16,305 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:10:17,306 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:10:18,308 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:10:19,309 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:10:20,310 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:10:21,311 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:10:22,313 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:10:23,314 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:10:24,316 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:10:25,317 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:10:26,318 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:10:27,320 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:10:28,321 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:10:29,322 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:10:30,323 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:10:31,325 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:10:32,326 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:10:33,327 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:10:34,329 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:10:35,330 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:10:36,331 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:10:37,333 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:10:38,334 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:10:39,335 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:10:39,337 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 13:10:45,339 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:10:46,340 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:10:47,341 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:10:48,343 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:10:49,344 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:10:50,345 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:10:51,346 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:10:52,348 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:10:52,424 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:10:52,425 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:10:52,426 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 13:10:53,349 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:10:54,350 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:10:55,352 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:10:56,353 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:10:57,354 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:10:58,355 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:10:59,357 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:11:00,358 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:11:01,359 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:11:02,361 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:11:03,362 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:11:04,363 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:11:05,365 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:11:06,366 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:11:07,367 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:11:08,368 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:11:09,369 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:11:10,370 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:11:11,372 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:11:12,373 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:11:13,374 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:11:14,376 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:11:15,377 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:11:16,378 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:11:17,379 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:11:18,381 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:11:19,382 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:11:20,383 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:11:21,384 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:11:22,386 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:11:23,387 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:11:24,389 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:11:25,390 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:11:26,391 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:11:27,393 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:11:28,394 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:11:29,395 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:11:30,396 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:11:31,398 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:11:32,399 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:11:33,400 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:11:34,402 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:11:34,403 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 13:11:40,405 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:11:41,407 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:11:42,408 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:11:43,409 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:11:44,410 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:11:45,412 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:11:46,413 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:11:47,414 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:11:48,415 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:11:49,416 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:11:50,418 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:11:51,419 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:11:52,420 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:11:52,437 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:11:52,439 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:11:52,439 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 13:11:53,421 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:11:54,422 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:11:55,424 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:11:56,425 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:11:57,433 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:11:58,435 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:11:59,436 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:12:00,437 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:12:01,439 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:12:02,440 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:12:03,441 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:12:04,443 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:12:05,444 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:12:06,445 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:12:07,447 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:12:08,448 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:12:09,449 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:12:10,451 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:12:11,452 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:12:12,454 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:12:13,455 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:12:14,456 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:12:15,457 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:12:16,459 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:12:17,460 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:12:18,462 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:12:19,463 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:12:20,464 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:12:21,465 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:12:22,467 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:12:23,468 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:12:24,470 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:12:25,471 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:12:26,472 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:12:27,473 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:12:28,475 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:12:29,476 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:12:29,477 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 13:12:35,479 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:12:36,480 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:12:37,482 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:12:38,483 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:12:39,484 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:12:40,486 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:12:41,487 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:12:42,488 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:12:43,490 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:12:44,491 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:12:45,492 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:12:46,494 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:12:47,495 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:12:48,496 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:12:49,498 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:12:50,499 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:12:51,500 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:12:52,423 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:12:52,424 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:12:52,424 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 13:12:52,501 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:12:53,503 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:12:54,504 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:12:55,505 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:12:56,507 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:12:57,508 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:12:58,509 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:12:59,511 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:13:00,512 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:13:01,513 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:13:02,514 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:13:03,516 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:13:04,517 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:13:05,518 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:13:06,520 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:13:07,521 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:13:08,522 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:13:09,524 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:13:10,525 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:13:11,526 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:13:12,528 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:13:13,529 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:13:14,530 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:13:15,532 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:13:16,533 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:13:17,534 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:13:18,536 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:13:19,537 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:13:20,538 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:13:21,540 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:13:22,541 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:13:23,542 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:13:24,543 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:13:24,545 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 13:13:30,547 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:13:31,548 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:13:32,549 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:13:33,551 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:13:34,552 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:13:35,553 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:13:36,555 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:13:37,556 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:13:38,557 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:13:39,559 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:13:40,560 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:13:41,561 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:13:42,563 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:13:43,564 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:13:44,565 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:13:45,567 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:13:46,568 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:13:47,569 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:13:48,571 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:13:49,572 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:13:50,573 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:13:51,574 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:13:52,427 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:13:52,428 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:13:52,428 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 13:13:52,576 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:13:53,577 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:13:54,578 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:13:55,580 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:13:56,581 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:13:57,582 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:13:58,584 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:13:59,585 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:14:00,586 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:14:01,588 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:14:02,589 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:14:03,590 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:14:04,591 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:14:05,593 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:14:06,594 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:14:07,595 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:14:08,597 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:14:09,598 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:14:10,599 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:14:11,601 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:14:12,602 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:14:13,603 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:14:14,605 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:14:15,606 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:14:16,607 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:14:17,609 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:14:18,610 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:14:19,611 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:14:19,614 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 13:14:25,615 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:14:26,617 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:14:27,618 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:14:28,619 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:14:29,621 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:14:30,622 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:14:31,623 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:14:32,624 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:14:33,626 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:14:34,627 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:14:35,629 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:14:36,630 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:14:37,631 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:14:38,632 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:14:39,634 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:14:40,635 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:14:41,636 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:14:42,638 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:14:43,639 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:14:44,640 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:14:45,642 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:14:46,643 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:14:47,644 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:14:48,645 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:14:49,647 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:14:50,648 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:14:51,649 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:14:52,436 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:14:52,438 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:14:52,439 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 13:14:52,651 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:14:53,652 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:14:54,653 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:14:55,655 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:14:56,656 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:14:57,657 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:14:58,659 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:14:59,660 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:15:00,661 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:15:01,662 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:15:02,664 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:15:03,665 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:15:04,666 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:15:05,668 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:15:06,669 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:15:07,671 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:15:08,672 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:15:09,673 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:15:10,675 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:15:11,676 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:15:12,677 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:15:13,679 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:15:14,680 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:15:14,682 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 13:15:20,684 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:15:21,685 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:15:22,687 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:15:23,688 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:15:24,689 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:15:25,691 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:15:26,692 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:15:27,693 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:15:28,695 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:15:29,696 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:15:30,697 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:15:31,698 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:15:32,700 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:15:33,701 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:15:34,702 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:15:35,704 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:15:36,705 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:15:37,706 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:15:38,708 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:15:39,709 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:15:40,710 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:15:41,711 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:15:42,713 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:15:43,714 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:15:44,715 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:15:45,717 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:15:46,718 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:15:47,719 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:15:48,720 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:15:49,722 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:15:50,723 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:15:51,724 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:15:52,424 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:15:52,425 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:15:52,425 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 13:15:52,726 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:15:53,727 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:15:54,728 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:15:55,730 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:15:56,731 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:15:57,732 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:15:58,734 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:15:59,735 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:16:00,736 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:16:01,738 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:16:02,739 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:16:03,740 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:16:04,742 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:16:05,743 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:16:06,744 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:16:07,746 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:16:08,747 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:16:09,748 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:16:09,751 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 13:16:15,754 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:16:16,755 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:16:17,756 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:16:18,758 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:16:19,759 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:16:20,760 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:16:21,762 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:16:22,763 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:16:23,764 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:16:24,766 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:16:25,767 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:16:26,768 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:16:27,770 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:16:28,771 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:16:29,772 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:16:30,773 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:16:31,775 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:16:32,776 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:16:33,777 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:16:34,779 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:16:35,780 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:16:36,781 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:16:37,783 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:16:38,784 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:16:39,785 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:16:40,786 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:16:41,788 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:16:42,789 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:16:43,790 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:16:44,791 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:16:45,793 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:16:46,794 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:16:47,795 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:16:48,796 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:16:49,797 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:16:50,799 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:16:51,800 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:16:52,438 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:16:52,439 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:16:52,441 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 13:16:52,801 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:16:53,802 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:16:54,804 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:16:55,805 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:16:56,806 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:16:57,807 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:16:58,808 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:16:59,810 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:17:00,811 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:17:01,812 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:17:02,813 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:17:03,815 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:17:04,816 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:17:04,819 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 13:17:10,821 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:17:11,823 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:17:12,824 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:17:13,825 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:17:14,826 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:17:15,828 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:17:16,829 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:17:17,830 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:17:18,831 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:17:19,832 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:17:20,834 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:17:21,835 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:17:22,836 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:17:23,837 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:17:24,839 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:17:25,840 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:17:26,841 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:17:27,842 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:17:28,844 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:17:29,845 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:17:30,846 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:17:31,847 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:17:32,848 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:17:33,850 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:17:34,851 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:17:35,852 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:17:36,854 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:17:37,855 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:17:38,856 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:17:39,857 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:17:40,858 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:17:41,860 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:17:42,861 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:17:43,862 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:17:44,863 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:17:45,865 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:17:46,866 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:17:47,867 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:17:48,868 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:17:49,870 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:17:50,871 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:17:51,872 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:17:52,423 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:17:52,424 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:17:52,425 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 13:17:52,873 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:17:53,875 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:17:54,876 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:17:55,877 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:17:56,878 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:17:57,880 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:17:58,881 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:17:59,882 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:17:59,884 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 13:18:05,886 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:18:06,887 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:18:07,889 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:18:08,890 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:18:09,891 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:18:10,892 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:18:11,893 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:18:12,895 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:18:13,896 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:18:14,897 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:18:15,898 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:18:16,899 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:18:17,901 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:18:18,902 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:18:19,903 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:18:20,904 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:18:21,905 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:18:22,907 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:18:23,908 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:18:24,909 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:18:25,910 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:18:26,912 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:18:27,913 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:18:28,914 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:18:29,915 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:18:30,916 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:18:31,918 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:18:32,919 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:18:33,920 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:18:34,921 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:18:35,923 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:18:36,924 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:18:37,925 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:18:38,926 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:18:39,928 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:18:40,929 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:18:41,930 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:18:42,931 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:18:43,932 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:18:44,934 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:18:45,935 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:18:46,936 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:18:47,937 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:18:48,939 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:18:49,940 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:18:50,941 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:18:51,942 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:18:52,436 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:18:52,437 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:18:52,438 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 13:18:52,943 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:18:53,945 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:18:54,946 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:18:54,947 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 13:19:00,949 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:19:01,950 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:19:02,951 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:19:03,953 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:19:04,954 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:19:05,955 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:19:06,957 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:19:07,958 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:19:08,959 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:19:09,960 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:19:10,962 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:19:11,963 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:19:12,964 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:19:13,965 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:19:14,966 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:19:15,967 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:19:16,969 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:19:17,970 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:19:18,971 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:19:19,972 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:19:20,973 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:19:21,974 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:19:22,976 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:19:23,978 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:19:24,979 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:19:25,980 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:19:26,982 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:19:27,983 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:19:28,984 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:19:29,985 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:19:30,987 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:19:31,988 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:19:32,989 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:19:33,990 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:19:34,992 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:19:35,993 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:19:36,995 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:19:37,996 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:19:38,997 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:19:39,998 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:19:41,000 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:19:42,001 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:19:43,002 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:19:44,003 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:19:45,005 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:19:46,006 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:19:47,007 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:19:48,009 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:19:49,010 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:19:50,011 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:19:50,013 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 13:19:52,429 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:19:52,430 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:19:52,431 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 13:19:56,014 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:19:57,016 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:19:58,017 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:19:59,018 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:20:00,020 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:20:01,021 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:20:02,022 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:20:03,023 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:20:04,025 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:20:05,026 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:20:06,027 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:20:07,029 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:20:08,030 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:20:09,031 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:20:10,032 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:20:11,033 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:20:12,035 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:20:13,036 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:20:14,037 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:20:15,038 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:20:16,040 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:20:17,041 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:20:18,042 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:20:19,043 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:20:20,044 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:20:21,045 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:20:22,047 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:20:23,048 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:20:24,049 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:20:25,050 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:20:26,051 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:20:27,052 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:20:28,054 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:20:29,055 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:20:30,056 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:20:31,057 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:20:32,058 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:20:33,060 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:20:34,061 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:20:35,062 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:20:36,063 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:20:37,064 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:20:38,066 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:20:39,067 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:20:40,068 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:20:41,069 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:20:42,070 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:20:43,071 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:20:44,073 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:20:45,074 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:20:45,075 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 13:20:51,077 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:20:52,078 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:20:52,440 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:20:52,441 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:20:52,442 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 13:20:53,079 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:20:54,081 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:20:55,082 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:20:56,083 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:20:57,084 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:20:58,085 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:20:59,087 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:21:00,088 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:21:01,089 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:21:02,090 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:21:03,092 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:21:04,093 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:21:05,094 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:21:06,096 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:21:07,097 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:21:08,098 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:21:09,099 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:21:10,100 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:21:11,102 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:21:12,103 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:21:13,104 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:21:14,105 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:21:15,107 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:21:16,108 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:21:17,109 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:21:18,110 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:21:19,112 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:21:20,113 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:21:21,114 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:21:22,115 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:21:23,116 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:21:24,118 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:21:25,120 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:21:26,121 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:21:27,122 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:21:28,123 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:21:29,125 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:21:30,126 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:21:31,127 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:21:32,128 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:21:33,129 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:21:34,131 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:21:35,132 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:21:36,133 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:21:37,135 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:21:38,136 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:21:39,137 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:21:40,138 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:21:40,140 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 13:21:46,142 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:21:47,143 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:21:48,144 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:21:49,145 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:21:50,146 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:21:51,148 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:21:52,149 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:21:52,426 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:21:52,428 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:21:52,429 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 13:21:53,150 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:21:54,151 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:21:55,152 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:21:56,154 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:21:57,155 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:21:58,156 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:21:59,157 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:22:00,158 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:22:01,159 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:22:02,161 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:22:03,162 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:22:04,163 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:22:05,164 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:22:06,166 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:22:07,167 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:22:08,168 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:22:09,169 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:22:10,171 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:22:11,172 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:22:12,173 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:22:13,174 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:22:14,176 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:22:15,177 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:22:16,178 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:22:17,179 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:22:18,180 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:22:19,182 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:22:20,183 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:22:21,184 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:22:22,185 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:22:23,187 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:22:24,189 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:22:25,190 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:22:26,191 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:22:27,192 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:22:28,194 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:22:29,195 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:22:30,196 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:22:31,197 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:22:32,199 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:22:33,200 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:22:34,201 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:22:35,202 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:22:35,204 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 13:22:41,206 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:22:42,207 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:22:43,208 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:22:44,209 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:22:45,210 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:22:46,212 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:22:47,213 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:22:48,214 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:22:49,215 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:22:50,216 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:22:51,218 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:22:52,219 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:22:52,438 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:22:52,439 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:22:52,440 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 13:22:53,220 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:22:54,221 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:22:55,223 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:22:56,224 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:22:57,225 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:22:58,226 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:22:59,227 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:23:00,229 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:23:01,230 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:23:02,231 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:23:03,232 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:23:04,234 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:23:05,235 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:23:06,236 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:23:07,238 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:23:08,239 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:23:09,240 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:23:10,241 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:23:11,242 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:23:12,244 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:23:13,245 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:23:14,246 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:23:15,247 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:23:16,249 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:23:17,250 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:23:18,252 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:23:19,253 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:23:20,254 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:23:21,256 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:23:22,257 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:23:23,258 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:23:24,260 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:23:25,261 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:23:26,263 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:23:27,264 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:23:28,265 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:23:29,267 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:23:30,268 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:23:30,270 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 13:23:36,272 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:23:37,273 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:23:38,274 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:23:39,276 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:23:40,277 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:23:41,278 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:23:42,280 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:23:43,281 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:23:44,282 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:23:45,283 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:23:46,285 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:23:47,286 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:23:48,287 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:23:49,288 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:23:50,290 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:23:51,291 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:23:52,292 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:23:52,426 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:23:52,426 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:23:52,427 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 13:23:53,293 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:23:54,295 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:23:55,296 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:23:56,297 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:23:57,299 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:23:58,300 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:23:59,301 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:24:00,302 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:24:01,304 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:24:02,305 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:24:03,306 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:24:04,308 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:24:05,309 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:24:06,310 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:24:07,312 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:24:08,313 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:24:09,314 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:24:10,316 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:24:11,317 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:24:12,318 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:24:13,320 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:24:14,321 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:24:15,323 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:24:16,324 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:24:17,325 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:24:18,326 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:24:19,328 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:24:20,329 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:24:21,330 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:24:22,332 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:24:23,333 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:24:24,335 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:24:25,336 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:24:25,338 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 13:24:31,339 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:24:32,341 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:24:33,342 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:24:34,343 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:24:35,345 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:24:36,346 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:24:37,348 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:24:38,349 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:24:39,350 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:24:40,351 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:24:41,353 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:24:42,354 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:24:43,355 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:24:44,356 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:24:45,358 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:24:46,359 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:24:47,360 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:24:48,362 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:24:49,363 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:24:50,364 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:24:51,365 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:24:52,367 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:24:52,440 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:24:52,441 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:24:52,442 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 13:24:53,368 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:24:54,369 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:24:55,371 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:24:56,372 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:24:57,373 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:24:58,375 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:24:59,376 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:25:00,377 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:25:01,378 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:25:02,380 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:25:03,381 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:25:04,382 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:25:05,383 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:25:06,385 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:25:07,386 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:25:08,388 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:25:09,389 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:25:10,390 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:25:11,391 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:25:12,393 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:25:13,394 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:25:14,395 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:25:15,397 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:25:16,398 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:25:17,399 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:25:18,400 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:25:19,402 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:25:20,403 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:25:20,405 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 13:25:26,407 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:25:27,409 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:25:28,410 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:25:29,411 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:25:30,412 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:25:31,414 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:25:32,415 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:25:33,416 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:25:34,417 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:25:35,419 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:25:36,420 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:25:37,421 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:25:38,423 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:25:39,424 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:25:40,425 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:25:41,426 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:25:42,428 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:25:43,429 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:25:44,430 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:25:45,431 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:25:46,433 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:25:47,434 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:25:48,435 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:25:49,436 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:25:50,438 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:25:51,439 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:25:52,426 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:25:52,427 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:25:52,427 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 13:25:52,440 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:25:53,441 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:25:54,443 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:25:55,444 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:25:56,445 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:25:57,446 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:25:58,447 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:25:59,449 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:26:00,450 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:26:01,451 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:26:02,452 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:26:03,454 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:26:04,455 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:26:05,456 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:26:06,458 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:26:07,459 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:26:08,460 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:26:09,461 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:26:10,463 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:26:11,464 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:26:12,465 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:26:13,466 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:26:14,468 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:26:15,469 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:26:15,471 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 13:26:21,473 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:26:22,474 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:26:23,475 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:26:24,477 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:26:25,478 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:26:26,479 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:26:27,480 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:26:28,482 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:26:29,483 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:26:30,484 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:26:31,485 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:26:32,487 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:26:33,488 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:26:34,489 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:26:35,490 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:26:36,492 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:26:37,493 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:26:38,494 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:26:39,496 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:26:40,497 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:26:41,498 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:26:42,499 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:26:43,501 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:26:44,502 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:26:45,503 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:26:46,504 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:26:47,506 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:26:48,507 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:26:49,508 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:26:50,509 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:26:51,511 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:26:52,440 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:26:52,442 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:26:52,444 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 13:26:52,512 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:26:53,513 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:26:54,514 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:26:55,515 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:26:56,517 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:26:57,518 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:26:58,519 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:26:59,520 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:27:00,522 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:27:01,523 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:27:02,524 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:27:03,525 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:27:04,526 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:27:05,528 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:27:06,529 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:27:07,530 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:27:08,532 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:27:09,533 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:27:10,534 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:27:10,536 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 13:27:16,538 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:27:17,539 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:27:18,541 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:27:19,542 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:27:20,543 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:27:21,544 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:27:22,546 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:27:23,547 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:27:24,548 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:27:25,549 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:27:26,550 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:27:27,552 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:27:28,553 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:27:29,554 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:27:30,555 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:27:31,557 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:27:32,558 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:27:33,559 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:27:34,560 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:27:35,561 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:27:36,563 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:27:37,564 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:27:38,565 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:27:39,567 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:27:40,568 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:27:41,569 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:27:42,570 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:27:43,572 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:27:44,573 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:27:45,574 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:27:46,575 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:27:47,576 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:27:48,578 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:27:49,579 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:27:50,580 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:27:51,581 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:27:52,428 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:27:52,429 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:27:52,429 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 13:27:52,582 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:27:53,584 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:27:54,585 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:27:55,586 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:27:56,587 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:27:57,588 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:27:58,590 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:27:59,591 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:28:00,592 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:28:01,593 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:28:02,594 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:28:03,595 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:28:04,597 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:28:05,598 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:28:05,600 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 13:28:11,602 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:28:12,603 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:28:13,604 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:28:14,605 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:28:15,607 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:28:16,608 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:28:17,609 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:28:18,610 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:28:19,612 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:28:20,613 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:28:21,614 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:28:22,616 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:28:23,617 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:28:24,618 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:28:25,619 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:28:26,620 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:28:27,622 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:28:28,623 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:28:29,624 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:28:30,625 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:28:31,626 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:28:32,628 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:28:33,629 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:28:34,630 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:28:35,631 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:28:36,633 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:28:37,634 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:28:38,635 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:28:39,637 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:28:40,638 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:28:41,639 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:28:42,641 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:28:43,642 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:28:44,643 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:28:45,644 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:28:46,645 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:28:47,647 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:28:48,648 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:28:49,649 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:28:50,650 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:28:51,652 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:28:52,432 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:28:52,433 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:28:52,434 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 13:28:52,653 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:28:53,654 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:28:54,655 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:28:55,656 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:28:56,658 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:28:57,659 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:28:58,660 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:28:59,661 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:29:00,663 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:29:00,665 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 13:29:06,667 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:29:07,668 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:29:08,670 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:29:09,671 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:29:10,672 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:29:11,673 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:29:12,674 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:29:13,676 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:29:14,677 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:29:15,678 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:29:16,679 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:29:17,681 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:29:18,682 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:29:19,683 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:29:20,684 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:29:21,685 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:29:22,687 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:29:23,688 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:29:24,689 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:29:25,690 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:29:26,692 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:29:27,693 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:29:28,694 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:29:29,695 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:29:30,696 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:29:31,698 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:29:32,699 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:29:33,700 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:29:34,701 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:29:35,702 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:29:36,704 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:29:37,705 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:29:38,706 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:29:39,708 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:29:40,709 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:29:41,710 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:29:42,711 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:29:43,713 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:29:44,714 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:29:45,715 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:29:46,716 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:29:47,717 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:29:48,719 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:29:49,720 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:29:50,721 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:29:51,722 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:29:52,442 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:29:52,443 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:29:52,444 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 13:29:52,723 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:29:53,725 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:29:54,726 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:29:55,727 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:29:55,728 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 13:30:01,730 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:30:02,731 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:30:03,733 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:30:04,734 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:30:05,735 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:30:06,736 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:30:07,737 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:30:08,739 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:30:09,740 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:30:10,741 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:30:11,742 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:30:12,743 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:30:13,745 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:30:14,746 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:30:15,747 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:30:16,748 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:30:17,749 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:30:18,750 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:30:19,752 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:30:20,753 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:30:21,754 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:30:22,755 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:30:23,758 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:30:24,759 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:30:25,761 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:30:26,762 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:30:27,763 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:30:28,764 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:30:29,765 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:30:30,767 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:30:31,768 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:30:32,769 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:30:33,770 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:30:34,771 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:30:35,773 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:30:36,774 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:30:37,775 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:30:38,777 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:30:39,778 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:30:40,779 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:30:41,780 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:30:42,781 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:30:43,783 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:30:44,784 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:30:45,785 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:30:46,786 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:30:47,787 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:30:48,789 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:30:49,790 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:30:50,791 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:30:50,793 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 13:30:52,428 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:30:52,429 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:30:52,430 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 13:30:56,794 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:30:57,796 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:30:58,797 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:30:59,798 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:31:00,799 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:31:01,800 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:31:02,802 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:31:03,803 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:31:04,804 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:31:05,805 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:31:06,807 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:31:07,808 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:31:08,809 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:31:09,811 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:31:10,812 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:31:11,813 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:31:12,814 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:31:13,815 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:31:14,817 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:31:15,818 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:31:16,819 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:31:17,820 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:31:18,821 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:31:19,822 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:31:20,824 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:31:21,825 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:31:22,826 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:31:23,827 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:31:24,828 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:31:25,830 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:31:26,831 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:31:27,832 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:31:28,833 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:31:29,835 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:31:30,836 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:31:31,837 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:31:32,838 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:31:33,839 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:31:34,840 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:31:35,842 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:31:36,843 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:31:37,844 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:31:38,846 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:31:39,847 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:31:40,848 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:31:41,849 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:31:42,851 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:31:43,852 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:31:44,853 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:31:45,854 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:31:45,856 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 13:31:51,858 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:31:52,443 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:31:52,444 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:31:52,445 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 13:31:52,859 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:31:53,860 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:31:54,861 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:31:55,863 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:31:56,864 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:31:57,865 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:31:58,866 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:31:59,867 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:32:00,868 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:32:01,870 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:32:02,871 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:32:03,872 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:32:04,873 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:32:05,874 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:32:06,876 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:32:07,877 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:32:08,878 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:32:09,879 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:32:10,881 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:32:11,882 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:32:12,883 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:32:13,884 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:32:14,886 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:32:15,887 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:32:16,888 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:32:17,889 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:32:18,891 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:32:19,892 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:32:20,893 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:32:21,894 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:32:22,895 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:32:23,897 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:32:24,898 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:32:25,900 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:32:26,901 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:32:27,902 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:32:28,903 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:32:29,904 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:32:30,905 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:32:31,907 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:32:32,908 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:32:33,909 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:32:34,911 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:32:35,912 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:32:36,913 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:32:37,914 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:32:38,915 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:32:39,917 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:32:40,918 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:32:40,919 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 13:32:46,921 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:32:47,922 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:32:48,924 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:32:49,925 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:32:50,926 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:32:51,927 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:32:52,433 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:32:52,434 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:32:52,435 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 13:32:52,928 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:32:53,930 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:32:54,931 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:32:55,932 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:32:56,933 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:32:57,934 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:32:58,936 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:32:59,937 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:33:00,938 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:33:01,939 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:33:02,940 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:33:03,942 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:33:04,943 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:33:05,944 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:33:06,945 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:33:07,946 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:33:08,948 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:33:09,949 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:33:10,950 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:33:11,951 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:33:12,953 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:33:13,954 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:33:14,955 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:33:15,956 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:33:16,958 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:33:17,959 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:33:18,960 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:33:19,961 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:33:20,962 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:33:21,964 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:33:22,965 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:33:23,967 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:33:24,968 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:33:25,969 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:33:26,971 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:33:27,972 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:33:28,973 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:33:29,974 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:33:30,976 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:33:31,977 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:33:32,978 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:33:33,979 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:33:34,981 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:33:35,982 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:33:35,984 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 13:33:41,985 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:33:42,987 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:33:43,988 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:33:44,989 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:33:45,990 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:33:46,992 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:33:47,993 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:33:48,994 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:33:49,995 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:33:50,997 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:33:51,998 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:33:52,443 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:33:52,444 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:33:52,445 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 13:33:52,999 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:33:54,000 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:33:55,002 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:33:56,003 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:33:57,004 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:33:58,005 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:33:59,007 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:34:00,008 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:34:01,009 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:34:02,010 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:34:03,011 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:34:04,013 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:34:05,014 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:34:06,015 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:34:07,017 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:34:08,018 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:34:09,019 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:34:10,020 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:34:11,021 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:34:12,023 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:34:13,024 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:34:14,025 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:34:15,026 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:34:16,027 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:34:17,029 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:34:18,030 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:34:19,031 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:34:20,032 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:34:21,033 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:34:22,034 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:34:23,036 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:34:24,037 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:34:25,039 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:34:26,040 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:34:27,041 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:34:28,042 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:34:29,043 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:34:30,044 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:34:31,046 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:34:31,047 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 13:34:37,051 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:34:38,052 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:34:39,053 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:34:40,054 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:34:41,056 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:34:42,057 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:34:43,058 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:34:44,059 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:34:45,061 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:34:46,062 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:34:47,063 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:34:48,064 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:34:49,065 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:34:50,067 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:34:51,068 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:34:52,069 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:34:52,428 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:34:52,429 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:34:52,429 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 13:34:53,070 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:34:54,072 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:34:55,073 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:34:56,074 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:34:57,075 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:34:58,076 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:34:59,078 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:35:00,079 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:35:01,080 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:35:02,081 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:35:03,082 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:35:04,084 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:35:05,085 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:35:06,086 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:35:07,087 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:35:08,089 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:35:09,090 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:35:10,091 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:35:11,092 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:35:12,093 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:35:13,094 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:35:14,096 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:35:15,097 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:35:16,098 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:35:17,099 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:35:18,101 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:35:19,102 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:35:20,103 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:35:21,104 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:35:22,105 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:35:23,107 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:35:24,108 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:35:25,109 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:35:26,111 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:35:26,114 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 13:35:32,115 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:35:33,117 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:35:34,118 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:35:35,119 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:35:36,120 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:35:37,122 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:35:38,123 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:35:39,124 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:35:40,126 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:35:41,127 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:35:42,128 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:35:43,129 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:35:44,131 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:35:45,132 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:35:46,133 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:35:47,134 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:35:48,135 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:35:49,137 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:35:50,138 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:35:51,139 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:35:52,140 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:35:52,444 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:35:52,445 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:35:52,446 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 13:35:53,141 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:35:54,143 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:35:55,144 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:35:56,145 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:35:57,146 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:35:58,147 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:35:59,149 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:36:00,150 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:36:01,151 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:36:02,152 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:36:03,153 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:36:04,155 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:36:05,156 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:36:06,157 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:36:07,158 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:36:08,159 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:36:09,161 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:36:10,162 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:36:11,163 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:36:12,164 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:36:13,166 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:36:14,167 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:36:15,168 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:36:16,169 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:36:17,170 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:36:18,172 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:36:19,173 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:36:20,174 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:36:21,175 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:36:21,177 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 13:36:27,179 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:36:28,180 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:36:29,181 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:36:30,183 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:36:31,184 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:36:32,185 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:36:33,186 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:36:34,188 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:36:35,189 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:36:36,190 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:36:37,191 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:36:38,192 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:36:39,194 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:36:40,195 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:36:41,196 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:36:42,197 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:36:43,199 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:36:44,200 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:36:45,201 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:36:46,202 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:36:47,203 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:36:48,205 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:36:49,206 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:36:50,207 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:36:51,208 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:36:52,209 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:36:52,430 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:36:52,431 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:36:52,434 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 13:36:53,211 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:36:54,212 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:36:55,213 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:36:56,214 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:36:57,215 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:36:58,216 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:36:59,218 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:37:00,219 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:37:01,220 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:37:02,221 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:37:03,222 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:37:04,224 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:37:05,225 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:37:06,226 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:37:07,227 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:37:08,228 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:37:09,230 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:37:10,231 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:37:11,232 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:37:12,233 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:37:13,235 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:37:14,236 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:37:15,237 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:37:16,238 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:37:16,240 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 13:37:22,242 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:37:23,243 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:37:24,245 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:37:25,246 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:37:26,247 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:37:27,248 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:37:28,249 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:37:29,251 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:37:30,252 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:37:31,253 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:37:32,254 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:37:33,255 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:37:34,257 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:37:35,258 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:37:36,259 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:37:37,260 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:37:38,262 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:37:39,263 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:37:40,264 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:37:41,265 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:37:42,266 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:37:43,267 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:37:44,269 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:37:45,270 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:37:46,271 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:37:47,272 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:37:48,274 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:37:49,275 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:37:50,276 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:37:51,277 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:37:52,278 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:37:52,444 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:37:52,445 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:37:52,445 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 45 more 2025-07-07 13:37:53,280 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:37:54,281 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:37:55,282 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:37:56,283 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:37:57,284 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:37:58,286 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:37:59,287 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:38:00,288 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:38:01,289 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:38:02,290 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:38:03,292 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:38:04,293 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:38:05,294 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:38:06,295 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:38:07,297 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:38:08,298 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:38:09,299 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:38:10,300 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:38:11,302 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:38:11,304 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 13:38:17,306 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:38:18,307 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:38:19,308 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:38:20,309 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:38:21,311 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:38:22,312 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:38:23,313 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:38:24,315 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:38:25,316 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:38:26,317 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:38:27,318 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:38:28,319 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:38:29,321 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:38:30,322 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:38:31,323 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:38:32,324 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:38:33,325 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:38:34,327 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:38:35,328 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:38:36,329 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:38:37,330 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:38:38,332 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:38:39,333 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:38:40,334 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:38:41,335 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:38:42,337 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:38:43,338 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:38:44,339 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:38:45,340 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:38:46,342 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:38:47,343 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:38:48,344 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:38:49,345 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:38:50,347 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:38:51,348 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:38:52,349 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:38:52,430 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:38:52,431 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:38:52,432 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 13:38:53,350 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:38:54,352 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:38:55,353 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:38:56,354 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:38:57,355 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:38:58,357 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:38:59,358 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:39:00,359 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:39:01,360 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:39:02,361 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:39:03,363 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:39:04,364 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:39:05,365 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:39:06,366 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:39:06,369 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 13:39:12,371 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:39:13,372 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:39:14,373 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:39:15,374 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:39:16,376 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:39:17,377 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:39:18,378 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:39:19,379 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:39:20,380 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:39:21,382 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:39:22,383 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:39:23,384 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:39:24,385 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:39:25,387 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:39:26,388 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:39:27,389 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:39:28,390 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:39:29,391 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:39:30,393 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:39:31,394 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:39:32,395 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:39:33,396 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:39:34,398 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:39:35,399 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:39:36,400 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:39:37,402 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:39:38,403 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:39:39,404 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:39:40,405 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:39:41,407 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:39:42,408 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:39:43,409 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:39:44,410 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:39:45,411 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:39:46,413 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:39:47,414 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:39:48,415 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:39:49,416 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:39:50,418 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:39:51,419 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:39:52,420 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:39:52,445 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:39:52,446 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:39:52,447 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 13:39:53,421 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:39:54,422 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:39:55,423 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:39:56,425 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:39:57,426 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:39:58,427 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:39:59,429 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:40:00,430 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:40:01,431 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:40:01,433 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 13:40:07,435 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:40:08,436 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:40:09,064 WARN com.cloudera.cmf.event.publish.EventStorePublisherWithRetry: Failed to publish event: SimpleEvent{attributes={ROLE=[hdfs-DATANODE-b30a464b10a57fdd49ea734cd52a8291], HOSTS=[dmidlkprdls04.svr.luc.edu], ROLE_TYPE=[DATANODE], CATEGORY=[LOG_MESSAGE], EVENTCODE=[EV_LOG_EVENT], SERVICE=[hdfs], SERVICE_TYPE=[HDFS], LOG_LEVEL=[WARN], HOST_IDS=[5c33df90-d247-4c6d-b9e0-5908a423580a], SEVERITY=[IMPORTANT]}, content=Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /var/lib/hadoop-hdfs/hadoop-http-auth-signature-secret, timestamp=1751909970885} - 1 of 60 failure(s) in last 1818s java.io.IOException: Error connecting to dmidlkprdls01.svr.luc.edu/192.168.158.1:7184 at com.cloudera.cmf.event.shaded.org.apache.avro.ipc.netty.NettyTransceiver.getChannel(NettyTransceiver.java:269) at com.cloudera.cmf.event.shaded.org.apache.avro.ipc.netty.NettyTransceiver.(NettyTransceiver.java:197) at com.cloudera.cmf.event.shaded.org.apache.avro.ipc.netty.NettyTransceiver.(NettyTransceiver.java:120) at com.cloudera.cmf.event.publish.AvroEventStorePublishProxy.checkSpecificRequestor(AvroEventStorePublishProxy.java:120) at com.cloudera.cmf.event.publish.AvroEventStorePublishProxy.publishEvent(AvroEventStorePublishProxy.java:204) at com.cloudera.cmf.event.publish.EventStorePublisherWithRetry$PublishEventTask.run(EventStorePublisherWithRetry.java:242) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:750) Caused by: com.cloudera.cmf.event.shaded.io.netty.channel.AbstractChannel$AnnotatedNoRouteToHostException: No route to host: dmidlkprdls01.svr.luc.edu/192.168.158.1:7184 Caused by: java.net.NoRouteToHostException: No route to host at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:716) at com.cloudera.cmf.event.shaded.io.netty.channel.socket.nio.NioSocketChannel.doFinishConnect(NioSocketChannel.java:337) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.finishConnect(AbstractNioChannel.java:339) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:776) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:724) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:650) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:562) at com.cloudera.cmf.event.shaded.io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997) at com.cloudera.cmf.event.shaded.io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:40:09,438 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:40:10,439 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:40:11,440 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:40:12,441 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:40:13,442 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:40:14,443 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:40:15,445 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:40:16,446 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:40:17,447 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:40:18,448 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:40:19,449 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:40:20,451 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:40:21,452 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:40:22,453 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:40:23,454 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:40:24,455 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:40:25,457 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:40:26,458 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:40:27,459 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:40:28,460 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:40:29,461 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:40:30,463 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:40:31,464 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:40:32,465 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:40:33,466 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:40:34,467 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:40:35,469 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:40:36,470 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:40:37,471 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:40:38,473 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:40:39,474 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:40:40,475 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:40:41,476 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:40:42,478 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:40:43,479 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:40:44,480 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:40:45,481 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:40:46,482 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:40:47,484 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:40:48,485 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:40:49,486 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:40:50,487 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:40:51,488 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:40:52,449 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:40:52,450 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:40:52,452 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 13:40:52,490 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:40:53,491 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:40:54,492 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:40:55,493 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:40:56,494 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:40:56,496 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 13:41:02,498 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:41:03,499 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:41:04,500 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:41:05,501 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:41:06,502 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:41:07,504 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:41:08,505 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:41:09,506 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:41:10,508 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:41:11,509 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:41:12,510 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:41:13,511 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:41:14,513 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:41:15,514 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:41:16,515 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:41:17,516 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:41:18,517 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:41:19,519 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:41:20,520 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:41:21,521 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:41:22,522 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:41:23,524 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:41:24,525 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:41:25,527 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:41:26,528 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:41:27,529 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:41:28,530 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:41:29,531 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:41:30,533 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:41:31,534 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:41:32,535 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:41:33,536 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:41:34,537 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:41:35,539 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:41:36,540 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:41:37,542 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:41:38,543 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:41:39,544 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:41:40,546 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:41:41,547 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:41:42,548 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:41:43,549 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:41:44,550 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:41:45,552 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:41:46,553 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:41:47,554 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:41:48,555 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:41:49,557 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:41:50,558 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:41:51,559 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:41:51,561 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 13:41:52,445 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:41:52,446 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:41:52,446 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 45 more 2025-07-07 13:41:57,562 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:41:58,564 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:41:59,565 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:42:00,566 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:42:01,567 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:42:02,569 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:42:03,570 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:42:04,571 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:42:05,572 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:42:06,574 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:42:07,575 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:42:08,576 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:42:09,578 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:42:10,579 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:42:11,580 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:42:12,581 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:42:13,583 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:42:14,584 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:42:15,585 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:42:16,586 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:42:17,588 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:42:18,589 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:42:19,590 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:42:20,591 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:42:21,592 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:42:22,594 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:42:23,595 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:42:24,596 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:42:25,597 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:42:26,599 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:42:27,600 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:42:28,601 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:42:29,602 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:42:30,604 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:42:31,605 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:42:32,606 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:42:33,607 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:42:34,608 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:42:35,610 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:42:36,611 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:42:37,612 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:42:38,613 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:42:39,615 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:42:40,616 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:42:41,617 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:42:42,618 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:42:43,620 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:42:44,621 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:42:45,622 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:42:46,623 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:42:46,625 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 13:42:52,434 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:42:52,435 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:42:52,436 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 13:42:52,626 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:42:53,628 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:42:54,629 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:42:55,630 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:42:56,631 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:42:57,633 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:42:58,634 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:42:59,635 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:43:00,636 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:43:01,637 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:43:02,639 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:43:03,640 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:43:04,641 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:43:05,642 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:43:06,644 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:43:07,645 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:43:08,646 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:43:09,647 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:43:10,649 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:43:11,650 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:43:12,651 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:43:13,652 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:43:14,654 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:43:15,655 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:43:16,656 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:43:17,657 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:43:18,659 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:43:19,660 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:43:20,661 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:43:21,662 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:43:22,664 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:43:23,666 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:43:24,668 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:43:25,669 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:43:26,670 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:43:27,671 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:43:28,673 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:43:29,674 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:43:30,675 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:43:31,677 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:43:32,678 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:43:33,679 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:43:34,680 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:43:35,682 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:43:36,683 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:43:37,684 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:43:38,686 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:43:39,687 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:43:40,688 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:43:41,689 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:43:41,691 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 13:43:47,693 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:43:48,694 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:43:49,695 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:43:50,696 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:43:51,697 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:43:52,431 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:43:52,432 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:43:52,433 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 13:43:52,699 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:43:53,700 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:43:54,701 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:43:55,702 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:43:56,704 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:43:57,705 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:43:58,706 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:43:59,707 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:44:00,709 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:44:01,710 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:44:02,711 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:44:03,712 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:44:04,714 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:44:05,715 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:44:06,716 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:44:07,718 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:44:08,719 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:44:09,720 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:44:10,721 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:44:11,723 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:44:12,724 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:44:13,725 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:44:14,726 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:44:15,727 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:44:16,729 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:44:17,730 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:44:18,731 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:44:19,732 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:44:20,734 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:44:21,735 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:44:22,736 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:44:23,738 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:44:24,739 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:44:25,741 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:44:26,742 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:44:27,743 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:44:28,744 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:44:29,746 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:44:30,747 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:44:31,748 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:44:32,750 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:44:33,751 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:44:34,752 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:44:35,753 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:44:36,754 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:44:36,756 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 13:44:42,758 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:44:43,760 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:44:44,761 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:44:45,762 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:44:46,763 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:44:47,765 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:44:48,766 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:44:49,767 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:44:50,768 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:44:51,770 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:44:52,448 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:44:52,449 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:44:52,450 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 13:44:52,771 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:44:53,772 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:44:54,773 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:44:55,774 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:44:56,776 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:44:57,777 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:44:58,778 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:44:59,780 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:45:00,781 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:45:01,782 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:45:02,783 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:45:03,785 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:45:04,786 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:45:05,787 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:45:06,788 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:45:07,790 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:45:08,791 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:45:09,792 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:45:10,794 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:45:11,795 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:45:12,796 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:45:13,797 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:45:14,799 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:45:15,800 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:45:16,801 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:45:17,802 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:45:18,803 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:45:19,805 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:45:20,806 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:45:21,807 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:45:22,808 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:45:23,810 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:45:24,812 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:45:25,813 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:45:26,814 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:45:27,815 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:45:28,817 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:45:29,818 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:45:30,819 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:45:31,821 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:45:31,822 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 13:45:37,824 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:45:38,825 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:45:39,827 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:45:40,828 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:45:41,829 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:45:42,831 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:45:43,832 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:45:44,833 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:45:45,834 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:45:46,836 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:45:47,837 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:45:48,838 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:45:49,839 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:45:50,840 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:45:51,842 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:45:52,436 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:45:52,437 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:45:52,438 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 13:45:52,843 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:45:53,844 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:45:54,846 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:45:55,847 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:45:56,848 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:45:57,849 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:45:58,851 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:45:59,852 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:46:00,853 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:46:01,854 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:46:02,856 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:46:03,857 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:46:04,858 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:46:05,859 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:46:06,861 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:46:07,862 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:46:08,863 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:46:09,865 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:46:10,866 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:46:11,867 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:46:12,868 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:46:13,870 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:46:14,871 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:46:15,872 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:46:16,873 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:46:17,875 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:46:18,876 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:46:19,877 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:46:20,879 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:46:21,880 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:46:22,881 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:46:23,883 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:46:24,884 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:46:25,885 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:46:26,887 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:46:26,888 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 13:46:32,890 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:46:33,891 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:46:34,892 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:46:35,894 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:46:36,895 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:46:37,896 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:46:38,898 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:46:39,899 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:46:40,900 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:46:41,901 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:46:42,902 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:46:43,904 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:46:44,905 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:46:45,906 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:46:46,908 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:46:47,909 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:46:48,910 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:46:49,911 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:46:50,912 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:46:51,914 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:46:52,447 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:46:52,448 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:46:52,451 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 13:46:52,915 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:46:53,916 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:46:54,917 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:46:55,919 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:46:56,920 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:46:57,921 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:46:58,922 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:46:59,924 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:47:00,925 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:47:01,926 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:47:02,927 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:47:03,929 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:47:04,930 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:47:05,931 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:47:06,933 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:47:07,934 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:47:08,935 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:47:09,937 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:47:10,938 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:47:11,939 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:47:12,940 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:47:13,942 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:47:14,943 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:47:15,944 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:47:16,945 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:47:17,947 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:47:18,948 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:47:19,949 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:47:20,951 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:47:21,952 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:47:21,954 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 13:47:27,956 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:47:28,957 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:47:29,959 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:47:30,960 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:47:31,961 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:47:32,962 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:47:33,964 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:47:34,965 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:47:35,966 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:47:36,967 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:47:37,969 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:47:38,970 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:47:39,972 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:47:40,973 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:47:41,974 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:47:42,975 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:47:43,977 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:47:44,978 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:47:45,979 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:47:46,980 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:47:47,982 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:47:48,983 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:47:49,984 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:47:50,986 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:47:51,987 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:47:52,433 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:47:52,435 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:47:52,436 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 13:47:52,988 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:47:53,989 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:47:54,991 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:47:55,992 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:47:56,993 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:47:57,995 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:47:58,996 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:47:59,997 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:48:00,999 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:48:02,000 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:48:03,001 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:48:04,002 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:48:05,004 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:48:06,005 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:48:07,006 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:48:08,008 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:48:09,009 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:48:10,010 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:48:11,012 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:48:12,013 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:48:13,014 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:48:14,015 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:48:15,017 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:48:16,018 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:48:17,019 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:48:17,021 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 13:48:23,023 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:48:24,024 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:48:25,026 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:48:26,027 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:48:27,028 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:48:28,030 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:48:29,031 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:48:30,032 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:48:31,033 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:48:32,035 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:48:33,036 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:48:34,037 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:48:35,039 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:48:36,040 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:48:37,041 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:48:38,042 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:48:39,044 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:48:40,045 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:48:41,046 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:48:42,047 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:48:43,049 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:48:44,050 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:48:45,051 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:48:46,053 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:48:47,054 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:48:48,055 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:48:49,056 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:48:50,058 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:48:51,059 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:48:52,060 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:48:52,447 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:48:52,448 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:48:52,449 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 13:48:53,062 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:48:54,063 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:48:55,064 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:48:56,065 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:48:57,067 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:48:58,068 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:48:59,069 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:49:00,071 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:49:01,072 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:49:02,073 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:49:03,074 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:49:04,076 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:49:05,077 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:49:06,078 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:49:07,079 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:49:08,081 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:49:09,082 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:49:10,083 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:49:11,084 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:49:12,086 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:49:12,088 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 13:49:18,090 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:49:19,091 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:49:20,093 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:49:21,094 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:49:22,095 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:49:23,096 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:49:24,098 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:49:25,099 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:49:26,100 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:49:27,101 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:49:28,103 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:49:29,104 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:49:30,105 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:49:31,106 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:49:32,108 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:49:33,109 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:49:34,110 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:49:35,112 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:49:36,113 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:49:37,114 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:49:38,116 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:49:39,117 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:49:40,118 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:49:41,119 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:49:42,121 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:49:43,122 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:49:44,123 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:49:45,124 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:49:46,126 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:49:47,127 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:49:48,128 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:49:49,129 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:49:50,131 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:49:51,132 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:49:52,133 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:49:52,434 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:49:52,435 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:49:52,436 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 13:49:53,134 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:49:54,136 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:49:55,137 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:49:56,138 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:49:57,139 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:49:58,141 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:49:59,142 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:50:00,143 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:50:01,145 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:50:02,146 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:50:03,147 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:50:04,148 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:50:05,150 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:50:06,151 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:50:07,152 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:50:07,155 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 13:50:13,157 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:50:14,158 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:50:15,159 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:50:16,161 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:50:17,162 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:50:18,163 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:50:19,165 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:50:20,166 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:50:21,167 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:50:22,168 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:50:23,170 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:50:24,171 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:50:25,172 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:50:26,173 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:50:27,175 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:50:28,176 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:50:29,177 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:50:30,178 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:50:31,179 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:50:32,181 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:50:33,182 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:50:34,183 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:50:35,185 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:50:36,186 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:50:37,187 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:50:38,189 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:50:39,190 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:50:40,191 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:50:41,193 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:50:42,194 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:50:43,195 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:50:44,197 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:50:45,415 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:50:46,417 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:50:47,418 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:50:48,419 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:50:49,421 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:50:50,422 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:50:51,423 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:50:52,424 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:50:52,433 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:50:52,434 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:50:52,435 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 13:50:53,426 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:50:54,427 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:50:55,428 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:50:56,429 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:50:57,430 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:50:58,432 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:50:59,433 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:51:00,434 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:51:01,436 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:51:02,437 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:51:02,439 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 13:51:08,441 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:51:09,442 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:51:10,443 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:51:11,445 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:51:12,446 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:51:13,447 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:51:14,448 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:51:15,450 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:51:16,451 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:51:17,452 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:51:18,454 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:51:19,455 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:51:20,456 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:51:21,457 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:51:22,459 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:51:23,460 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:51:24,461 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:51:25,463 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:51:26,464 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:51:27,465 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:51:28,467 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:51:29,468 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:51:30,469 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:51:31,470 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:51:32,472 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:51:33,473 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:51:34,474 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:51:35,475 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:51:36,477 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:51:37,478 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:51:38,479 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:51:39,481 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:51:40,482 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:51:41,483 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:51:42,485 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:51:43,486 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:51:44,487 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:51:45,488 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:51:46,490 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:51:47,491 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:51:48,492 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:51:49,494 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:51:50,495 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:51:51,496 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:51:52,449 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:51:52,450 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:51:52,451 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 45 more 2025-07-07 13:51:52,497 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:51:53,499 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:51:54,500 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:51:55,501 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:51:56,502 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:51:57,504 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:51:57,506 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 13:52:03,508 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:52:04,509 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:52:05,510 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:52:06,512 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:52:07,513 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:52:08,514 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:52:09,516 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:52:10,517 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:52:11,518 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:52:12,519 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:52:13,521 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:52:14,522 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:52:15,523 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:52:16,524 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:52:17,526 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:52:18,527 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:52:19,528 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:52:20,530 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:52:21,531 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:52:22,532 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:52:23,533 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:52:24,535 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:52:25,536 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:52:26,537 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:52:27,538 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:52:28,540 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:52:29,541 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:52:30,542 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:52:31,543 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:52:32,545 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:52:33,546 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:52:34,547 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:52:35,549 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:52:36,550 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:52:37,551 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:52:38,553 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:52:39,554 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:52:40,555 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:52:41,556 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:52:42,558 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:52:43,559 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:52:44,560 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:52:45,561 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:52:46,563 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:52:47,564 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:52:48,565 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:52:49,567 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:52:50,568 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:52:51,569 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:52:52,435 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:52:52,436 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:52:52,437 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 13:52:52,570 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:52:52,572 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 13:52:58,575 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:52:59,576 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:53:00,578 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:53:01,579 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:53:02,580 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:53:03,581 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:53:04,583 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:53:05,584 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:53:06,585 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:53:07,587 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:53:08,588 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:53:09,589 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:53:10,591 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:53:11,592 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:53:12,593 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:53:13,594 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:53:14,596 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:53:15,597 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:53:16,598 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:53:17,599 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:53:18,601 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:53:19,602 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:53:20,603 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:53:21,604 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:53:22,606 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:53:23,607 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:53:24,609 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:53:25,610 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:53:26,611 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:53:27,612 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:53:28,614 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:53:29,615 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:53:30,616 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:53:31,618 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:53:32,619 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:53:33,620 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:53:34,621 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:53:35,623 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:53:36,624 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:53:37,625 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:53:38,627 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:53:39,628 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:53:40,629 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:53:41,630 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:53:42,631 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:53:43,633 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:53:44,634 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:53:45,635 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:53:46,637 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:53:47,638 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:53:47,642 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 13:53:52,448 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:53:52,449 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:53:52,450 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 13:53:53,643 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:53:54,645 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:53:55,646 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:53:56,647 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:53:57,649 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:53:58,650 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:53:59,651 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:54:00,652 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:54:01,654 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:54:02,655 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:54:03,656 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:54:04,658 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:54:05,659 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:54:06,660 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:54:07,661 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:54:08,663 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:54:09,664 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:54:10,665 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:54:11,667 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:54:12,668 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:54:13,669 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:54:14,671 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:54:15,672 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:54:16,673 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:54:17,675 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:54:18,676 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:54:19,677 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:54:20,678 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:54:21,679 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:54:22,681 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:54:23,682 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:54:24,683 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:54:25,684 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:54:26,686 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:54:27,687 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:54:28,688 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:54:29,689 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:54:30,691 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:54:31,692 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:54:32,693 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:54:33,694 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:54:34,695 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:54:35,697 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:54:36,698 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:54:37,699 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:54:38,701 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:54:39,702 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:54:40,703 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:54:41,704 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:54:42,706 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:54:42,707 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 13:54:48,709 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:54:49,710 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:54:50,711 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:54:51,712 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:54:52,432 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:54:52,433 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:54:52,434 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 13:54:52,714 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:54:53,715 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:54:54,716 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:54:55,717 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:54:56,719 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:54:57,720 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:54:58,721 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:54:59,722 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:55:00,724 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:55:01,725 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:55:02,726 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:55:03,727 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:55:04,729 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:55:05,730 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:55:06,731 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:55:07,732 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:55:08,734 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:55:09,735 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:55:10,736 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:55:11,738 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:55:12,739 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:55:13,740 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:55:14,742 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:55:15,743 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:55:16,744 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:55:17,745 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:55:18,747 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:55:19,748 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:55:20,749 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:55:21,750 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:55:22,752 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:55:23,754 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:55:24,755 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:55:25,756 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:55:26,758 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:55:27,759 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:55:28,760 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:55:29,762 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:55:30,763 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:55:31,764 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:55:32,765 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:55:33,767 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:55:34,768 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:55:35,769 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:55:36,770 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:55:37,772 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:55:37,773 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 13:55:43,775 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:55:44,777 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:55:45,778 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:55:46,779 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:55:47,780 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:55:48,782 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:55:49,783 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:55:50,784 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:55:51,785 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:55:52,451 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:55:52,452 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:55:52,453 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 13:55:52,787 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:55:53,788 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:55:54,789 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:55:55,790 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:55:56,792 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:55:57,793 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:55:58,794 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:55:59,795 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:56:00,797 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:56:01,798 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:56:02,799 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:56:03,801 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:56:04,802 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:56:05,803 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:56:06,804 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:56:07,806 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:56:08,807 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:56:09,808 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:56:10,809 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:56:11,811 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:56:12,812 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:56:13,813 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:56:14,815 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:56:15,816 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:56:16,817 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:56:17,818 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:56:18,820 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:56:19,821 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:56:20,822 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:56:21,823 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:56:22,825 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:56:23,827 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:56:24,829 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:56:25,830 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:56:26,831 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:56:27,833 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:56:28,834 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:56:29,835 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:56:30,836 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:56:31,837 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:56:32,839 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:56:32,840 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 13:56:38,842 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:56:39,844 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:56:40,845 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:56:41,846 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:56:42,848 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:56:43,849 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:56:44,850 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:56:45,851 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:56:46,853 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:56:47,854 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:56:48,855 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:56:49,856 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:56:50,858 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:56:51,859 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:56:52,437 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:56:52,438 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:56:52,440 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 13:56:52,860 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:56:53,861 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:56:54,862 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:56:55,864 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:56:56,865 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:56:57,866 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:56:58,867 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:56:59,869 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:57:00,870 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:57:01,871 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:57:02,873 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:57:03,874 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:57:04,875 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:57:05,876 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:57:06,878 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:57:07,879 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:57:08,880 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:57:09,881 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:57:10,883 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:57:11,884 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:57:12,885 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:57:13,886 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:57:14,888 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:57:15,889 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:57:16,890 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:57:17,892 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:57:18,893 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:57:19,894 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:57:20,895 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:57:21,897 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:57:22,898 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:57:23,900 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:57:24,901 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:57:25,903 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:57:26,904 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:57:27,905 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:57:27,907 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 13:57:33,909 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:57:34,910 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:57:35,911 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:57:36,912 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:57:37,913 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:57:38,915 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:57:39,916 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:57:40,917 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:57:41,918 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:57:42,920 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:57:43,921 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:57:44,922 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:57:45,923 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:57:46,925 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:57:47,926 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:57:48,927 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:57:49,928 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:57:50,930 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:57:51,931 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:57:52,450 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:57:52,451 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:57:52,452 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 13:57:52,932 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:57:53,934 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:57:54,935 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:57:55,936 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:57:56,937 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:57:57,939 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:57:58,940 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:57:59,941 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:58:00,942 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:58:01,944 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:58:02,945 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:58:03,946 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:58:04,947 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:58:05,949 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:58:06,950 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:58:07,951 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:58:08,953 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:58:09,954 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:58:10,955 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:58:11,956 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:58:12,958 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:58:13,959 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:58:14,960 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:58:15,961 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:58:16,963 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:58:17,964 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:58:18,965 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:58:19,966 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:58:20,967 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:58:21,968 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:58:22,970 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:58:22,972 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 13:58:28,974 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:58:29,975 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:58:30,976 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:58:31,978 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:58:32,979 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:58:33,980 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:58:34,981 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:58:35,982 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:58:36,984 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:58:37,985 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:58:38,986 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:58:39,988 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:58:40,989 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:58:41,990 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:58:42,991 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:58:43,993 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:58:44,994 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:58:45,995 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:58:46,996 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:58:47,997 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:58:48,999 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:58:50,000 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:58:51,001 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:58:52,002 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:58:52,436 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:58:52,437 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:58:52,437 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 13:58:53,003 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:58:54,004 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:58:55,005 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:58:56,007 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:58:57,008 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:58:58,009 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:58:59,010 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:59:00,011 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:59:01,013 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:59:02,014 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:59:03,015 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:59:04,016 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:59:05,017 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:59:06,019 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:59:07,020 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:59:08,021 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:59:09,022 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:59:10,024 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:59:11,025 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:59:12,026 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:59:13,027 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:59:14,028 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:59:15,030 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:59:16,031 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:59:17,032 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:59:18,033 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:59:18,035 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 13:59:24,037 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:59:25,039 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:59:26,040 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:59:27,041 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:59:28,042 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:59:29,043 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:59:30,045 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:59:31,046 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:59:32,047 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:59:33,048 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:59:34,050 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:59:35,051 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:59:36,052 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:59:37,053 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:59:38,054 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:59:39,056 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:59:40,057 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:59:41,058 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:59:42,060 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:59:43,061 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:59:44,062 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:59:45,063 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:59:46,065 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:59:47,066 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:59:48,067 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:59:49,068 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:59:50,070 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:59:51,071 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:59:52,072 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:59:52,451 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:59:52,452 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 13:59:52,452 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 13:59:53,073 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:59:54,074 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:59:55,076 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:59:56,077 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:59:57,078 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:59:58,079 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 13:59:59,081 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:00:00,082 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:00:01,083 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:00:02,084 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:00:03,085 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:00:04,086 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:00:05,088 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:00:06,089 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:00:07,090 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:00:08,091 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:00:09,093 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:00:10,094 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:00:11,095 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:00:12,096 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:00:13,097 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:00:13,100 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 14:00:19,102 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:00:20,103 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:00:21,104 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:00:22,105 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:00:23,106 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:00:24,108 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:00:25,109 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:00:26,110 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:00:27,111 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:00:28,112 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:00:29,114 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:00:30,115 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:00:31,116 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:00:32,117 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:00:33,118 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:00:34,120 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:00:35,121 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:00:36,122 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:00:37,123 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:00:38,124 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:00:39,126 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:00:40,127 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:00:41,128 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:00:42,129 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:00:43,130 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:00:44,132 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:00:45,133 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:00:46,134 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:00:47,135 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:00:48,136 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:00:49,138 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:00:50,139 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:00:51,140 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:00:52,141 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:00:52,437 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:00:52,438 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:00:52,439 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 14:00:53,142 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:00:54,143 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:00:55,145 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:00:56,146 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:00:57,147 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:00:58,148 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:00:59,150 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:01:00,151 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:01:01,152 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:01:02,153 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:01:03,154 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:01:04,155 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:01:05,157 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:01:06,158 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:01:07,159 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:01:08,160 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:01:08,162 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 14:01:14,164 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:01:15,165 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:01:16,166 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:01:17,168 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:01:18,169 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:01:19,170 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:01:20,171 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:01:21,172 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:01:22,174 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:01:23,175 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:01:24,176 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:01:25,177 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:01:26,179 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:01:27,180 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:01:28,181 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:01:29,182 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:01:30,183 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:01:31,185 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:01:32,186 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:01:33,187 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:01:34,188 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:01:35,189 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:01:36,190 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:01:37,192 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:01:38,193 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:01:39,194 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:01:40,196 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:01:41,197 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:01:42,198 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:01:43,199 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:01:44,200 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:01:45,201 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:01:46,202 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:01:47,203 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:01:48,205 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:01:49,206 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:01:50,207 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:01:51,208 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:01:52,209 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:01:52,459 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:01:52,460 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:01:52,461 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 14:01:53,210 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:01:54,211 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:01:55,213 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:01:56,213 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:01:57,215 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:01:58,216 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:01:59,217 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:02:00,218 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:02:01,219 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:02:02,221 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:02:03,222 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:02:03,224 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 14:02:09,226 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:02:10,227 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:02:11,228 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:02:12,230 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:02:13,231 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:02:14,232 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:02:15,233 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:02:16,234 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:02:17,236 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:02:18,237 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:02:19,238 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:02:20,239 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:02:21,240 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:02:22,242 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:02:23,243 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:02:24,244 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:02:25,245 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:02:26,246 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:02:27,248 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:02:28,249 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:02:29,250 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:02:30,251 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:02:31,252 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:02:32,254 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:02:33,255 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:02:34,256 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:02:35,257 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:02:36,258 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:02:37,260 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:02:38,261 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:02:39,262 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:02:40,264 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:02:41,265 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:02:42,266 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:02:43,267 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:02:44,268 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:02:45,270 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:02:46,271 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:02:47,272 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:02:48,273 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:02:49,274 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:02:50,275 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:02:51,277 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:02:52,278 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:02:52,444 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:02:52,445 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:02:52,446 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 14:02:53,279 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:02:54,280 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:02:55,281 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:02:56,283 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:02:57,284 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:02:58,285 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:02:58,287 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 14:03:04,289 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:03:05,290 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:03:06,292 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:03:07,293 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:03:08,294 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:03:09,296 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:03:10,297 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:03:11,298 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:03:12,299 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:03:13,301 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:03:14,302 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:03:15,303 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:03:16,304 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:03:17,306 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:03:18,307 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:03:19,308 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:03:20,310 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:03:21,311 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:03:22,312 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:03:23,313 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:03:24,315 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:03:25,316 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:03:26,317 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:03:27,318 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:03:28,320 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:03:29,321 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:03:30,322 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:03:31,324 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:03:32,325 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:03:33,326 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:03:34,327 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:03:35,328 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:03:36,330 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:03:37,331 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:03:38,332 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:03:39,334 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:03:40,335 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:03:41,336 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:03:42,338 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:03:43,339 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:03:44,340 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:03:45,341 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:03:46,342 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:03:47,344 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:03:48,345 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:03:49,346 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:03:50,347 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:03:51,349 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:03:52,350 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:03:52,455 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:03:52,456 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:03:52,457 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 14:03:53,351 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:03:53,352 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 14:03:59,354 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:04:00,355 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:04:01,357 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:04:02,358 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:04:03,359 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:04:04,360 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:04:05,361 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:04:06,363 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:04:07,364 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:04:08,365 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:04:09,367 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:04:10,368 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:04:11,369 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:04:12,370 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:04:13,371 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:04:14,373 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:04:15,374 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:04:16,375 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:04:17,376 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:04:18,378 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:04:19,379 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:04:20,380 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:04:21,381 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:04:22,382 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:04:23,383 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:04:24,385 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:04:25,386 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:04:26,388 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:04:27,389 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:04:28,390 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:04:29,391 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:04:30,392 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:04:31,394 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:04:32,395 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:04:33,396 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:04:34,397 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:04:35,398 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:04:36,399 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:04:37,401 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:04:38,402 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:04:39,403 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:04:40,405 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:04:41,406 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:04:42,407 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:04:43,408 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:04:44,409 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:04:45,411 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:04:46,412 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:04:47,413 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:04:48,414 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:04:48,416 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 14:04:52,439 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:04:52,439 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:04:52,440 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 14:04:54,417 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:04:55,418 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:04:56,419 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:04:57,421 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:04:58,422 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:04:59,423 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:05:00,424 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:05:01,425 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:05:02,427 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:05:03,428 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:05:04,429 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:05:05,430 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:05:06,431 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:05:07,433 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:05:08,434 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:05:09,435 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:05:10,436 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:05:11,438 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:05:12,439 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:05:13,440 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:05:14,441 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:05:15,442 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:05:16,443 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:05:17,445 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:05:18,446 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:05:19,447 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:05:20,448 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:05:21,449 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:05:22,451 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:05:23,452 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:05:24,453 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:05:25,454 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:05:26,455 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:05:27,457 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:05:28,458 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:05:29,459 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:05:30,460 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:05:31,461 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:05:32,462 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:05:33,464 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:05:34,465 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:05:35,466 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:05:36,467 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:05:37,468 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:05:38,469 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:05:39,471 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:05:40,472 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:05:41,473 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:05:42,474 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:05:43,476 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:05:43,477 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 14:05:49,479 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:05:50,480 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:05:51,481 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:05:52,454 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:05:52,455 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:05:52,456 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 14:05:52,482 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:05:53,483 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:05:54,485 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:05:55,486 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:05:56,487 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:05:57,488 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:05:58,489 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:05:59,490 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:06:00,492 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:06:01,493 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:06:02,494 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:06:03,495 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:06:04,497 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:06:05,498 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:06:06,499 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:06:07,500 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:06:08,501 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:06:09,503 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:06:10,504 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:06:11,505 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:06:12,506 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:06:13,508 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:06:14,509 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:06:15,510 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:06:16,512 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:06:17,513 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:06:18,514 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:06:19,515 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:06:20,517 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:06:21,518 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:06:22,519 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:06:23,521 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:06:24,523 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:06:25,524 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:06:26,525 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:06:27,526 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:06:28,527 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:06:29,529 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:06:30,530 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:06:31,531 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:06:32,533 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:06:33,534 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:06:34,535 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:06:35,536 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:06:36,538 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:06:37,539 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:06:38,540 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:06:38,542 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 14:06:44,543 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:06:45,544 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:06:46,546 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:06:47,547 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:06:48,548 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:06:49,550 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:06:50,551 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:06:51,552 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:06:52,443 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:06:52,444 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:06:52,447 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 14:06:52,553 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:06:53,555 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:06:54,556 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:06:55,557 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:06:56,558 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:06:57,560 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:06:58,561 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:06:59,562 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:07:00,563 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:07:01,565 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:07:02,566 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:07:03,567 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:07:04,568 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:07:05,570 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:07:06,571 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:07:07,572 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:07:08,573 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:07:09,574 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:07:10,576 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:07:11,577 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:07:12,578 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:07:13,579 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:07:14,581 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:07:15,582 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:07:16,583 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:07:17,584 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:07:18,586 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:07:19,587 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:07:20,588 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:07:21,589 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:07:22,591 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:07:23,593 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:07:24,594 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:07:25,595 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:07:26,596 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:07:27,597 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:07:28,599 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:07:29,600 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:07:30,601 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:07:31,602 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:07:32,603 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:07:33,605 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:07:33,606 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 14:07:39,608 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:07:40,609 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:07:41,610 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:07:42,612 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:07:43,613 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:07:44,614 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:07:45,615 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:07:46,616 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:07:47,618 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:07:48,619 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:07:49,620 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:07:50,621 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:07:51,623 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:07:52,454 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:07:52,455 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:07:52,456 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 45 more 2025-07-07 14:07:52,624 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:07:53,625 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:07:54,626 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:07:55,627 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:07:56,629 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:07:57,630 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:07:58,631 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:07:59,632 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:08:00,633 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:08:01,635 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:08:02,636 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:08:03,637 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:08:04,638 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:08:05,639 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:08:06,641 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:08:07,642 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:08:08,643 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:08:09,644 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:08:10,646 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:08:11,647 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:08:12,648 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:08:13,649 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:08:14,650 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:08:15,652 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:08:16,653 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:08:17,654 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:08:18,655 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:08:19,656 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:08:20,658 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:08:21,659 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:08:22,660 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:08:23,662 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:08:24,663 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:08:25,665 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:08:26,666 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:08:27,667 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:08:28,668 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:08:28,670 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 14:08:34,672 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:08:35,673 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:08:36,674 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:08:37,675 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:08:38,676 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:08:39,678 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:08:40,679 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:08:41,680 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:08:42,681 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:08:43,683 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:08:44,684 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:08:45,685 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:08:46,686 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:08:47,687 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:08:48,688 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:08:49,690 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:08:50,691 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:08:51,692 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:08:52,441 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:08:52,442 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:08:52,443 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 14:08:52,693 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:08:53,694 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:08:54,696 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:08:55,697 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:08:56,698 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:08:57,699 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:08:58,701 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:08:59,702 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:09:00,703 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:09:01,704 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:09:02,706 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:09:03,707 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:09:04,708 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:09:05,709 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:09:06,711 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:09:07,712 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:09:08,713 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:09:09,714 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:09:10,716 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:09:11,717 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:09:12,718 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:09:13,719 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:09:14,721 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:09:15,722 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:09:16,723 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:09:17,724 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:09:18,725 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:09:19,727 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:09:20,728 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:09:21,729 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:09:22,730 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:09:23,734 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:09:23,735 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 14:09:29,737 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:09:30,738 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:09:31,740 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:09:32,741 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:09:33,742 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:09:34,743 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:09:35,745 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:09:36,746 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:09:37,747 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:09:38,748 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:09:39,750 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:09:40,751 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:09:41,752 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:09:42,753 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:09:43,755 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:09:44,756 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:09:45,757 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:09:46,758 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:09:47,760 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:09:48,761 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:09:49,762 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:09:50,763 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:09:51,765 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:09:52,456 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:09:52,457 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:09:52,458 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 14:09:52,766 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:09:53,767 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:09:54,768 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:09:55,769 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:09:56,771 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:09:57,772 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:09:58,773 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:09:59,774 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:10:00,776 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:10:01,777 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:10:02,778 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:10:03,780 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:10:04,781 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:10:05,782 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:10:06,783 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:10:07,785 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:10:08,786 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:10:09,787 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:10:10,789 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:10:11,790 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:10:12,791 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:10:13,792 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:10:14,794 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:10:15,795 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:10:16,796 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:10:17,797 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:10:18,798 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:10:18,801 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 14:10:24,803 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:10:25,804 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:10:26,524 WARN com.cloudera.cmf.event.publish.EventStorePublisherWithRetry: Failed to publish event: SimpleEvent{attributes={ROLE=[hdfs-DATANODE-b30a464b10a57fdd49ea734cd52a8291], HOSTS=[dmidlkprdls04.svr.luc.edu], ROLE_TYPE=[DATANODE], CATEGORY=[LOG_MESSAGE], EVENTCODE=[EV_LOG_EVENT], SERVICE=[hdfs], SERVICE_TYPE=[HDFS], LOG_LEVEL=[WARN], HOST_IDS=[5c33df90-d247-4c6d-b9e0-5908a423580a], SEVERITY=[IMPORTANT]}, content=Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /var/lib/hadoop-hdfs/hadoop-http-auth-signature-secret, timestamp=1751909970885} - 1 of 60 failure(s) in last 1817s java.io.IOException: Error connecting to dmidlkprdls01.svr.luc.edu/192.168.158.1:7184 at com.cloudera.cmf.event.shaded.org.apache.avro.ipc.netty.NettyTransceiver.getChannel(NettyTransceiver.java:269) at com.cloudera.cmf.event.shaded.org.apache.avro.ipc.netty.NettyTransceiver.(NettyTransceiver.java:197) at com.cloudera.cmf.event.shaded.org.apache.avro.ipc.netty.NettyTransceiver.(NettyTransceiver.java:120) at com.cloudera.cmf.event.publish.AvroEventStorePublishProxy.checkSpecificRequestor(AvroEventStorePublishProxy.java:120) at com.cloudera.cmf.event.publish.AvroEventStorePublishProxy.publishEvent(AvroEventStorePublishProxy.java:204) at com.cloudera.cmf.event.publish.EventStorePublisherWithRetry$PublishEventTask.run(EventStorePublisherWithRetry.java:242) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:750) Caused by: com.cloudera.cmf.event.shaded.io.netty.channel.AbstractChannel$AnnotatedNoRouteToHostException: No route to host: dmidlkprdls01.svr.luc.edu/192.168.158.1:7184 Caused by: java.net.NoRouteToHostException: No route to host at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:716) at com.cloudera.cmf.event.shaded.io.netty.channel.socket.nio.NioSocketChannel.doFinishConnect(NioSocketChannel.java:337) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.finishConnect(AbstractNioChannel.java:339) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:776) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:724) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:650) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:562) at com.cloudera.cmf.event.shaded.io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997) at com.cloudera.cmf.event.shaded.io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:10:26,805 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:10:27,806 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:10:28,807 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:10:29,809 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:10:30,810 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:10:31,811 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:10:32,812 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:10:33,813 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:10:34,814 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:10:35,816 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:10:36,817 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:10:37,818 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:10:38,819 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:10:39,821 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:10:40,822 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:10:41,823 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:10:42,825 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:10:43,826 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:10:44,827 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:10:45,828 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:10:46,829 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:10:47,831 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:10:48,832 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:10:49,833 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:10:50,834 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:10:51,836 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:10:52,442 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:10:52,444 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:10:52,445 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 14:10:52,837 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:10:53,838 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:10:54,839 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:10:55,841 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:10:56,842 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:10:57,843 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:10:58,844 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:10:59,845 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:11:00,847 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:11:01,848 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:11:02,849 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:11:03,850 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:11:04,852 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:11:05,853 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:11:06,854 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:11:07,855 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:11:08,856 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:11:09,858 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:11:10,859 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:11:11,860 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:11:12,862 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:11:13,863 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:11:13,865 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 14:11:19,869 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:11:20,870 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:11:21,871 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:11:22,872 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:11:23,874 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:11:24,875 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:11:25,876 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:11:26,877 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:11:27,879 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:11:28,880 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:11:29,881 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:11:30,882 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:11:31,883 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:11:32,885 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:11:33,886 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:11:34,887 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:11:35,888 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:11:36,889 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:11:37,891 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:11:38,892 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:11:39,893 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:11:40,894 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:11:41,896 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:11:42,897 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:11:43,898 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:11:44,899 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:11:45,900 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:11:46,901 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:11:47,902 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:11:48,904 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:11:49,905 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:11:50,906 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:11:51,907 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:11:52,454 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:11:52,455 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:11:52,456 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 14:11:52,908 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:11:53,910 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:11:54,911 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:11:55,912 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:11:56,913 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:11:57,914 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:11:58,915 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:11:59,916 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:12:00,918 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:12:01,919 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:12:02,920 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:12:03,921 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:12:04,922 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:12:05,923 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:12:06,924 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:12:07,926 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:12:08,927 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:12:08,931 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 14:12:14,933 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:12:15,934 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:12:16,935 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:12:17,936 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:12:18,937 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:12:19,938 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:12:20,939 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:12:21,940 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:12:22,942 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:12:23,943 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:12:24,944 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:12:25,945 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:12:26,947 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:12:27,948 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:12:28,949 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:12:29,950 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:12:30,951 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:12:31,953 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:12:32,954 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:12:33,955 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:12:34,956 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:12:35,957 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:12:36,958 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:12:37,959 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:12:38,961 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:12:39,962 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:12:40,963 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:12:41,964 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:12:42,966 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:12:43,967 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:12:44,968 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:12:45,969 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:12:46,970 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:12:47,971 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:12:48,972 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:12:49,974 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:12:50,975 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:12:51,976 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:12:52,442 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:12:52,443 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:12:52,444 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 14:12:52,977 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:12:53,978 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:12:54,980 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:12:55,981 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:12:56,982 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:12:57,983 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:12:58,984 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:12:59,985 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:13:00,986 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:13:01,987 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:13:02,988 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:13:03,989 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:13:03,992 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 14:13:09,994 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:13:10,995 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:13:11,996 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:13:12,997 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:13:13,998 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:13:14,999 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:13:16,000 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:13:17,001 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:13:18,003 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:13:19,004 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:13:20,005 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:13:21,006 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:13:22,007 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:13:23,009 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:13:24,010 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:13:25,011 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:13:26,012 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:13:27,013 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:13:28,015 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:13:29,016 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:13:30,017 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:13:31,018 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:13:32,019 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:13:33,020 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:13:34,022 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:13:35,023 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:13:36,024 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:13:37,025 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:13:38,027 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:13:39,028 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:13:40,029 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:13:41,030 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:13:42,031 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:13:43,033 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:13:44,034 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:13:45,035 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:13:46,036 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:13:47,037 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:13:48,038 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:13:49,039 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:13:50,040 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:13:51,041 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:13:52,043 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:13:52,446 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:13:52,447 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:13:52,448 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 14:13:53,044 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:13:54,045 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:13:55,046 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:13:56,047 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:13:57,048 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:13:58,049 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:13:59,051 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:13:59,053 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 14:14:05,055 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:14:06,056 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:14:07,057 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:14:08,058 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:14:09,059 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:14:10,061 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:14:11,062 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:14:12,063 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:14:13,064 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:14:14,065 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:14:15,067 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:14:16,068 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:14:17,069 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:14:18,070 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:14:19,071 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:14:20,072 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:14:21,073 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:14:22,075 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:14:23,076 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:14:24,077 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:14:25,078 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:14:26,079 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:14:27,081 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:14:28,082 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:14:29,083 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:14:30,084 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:14:31,085 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:14:32,086 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:14:33,087 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:14:34,089 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:14:35,090 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:14:36,091 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:14:37,092 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:14:38,093 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:14:39,094 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:14:40,096 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:14:41,097 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:14:42,098 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:14:43,099 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:14:44,100 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:14:45,102 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:14:46,103 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:14:47,104 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:14:48,105 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:14:49,106 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:14:50,107 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:14:51,108 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:14:52,110 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:14:52,455 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:14:52,455 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:14:52,456 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 14:14:53,111 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:14:54,112 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:14:54,113 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 14:15:00,115 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:15:01,116 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:15:02,117 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:15:03,118 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:15:04,119 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:15:05,120 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:15:06,122 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:15:07,123 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:15:08,124 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:15:09,125 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:15:10,126 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:15:11,127 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:15:12,128 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:15:13,129 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:15:14,131 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:15:15,132 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:15:16,133 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:15:17,134 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:15:18,135 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:15:19,136 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:15:20,137 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:15:21,138 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:15:22,139 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:15:23,140 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:15:24,143 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:15:25,144 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:15:26,145 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:15:27,146 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:15:28,147 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:15:29,148 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:15:30,149 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:15:31,150 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:15:32,151 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:15:33,153 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:15:34,154 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:15:35,155 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:15:36,156 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:15:37,157 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:15:38,158 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:15:39,160 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:15:40,161 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:15:41,162 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:15:42,163 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:15:43,164 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:15:44,165 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:15:45,166 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:15:46,167 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:15:47,169 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:15:48,170 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:15:49,171 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:15:49,172 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 14:15:52,446 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:15:52,447 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:15:52,448 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 14:15:55,174 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:15:56,175 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:15:57,176 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:15:58,177 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:15:59,178 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:16:00,179 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:16:01,180 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:16:02,181 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:16:03,182 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:16:04,183 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:16:05,185 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:16:06,186 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:16:07,187 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:16:08,188 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:16:09,189 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:16:10,190 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:16:11,191 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:16:12,192 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:16:13,194 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:16:14,195 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:16:15,196 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:16:16,197 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:16:17,199 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:16:18,200 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:16:19,201 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:16:20,202 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:16:21,203 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:16:22,204 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:16:23,205 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:16:24,206 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:16:25,208 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:16:26,209 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:16:27,210 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:16:28,211 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:16:29,212 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:16:30,213 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:16:31,214 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:16:32,215 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:16:33,217 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:16:34,218 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:16:35,219 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:16:36,220 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:16:37,221 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:16:38,222 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:16:39,223 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:16:40,225 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:16:41,226 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:16:42,227 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:16:43,228 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:16:44,229 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:16:44,231 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 14:16:50,232 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:16:51,234 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:16:52,235 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:16:52,451 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:16:52,452 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:16:52,455 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 14:16:53,236 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:16:54,237 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:16:55,238 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:16:56,239 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:16:57,241 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:16:58,242 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:16:59,243 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:17:00,244 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:17:01,245 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:17:02,247 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:17:03,248 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:17:04,249 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:17:05,250 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:17:06,252 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:17:07,253 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:17:08,254 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:17:09,255 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:17:10,257 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:17:11,258 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:17:12,259 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:17:13,260 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:17:14,262 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:17:15,263 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:17:16,264 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:17:17,265 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:17:18,266 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:17:19,268 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:17:20,269 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:17:21,270 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:17:22,271 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:17:23,273 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:17:24,275 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:17:25,276 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:17:26,277 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:17:27,278 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:17:28,280 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:17:29,281 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:17:30,282 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:17:31,283 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:17:32,285 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:17:33,286 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:17:34,287 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:17:35,288 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:17:36,289 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:17:37,291 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:17:38,292 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:17:39,293 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:17:39,295 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 14:17:45,296 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:17:46,298 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:17:47,299 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:17:48,300 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:17:49,301 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:17:50,302 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:17:51,303 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:17:52,305 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:17:52,457 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:17:52,458 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:17:52,459 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 14:17:53,306 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:17:54,307 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:17:55,308 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:17:56,309 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:17:57,311 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:17:58,312 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:17:59,313 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:18:00,314 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:18:01,315 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:18:02,316 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:18:03,317 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:18:04,319 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:18:05,320 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:18:06,321 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:18:07,322 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:18:08,323 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:18:09,325 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:18:10,326 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:18:11,327 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:18:12,328 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:18:13,329 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:18:14,331 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:18:15,332 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:18:16,333 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:18:17,334 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:18:18,335 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:18:19,336 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:18:20,337 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:18:21,339 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:18:22,340 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:18:23,341 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:18:24,343 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:18:25,344 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:18:26,345 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:18:27,346 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:18:28,348 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:18:29,349 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:18:30,350 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:18:31,351 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:18:32,352 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:18:33,354 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:18:34,355 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:18:34,356 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 14:18:40,358 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:18:41,359 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:18:42,360 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:18:43,362 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:18:44,363 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:18:45,364 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:18:46,365 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:18:47,366 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:18:48,367 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:18:49,368 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:18:50,369 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:18:51,371 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:18:52,372 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:18:52,446 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:18:52,447 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:18:52,448 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 14:18:53,373 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:18:54,374 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:18:55,375 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:18:56,376 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:18:57,377 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:18:58,379 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:18:59,380 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:19:00,381 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:19:01,382 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:19:02,383 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:19:03,384 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:19:04,385 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:19:05,387 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:19:06,388 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:19:07,389 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:19:08,390 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:19:09,391 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:19:10,392 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:19:11,394 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:19:12,395 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:19:13,396 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:19:14,397 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:19:15,398 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:19:16,399 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:19:17,401 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:19:18,402 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:19:19,403 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:19:20,404 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:19:21,405 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:19:22,406 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:19:23,407 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:19:24,409 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:19:25,410 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:19:26,411 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:19:27,413 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:19:28,414 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:19:29,415 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:19:29,416 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 14:19:35,418 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:19:36,419 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:19:37,420 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:19:38,421 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:19:39,422 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:19:40,424 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:19:41,425 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:19:42,426 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:19:43,427 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:19:44,428 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:19:45,429 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:19:46,431 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:19:47,432 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:19:48,433 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:19:49,434 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:19:50,435 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:19:51,436 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:19:52,438 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:19:52,445 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:19:52,446 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:19:52,447 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 14:19:53,439 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:19:54,440 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:19:55,442 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:19:56,443 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:19:57,444 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:19:58,445 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:19:59,446 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:20:00,448 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:20:01,449 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:20:02,450 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:20:03,451 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:20:04,452 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:20:05,453 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:20:06,454 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:20:07,456 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:20:08,457 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:20:09,458 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:20:10,459 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:20:11,461 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:20:12,462 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:20:13,463 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:20:14,464 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:20:15,465 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:20:16,467 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:20:17,468 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:20:18,469 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:20:19,470 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:20:20,471 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:20:21,473 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:20:22,474 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:20:23,476 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:20:24,477 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:20:24,478 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 14:20:30,480 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:20:31,481 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:20:32,483 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:20:33,484 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:20:34,485 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:20:35,486 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:20:36,487 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:20:37,488 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:20:38,490 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:20:39,491 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:20:40,492 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:20:41,493 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:20:42,494 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:20:43,496 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:20:44,497 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:20:45,498 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:20:46,499 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:20:47,501 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:20:48,502 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:20:49,503 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:20:50,504 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:20:51,505 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:20:52,458 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:20:52,459 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:20:52,460 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 14:20:52,507 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:20:53,508 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:20:54,509 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:20:55,510 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:20:56,511 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:20:57,513 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:20:58,514 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:20:59,515 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:21:00,516 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:21:01,517 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:21:02,518 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:21:03,520 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:21:04,521 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:21:05,522 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:21:06,523 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:21:07,524 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:21:08,526 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:21:09,527 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:21:10,528 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:21:11,529 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:21:12,530 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:21:13,532 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:21:14,533 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:21:15,534 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:21:16,535 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:21:17,536 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:21:18,538 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:21:19,539 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:21:19,541 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 14:21:25,543 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:21:26,544 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:21:27,545 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:21:28,547 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:21:29,548 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:21:30,549 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:21:31,550 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:21:32,551 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:21:33,553 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:21:34,554 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:21:35,555 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:21:36,556 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:21:37,557 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:21:38,559 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:21:39,560 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:21:40,561 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:21:41,562 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:21:42,564 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:21:43,565 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:21:44,566 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:21:45,567 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:21:46,568 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:21:47,569 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:21:48,571 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:21:49,572 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:21:50,573 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:21:51,574 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:21:52,446 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:21:52,447 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:21:52,448 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 14:21:52,575 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:21:53,577 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:21:54,578 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:21:55,579 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:21:56,580 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:21:57,581 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:21:58,582 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:21:59,583 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:22:00,584 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:22:01,586 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:22:02,587 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:22:03,588 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:22:04,589 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:22:05,590 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:22:06,591 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:22:07,592 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:22:08,594 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:22:09,595 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:22:10,596 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:22:11,597 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:22:12,598 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:22:13,600 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:22:14,601 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:22:14,603 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 14:22:20,605 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:22:21,606 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:22:22,607 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:22:23,609 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:22:24,610 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:22:25,611 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:22:26,612 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:22:27,613 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:22:28,615 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:22:29,616 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:22:30,617 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:22:31,618 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:22:32,619 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:22:33,620 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:22:34,622 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:22:35,623 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:22:36,624 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:22:37,625 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:22:38,626 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:22:39,628 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:22:40,629 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:22:41,630 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:22:42,631 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:22:43,632 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:22:44,634 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:22:45,635 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:22:46,636 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:22:47,637 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:22:48,638 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:22:49,640 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:22:50,641 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:22:51,642 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:22:52,446 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:22:52,447 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:22:52,448 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 14:22:52,643 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:22:53,645 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:22:54,646 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:22:55,647 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:22:56,648 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:22:57,650 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:22:58,651 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:22:59,652 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:23:00,653 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:23:01,654 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:23:02,656 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:23:03,657 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:23:04,658 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:23:05,659 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:23:06,661 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:23:07,662 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:23:08,663 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:23:09,664 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:23:09,668 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 14:23:15,669 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:23:16,671 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:23:17,672 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:23:18,673 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:23:19,674 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:23:20,675 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:23:21,677 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:23:22,678 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:23:23,679 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:23:24,680 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:23:25,681 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:23:26,682 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:23:27,684 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:23:28,685 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:23:29,686 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:23:30,687 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:23:31,688 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:23:32,690 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:23:33,691 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:23:34,692 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:23:35,693 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:23:36,694 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:23:37,695 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:23:38,697 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:23:39,698 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:23:40,699 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:23:41,700 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:23:42,701 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:23:43,703 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:23:44,704 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:23:45,705 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:23:46,706 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:23:47,707 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:23:48,709 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:23:49,710 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:23:50,711 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:23:51,712 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:23:52,457 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:23:52,458 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:23:52,459 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 14:23:52,713 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:23:53,715 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:23:54,716 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:23:55,717 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:23:56,718 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:23:57,719 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:23:58,721 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:23:59,722 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:24:00,723 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:24:01,724 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:24:02,725 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:24:03,727 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:24:04,728 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:24:04,730 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 14:24:10,732 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:24:11,733 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:24:12,734 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:24:13,736 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:24:14,737 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:24:15,738 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:24:16,739 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:24:17,740 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:24:18,741 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:24:19,743 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:24:20,744 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:24:21,745 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:24:22,746 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:24:23,747 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:24:24,748 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:24:25,749 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:24:26,751 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:24:27,752 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:24:28,753 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:24:29,754 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:24:30,755 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:24:31,757 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:24:32,758 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:24:33,759 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:24:34,760 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:24:35,761 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:24:36,763 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:24:37,764 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:24:38,765 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:24:39,766 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:24:40,767 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:24:41,769 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:24:42,770 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:24:43,771 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:24:44,772 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:24:45,773 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:24:46,775 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:24:47,776 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:24:48,777 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:24:49,778 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:24:50,780 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:24:51,781 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:24:52,453 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:24:52,454 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:24:52,455 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 14:24:52,782 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:24:53,783 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:24:54,784 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:24:55,786 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:24:56,787 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:24:57,788 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:24:58,789 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:24:59,790 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:24:59,793 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 14:25:05,795 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:25:06,796 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:25:07,797 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:25:08,798 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:25:09,799 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:25:10,801 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:25:11,802 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:25:12,803 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:25:13,805 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:25:14,806 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:25:15,807 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:25:16,808 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:25:17,809 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:25:18,811 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:25:19,812 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:25:20,813 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:25:21,814 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:25:22,815 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:25:23,816 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:25:24,818 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:25:25,819 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:25:26,820 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:25:27,821 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:25:28,822 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:25:29,823 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:25:30,825 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:25:31,826 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:25:32,827 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:25:33,828 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:25:34,829 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:25:35,830 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:25:36,832 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:25:37,833 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:25:38,834 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:25:39,835 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:25:40,836 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:25:41,837 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:25:42,839 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:25:43,840 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:25:44,841 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:25:45,842 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:25:46,843 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:25:47,844 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:25:48,846 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:25:49,847 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:25:50,848 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:25:51,849 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:25:52,460 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:25:52,461 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:25:52,461 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 14:25:52,850 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:25:53,851 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:25:54,852 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:25:54,854 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 14:26:00,855 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:26:01,856 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:26:02,858 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:26:03,859 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:26:04,860 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:26:05,861 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:26:06,862 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:26:07,864 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:26:08,865 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:26:09,866 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:26:10,867 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:26:11,868 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:26:12,869 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:26:13,871 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:26:14,872 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:26:15,873 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:26:16,874 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:26:17,875 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:26:18,876 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:26:19,877 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:26:20,879 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:26:21,880 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:26:22,881 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:26:23,883 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:26:24,884 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:26:25,885 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:26:26,886 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:26:27,888 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:26:28,889 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:26:29,890 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:26:30,891 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:26:31,892 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:26:32,894 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:26:33,895 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:26:34,896 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:26:35,897 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:26:36,898 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:26:37,899 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:26:38,901 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:26:39,902 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:26:40,903 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:26:41,904 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:26:42,906 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:26:43,907 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:26:44,908 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:26:45,909 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:26:46,910 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:26:47,912 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:26:48,913 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:26:49,914 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:26:49,916 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 14:26:52,448 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:26:52,449 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:26:52,452 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 14:26:55,917 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:26:56,918 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:26:57,920 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:26:58,921 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:26:59,922 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:27:00,923 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:27:01,924 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:27:02,926 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:27:03,927 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:27:04,928 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:27:05,929 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:27:06,931 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:27:07,932 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:27:08,933 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:27:09,934 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:27:10,936 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:27:11,937 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:27:12,938 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:27:13,939 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:27:14,940 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:27:15,942 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:27:16,943 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:27:17,944 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:27:18,945 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:27:19,946 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:27:20,948 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:27:21,949 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:27:22,950 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:27:23,951 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:27:24,952 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:27:25,954 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:27:26,955 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:27:27,956 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:27:28,957 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:27:29,959 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:27:30,960 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:27:31,961 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:27:32,962 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:27:33,963 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:27:34,965 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:27:35,966 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:27:36,967 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:27:37,968 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:27:38,970 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:27:39,971 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:27:40,972 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:27:41,973 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:27:42,975 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:27:43,976 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:27:44,977 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:27:44,979 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 14:27:50,980 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:27:51,981 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:27:52,463 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:27:52,464 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:27:52,465 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 14:27:52,983 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:27:53,984 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:27:54,985 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:27:55,986 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:27:56,987 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:27:57,988 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:27:58,990 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:27:59,991 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:28:00,992 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:28:01,993 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:28:02,994 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:28:03,996 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:28:04,997 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:28:05,998 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:28:06,999 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:28:08,000 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:28:09,007 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:28:10,008 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:28:11,012 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:28:12,013 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:28:13,014 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:28:14,015 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:28:15,017 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:28:16,018 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:28:17,019 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:28:18,020 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:28:19,021 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:28:20,022 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:28:21,023 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:28:22,024 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:28:23,026 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:28:24,027 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:28:25,028 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:28:26,030 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:28:27,031 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:28:28,032 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:28:29,033 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:28:30,034 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:28:31,035 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:28:32,036 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:28:33,037 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:28:34,038 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:28:35,040 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:28:36,041 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:28:37,042 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:28:38,043 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:28:39,044 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:28:40,045 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:28:40,047 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 14:28:46,048 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:28:47,049 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:28:48,051 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:28:49,052 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:28:50,053 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:28:51,054 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:28:52,055 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:28:52,450 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:28:52,452 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:28:52,452 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 14:28:53,056 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:28:54,058 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:28:55,059 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:28:56,060 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:28:57,061 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:28:58,062 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:28:59,063 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:29:00,064 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:29:01,065 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:29:02,066 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:29:03,068 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:29:04,069 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:29:05,070 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:29:06,071 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:29:07,072 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:29:08,073 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:29:09,075 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:29:10,076 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:29:11,077 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:29:12,078 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:29:13,079 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:29:14,080 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:29:15,081 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:29:16,082 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:29:17,084 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:29:18,085 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:29:19,086 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:29:20,087 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:29:21,088 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:29:22,089 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:29:23,090 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:29:24,092 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:29:25,093 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:29:26,095 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:29:27,096 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:29:28,097 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:29:29,098 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:29:30,099 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:29:31,100 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:29:32,101 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:29:33,103 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:29:34,104 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:29:35,105 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:29:35,106 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 14:29:41,111 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:29:42,112 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:29:43,113 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:29:44,114 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:29:45,115 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:29:46,116 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:29:47,117 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:29:48,118 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:29:49,120 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:29:50,121 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:29:51,122 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:29:52,123 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:29:52,463 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:29:52,464 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:29:52,465 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 45 more 2025-07-07 14:29:53,124 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:29:54,125 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:29:55,127 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:29:56,128 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:29:57,129 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:29:58,130 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:29:59,131 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:30:00,132 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:30:01,133 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:30:02,134 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:30:03,136 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:30:04,137 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:30:05,138 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:30:06,139 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:30:07,140 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:30:08,141 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:30:09,142 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:30:10,144 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:30:11,145 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:30:12,146 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:30:13,147 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:30:14,149 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:30:15,150 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:30:16,151 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:30:17,152 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:30:18,153 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:30:19,154 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:30:20,156 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:30:21,157 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:30:22,158 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:30:23,159 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:30:24,161 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:30:25,162 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:30:26,163 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:30:27,165 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:30:28,166 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:30:29,167 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:30:30,168 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:30:30,171 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 14:30:36,173 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:30:37,174 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:30:38,175 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:30:39,177 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:30:40,178 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:30:41,179 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:30:42,180 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:30:43,181 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:30:44,182 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:30:45,184 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:30:46,185 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:30:47,186 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:30:48,187 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:30:49,188 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:30:50,189 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:30:51,191 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:30:52,192 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:30:52,453 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:30:52,454 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:30:52,455 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 14:30:53,193 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:30:54,194 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:30:55,195 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:30:56,197 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:30:57,198 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:30:58,199 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:30:59,200 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:31:00,201 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:31:01,203 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:31:02,204 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:31:03,205 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:31:04,206 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:31:05,207 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:31:06,208 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:31:07,210 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:31:08,211 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:31:09,212 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:31:10,213 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:31:11,215 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:31:12,216 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:31:13,217 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:31:14,218 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:31:15,220 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:31:16,221 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:31:17,222 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:31:18,223 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:31:19,224 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:31:20,226 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:31:21,227 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:31:22,228 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:31:23,229 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:31:24,231 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:31:25,232 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:31:25,234 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 14:31:31,236 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:31:32,237 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:31:33,238 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:31:34,239 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:31:35,240 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:31:36,241 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:31:37,243 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:31:38,244 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:31:39,245 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:31:40,246 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:31:41,247 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:31:42,248 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:31:43,253 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:31:44,254 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:31:45,255 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:31:46,256 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:31:47,257 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:31:48,258 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:31:49,260 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:31:50,261 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:31:51,262 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:31:52,263 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:31:52,450 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:31:52,451 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:31:52,451 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 14:31:53,264 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:31:54,265 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:31:55,267 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:31:56,268 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:31:57,269 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:31:58,270 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:31:59,271 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:32:00,272 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:32:01,274 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:32:02,275 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:32:03,276 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:32:04,277 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:32:05,278 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:32:06,280 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:32:07,281 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:32:08,282 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:32:09,283 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:32:10,284 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:32:11,286 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:32:12,288 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:32:13,289 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:32:14,290 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:32:15,291 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:32:16,292 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:32:17,293 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:32:18,295 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:32:19,296 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:32:20,297 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:32:20,299 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 14:32:26,301 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:32:27,302 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:32:28,303 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:32:29,304 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:32:30,305 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:32:31,306 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:32:32,308 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:32:33,309 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:32:34,310 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:32:35,311 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:32:36,312 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:32:37,314 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:32:38,315 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:32:39,316 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:32:40,317 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:32:41,318 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:32:42,320 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:32:43,321 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:32:44,322 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:32:45,323 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:32:46,324 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:32:47,325 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:32:48,327 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:32:49,328 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:32:50,329 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:32:51,330 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:32:52,331 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:32:52,463 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:32:52,464 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:32:52,465 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 14:32:53,333 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:32:54,334 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:32:55,335 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:32:56,336 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:32:57,337 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:32:58,339 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:32:59,340 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:33:00,341 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:33:01,342 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:33:02,343 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:33:03,345 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:33:04,346 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:33:05,347 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:33:06,348 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:33:07,350 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:33:08,351 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:33:09,352 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:33:10,353 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:33:11,354 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:33:12,356 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:33:13,357 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:33:14,358 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:33:15,359 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:33:15,361 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 14:33:21,363 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:33:22,364 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:33:23,365 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:33:24,367 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:33:25,368 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:33:26,369 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:33:27,370 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:33:28,371 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:33:29,373 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:33:30,374 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:33:31,375 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:33:32,376 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:33:33,377 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:33:34,378 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:33:35,380 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:33:36,381 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:33:37,382 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:33:38,383 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:33:39,384 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:33:40,386 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:33:41,387 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:33:42,388 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:33:43,390 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:33:44,391 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:33:45,392 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:33:46,393 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:33:47,394 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:33:48,396 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:33:49,397 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:33:50,398 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:33:51,399 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:33:52,400 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:33:52,451 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:33:52,452 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:33:52,453 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 14:33:53,401 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:33:54,402 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:33:55,404 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:33:56,405 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:33:57,406 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:33:58,407 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:33:59,408 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:34:00,410 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:34:01,411 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:34:02,412 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:34:03,413 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:34:04,414 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:34:05,416 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:34:06,417 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:34:07,418 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:34:08,419 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:34:09,421 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:34:10,422 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:34:10,424 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 14:34:16,426 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:34:17,427 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:34:18,428 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:34:19,430 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:34:20,431 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:34:21,432 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:34:22,433 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:34:23,434 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:34:24,436 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:34:25,437 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:34:26,438 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:34:27,439 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:34:28,440 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:34:29,441 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:34:30,443 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:34:31,444 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:34:32,445 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:34:33,446 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:34:34,447 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:34:35,449 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:34:36,450 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:34:37,451 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:34:38,452 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:34:39,453 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:34:40,455 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:34:41,456 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:34:42,457 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:34:43,459 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:34:44,460 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:34:45,461 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:34:46,462 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:34:47,463 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:34:48,464 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:34:49,466 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:34:50,467 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:34:51,468 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:34:52,464 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:34:52,465 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:34:52,465 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 45 more 2025-07-07 14:34:52,469 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:34:53,470 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:34:54,472 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:34:55,473 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:34:56,474 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:34:57,475 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:34:58,476 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:34:59,477 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:35:00,478 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:35:01,480 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:35:02,481 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:35:03,482 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:35:04,483 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:35:05,484 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:35:05,486 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 14:35:11,488 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:35:12,490 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:35:13,491 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:35:14,492 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:35:15,493 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:35:16,494 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:35:17,495 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:35:18,497 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:35:19,498 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:35:20,499 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:35:21,500 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:35:22,502 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:35:23,503 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:35:24,504 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:35:25,505 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:35:26,506 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:35:27,508 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:35:28,509 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:35:29,510 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:35:30,511 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:35:31,512 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:35:32,514 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:35:33,515 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:35:34,516 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:35:35,517 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:35:36,518 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:35:37,519 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:35:38,520 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:35:39,522 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:35:40,523 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:35:41,524 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:35:42,525 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:35:43,526 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:35:44,528 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:35:45,529 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:35:46,530 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:35:47,531 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:35:48,533 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:35:49,534 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:35:50,535 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:35:51,536 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:35:52,451 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:35:52,452 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:35:52,453 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 14:35:52,537 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:35:53,539 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:35:54,540 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:35:55,541 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:35:56,542 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:35:57,543 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:35:58,545 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:35:59,546 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:36:00,547 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:36:00,551 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 14:36:06,553 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:36:07,554 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:36:08,555 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:36:09,556 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:36:10,557 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:36:11,559 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:36:12,560 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:36:13,561 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:36:14,562 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:36:15,564 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:36:16,565 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:36:17,566 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:36:18,567 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:36:19,568 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:36:20,570 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:36:21,571 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:36:22,572 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:36:23,573 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:36:24,575 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:36:25,576 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:36:26,577 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:36:27,578 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:36:28,579 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:36:29,581 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:36:30,582 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:36:31,583 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:36:32,584 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:36:33,585 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:36:34,586 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:36:35,588 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:36:36,589 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:36:37,590 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:36:38,591 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:36:39,592 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:36:40,593 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:36:41,595 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:36:42,596 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:36:43,597 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:36:44,598 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:36:45,599 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:36:46,600 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:36:47,602 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:36:48,603 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:36:49,604 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:36:50,605 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:36:51,606 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:36:52,460 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:36:52,461 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:36:52,464 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 14:36:52,607 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:36:53,608 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:36:54,610 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:36:55,611 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:36:55,612 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 14:37:01,614 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:37:02,615 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:37:03,616 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:37:04,618 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:37:05,619 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:37:06,620 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:37:07,621 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:37:08,622 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:37:09,623 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:37:10,625 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:37:11,626 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:37:12,627 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:37:13,628 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:37:14,630 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:37:15,631 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:37:16,632 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:37:17,633 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:37:18,634 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:37:19,636 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:37:20,637 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:37:21,638 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:37:22,639 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:37:23,641 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:37:24,642 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:37:25,644 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:37:26,645 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:37:27,646 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:37:28,647 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:37:29,648 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:37:30,650 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:37:31,651 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:37:32,652 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:37:33,653 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:37:34,655 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:37:35,656 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:37:36,657 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:37:37,658 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:37:38,659 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:37:39,660 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:37:40,662 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:37:41,663 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:37:42,664 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:37:43,666 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:37:44,667 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:37:45,668 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:37:46,669 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:37:47,670 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:37:48,671 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:37:49,673 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:37:50,674 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:37:50,675 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 14:37:52,465 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:37:52,466 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:37:52,467 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 45 more 2025-07-07 14:37:56,677 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:37:57,678 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:37:58,680 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:37:59,681 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:38:00,682 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:38:01,683 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:38:02,684 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:38:03,685 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:38:04,687 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:38:05,688 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:38:06,689 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:38:07,690 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:38:08,691 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:38:09,692 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:38:10,693 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:38:11,695 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:38:12,696 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:38:13,697 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:38:14,698 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:38:15,700 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:38:16,701 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:38:17,702 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:38:18,703 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:38:19,704 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:38:20,705 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:38:21,706 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:38:22,708 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:38:23,709 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:38:24,710 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:38:25,711 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:38:26,712 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:38:27,713 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:38:28,714 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:38:29,716 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:38:30,717 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:38:31,718 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:38:32,719 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:38:33,720 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:38:34,722 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:38:35,723 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:38:36,724 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:38:37,725 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:38:38,726 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:38:39,727 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:38:40,728 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:38:41,730 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:38:42,731 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:38:43,732 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:38:44,733 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:38:45,734 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:38:45,736 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 14:38:51,737 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:38:52,455 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:38:52,456 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:38:52,457 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 14:38:52,739 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:38:53,740 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:38:54,741 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:38:55,742 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:38:56,743 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:38:57,744 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:38:58,745 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:38:59,747 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:39:00,748 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:39:01,749 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:39:02,750 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:39:03,751 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:39:04,752 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:39:05,754 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:39:06,755 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:39:07,756 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:39:08,757 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:39:09,758 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:39:10,759 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:39:11,761 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:39:12,762 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:39:13,763 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:39:14,764 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:39:15,765 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:39:16,767 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:39:17,768 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:39:18,769 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:39:19,770 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:39:20,771 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:39:21,773 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:39:22,774 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:39:23,776 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:39:24,777 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:39:25,778 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:39:26,779 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:39:27,780 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:39:28,782 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:39:29,783 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:39:30,784 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:39:31,785 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:39:32,786 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:39:33,788 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:39:34,789 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:39:35,790 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:39:36,791 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:39:37,792 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:39:38,793 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:39:39,795 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:39:40,796 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:39:40,797 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 14:39:46,799 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:39:47,800 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:39:48,801 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:39:49,803 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:39:50,804 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:39:51,805 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:39:52,466 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:39:52,467 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:39:52,468 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 14:39:52,806 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:39:53,807 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:39:54,808 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:39:55,810 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:39:56,811 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:39:57,812 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:39:58,813 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:39:59,814 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:40:00,815 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:40:01,817 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:40:02,818 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:40:03,819 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:40:04,820 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:40:05,821 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:40:06,823 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:40:07,824 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:40:08,825 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:40:09,826 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:40:10,827 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:40:11,829 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:40:12,830 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:40:13,831 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:40:14,832 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:40:15,834 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:40:16,835 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:40:17,836 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:40:18,837 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:40:19,838 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:40:20,839 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:40:21,841 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:40:22,842 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:40:23,843 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:40:24,845 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:40:25,846 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:40:26,847 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:40:27,848 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:40:28,849 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:40:29,850 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:40:30,851 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:40:31,853 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:40:32,854 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:40:33,855 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:40:34,856 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:40:35,857 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:40:35,859 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 14:40:41,861 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:40:42,862 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:40:43,863 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:40:44,641 WARN com.cloudera.cmf.event.publish.EventStorePublisherWithRetry: Failed to publish event: SimpleEvent{attributes={ROLE=[hdfs-DATANODE-b30a464b10a57fdd49ea734cd52a8291], HOSTS=[dmidlkprdls04.svr.luc.edu], ROLE_TYPE=[DATANODE], CATEGORY=[LOG_MESSAGE], EVENTCODE=[EV_LOG_EVENT], SERVICE=[hdfs], SERVICE_TYPE=[HDFS], LOG_LEVEL=[WARN], HOST_IDS=[5c33df90-d247-4c6d-b9e0-5908a423580a], SEVERITY=[IMPORTANT]}, content=Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /var/lib/hadoop-hdfs/hadoop-http-auth-signature-secret, timestamp=1751909970885} - 1 of 60 failure(s) in last 1818s java.io.IOException: Error connecting to dmidlkprdls01.svr.luc.edu/192.168.158.1:7184 at com.cloudera.cmf.event.shaded.org.apache.avro.ipc.netty.NettyTransceiver.getChannel(NettyTransceiver.java:269) at com.cloudera.cmf.event.shaded.org.apache.avro.ipc.netty.NettyTransceiver.(NettyTransceiver.java:197) at com.cloudera.cmf.event.shaded.org.apache.avro.ipc.netty.NettyTransceiver.(NettyTransceiver.java:120) at com.cloudera.cmf.event.publish.AvroEventStorePublishProxy.checkSpecificRequestor(AvroEventStorePublishProxy.java:120) at com.cloudera.cmf.event.publish.AvroEventStorePublishProxy.publishEvent(AvroEventStorePublishProxy.java:204) at com.cloudera.cmf.event.publish.EventStorePublisherWithRetry$PublishEventTask.run(EventStorePublisherWithRetry.java:242) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:750) Caused by: com.cloudera.cmf.event.shaded.io.netty.channel.AbstractChannel$AnnotatedNoRouteToHostException: No route to host: dmidlkprdls01.svr.luc.edu/192.168.158.1:7184 Caused by: java.net.NoRouteToHostException: No route to host at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:716) at com.cloudera.cmf.event.shaded.io.netty.channel.socket.nio.NioSocketChannel.doFinishConnect(NioSocketChannel.java:337) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.finishConnect(AbstractNioChannel.java:339) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:776) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:724) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:650) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:562) at com.cloudera.cmf.event.shaded.io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997) at com.cloudera.cmf.event.shaded.io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:40:44,864 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:40:45,866 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:40:46,867 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:40:47,868 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:40:48,869 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:40:49,870 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:40:50,872 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:40:51,873 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:40:52,454 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:40:52,455 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:40:52,456 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 14:40:52,874 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:40:53,875 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:40:54,876 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:40:55,877 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:40:56,878 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:40:57,880 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:40:58,881 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:40:59,882 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:41:00,883 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:41:01,884 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:41:02,886 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:41:03,887 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:41:04,888 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:41:05,889 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:41:06,890 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:41:07,891 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:41:08,893 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:41:09,894 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:41:10,895 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:41:11,896 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:41:12,897 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:41:13,898 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:41:14,900 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:41:15,901 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:41:16,902 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:41:17,903 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:41:18,904 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:41:19,906 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:41:20,907 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:41:21,908 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:41:22,909 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:41:23,911 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:41:24,912 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:41:25,913 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:41:26,914 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:41:27,915 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:41:28,917 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:41:29,918 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:41:30,919 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:41:30,920 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 14:41:36,922 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:41:37,923 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:41:38,924 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:41:39,926 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:41:40,927 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:41:41,928 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:41:42,929 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:41:43,930 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:41:44,932 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:41:45,933 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:41:46,934 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:41:47,935 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:41:48,936 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:41:49,937 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:41:50,939 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:41:51,940 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:41:52,469 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:41:52,470 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:41:52,471 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 14:41:52,941 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:41:53,942 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:41:54,943 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:41:55,944 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:41:56,945 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:41:57,947 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:41:58,948 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:41:59,949 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:42:00,950 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:42:01,951 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:42:02,952 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:42:03,954 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:42:04,955 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:42:05,956 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:42:06,957 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:42:07,958 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:42:08,959 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:42:09,961 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:42:10,962 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:42:11,963 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:42:12,965 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:42:13,966 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:42:14,967 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:42:15,968 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:42:16,969 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:42:17,971 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:42:18,972 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:42:19,973 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:42:20,974 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:42:21,975 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:42:22,976 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:42:23,978 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:42:24,980 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:42:25,981 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:42:25,982 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 14:42:31,984 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:42:32,985 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:42:33,986 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:42:34,987 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:42:35,989 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:42:36,990 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:42:37,991 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:42:38,992 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:42:39,993 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:42:40,994 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:42:41,996 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:42:42,997 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:42:43,998 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:42:44,999 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:42:46,000 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:42:47,002 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:42:48,003 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:42:49,004 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:42:50,005 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:42:51,006 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:42:52,008 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:42:52,457 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:42:52,458 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:42:52,459 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 14:42:53,009 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:42:54,010 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:42:55,011 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:42:56,012 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:42:57,013 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:42:58,015 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:42:59,016 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:43:00,017 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:43:01,018 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:43:02,019 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:43:03,021 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:43:04,022 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:43:05,023 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:43:06,024 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:43:07,025 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:43:08,027 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:43:09,028 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:43:10,029 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:43:11,030 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:43:12,032 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:43:13,033 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:43:14,034 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:43:15,035 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:43:16,036 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:43:17,037 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:43:18,039 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:43:19,040 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:43:20,041 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:43:21,042 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:43:21,044 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 14:43:27,046 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:43:28,047 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:43:29,048 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:43:30,050 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:43:31,051 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:43:32,052 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:43:33,053 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:43:34,054 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:43:35,055 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:43:36,057 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:43:37,058 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:43:38,059 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:43:39,060 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:43:40,061 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:43:41,062 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:43:42,064 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:43:43,065 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:43:44,066 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:43:45,067 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:43:46,068 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:43:47,070 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:43:48,071 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:43:49,072 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:43:50,073 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:43:51,074 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:43:52,075 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:43:52,461 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:43:52,461 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:43:52,462 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 14:43:53,077 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:43:54,078 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:43:55,079 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:43:56,080 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:43:57,081 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:43:58,082 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:43:59,084 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:44:00,085 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:44:01,086 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:44:02,087 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:44:03,088 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:44:04,090 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:44:05,091 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:44:06,092 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:44:07,093 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:44:08,094 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:44:09,096 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:44:10,097 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:44:11,098 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:44:12,099 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:44:13,101 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:44:14,102 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:44:15,103 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:44:16,104 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:44:16,106 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 14:44:22,108 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:44:23,109 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:44:24,110 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:44:25,112 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:44:26,113 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:44:27,114 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:44:28,115 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:44:29,116 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:44:30,117 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:44:31,119 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:44:32,120 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:44:33,121 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:44:34,122 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:44:35,123 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:44:36,124 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:44:37,125 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:44:38,127 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:44:39,128 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:44:40,129 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:44:41,130 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:44:42,131 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:44:43,132 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:44:44,134 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:44:45,135 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:44:46,136 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:44:47,137 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:44:48,138 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:44:49,139 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:44:50,140 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:44:51,142 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:44:52,143 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:44:52,489 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:44:52,490 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:44:52,491 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 14:44:53,144 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:44:54,145 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:44:55,146 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:44:56,147 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:44:57,148 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:44:58,149 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:44:59,150 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:45:00,151 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:45:01,153 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:45:02,154 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:45:03,155 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:45:04,156 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:45:05,157 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:45:06,158 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:45:07,159 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:45:08,160 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:45:09,161 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:45:10,162 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:45:11,164 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:45:11,166 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 14:45:17,168 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:45:18,169 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:45:19,170 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:45:20,171 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:45:21,172 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:45:22,173 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:45:23,175 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:45:24,176 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:45:25,177 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:45:26,178 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:45:27,179 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:45:28,180 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:45:29,181 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:45:30,182 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:45:31,183 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:45:32,184 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:45:33,185 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:45:34,186 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:45:35,188 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:45:36,189 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:45:37,190 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:45:38,191 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:45:39,192 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:45:40,193 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:45:41,194 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:45:42,195 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:45:43,196 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:45:44,198 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:45:45,199 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:45:46,200 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:45:47,201 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:45:48,202 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:45:49,203 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:45:50,204 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:45:51,205 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:45:52,206 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:45:52,457 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:45:52,458 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:45:52,459 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 14:45:53,207 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:45:54,208 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:45:55,210 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:45:56,211 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:45:57,212 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:45:58,213 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:45:59,214 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:46:00,215 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:46:01,216 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:46:02,217 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:46:03,218 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:46:04,219 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:46:05,220 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:46:06,222 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:46:06,224 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 14:46:12,226 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:46:13,227 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:46:14,228 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:46:15,229 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:46:16,230 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:46:17,231 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:46:18,232 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:46:19,234 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:46:20,235 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:46:21,236 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:46:22,237 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:46:23,238 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:46:24,239 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:46:25,240 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:46:26,241 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:46:27,243 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:46:28,244 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:46:29,245 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:46:30,246 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:46:31,247 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:46:32,248 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:46:33,250 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:46:34,251 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:46:35,252 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:46:36,253 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:46:37,254 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:46:38,255 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:46:39,256 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:46:40,258 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:46:41,259 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:46:42,260 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:46:43,261 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:46:44,262 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:46:45,264 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:46:46,265 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:46:47,266 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:46:48,267 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:46:49,268 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:46:50,269 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:46:51,270 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:46:52,271 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:46:52,468 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:46:52,469 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:46:52,472 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 14:46:53,272 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:46:54,273 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:46:55,275 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:46:56,276 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:46:57,277 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:46:58,278 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:46:59,279 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:47:00,280 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:47:01,281 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:47:01,283 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 14:47:07,285 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:47:08,286 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:47:09,287 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:47:10,288 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:47:11,289 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:47:12,291 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:47:13,292 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:47:14,293 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:47:15,294 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:47:16,295 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:47:17,296 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:47:18,298 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:47:19,299 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:47:20,300 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:47:21,301 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:47:22,302 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:47:23,303 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:47:24,304 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:47:25,305 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:47:26,306 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:47:27,307 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:47:28,309 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:47:29,310 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:47:30,311 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:47:31,312 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:47:32,313 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:47:33,314 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:47:34,315 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:47:35,316 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:47:36,317 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:47:37,318 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:47:38,320 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:47:39,321 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:47:40,322 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:47:41,323 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:47:42,324 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:47:43,325 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:47:44,326 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:47:45,328 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:47:46,329 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:47:47,330 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:47:48,331 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:47:49,332 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:47:50,333 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:47:51,334 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:47:52,336 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:47:52,457 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:47:52,458 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:47:52,459 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 14:47:53,337 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:47:54,338 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:47:55,339 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:47:56,340 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:47:56,341 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 14:48:02,345 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:48:03,346 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:48:04,347 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:48:05,348 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:48:06,349 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:48:07,350 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:48:08,352 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:48:09,353 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:48:10,354 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:48:11,355 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:48:12,356 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:48:13,357 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:48:14,358 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:48:15,360 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:48:16,361 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:48:17,362 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:48:18,363 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:48:19,364 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:48:20,365 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:48:21,366 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:48:22,367 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:48:23,369 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:48:24,370 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:48:25,371 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:48:26,372 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:48:27,373 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:48:28,375 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:48:29,376 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:48:30,377 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:48:31,378 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:48:32,379 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:48:33,380 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:48:34,381 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:48:35,382 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:48:36,383 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:48:37,385 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:48:38,386 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:48:39,387 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:48:40,388 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:48:41,389 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:48:42,390 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:48:43,391 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:48:44,393 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:48:45,394 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:48:46,395 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:48:47,396 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:48:48,397 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:48:49,398 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:48:50,399 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:48:51,400 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:48:51,404 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 14:48:52,457 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:48:52,457 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:48:52,458 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 14:48:57,406 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:48:58,407 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:48:59,408 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:49:00,409 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:49:01,410 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:49:02,411 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:49:03,412 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:49:04,414 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:49:05,415 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:49:06,416 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:49:07,417 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:49:08,418 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:49:09,419 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:49:10,420 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:49:11,421 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:49:12,422 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:49:13,424 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:49:14,425 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:49:15,426 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:49:16,427 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:49:17,428 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:49:18,429 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:49:19,430 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:49:20,431 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:49:21,433 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:49:22,434 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:49:23,435 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:49:24,436 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:49:25,437 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:49:26,438 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:49:27,439 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:49:28,440 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:49:29,441 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:49:30,443 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:49:31,444 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:49:32,445 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:49:33,446 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:49:34,447 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:49:35,448 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:49:36,449 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:49:37,450 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:49:38,451 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:49:39,453 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:49:40,454 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:49:41,455 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:49:42,456 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:49:43,457 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:49:44,458 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:49:45,459 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:49:46,461 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:49:46,462 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 14:49:52,464 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:49:52,473 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:49:52,474 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:49:52,475 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 45 more 2025-07-07 14:49:53,466 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:49:54,467 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:49:55,468 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:49:56,470 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:49:57,471 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:49:58,472 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:49:59,473 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:50:00,474 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:50:01,475 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:50:02,476 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:50:03,478 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:50:04,479 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:50:05,480 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:50:06,481 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:50:07,482 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:50:08,483 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:50:09,484 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:50:10,485 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:50:11,486 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:50:12,488 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:50:13,489 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:50:14,490 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:50:15,491 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:50:16,492 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:50:17,493 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:50:18,494 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:50:19,495 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:50:20,497 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:50:21,498 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:50:22,499 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:50:23,500 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:50:24,501 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:50:25,503 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:50:26,504 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:50:27,505 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:50:28,506 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:50:29,507 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:50:30,508 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:50:31,509 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:50:32,510 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:50:33,511 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:50:34,513 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:50:35,514 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:50:36,514 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:50:37,516 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:50:38,517 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:50:39,518 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:50:40,519 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:50:41,520 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:50:41,521 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 14:50:47,523 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:50:48,524 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:50:49,525 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:50:50,526 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:50:51,527 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:50:52,462 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:50:52,463 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:50:52,464 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 14:50:52,528 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:50:53,529 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:50:54,530 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:50:55,531 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:50:56,533 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:50:57,534 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:50:58,535 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:50:59,536 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:51:00,537 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:51:01,538 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:51:02,539 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:51:03,540 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:51:04,541 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:51:05,543 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:51:06,544 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:51:07,545 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:51:08,546 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:51:09,547 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:51:10,548 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:51:11,549 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:51:12,551 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:51:13,552 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:51:14,553 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:51:15,554 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:51:16,555 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:51:17,556 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:51:18,557 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:51:19,558 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:51:20,560 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:51:21,561 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:51:22,562 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:51:23,564 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:51:24,565 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:51:25,566 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:51:26,567 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:51:27,568 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:51:28,569 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:51:29,570 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:51:30,572 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:51:31,573 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:51:32,574 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:51:33,575 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:51:34,576 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:51:35,577 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:51:36,578 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:51:36,580 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 14:51:42,582 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:51:43,583 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:51:44,584 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:51:45,585 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:51:46,586 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:51:47,587 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:51:48,588 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:51:49,589 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:51:50,591 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:51:51,592 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:51:52,462 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:51:52,463 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:51:52,464 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 14:51:52,593 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:51:53,594 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:51:54,595 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:51:55,596 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:51:56,597 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:51:57,598 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:51:58,599 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:51:59,600 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:52:00,601 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:52:01,602 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:52:02,603 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:52:03,604 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:52:04,605 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:52:05,607 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:52:06,608 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:52:07,609 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:52:08,610 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:52:09,611 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:52:10,612 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:52:11,613 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:52:12,615 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:52:13,616 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:52:14,617 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:52:15,618 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:52:16,619 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:52:17,620 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:52:18,621 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:52:19,622 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:52:20,624 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:52:21,625 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:52:22,626 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:52:23,628 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:52:24,629 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:52:25,630 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:52:26,631 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:52:27,632 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:52:28,633 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:52:29,634 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:52:30,635 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:52:31,637 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:52:31,638 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 14:52:37,640 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:52:38,641 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:52:39,642 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:52:40,643 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:52:41,644 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:52:42,645 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:52:43,647 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:52:44,648 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:52:45,649 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:52:46,650 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:52:47,651 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:52:48,652 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:52:49,653 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:52:50,654 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:52:51,655 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:52:52,472 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:52:52,473 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:52:52,474 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 45 more 2025-07-07 14:52:52,656 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:52:53,658 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:52:54,658 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:52:55,660 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:52:56,661 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:52:57,662 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:52:58,663 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:52:59,664 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:53:00,665 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:53:01,666 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:53:02,667 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:53:03,668 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:53:04,670 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:53:05,671 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:53:06,672 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:53:07,673 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:53:08,674 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:53:09,675 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:53:10,676 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:53:11,677 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:53:12,679 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:53:13,680 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:53:14,681 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:53:15,682 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:53:16,683 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:53:17,684 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:53:18,685 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:53:19,686 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:53:20,687 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:53:21,688 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:53:22,690 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:53:23,691 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:53:24,692 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:53:25,694 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:53:26,695 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:53:26,696 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 14:53:32,698 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:53:33,699 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:53:34,700 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:53:35,701 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:53:36,702 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:53:37,703 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:53:38,705 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:53:39,706 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:53:40,707 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:53:41,708 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:53:42,709 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:53:43,710 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:53:44,711 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:53:45,713 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:53:46,714 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:53:47,715 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:53:48,716 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:53:49,717 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:53:50,718 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:53:51,719 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:53:52,466 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:53:52,467 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:53:52,468 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 14:53:52,720 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:53:53,721 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:53:54,722 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:53:55,724 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:53:56,725 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:53:57,726 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:53:58,727 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:53:59,728 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:54:00,729 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:54:01,730 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:54:02,731 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:54:03,733 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:54:04,734 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:54:05,735 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:54:06,736 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:54:07,737 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:54:08,738 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:54:09,739 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:54:10,740 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:54:11,741 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:54:12,743 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:54:13,744 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:54:14,745 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:54:15,746 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:54:16,747 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:54:17,748 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:54:18,749 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:54:19,750 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:54:20,752 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:54:21,753 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:54:21,755 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 14:54:27,757 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:54:28,758 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:54:29,759 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:54:30,760 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:54:31,762 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:54:32,763 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:54:33,764 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:54:34,765 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:54:35,766 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:54:36,767 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:54:37,768 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:54:38,769 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:54:39,771 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:54:40,772 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:54:41,773 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:54:42,774 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:54:43,775 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:54:44,776 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:54:45,777 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:54:46,779 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:54:47,780 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:54:48,781 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:54:49,782 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:54:50,783 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:54:51,784 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:54:52,459 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:54:52,460 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:54:52,461 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 14:54:52,785 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:54:53,787 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:54:54,788 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:54:55,789 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:54:56,790 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:54:57,791 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:54:58,792 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:54:59,793 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:55:00,795 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:55:01,796 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:55:02,797 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:55:03,798 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:55:04,799 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:55:05,800 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:55:06,801 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:55:07,802 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:55:08,804 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:55:09,805 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:55:10,806 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:55:11,807 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:55:12,808 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:55:13,809 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:55:14,811 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:55:15,812 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:55:16,813 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:55:16,815 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 14:55:22,817 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:55:23,818 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:55:24,819 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:55:25,821 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:55:26,822 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:55:27,823 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:55:28,824 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:55:29,826 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:55:30,827 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:55:31,828 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:55:32,829 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:55:33,831 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:55:34,832 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:55:35,833 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:55:36,834 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:55:37,835 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:55:38,837 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:55:39,838 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:55:40,839 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:55:41,840 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:55:42,841 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:55:43,843 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:55:44,844 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:55:45,845 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:55:46,846 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:55:47,847 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:55:48,848 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:55:49,850 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:55:50,851 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:55:51,852 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:55:52,474 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:55:52,475 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:55:52,476 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 14:55:52,853 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:55:53,854 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:55:54,855 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:55:55,856 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:55:56,857 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:55:57,858 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:55:58,860 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:55:59,861 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:56:00,862 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:56:01,863 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:56:02,865 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:56:03,866 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:56:04,867 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:56:05,868 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:56:06,869 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:56:07,871 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:56:08,872 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:56:09,873 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:56:10,874 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:56:11,875 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:56:11,878 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 14:56:17,880 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:56:18,881 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:56:19,882 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:56:20,883 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:56:21,884 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:56:22,885 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:56:23,887 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:56:24,888 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:56:25,889 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:56:26,893 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:56:27,894 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:56:28,895 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:56:29,896 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:56:30,897 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:56:31,899 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:56:32,900 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:56:33,901 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:56:34,902 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:56:35,903 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:56:36,904 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:56:37,906 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:56:38,907 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:56:39,908 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:56:40,909 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:56:41,910 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:56:42,912 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:56:43,913 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:56:44,914 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:56:45,915 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:56:46,916 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:56:47,918 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:56:48,919 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:56:49,920 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:56:50,921 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:56:51,922 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:56:52,463 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:56:52,464 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:56:52,467 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 14:56:52,923 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:56:53,924 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:56:54,925 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:56:55,927 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:56:56,928 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:56:57,929 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:56:58,930 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:56:59,931 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:57:00,933 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:57:01,934 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:57:02,935 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:57:03,936 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:57:04,937 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:57:05,939 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:57:06,940 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:57:06,942 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 14:57:12,944 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:57:13,945 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:57:14,946 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:57:15,948 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:57:16,949 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:57:17,950 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:57:18,951 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:57:19,952 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:57:20,953 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:57:21,955 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:57:22,956 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:57:23,957 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:57:24,958 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:57:25,959 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:57:26,961 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:57:27,962 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:57:28,963 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:57:29,964 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:57:30,965 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:57:31,966 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:57:32,968 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:57:33,969 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:57:34,970 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:57:35,971 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:57:36,972 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:57:37,973 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:57:38,975 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:57:39,976 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:57:40,977 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:57:41,978 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:57:42,979 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:57:43,980 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:57:44,981 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:57:45,983 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:57:46,984 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:57:47,985 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:57:48,986 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:57:49,987 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:57:50,988 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:57:51,990 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:57:52,475 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:57:52,476 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:57:52,477 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 45 more 2025-07-07 14:57:52,991 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:57:53,992 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:57:54,993 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:57:55,994 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:57:56,996 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:57:57,997 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:57:58,998 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:57:59,999 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:58:01,000 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:58:02,001 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:58:02,003 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 14:58:08,005 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:58:09,006 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:58:10,008 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:58:11,009 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:58:12,010 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:58:13,011 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:58:14,012 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:58:15,013 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:58:16,014 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:58:17,016 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:58:18,017 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:58:19,018 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:58:20,019 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:58:21,020 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:58:22,021 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:58:23,022 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:58:24,024 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:58:25,025 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:58:26,026 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:58:27,027 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:58:28,028 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:58:29,029 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:58:30,030 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:58:31,032 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:58:32,033 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:58:33,034 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:58:34,035 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:58:35,036 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:58:36,037 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:58:37,039 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:58:38,040 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:58:39,041 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:58:40,042 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:58:41,043 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:58:42,044 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:58:43,046 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:58:44,047 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:58:45,048 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:58:46,049 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:58:47,050 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:58:48,051 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:58:49,052 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:58:50,053 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:58:51,055 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:58:52,056 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:58:52,461 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:58:52,462 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:58:52,462 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 14:58:53,057 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:58:54,058 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:58:55,059 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:58:56,060 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:58:57,061 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:58:57,062 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 14:59:03,064 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:59:04,065 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:59:05,066 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:59:06,067 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:59:07,069 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:59:08,070 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:59:09,071 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:59:10,072 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:59:11,073 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:59:12,074 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:59:13,076 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:59:14,077 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:59:15,078 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:59:16,079 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:59:17,080 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:59:18,081 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:59:19,083 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:59:20,084 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:59:21,085 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:59:22,086 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:59:23,087 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:59:24,089 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:59:25,090 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:59:26,091 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:59:27,093 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:59:28,094 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:59:29,095 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:59:30,096 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:59:31,097 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:59:32,098 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:59:33,099 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:59:34,101 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:59:35,102 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:59:36,103 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:59:37,104 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:59:38,105 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:59:39,106 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:59:40,107 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:59:41,109 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:59:42,110 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:59:43,111 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:59:44,112 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:59:45,113 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:59:46,114 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:59:47,116 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:59:48,117 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:59:49,118 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:59:50,119 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:59:51,120 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:59:52,121 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:59:52,123 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 14:59:52,477 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:59:52,478 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 14:59:52,478 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 14:59:58,125 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 14:59:59,126 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:00:00,127 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:00:01,128 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:00:02,129 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:00:03,130 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:00:04,131 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:00:05,133 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:00:06,134 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:00:07,135 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:00:08,136 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:00:09,137 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:00:10,138 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:00:11,140 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:00:12,141 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:00:13,142 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:00:14,143 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:00:15,144 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:00:16,146 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:00:17,147 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:00:18,148 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:00:19,149 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:00:20,150 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:00:21,151 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:00:22,153 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:00:23,154 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:00:24,155 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:00:25,156 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:00:26,157 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:00:27,158 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:00:28,160 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:00:29,161 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:00:30,162 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:00:31,163 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:00:32,164 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:00:33,165 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:00:34,167 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:00:35,168 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:00:36,169 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:00:37,170 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:00:38,171 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:00:39,173 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:00:40,174 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:00:41,175 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:00:42,176 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:00:43,177 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:00:44,179 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:00:45,180 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:00:46,181 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:00:47,182 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:00:47,184 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 15:00:52,465 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 15:00:52,466 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 15:00:52,466 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 15:00:53,185 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:00:54,187 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:00:55,188 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:00:56,189 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:00:57,190 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:00:58,191 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:00:59,192 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:01:00,193 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:01:01,195 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:01:02,196 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:01:03,197 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:01:04,198 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:01:05,199 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:01:06,200 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:01:07,201 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:01:08,202 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:01:09,204 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:01:10,205 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:01:11,206 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:01:12,207 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:01:13,208 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:01:14,210 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:01:15,211 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:01:16,212 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:01:17,213 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:01:18,214 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:01:19,216 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:01:20,217 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:01:21,218 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:01:22,219 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:01:23,220 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:01:24,222 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:01:25,223 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:01:26,224 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:01:27,226 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:01:28,227 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:01:29,228 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:01:30,229 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:01:31,230 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:01:32,232 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:01:33,233 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:01:34,234 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:01:35,235 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:01:36,236 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:01:37,238 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:01:38,239 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:01:39,240 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:01:40,241 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:01:41,242 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:01:42,244 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:01:42,245 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 15:01:48,247 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:01:49,248 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:01:50,249 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:01:51,250 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:01:52,251 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:01:52,464 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 15:01:52,465 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 15:01:52,465 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 15:01:53,253 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:01:54,254 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:01:55,255 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:01:56,256 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:01:57,257 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:01:58,258 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:01:59,259 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:02:00,261 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:02:01,262 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:02:02,263 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:02:03,264 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:02:04,265 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:02:05,267 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:02:06,268 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:02:07,269 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:02:08,270 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:02:09,271 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:02:10,272 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:02:11,274 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:02:12,275 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:02:13,276 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:02:14,277 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:02:15,279 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:02:16,280 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:02:17,281 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:02:18,282 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:02:19,283 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:02:20,285 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:02:21,286 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:02:22,287 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:02:23,288 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:02:24,290 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:02:25,291 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:02:26,292 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:02:27,294 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:02:28,295 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:02:29,296 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:02:30,297 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:02:31,299 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:02:32,300 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:02:33,301 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:02:34,302 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:02:35,303 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:02:36,305 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:02:37,306 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:02:37,307 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 15:02:43,309 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:02:44,310 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:02:45,311 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:02:46,313 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:02:47,314 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:02:48,315 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:02:49,316 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:02:50,317 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:02:51,319 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:02:52,320 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:02:52,476 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 15:02:52,477 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 15:02:52,477 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 15:02:53,321 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:02:54,322 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:02:55,323 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:02:56,324 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:02:57,325 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:02:58,327 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:02:59,328 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:03:00,329 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:03:01,330 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:03:02,331 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:03:03,332 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:03:04,333 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:03:05,335 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:03:06,336 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:03:07,337 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:03:08,338 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:03:09,340 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:03:10,341 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:03:11,342 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:03:12,343 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:03:13,344 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:03:14,346 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:03:15,347 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:03:16,348 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:03:17,349 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:03:18,350 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:03:19,351 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:03:20,352 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:03:21,354 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:03:22,355 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:03:23,356 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:03:24,359 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:03:25,360 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:03:26,361 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:03:27,363 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:03:28,364 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:03:29,365 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:03:30,366 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:03:31,367 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:03:32,368 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:03:32,370 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 15:03:38,372 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:03:39,373 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:03:40,374 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:03:41,375 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:03:42,376 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:03:43,378 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:03:44,379 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:03:45,380 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:03:46,381 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:03:47,382 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:03:48,384 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:03:49,385 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:03:50,386 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:03:51,387 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:03:52,389 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:03:52,467 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 15:03:52,468 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 15:03:52,469 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 15:03:53,390 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:03:54,391 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:03:55,392 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:03:56,393 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:03:57,394 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:03:58,396 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:03:59,397 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:04:00,398 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:04:01,399 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:04:02,401 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:04:03,402 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:04:04,403 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:04:05,404 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:04:06,405 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:04:07,407 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:04:08,408 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:04:09,409 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:04:10,410 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:04:11,412 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:04:12,413 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:04:13,414 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:04:14,415 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:04:15,416 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:04:16,417 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:04:17,419 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:04:18,420 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:04:19,421 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:04:20,422 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:04:21,423 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:04:22,425 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:04:23,426 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:04:24,428 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:04:25,429 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:04:26,430 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:04:27,431 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:04:27,433 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 15:04:33,434 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:04:34,436 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:04:35,437 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:04:36,438 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:04:37,439 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:04:38,440 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:04:39,442 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:04:40,443 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:04:41,444 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:04:42,445 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:04:43,446 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:04:44,448 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:04:45,449 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:04:46,450 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:04:47,451 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:04:48,452 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:04:49,454 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:04:50,455 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:04:51,456 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:04:52,457 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:04:52,469 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 15:04:52,470 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 15:04:52,472 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 15:04:53,459 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:04:54,460 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:04:55,461 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:04:56,462 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:04:57,463 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:04:58,464 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:04:59,466 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:05:00,467 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:05:01,468 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:05:02,469 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:05:03,470 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:05:04,472 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:05:05,473 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:05:06,474 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:05:07,475 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:05:08,477 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:05:09,478 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:05:10,479 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:05:11,480 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:05:12,481 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:05:13,483 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:05:14,484 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:05:15,485 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:05:16,486 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:05:17,487 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:05:18,489 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:05:19,490 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:05:20,491 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:05:21,492 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:05:22,494 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:05:22,496 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 15:05:28,498 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:05:29,499 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:05:30,500 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:05:31,501 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:05:32,503 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:05:33,504 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:05:34,505 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:05:35,506 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:05:36,507 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:05:37,509 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:05:38,510 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:05:39,511 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:05:40,512 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:05:41,513 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:05:42,515 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:05:43,516 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:05:44,517 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:05:45,518 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:05:46,520 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:05:47,521 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:05:48,522 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:05:49,523 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:05:50,524 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:05:51,526 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:05:52,478 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 15:05:52,479 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 15:05:52,480 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 15:05:52,527 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:05:53,528 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:05:54,529 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:05:55,530 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:05:56,531 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:05:57,532 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:05:58,534 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:05:59,535 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:06:00,536 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:06:01,537 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:06:02,538 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:06:03,540 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:06:04,541 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:06:05,542 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:06:06,543 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:06:07,544 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:06:08,546 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:06:09,547 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:06:10,548 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:06:11,549 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:06:12,550 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:06:13,552 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:06:14,553 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:06:15,554 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:06:16,555 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:06:17,557 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:06:17,559 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 15:06:23,562 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:06:24,564 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:06:25,565 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:06:26,566 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:06:27,567 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:06:28,568 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:06:29,569 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:06:30,570 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:06:31,572 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:06:32,573 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:06:33,574 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:06:34,575 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:06:35,576 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:06:36,578 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:06:37,579 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:06:38,580 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:06:39,581 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:06:40,582 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:06:41,584 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:06:42,585 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:06:43,586 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:06:44,587 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:06:45,589 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:06:46,590 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:06:47,591 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:06:48,592 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:06:49,593 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:06:50,595 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:06:51,596 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:06:52,466 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 15:06:52,467 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 15:06:52,470 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 15:06:52,597 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:06:53,598 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:06:54,599 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:06:55,600 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:06:56,602 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:06:57,603 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:06:58,604 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:06:59,605 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:07:00,606 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:07:01,608 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:07:02,609 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:07:03,610 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:07:04,611 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:07:05,612 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:07:06,614 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:07:07,615 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:07:08,616 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:07:09,617 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:07:10,618 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:07:11,620 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:07:12,621 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:07:12,625 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 15:07:18,627 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:07:19,628 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:07:20,629 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:07:21,630 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:07:22,632 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:07:23,633 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:07:24,634 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:07:25,635 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:07:26,636 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:07:27,638 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:07:28,639 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:07:29,640 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:07:30,641 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:07:31,642 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:07:32,644 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:07:33,645 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:07:34,646 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:07:35,647 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:07:36,648 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:07:37,650 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:07:38,651 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:07:39,652 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:07:40,653 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:07:41,654 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:07:42,656 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:07:43,657 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:07:44,658 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:07:45,659 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:07:46,660 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:07:47,662 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:07:48,663 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:07:49,664 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:07:50,665 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:07:51,666 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:07:52,466 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 15:07:52,467 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 15:07:52,468 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 15:07:52,668 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:07:53,669 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:07:54,670 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:07:55,671 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:07:56,672 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:07:57,674 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:07:58,675 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:07:59,676 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:08:00,677 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:08:01,679 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:08:02,680 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:08:03,681 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:08:04,682 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:08:05,683 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:08:06,685 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:08:07,686 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:08:07,688 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 15:08:13,690 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:08:14,691 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:08:15,692 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:08:16,694 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:08:17,695 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:08:18,696 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:08:19,697 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:08:20,698 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:08:21,699 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:08:22,701 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:08:23,702 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:08:24,703 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:08:25,704 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:08:26,705 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:08:27,707 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:08:28,708 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:08:29,709 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:08:30,710 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:08:31,711 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:08:32,713 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:08:33,714 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:08:34,715 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:08:35,716 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:08:36,717 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:08:37,719 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:08:38,720 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:08:39,721 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:08:40,722 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:08:41,723 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:08:42,725 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:08:43,726 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:08:44,727 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:08:45,729 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:08:46,730 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:08:47,731 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:08:48,732 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:08:49,733 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:08:50,735 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:08:51,736 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:08:52,479 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 15:08:52,479 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 15:08:52,480 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 15:08:52,737 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:08:53,738 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:08:54,739 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:08:55,741 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:08:56,742 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:08:57,743 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:08:58,744 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:08:59,746 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:09:00,747 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:09:01,748 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:09:02,749 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:09:02,751 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 15:09:08,753 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:09:09,754 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:09:10,756 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:09:11,757 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:09:12,758 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:09:13,759 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:09:14,761 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:09:15,762 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:09:16,763 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:09:17,764 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:09:18,765 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:09:19,767 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:09:20,768 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:09:21,769 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:09:22,770 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:09:23,771 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:09:24,773 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:09:25,774 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:09:26,775 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:09:27,777 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:09:28,778 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:09:29,779 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:09:30,780 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:09:31,782 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:09:32,783 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:09:33,784 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:09:34,785 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:09:35,786 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:09:36,788 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:09:37,789 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:09:38,790 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:09:39,791 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:09:40,793 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:09:41,794 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:09:42,795 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:09:43,796 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:09:44,797 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:09:45,799 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:09:46,800 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:09:47,801 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:09:48,802 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:09:49,803 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:09:50,805 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:09:51,806 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:09:52,467 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 15:09:52,468 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 15:09:52,469 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 15:09:52,807 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:09:53,808 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:09:54,809 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:09:55,811 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:09:56,812 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:09:57,813 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:09:57,815 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 15:10:03,817 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:10:04,818 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:10:05,819 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:10:06,821 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:10:07,822 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:10:08,823 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:10:09,824 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:10:10,825 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:10:11,827 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:10:12,828 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:10:13,829 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:10:14,831 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:10:15,832 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:10:16,833 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:10:17,834 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:10:18,836 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:10:19,837 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:10:20,838 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:10:21,839 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:10:22,841 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:10:23,842 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:10:24,843 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:10:25,844 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:10:26,845 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:10:27,847 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:10:28,848 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:10:29,849 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:10:30,850 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:10:31,851 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:10:32,852 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:10:33,854 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:10:34,855 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:10:35,856 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:10:36,857 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:10:37,858 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:10:38,860 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:10:39,861 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:10:40,862 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:10:41,863 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:10:42,864 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:10:43,866 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:10:44,867 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:10:45,868 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:10:46,870 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:10:47,871 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:10:48,872 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:10:49,873 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:10:50,874 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:10:51,876 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:10:52,467 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 15:10:52,468 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 15:10:52,470 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 15:10:52,877 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:10:52,878 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 15:10:58,880 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:10:59,881 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:10:59,982 WARN com.cloudera.cmf.event.publish.EventStorePublisherWithRetry: Failed to publish event: SimpleEvent{attributes={ROLE=[hdfs-DATANODE-b30a464b10a57fdd49ea734cd52a8291], HOSTS=[dmidlkprdls04.svr.luc.edu], ROLE_TYPE=[DATANODE], CATEGORY=[LOG_MESSAGE], EVENTCODE=[EV_LOG_EVENT], SERVICE=[hdfs], SERVICE_TYPE=[HDFS], LOG_LEVEL=[WARN], HOST_IDS=[5c33df90-d247-4c6d-b9e0-5908a423580a], SEVERITY=[IMPORTANT]}, content=Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /var/lib/hadoop-hdfs/hadoop-http-auth-signature-secret, timestamp=1751909970885} - 1 of 60 failure(s) in last 1815s java.io.IOException: Error connecting to dmidlkprdls01.svr.luc.edu/192.168.158.1:7184 at com.cloudera.cmf.event.shaded.org.apache.avro.ipc.netty.NettyTransceiver.getChannel(NettyTransceiver.java:269) at com.cloudera.cmf.event.shaded.org.apache.avro.ipc.netty.NettyTransceiver.(NettyTransceiver.java:197) at com.cloudera.cmf.event.shaded.org.apache.avro.ipc.netty.NettyTransceiver.(NettyTransceiver.java:120) at com.cloudera.cmf.event.publish.AvroEventStorePublishProxy.checkSpecificRequestor(AvroEventStorePublishProxy.java:120) at com.cloudera.cmf.event.publish.AvroEventStorePublishProxy.publishEvent(AvroEventStorePublishProxy.java:204) at com.cloudera.cmf.event.publish.EventStorePublisherWithRetry$PublishEventTask.run(EventStorePublisherWithRetry.java:242) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:750) Caused by: com.cloudera.cmf.event.shaded.io.netty.channel.AbstractChannel$AnnotatedNoRouteToHostException: No route to host: dmidlkprdls01.svr.luc.edu/192.168.158.1:7184 Caused by: java.net.NoRouteToHostException: No route to host at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:716) at com.cloudera.cmf.event.shaded.io.netty.channel.socket.nio.NioSocketChannel.doFinishConnect(NioSocketChannel.java:337) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.finishConnect(AbstractNioChannel.java:339) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:776) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:724) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:650) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:562) at com.cloudera.cmf.event.shaded.io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997) at com.cloudera.cmf.event.shaded.io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) at java.lang.Thread.run(Thread.java:750) 2025-07-07 15:11:00,882 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:11:01,883 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:11:02,884 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:11:03,886 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:11:04,887 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:11:05,888 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:11:06,889 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:11:07,890 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:11:08,892 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:11:09,893 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:11:10,894 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:11:11,895 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:11:12,896 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:11:13,898 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:11:14,899 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:11:15,900 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:11:16,901 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:11:17,902 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:11:18,904 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:11:19,905 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:11:20,906 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:11:21,907 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:11:22,908 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:11:23,910 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:11:24,911 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:11:25,912 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:11:26,914 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:11:27,915 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:11:28,916 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:11:29,917 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:11:30,918 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:11:31,919 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:11:32,920 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:11:33,922 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:11:34,923 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:11:35,924 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:11:36,925 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:11:37,927 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:11:38,928 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:11:39,929 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:11:40,930 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:11:41,931 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:11:42,932 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:11:43,934 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:11:44,935 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:11:45,936 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:11:46,937 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:11:47,939 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:11:47,940 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 15:11:52,481 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 15:11:52,483 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 15:11:52,484 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 15:11:53,942 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:11:54,943 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:11:55,944 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:11:56,945 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:11:57,946 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:11:58,947 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:11:59,949 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:12:00,950 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:12:01,951 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:12:02,952 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:12:03,954 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:12:04,955 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:12:05,956 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:12:06,957 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:12:07,958 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:12:08,959 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:12:09,960 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:12:10,962 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:12:11,963 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:12:12,964 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:12:13,965 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:12:14,966 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:12:15,967 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:12:16,969 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:12:17,970 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:12:18,971 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:12:19,972 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:12:20,973 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:12:21,974 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:12:22,975 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:12:23,976 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:12:24,977 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:12:25,978 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:12:26,980 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:12:27,981 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:12:28,982 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:12:29,983 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:12:30,984 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:12:31,985 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:12:32,986 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:12:33,987 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:12:34,988 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:12:35,989 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:12:36,991 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:12:37,992 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:12:38,993 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:12:39,994 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:12:40,995 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:12:41,997 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:12:42,998 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:12:42,999 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 15:12:49,001 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:12:50,002 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:12:51,003 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:12:52,004 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:12:52,469 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 15:12:52,470 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 15:12:52,470 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 15:12:53,006 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:12:54,007 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:12:55,008 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:12:56,009 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:12:57,010 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:12:58,011 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:12:59,012 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:13:00,014 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:13:01,015 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:13:02,016 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:13:03,017 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:13:04,018 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:13:05,019 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:13:06,020 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:13:07,021 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:13:08,023 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:13:09,024 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:13:10,025 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:13:11,026 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:13:12,027 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:13:13,028 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:13:14,030 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:13:15,031 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:13:16,032 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:13:17,033 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:13:18,034 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:13:19,035 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:13:20,036 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:13:21,038 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:13:22,039 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:13:23,040 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:13:24,042 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:13:25,043 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:13:26,044 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:13:27,045 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:13:28,046 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:13:29,047 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:13:30,049 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:13:31,050 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:13:32,051 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:13:33,052 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:13:34,053 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:13:35,054 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:13:36,055 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:13:37,056 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:13:38,058 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:13:38,059 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 15:13:44,061 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:13:45,062 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:13:46,063 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:13:47,064 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:13:48,065 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:13:49,066 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:13:50,067 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:13:51,069 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:13:52,070 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:13:52,468 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 15:13:52,469 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 15:13:52,470 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 15:13:53,071 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:13:54,072 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:13:55,073 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:13:56,074 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:13:57,075 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:13:58,076 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:13:59,077 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:14:00,079 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:14:01,080 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:14:02,081 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:14:03,082 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:14:04,083 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:14:05,085 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:14:06,086 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:14:07,087 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:14:08,088 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:14:09,089 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:14:10,091 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:14:11,092 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:14:12,093 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:14:13,094 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:14:14,095 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:14:15,097 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:14:16,098 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:14:17,099 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:14:18,100 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:14:19,102 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:14:20,103 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:14:21,104 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:14:22,105 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:14:23,106 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:14:24,108 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:14:25,109 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:14:26,110 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:14:27,111 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:14:28,113 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:14:29,114 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:14:30,115 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:14:31,116 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:14:32,117 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:14:33,119 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:14:33,120 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 15:14:39,122 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:14:40,123 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:14:41,124 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:14:42,125 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:14:43,126 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:14:44,128 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:14:45,129 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:14:46,130 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:14:47,131 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:14:48,132 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:14:49,133 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:14:50,134 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:14:51,135 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:14:52,137 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:14:52,483 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 15:14:52,484 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 15:14:52,484 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 15:14:53,138 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:14:54,139 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:14:55,140 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:14:56,141 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:14:57,142 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:14:58,143 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:14:59,145 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:15:00,146 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:15:01,147 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:15:02,148 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:15:03,149 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:15:04,150 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:15:05,151 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:15:06,153 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:15:07,154 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:15:08,155 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:15:09,156 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:15:10,157 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:15:11,158 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:15:12,159 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:15:13,161 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:15:14,162 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:15:15,163 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:15:16,164 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:15:17,165 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:15:18,167 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:15:19,168 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:15:20,169 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:15:21,170 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:15:22,171 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:15:23,172 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:15:24,174 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:15:25,175 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:15:26,176 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:15:27,178 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:15:28,179 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:15:28,180 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 15:15:34,182 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:15:35,183 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:15:36,184 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:15:37,185 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:15:38,186 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:15:39,187 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:15:40,188 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:15:41,190 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:15:42,191 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:15:43,192 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:15:44,193 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:15:45,194 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:15:46,195 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:15:47,196 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:15:48,197 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:15:49,199 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:15:50,200 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:15:51,201 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:15:52,202 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:15:52,472 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 15:15:52,473 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 15:15:52,474 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 15:15:53,203 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:15:54,204 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:15:55,205 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:15:56,206 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:15:57,208 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:15:58,209 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:15:59,210 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:16:00,211 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:16:01,212 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:16:02,213 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:16:03,214 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:16:04,216 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:16:05,217 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:16:06,218 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:16:07,219 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:16:08,220 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:16:09,221 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:16:10,222 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:16:11,224 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:16:12,225 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:16:13,226 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:16:14,227 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:16:15,228 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:16:16,230 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:16:17,231 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:16:18,232 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:16:19,233 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:16:20,234 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:16:21,235 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:16:22,236 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:16:23,238 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:16:23,241 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 15:16:29,243 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:16:30,244 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:16:31,245 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:16:32,246 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:16:33,247 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:16:34,248 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:16:35,250 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:16:36,251 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:16:37,252 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:16:38,253 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:16:39,254 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:16:40,255 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:16:41,256 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:16:42,257 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:16:43,259 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:16:44,260 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:16:45,261 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:16:46,262 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:16:47,263 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:16:48,265 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:16:49,266 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:16:50,267 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:16:51,268 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:16:52,269 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:16:52,473 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 15:16:52,474 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 15:16:52,477 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 15:16:53,271 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:16:54,272 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:16:55,273 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:16:56,274 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:16:57,275 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:16:58,276 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:16:59,277 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:17:00,278 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:17:01,279 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:17:02,280 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:17:03,281 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:17:04,283 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:17:05,284 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:17:06,285 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:17:07,286 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:17:08,287 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:17:09,288 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:17:10,289 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:17:11,290 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:17:12,292 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:17:13,293 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:17:14,294 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:17:15,295 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:17:16,296 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:17:17,297 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:17:18,298 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:17:18,300 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 15:17:24,302 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:17:25,303 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:17:26,304 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:17:27,305 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:17:28,306 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:17:29,307 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:17:30,308 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:17:31,309 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:17:32,310 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:17:33,311 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:17:34,312 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:17:35,314 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:17:36,315 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:17:37,316 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:17:38,317 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:17:39,318 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:17:40,319 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:17:41,320 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:17:42,321 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:17:43,322 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:17:44,323 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:17:45,324 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:17:46,325 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:17:47,326 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:17:48,327 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:17:49,329 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:17:50,329 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:17:51,330 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:17:52,332 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:17:52,485 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 15:17:52,486 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 15:17:52,487 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 45 more 2025-07-07 15:17:53,333 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:17:54,334 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:17:55,335 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:17:56,336 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:17:57,337 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:17:58,338 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:17:59,339 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:18:00,340 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:18:01,341 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:18:02,342 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:18:03,343 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:18:04,345 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:18:05,346 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:18:06,347 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:18:07,348 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:18:08,349 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:18:09,350 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:18:10,351 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:18:11,352 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:18:12,353 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:18:13,354 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:18:13,357 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 15:18:19,359 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:18:20,360 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:18:21,361 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:18:22,362 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:18:23,363 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:18:24,364 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:18:25,365 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:18:26,366 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:18:27,368 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:18:28,369 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:18:29,370 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:18:30,371 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:18:31,372 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:18:32,374 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:18:33,375 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:18:34,376 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:18:35,377 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:18:36,378 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:18:37,379 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:18:38,380 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:18:39,381 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:18:40,382 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:18:41,384 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:18:42,385 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:18:43,386 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:18:44,387 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:18:45,388 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:18:46,389 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:18:47,390 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:18:48,391 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:18:49,392 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:18:50,394 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:18:51,395 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:18:52,396 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:18:52,472 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 15:18:52,473 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 15:18:52,474 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 15:18:53,397 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:18:54,398 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:18:55,399 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:18:56,400 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:18:57,401 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:18:58,402 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:18:59,404 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:19:00,405 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:19:01,406 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:19:02,407 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:19:03,408 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:19:04,409 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:19:05,410 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:19:06,411 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:19:07,412 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:19:08,413 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:19:08,415 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 15:19:14,417 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:19:15,418 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:19:16,419 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:19:17,421 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:19:18,422 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:19:19,423 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:19:20,424 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:19:21,425 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:19:22,426 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:19:23,427 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:19:24,428 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:19:25,429 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:19:26,430 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:19:27,431 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:19:28,433 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:19:29,434 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:19:30,435 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:19:31,436 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:19:32,437 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:19:33,438 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:19:34,439 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:19:35,440 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:19:36,441 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:19:37,442 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:19:38,444 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:19:39,445 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:19:40,446 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:19:41,447 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:19:42,448 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:19:43,449 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:19:44,450 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:19:45,451 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:19:46,452 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:19:47,453 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:19:48,455 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:19:49,456 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:19:50,457 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:19:51,458 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:19:52,459 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:19:52,477 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 15:19:52,478 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 15:19:52,479 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 15:19:53,460 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:19:54,461 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:19:55,463 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:19:56,464 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:19:57,465 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:19:58,466 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:19:59,467 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:20:00,468 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:20:01,469 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:20:02,470 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:20:03,471 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:20:03,473 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 15:20:09,475 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:20:10,476 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:20:11,477 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:20:12,478 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:20:13,479 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:20:14,480 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:20:15,482 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:20:16,483 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:20:17,484 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:20:18,485 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:20:19,486 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:20:20,487 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:20:21,488 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:20:22,489 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:20:23,490 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:20:24,492 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:20:25,493 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:20:26,494 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:20:27,495 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:20:28,496 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:20:29,497 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:20:30,498 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:20:31,499 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:20:32,500 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:20:33,502 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:20:34,503 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:20:35,504 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:20:36,505 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:20:37,506 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:20:38,507 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:20:39,508 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:20:40,509 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:20:41,510 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:20:42,511 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:20:43,512 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:20:44,514 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:20:45,515 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:20:46,516 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:20:47,517 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:20:48,518 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:20:49,519 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:20:50,520 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:20:51,522 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:20:52,482 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 15:20:52,483 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 15:20:52,484 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 15:20:52,523 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:20:53,524 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:20:54,525 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:20:55,526 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:20:56,527 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:20:57,529 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:20:58,530 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:20:58,532 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 15:21:04,534 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:21:05,535 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:21:06,536 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:21:07,537 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:21:08,538 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:21:09,539 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:21:10,541 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:21:11,542 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:21:12,543 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:21:13,544 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:21:14,545 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:21:15,546 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:21:16,548 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:21:17,549 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:21:18,550 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:21:19,551 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:21:20,552 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:21:21,553 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:21:22,554 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:21:23,555 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:21:24,556 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:21:25,558 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:21:26,559 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:21:27,560 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:21:28,561 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:21:29,562 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:21:30,563 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:21:31,565 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:21:32,566 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:21:33,567 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:21:34,568 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:21:35,569 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:21:36,570 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:21:37,571 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:21:38,572 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:21:39,573 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:21:40,575 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:21:41,576 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:21:42,577 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:21:43,578 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:21:44,579 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:21:45,580 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:21:46,581 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:21:47,583 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:21:48,584 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:21:49,585 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:21:50,586 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:21:51,587 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:21:52,472 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 15:21:52,473 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 15:21:52,474 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 15:21:52,588 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:21:53,589 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:21:53,590 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 15:21:59,592 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:22:00,593 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:22:01,594 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:22:02,595 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:22:03,596 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:22:04,598 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:22:05,599 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:22:06,600 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:22:07,601 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:22:08,602 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:22:09,603 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:22:10,604 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:22:11,605 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:22:12,607 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:22:13,608 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:22:14,609 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:22:15,610 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:22:16,611 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:22:17,612 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:22:18,613 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:22:19,614 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:22:20,615 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:22:21,617 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:22:22,618 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:22:23,620 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:22:24,621 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:22:25,622 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:22:26,623 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:22:27,624 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:22:28,625 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:22:29,626 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:22:30,627 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:22:31,629 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:22:32,630 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:22:33,631 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:22:34,632 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:22:35,633 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:22:36,634 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:22:37,635 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:22:38,636 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:22:39,637 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:22:40,639 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:22:41,640 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:22:42,641 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:22:43,642 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:22:44,643 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:22:45,644 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:22:46,646 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:22:47,647 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:22:48,648 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:22:48,649 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 15:22:52,484 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 15:22:52,485 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 15:22:52,486 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 15:22:54,651 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:22:55,652 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:22:56,653 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:22:57,654 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:22:58,655 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:22:59,656 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:23:00,657 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:23:01,659 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:23:02,660 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:23:03,661 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:23:04,662 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:23:05,663 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:23:06,664 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:23:07,665 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:23:08,666 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:23:09,667 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:23:10,669 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:23:11,670 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:23:12,671 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:23:13,672 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:23:14,673 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:23:15,674 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:23:16,676 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:23:17,677 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:23:18,678 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:23:19,679 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:23:20,680 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:23:21,681 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:23:22,682 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:23:23,683 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:23:24,684 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:23:25,686 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:23:26,687 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:23:27,688 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:23:28,689 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:23:29,690 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:23:30,691 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:23:31,692 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:23:32,693 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:23:33,694 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:23:34,695 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:23:35,697 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:23:36,698 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:23:37,699 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:23:38,700 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:23:39,701 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:23:40,702 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:23:41,703 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:23:42,704 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:23:43,705 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:23:43,707 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 15:23:49,708 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:23:50,710 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:23:51,711 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:23:52,472 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 15:23:52,472 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 15:23:52,473 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 15:23:52,712 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:23:53,713 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:23:54,714 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:23:55,715 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:23:56,716 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:23:57,717 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:23:58,718 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:23:59,720 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:24:00,721 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:24:01,722 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:24:02,723 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:24:03,724 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:24:04,725 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:24:05,726 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:24:06,727 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:24:07,728 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:24:08,729 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:24:09,730 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:24:10,732 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:24:11,733 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:24:12,734 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:24:13,735 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:24:14,736 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:24:15,737 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:24:16,738 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:24:17,740 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:24:18,741 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:24:19,742 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:24:20,743 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:24:21,744 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:24:22,745 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:24:23,747 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:24:24,748 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:24:25,749 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:24:26,750 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:24:27,751 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:24:28,753 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:24:29,754 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:24:30,755 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:24:31,756 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:24:32,757 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:24:33,758 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:24:34,759 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:24:35,761 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:24:36,762 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:24:37,763 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:24:38,764 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:24:38,765 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 15:24:44,769 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:24:45,770 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:24:46,771 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:24:47,773 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:24:48,774 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:24:49,775 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:24:50,776 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:24:51,777 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:24:52,487 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 15:24:52,488 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 15:24:52,489 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 15:24:52,778 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:24:53,779 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:24:54,780 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:24:55,781 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:24:56,782 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:24:57,784 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:24:58,785 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:24:59,786 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:25:00,787 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:25:01,788 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:25:02,789 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:25:03,790 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:25:04,791 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:25:05,793 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:25:06,794 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:25:07,795 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:25:08,796 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:25:09,797 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:25:10,798 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:25:11,799 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:25:12,800 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:25:13,801 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:25:14,802 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:25:15,803 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:25:16,804 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:25:17,806 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:25:18,807 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:25:19,808 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:25:20,809 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:25:21,810 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:25:22,811 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:25:23,813 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:25:24,814 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:25:25,815 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:25:26,816 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:25:27,818 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:25:28,819 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:25:29,820 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:25:30,821 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:25:31,822 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:25:32,823 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:25:33,824 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:25:33,827 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 15:25:39,829 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:25:40,830 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:25:41,831 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:25:42,832 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:25:43,833 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:25:44,834 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:25:45,835 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:25:46,836 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:25:47,837 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:25:48,839 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:25:49,840 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:25:50,841 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:25:51,842 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:25:52,474 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 15:25:52,474 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 15:25:52,475 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 15:25:52,843 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:25:53,844 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:25:54,845 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:25:55,846 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:25:56,847 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:25:57,848 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:25:58,850 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:25:59,851 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:26:00,852 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:26:01,853 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:26:02,854 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:26:03,855 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:26:04,856 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:26:05,857 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:26:06,858 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:26:07,859 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:26:08,861 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:26:09,862 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:26:10,863 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:26:11,864 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:26:12,865 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:26:13,866 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:26:14,867 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:26:15,869 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:26:16,870 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:26:17,871 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:26:18,872 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:26:19,873 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:26:20,874 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:26:21,875 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:26:22,877 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:26:23,878 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:26:24,879 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:26:25,880 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:26:26,882 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:26:27,883 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:26:28,884 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:26:28,885 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 15:26:34,887 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:26:35,888 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:26:36,889 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:26:37,890 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:26:38,891 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:26:39,892 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:26:40,894 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:26:41,895 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:26:42,896 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:26:43,897 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:26:44,898 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:26:45,899 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:26:46,900 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:26:47,901 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:26:48,902 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:26:49,904 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:26:50,905 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:26:51,906 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:26:52,476 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 15:26:52,477 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 15:26:52,480 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 15:26:52,907 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:26:53,908 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:26:54,909 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:26:55,910 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:26:56,911 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:26:57,912 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:26:58,913 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:26:59,915 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:27:00,916 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:27:01,917 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:27:02,918 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:27:03,919 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:27:04,920 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:27:05,921 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:27:06,922 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:27:07,924 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:27:08,925 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:27:09,926 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:27:10,927 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:27:11,928 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:27:12,929 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:27:13,930 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:27:14,931 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:27:15,932 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:27:16,934 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:27:17,935 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:27:18,936 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:27:19,937 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:27:20,938 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:27:21,939 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:27:22,940 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:27:23,942 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:27:23,943 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 15:27:29,945 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:27:30,946 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:27:31,947 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:27:32,948 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:27:33,949 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:27:34,950 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:27:35,951 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:27:36,953 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:27:37,954 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:27:38,955 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:27:39,956 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:27:40,957 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:27:41,958 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:27:42,959 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:27:43,960 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:27:44,962 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:27:45,963 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:27:46,964 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:27:47,965 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:27:48,966 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:27:49,967 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:27:50,968 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:27:51,969 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:27:52,488 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 15:27:52,489 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 15:27:52,489 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 45 more 2025-07-07 15:27:52,970 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:27:53,971 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:27:54,972 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:27:55,973 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:27:56,975 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:27:57,976 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:27:58,977 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:27:59,978 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:28:00,979 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:28:01,980 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:28:02,981 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:28:03,982 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:28:04,983 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:28:05,984 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:28:06,985 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:28:07,986 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:28:08,987 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:28:09,989 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:28:10,990 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:28:11,991 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:28:12,992 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:28:13,993 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:28:14,994 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:28:15,995 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:28:16,996 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:28:17,997 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:28:18,999 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:28:19,001 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 15:28:25,002 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:28:26,004 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:28:27,005 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:28:28,006 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:28:29,007 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:28:30,008 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:28:31,009 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:28:32,010 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:28:33,011 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:28:34,012 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:28:35,013 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:28:36,014 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:28:37,015 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:28:38,016 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:28:39,018 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:28:40,019 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:28:41,020 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:28:42,021 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:28:43,022 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:28:44,023 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:28:45,024 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:28:46,025 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:28:47,026 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:28:48,027 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:28:49,028 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:28:50,029 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:28:51,031 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:28:52,032 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:28:52,477 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 15:28:52,478 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 15:28:52,478 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 15:28:53,033 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:28:54,034 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:28:55,035 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:28:56,036 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:28:57,037 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:28:58,038 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:28:59,039 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:29:00,040 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:29:01,041 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:29:02,042 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:29:03,044 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:29:04,045 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:29:05,046 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:29:06,047 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:29:07,048 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:29:08,049 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:29:09,050 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:29:10,051 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:29:11,052 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:29:12,053 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:29:13,054 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:29:14,055 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:29:14,057 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 15:29:20,059 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:29:21,060 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:29:22,061 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:29:23,062 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:29:24,063 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:29:25,064 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:29:26,065 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:29:27,066 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:29:28,067 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:29:29,068 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:29:30,070 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:29:31,071 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:29:32,072 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:29:33,073 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:29:34,074 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:29:35,075 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:29:36,076 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:29:37,077 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:29:38,078 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:29:39,079 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:29:40,080 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:29:41,081 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:29:42,082 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:29:43,084 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:29:44,085 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:29:45,086 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:29:46,087 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:29:47,088 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:29:48,089 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:29:49,090 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:29:50,092 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:29:51,093 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:29:52,094 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:29:52,476 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 15:29:52,477 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 15:29:52,478 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 15:29:53,095 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:29:54,096 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:29:55,097 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:29:56,098 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:29:57,099 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:29:58,100 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:29:59,101 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:30:00,103 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:30:01,104 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:30:02,105 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:30:03,106 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:30:04,107 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:30:05,108 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:30:06,109 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:30:07,110 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:30:08,111 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:30:09,112 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:30:09,114 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 15:30:15,116 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:30:16,117 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:30:17,118 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:30:18,119 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:30:19,120 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:30:20,121 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:30:21,122 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:30:22,123 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:30:23,125 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:30:24,126 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:30:25,127 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:30:26,128 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:30:27,129 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:30:28,130 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:30:29,131 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:30:30,132 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:30:31,133 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:30:32,134 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:30:33,136 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:30:34,137 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:30:35,138 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:30:36,139 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:30:37,140 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:30:38,141 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:30:39,143 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:30:40,144 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:30:41,145 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:30:42,146 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:30:43,147 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:30:44,148 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:30:45,150 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:30:46,151 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:30:47,152 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:30:48,153 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:30:49,154 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:30:50,155 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:30:51,156 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:30:52,157 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:30:52,489 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 15:30:52,491 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 15:30:52,492 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 15:30:53,158 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:30:54,159 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:30:55,161 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:30:56,162 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:30:57,163 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:30:58,164 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:30:59,165 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:31:00,166 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:31:01,167 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:31:02,168 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:31:03,170 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:31:04,171 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:31:04,172 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 15:31:10,174 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:31:11,175 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:31:12,176 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:31:13,177 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:31:14,179 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:31:15,180 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:31:16,181 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:31:17,182 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:31:18,183 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:31:19,184 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:31:20,185 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:31:21,186 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:31:22,187 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:31:23,189 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:31:24,190 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:31:25,191 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:31:26,192 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:31:27,193 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:31:28,194 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:31:29,195 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:31:30,196 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:31:31,197 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:31:32,198 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:31:33,199 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:31:34,201 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:31:35,202 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:31:36,203 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:31:37,204 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:31:38,205 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:31:39,206 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:31:40,207 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:31:41,208 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:31:42,209 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:31:43,210 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:31:44,211 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:31:45,212 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:31:46,214 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:31:47,215 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:31:48,216 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:31:49,217 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:31:50,218 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:31:51,219 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:31:52,220 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:31:52,478 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 15:31:52,479 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 15:31:52,480 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 15:31:53,221 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:31:54,222 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:31:55,223 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:31:56,224 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:31:57,225 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:31:58,226 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:31:59,227 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:31:59,229 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 15:32:05,231 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:32:06,232 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:32:07,233 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:32:08,234 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:32:09,235 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:32:10,236 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:32:11,237 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:32:12,238 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:32:13,240 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:32:14,241 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:32:15,242 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:32:16,243 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:32:17,244 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:32:18,245 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:32:19,246 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:32:20,247 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:32:21,248 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:32:22,250 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:32:23,251 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:32:24,252 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:32:25,253 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:32:26,254 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:32:27,255 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:32:28,256 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:32:29,257 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:32:30,258 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:32:31,259 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:32:32,260 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:32:33,261 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:32:34,262 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:32:35,264 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:32:36,265 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:32:37,266 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:32:38,267 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:32:39,268 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:32:40,269 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:32:41,270 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:32:42,271 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:32:43,272 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:32:44,273 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:32:45,274 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:32:46,276 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:32:47,277 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:32:48,278 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:32:49,279 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:32:50,280 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:32:51,281 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:32:52,282 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:32:52,478 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 15:32:52,479 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 15:32:52,480 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 15:32:53,283 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:32:54,284 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:32:54,285 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 15:33:00,286 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:33:01,288 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:33:02,289 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:33:03,290 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:33:04,291 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:33:05,292 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:33:06,293 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:33:07,294 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:33:08,295 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:33:09,296 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:33:10,297 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:33:11,298 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:33:12,299 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:33:13,300 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:33:14,301 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:33:15,302 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:33:16,303 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:33:17,304 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:33:18,306 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:33:19,307 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:33:20,308 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:33:21,309 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:33:22,310 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:33:23,311 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:33:24,313 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:33:25,314 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:33:26,315 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:33:27,316 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:33:28,318 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:33:29,319 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:33:30,320 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:33:31,321 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:33:32,322 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:33:33,323 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:33:34,324 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:33:35,325 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:33:36,327 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:33:37,328 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:33:38,329 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:33:39,330 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:33:40,331 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:33:41,332 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:33:42,333 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:33:43,335 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:33:44,336 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:33:45,337 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:33:46,338 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:33:47,339 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:33:48,340 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:33:49,341 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:33:49,343 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 15:33:52,488 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 15:33:52,489 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 15:33:52,490 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 15:33:55,344 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:33:56,346 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:33:57,347 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:33:58,348 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:33:59,349 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:34:00,350 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:34:01,351 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:34:02,352 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:34:03,353 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:34:04,354 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:34:05,356 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:34:06,357 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:34:07,358 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:34:08,359 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:34:09,360 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:34:10,361 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:34:11,362 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:34:12,363 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:34:13,365 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:34:14,366 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:34:15,367 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:34:16,368 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:34:17,369 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:34:18,370 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:34:19,371 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:34:20,373 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:34:21,374 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:34:22,375 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:34:23,376 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:34:24,377 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:34:25,378 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:34:26,380 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:34:27,381 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:34:28,382 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:34:29,383 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:34:30,384 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:34:31,385 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:34:32,386 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:34:33,387 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:34:34,388 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:34:35,389 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:34:36,390 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:34:37,392 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:34:38,393 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:34:39,394 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:34:40,395 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:34:41,396 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:34:42,397 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:34:43,398 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:34:44,399 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:34:44,401 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 15:34:50,403 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:34:51,404 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:34:52,405 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:34:52,481 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 15:34:52,482 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 15:34:52,483 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 15:34:53,406 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:34:54,407 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:34:55,408 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:34:56,409 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:34:57,411 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:34:58,412 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:34:59,413 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:35:00,414 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:35:01,415 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:35:02,416 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:35:03,417 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:35:04,418 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:35:05,420 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:35:06,421 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:35:07,422 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:35:08,423 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:35:09,424 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:35:10,425 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:35:11,426 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:35:12,427 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:35:13,429 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:35:14,430 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:35:15,431 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:35:16,432 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:35:17,433 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:35:18,434 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:35:19,435 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:35:20,436 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:35:21,438 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:35:22,439 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:35:23,440 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:35:24,442 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:35:25,443 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:35:26,444 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:35:27,445 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:35:28,446 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:35:29,447 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:35:30,448 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:35:31,449 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:35:32,451 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:35:33,452 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:35:34,453 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:35:35,454 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:35:36,456 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:35:37,457 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:35:38,458 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:35:39,459 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:35:39,461 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 15:35:45,463 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:35:46,464 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:35:47,465 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:35:48,466 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:35:49,467 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:35:50,469 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:35:51,470 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:35:52,471 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:35:52,481 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 15:35:52,482 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor129.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 15:35:52,482 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor130.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 15:35:53,472 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:35:54,474 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:35:55,475 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:35:56,476 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:35:57,477 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:35:58,479 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:35:59,480 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:36:00,481 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:36:01,482 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:36:02,484 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:36:03,485 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:36:04,486 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:36:05,488 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:36:06,489 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:36:07,490 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:36:08,491 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:36:09,492 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:36:10,493 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:36:11,495 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:36:12,496 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:36:13,497 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:36:14,498 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:36:15,500 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:36:16,501 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:36:17,502 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:36:18,503 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:36:19,505 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:36:20,506 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:36:21,507 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:36:22,509 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:36:23,511 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:36:24,512 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:36:25,513 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:36:26,514 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:36:27,515 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:36:28,516 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:36:29,518 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:36:30,519 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:36:31,520 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:36:32,521 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:36:33,522 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:36:34,524 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:36:34,525 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 15:36:40,527 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:36:41,528 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:36:42,529 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:36:43,531 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:36:44,532 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:36:45,533 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:36:46,534 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:36:47,538 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:36:48,542 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:36:49,543 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:36:50,544 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:36:51,545 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:36:52,547 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:36:53,548 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:36:54,549 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:36:55,550 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:36:56,551 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:36:57,552 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:36:58,553 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:36:59,554 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:37:00,555 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:37:01,557 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:37:02,558 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:37:03,559 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:37:04,560 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:37:05,561 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:37:06,562 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:37:07,564 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:37:08,565 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:37:09,566 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:37:10,567 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:37:11,568 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:37:12,569 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:37:13,571 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:37:14,572 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:37:15,573 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:37:16,574 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:37:17,575 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:37:18,576 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:37:19,578 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:37:20,579 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:37:21,580 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:37:22,581 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:37:23,582 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:37:24,583 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:37:25,585 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:37:26,586 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:37:27,587 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:37:28,588 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:37:29,589 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:37:29,590 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 15:37:35,592 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:37:36,594 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:37:37,595 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:37:38,596 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:37:39,597 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:37:40,598 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:37:41,600 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:37:42,601 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:37:43,602 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:37:44,603 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:37:45,604 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:37:46,605 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:37:47,607 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:37:48,608 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:37:49,609 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:37:50,610 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:37:51,611 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:37:52,613 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:37:53,614 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:37:54,615 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:37:55,616 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:37:56,617 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:37:57,619 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:37:58,620 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:37:59,621 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:38:00,622 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:38:01,623 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:38:02,625 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:38:03,626 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:38:04,627 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:38:05,628 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:38:06,629 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:38:07,631 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:38:08,632 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:38:09,633 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:38:10,634 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:38:11,635 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:38:12,637 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:38:13,638 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:38:14,639 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:38:15,640 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:38:16,642 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:38:17,643 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:38:18,644 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:38:19,645 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:38:20,646 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:38:21,648 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:38:22,649 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:38:23,650 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:38:24,651 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:38:24,653 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 15:38:30,655 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:38:31,656 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:38:32,657 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:38:33,658 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:38:34,659 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:38:35,661 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:38:36,662 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:38:37,663 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:38:38,664 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:38:39,665 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:38:40,666 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:38:41,667 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:38:42,669 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:38:43,670 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:38:44,671 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:38:45,672 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:38:46,673 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:38:47,674 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:38:48,676 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:38:49,677 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:38:50,678 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:38:51,679 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:38:52,680 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:38:53,682 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:38:54,683 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:38:55,684 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:38:56,685 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:38:57,686 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:38:58,688 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:38:59,689 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:39:00,690 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:39:01,691 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:39:02,692 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:39:03,694 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:39:04,695 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:39:05,696 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:39:06,697 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:39:07,698 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:39:08,699 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:39:09,701 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:39:10,702 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:39:11,703 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:39:12,704 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:39:13,705 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:39:14,706 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:39:15,708 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:39:16,709 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:39:17,710 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:39:18,712 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:39:19,713 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:39:19,714 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 15:39:25,716 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:39:26,717 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:39:27,718 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:39:28,720 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:39:29,721 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:39:30,722 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:39:31,723 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:39:32,725 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:39:33,726 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:39:34,727 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:39:35,728 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:39:36,729 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:39:37,731 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:39:38,732 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:39:39,733 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:39:40,734 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:39:41,735 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:39:42,737 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:39:43,738 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:39:44,739 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:39:45,740 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:39:46,741 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:39:47,743 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:39:48,744 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:39:49,745 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:39:50,746 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:39:51,747 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:39:52,749 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:39:53,750 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:39:54,751 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:39:55,752 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:39:56,753 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:39:57,755 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:39:58,756 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:39:59,757 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:40:00,758 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:40:01,760 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:40:02,761 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:40:03,762 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:40:04,763 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:40:05,764 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:40:06,766 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:40:07,767 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:40:08,768 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:40:09,769 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:40:10,770 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:40:11,772 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:40:12,773 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:40:13,774 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:40:14,775 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:40:14,776 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 15:40:20,778 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:40:21,779 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:40:22,781 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:40:23,782 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:40:24,783 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:40:25,784 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:40:26,785 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:40:27,787 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:40:28,788 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:40:29,789 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:40:30,790 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:40:31,791 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:40:32,792 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:40:33,793 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:40:34,795 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:40:35,796 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:40:36,797 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:40:37,798 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:40:38,800 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:40:39,801 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:40:40,802 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:40:41,803 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:40:42,804 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:40:43,806 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:40:44,807 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:40:45,808 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:40:46,809 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:40:47,810 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:40:48,812 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:40:49,813 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:40:50,814 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:40:51,815 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:40:52,816 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:40:53,817 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:40:54,819 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:40:55,820 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:40:56,821 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:40:57,822 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:40:58,823 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:40:59,825 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:41:00,826 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:41:01,827 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:41:02,828 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:41:03,829 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:41:04,831 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:41:05,832 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:41:06,833 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:41:07,834 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:41:08,835 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:41:09,837 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:41:09,838 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 15:41:15,840 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:41:16,841 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:41:17,842 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:41:18,241 WARN com.cloudera.cmf.event.publish.EventStorePublisherWithRetry: Failed to publish event: SimpleEvent{attributes={ROLE=[hdfs-DATANODE-b30a464b10a57fdd49ea734cd52a8291], HOSTS=[dmidlkprdls04.svr.luc.edu], ROLE_TYPE=[DATANODE], CATEGORY=[LOG_MESSAGE], EVENTCODE=[EV_LOG_EVENT], SERVICE=[hdfs], SERVICE_TYPE=[HDFS], LOG_LEVEL=[WARN], HOST_IDS=[5c33df90-d247-4c6d-b9e0-5908a423580a], SEVERITY=[IMPORTANT]}, content=Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /var/lib/hadoop-hdfs/hadoop-http-auth-signature-secret, timestamp=1751909970885} - 1 of 60 failure(s) in last 1818s java.io.IOException: Error connecting to dmidlkprdls01.svr.luc.edu/192.168.158.1:7184 at com.cloudera.cmf.event.shaded.org.apache.avro.ipc.netty.NettyTransceiver.getChannel(NettyTransceiver.java:269) at com.cloudera.cmf.event.shaded.org.apache.avro.ipc.netty.NettyTransceiver.(NettyTransceiver.java:197) at com.cloudera.cmf.event.shaded.org.apache.avro.ipc.netty.NettyTransceiver.(NettyTransceiver.java:120) at com.cloudera.cmf.event.publish.AvroEventStorePublishProxy.checkSpecificRequestor(AvroEventStorePublishProxy.java:120) at com.cloudera.cmf.event.publish.AvroEventStorePublishProxy.publishEvent(AvroEventStorePublishProxy.java:204) at com.cloudera.cmf.event.publish.EventStorePublisherWithRetry$PublishEventTask.run(EventStorePublisherWithRetry.java:242) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:750) Caused by: com.cloudera.cmf.event.shaded.io.netty.channel.AbstractChannel$AnnotatedNoRouteToHostException: No route to host: dmidlkprdls01.svr.luc.edu/192.168.158.1:7184 Caused by: java.net.NoRouteToHostException: No route to host at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:716) at com.cloudera.cmf.event.shaded.io.netty.channel.socket.nio.NioSocketChannel.doFinishConnect(NioSocketChannel.java:337) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.finishConnect(AbstractNioChannel.java:339) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:776) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:724) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:650) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:562) at com.cloudera.cmf.event.shaded.io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997) at com.cloudera.cmf.event.shaded.io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) at java.lang.Thread.run(Thread.java:750) 2025-07-07 15:41:18,843 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:41:19,844 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:41:20,846 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:41:21,847 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:41:22,848 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:41:23,849 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:41:24,850 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:41:25,851 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:41:26,853 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:41:27,854 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:41:28,855 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:41:29,856 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:41:30,857 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:41:31,859 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:41:32,860 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:41:33,861 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:41:34,862 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:41:35,863 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:41:36,865 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:41:37,866 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:41:38,867 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:41:39,868 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:41:40,869 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:41:41,870 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:41:42,872 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:41:43,873 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:41:44,874 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:41:45,875 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:41:46,876 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:41:47,877 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:41:48,879 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:41:49,880 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:41:50,881 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:41:51,882 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:41:52,883 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:41:53,884 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:41:54,885 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:41:55,886 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:41:56,887 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:41:57,888 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:41:58,889 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:41:59,890 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:42:00,892 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:42:01,893 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:42:02,894 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:42:03,895 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:42:04,896 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:42:04,898 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 15:42:10,899 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:42:11,901 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:42:12,902 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:42:13,903 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:42:14,904 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:42:15,905 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:42:16,906 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:42:17,907 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:42:18,909 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:42:19,910 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:42:20,911 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:42:21,912 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:42:22,913 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:42:23,914 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:42:24,915 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:42:25,916 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:42:26,917 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:42:27,919 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:42:28,920 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:42:29,921 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:42:30,922 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:42:31,923 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:42:32,924 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:42:33,925 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:42:34,926 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:42:35,928 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:42:36,929 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:42:37,930 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:42:38,931 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:42:39,932 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:42:40,933 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:42:41,934 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:42:42,935 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:42:43,937 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:42:44,938 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:42:45,939 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:42:46,940 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:42:47,941 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:42:48,942 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:42:49,943 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:42:50,945 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:42:51,946 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:42:52,947 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:42:53,948 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:42:54,949 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:42:55,950 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:42:56,952 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:42:57,953 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:42:58,954 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:42:59,955 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:42:59,956 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 15:43:05,959 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:43:06,961 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:43:07,962 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:43:08,963 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:43:09,964 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:43:10,965 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:43:11,966 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:43:12,968 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:43:13,969 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:43:14,970 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:43:15,971 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:43:16,972 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:43:17,973 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:43:18,974 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:43:19,976 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:43:20,977 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:43:21,978 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:43:22,979 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:43:23,980 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:43:24,981 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:43:25,982 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:43:26,983 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:43:27,984 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:43:28,986 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:43:29,987 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:43:30,988 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:43:31,989 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:43:32,990 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:43:33,991 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:43:34,992 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:43:35,993 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:43:36,995 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:43:37,996 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:43:38,997 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:43:39,998 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:43:40,999 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:43:42,000 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:43:43,001 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:43:44,002 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:43:45,003 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:43:46,005 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:43:47,006 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:43:48,007 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:43:49,008 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:43:50,009 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:43:51,010 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:43:52,011 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:43:53,013 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:43:54,014 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:43:55,015 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:43:55,018 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 15:44:01,019 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:44:02,021 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:44:03,022 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:44:04,023 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:44:05,024 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:44:06,025 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:44:07,026 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:44:08,027 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:44:09,029 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:44:10,030 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:44:11,031 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:44:12,032 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:44:13,033 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:44:14,034 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:44:15,035 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:44:16,036 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:44:17,037 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:44:18,039 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:44:19,040 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:44:20,041 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:44:21,042 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:44:22,043 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:44:23,044 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:44:24,045 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:44:25,047 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:44:26,048 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:44:27,049 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:44:28,050 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:44:29,051 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:44:30,052 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:44:31,053 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:44:32,055 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:44:33,056 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:44:34,057 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:44:35,058 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:44:36,059 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:44:37,060 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:44:38,061 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:44:39,062 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:44:40,064 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:44:41,065 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:44:42,066 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:44:43,067 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:44:44,068 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:44:45,069 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:44:46,071 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:44:47,072 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:44:48,073 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:44:49,074 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:44:50,075 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:44:50,077 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 15:44:56,079 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:44:57,080 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:44:58,081 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:44:59,082 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:45:00,083 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:45:01,084 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:45:02,085 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:45:03,086 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:45:04,088 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:45:05,089 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:45:06,090 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:45:07,091 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:45:08,092 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:45:09,093 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:45:10,094 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:45:11,096 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:45:12,097 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:45:13,098 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:45:14,099 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:45:15,100 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:45:16,101 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:45:17,103 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:45:18,104 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:45:19,105 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:45:20,106 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:45:21,107 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:45:22,108 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:45:23,109 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:45:24,110 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:45:25,111 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:45:26,113 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:45:27,114 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:45:28,115 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:45:29,116 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:45:30,117 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:45:31,118 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:45:32,119 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:45:33,120 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:45:34,121 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:45:35,123 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:45:36,124 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:45:37,125 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:45:38,126 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:45:39,127 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:45:40,128 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:45:41,129 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:45:42,130 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:45:43,132 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:45:44,133 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:45:45,134 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:45:45,135 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 15:45:51,137 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:45:52,138 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:45:53,139 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:45:54,141 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:45:55,142 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:45:56,143 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:45:57,144 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:45:58,145 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:45:59,147 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:46:00,148 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:46:01,149 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:46:02,150 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:46:03,151 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:46:04,152 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:46:05,153 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:46:06,154 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:46:07,156 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:46:08,157 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:46:09,158 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:46:10,159 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:46:11,160 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:46:12,161 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:46:13,162 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:46:14,163 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:46:15,164 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:46:16,166 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:46:17,167 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:46:18,168 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:46:19,169 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:46:20,170 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:46:21,171 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:46:22,172 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:46:23,174 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:46:24,175 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:46:25,176 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:46:26,177 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:46:27,178 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:46:28,179 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:46:29,180 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:46:30,181 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:46:31,182 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:46:32,183 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:46:33,185 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:46:34,186 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:46:35,187 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:46:36,188 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:46:37,189 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:46:38,190 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:46:39,191 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:46:40,192 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:46:40,194 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 15:46:46,195 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:46:47,197 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:46:48,198 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:46:49,199 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:46:50,200 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:46:51,201 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:46:52,202 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:46:53,203 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:46:54,204 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:46:55,205 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:46:56,206 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:46:57,207 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:46:58,208 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:46:59,209 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:47:00,210 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:47:01,211 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:47:02,212 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:47:03,213 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:47:04,215 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:47:05,216 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:47:06,217 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:47:07,218 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:47:08,219 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:47:09,220 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:47:10,221 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:47:11,222 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:47:12,223 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:47:13,224 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:47:14,225 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:47:15,227 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:47:16,228 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:47:17,229 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:47:18,230 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:47:19,231 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:47:20,232 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:47:21,233 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:47:22,234 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:47:23,235 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:47:24,237 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:47:25,238 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:47:26,239 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:47:27,240 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:47:28,241 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:47:29,242 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:47:30,243 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:47:31,244 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:47:32,246 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:47:33,247 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:47:34,248 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:47:35,249 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:47:35,250 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 15:47:41,252 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:47:42,253 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:47:43,254 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:47:44,255 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:47:45,256 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:47:46,258 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:47:47,259 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:47:48,260 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:47:49,261 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:47:50,262 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:47:51,263 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:47:52,264 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:47:53,266 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:47:54,267 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:47:55,268 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:47:56,269 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:47:57,270 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:47:58,271 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:47:59,272 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:48:00,274 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:48:01,275 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:48:02,276 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:48:03,277 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:48:04,278 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:48:05,279 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:48:06,280 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:48:07,281 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:48:08,282 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:48:09,283 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:48:10,285 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:48:11,286 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:48:12,287 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:48:13,288 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:48:14,289 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:48:15,290 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:48:16,291 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:48:17,293 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:48:18,294 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:48:19,295 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:48:20,296 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:48:21,297 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:48:22,298 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:48:23,299 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:48:24,300 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:48:25,302 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:48:26,303 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:48:27,304 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:48:28,305 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:48:29,306 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:48:30,307 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:48:30,309 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 15:48:36,310 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:48:37,312 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:48:38,313 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:48:39,314 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:48:40,315 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:48:41,316 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:48:42,317 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:48:43,318 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:48:44,319 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:48:45,321 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:48:46,322 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:48:47,323 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:48:48,324 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:48:49,325 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:48:50,326 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:48:51,327 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:48:52,329 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:48:53,330 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:48:54,331 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:48:55,332 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:48:56,333 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:48:57,334 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:48:58,335 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:48:59,337 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:49:00,338 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:49:01,339 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:49:02,340 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:49:03,341 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:49:04,342 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:49:05,343 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:49:06,345 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:49:07,346 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:49:08,347 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:49:09,348 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:49:10,349 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:49:11,350 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:49:12,351 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:49:13,353 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:49:14,354 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:49:15,355 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:49:16,356 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:49:17,357 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:49:18,358 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:49:19,359 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:49:20,361 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:49:21,362 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:49:22,363 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:49:23,364 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:49:24,365 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:49:25,366 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:49:25,368 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 15:49:31,369 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:49:32,371 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:49:33,372 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:49:34,373 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:49:35,374 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:49:36,375 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:49:37,376 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:49:38,377 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:49:39,378 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:49:40,380 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:49:41,381 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:49:42,382 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:49:43,383 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:49:44,384 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:49:45,385 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:49:46,386 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:49:47,388 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:49:48,389 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:49:49,390 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:49:50,391 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:49:51,392 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:49:52,393 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:49:53,394 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:49:54,395 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:49:55,397 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:49:56,398 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:49:57,399 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:49:58,400 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:49:59,401 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:50:00,402 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:50:01,403 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:50:02,404 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:50:03,405 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:50:04,406 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:50:05,407 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:50:06,408 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:50:07,409 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:50:08,410 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:50:09,411 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:50:10,412 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:50:11,413 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:50:12,415 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:50:13,416 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:50:14,417 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:50:15,418 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:50:16,419 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:50:17,420 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:50:18,421 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:50:19,422 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:50:20,423 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:50:20,425 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 15:50:26,426 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:50:27,427 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:50:28,428 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:50:29,429 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:50:30,431 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:50:31,432 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:50:32,432 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:50:33,434 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:50:34,435 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:50:35,436 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:50:36,437 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:50:37,438 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:50:38,439 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:50:39,440 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:50:40,441 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:50:41,442 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:50:42,443 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:50:43,444 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:50:44,445 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:50:45,446 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:50:46,448 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:50:47,449 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:50:48,450 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:50:49,451 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:50:50,452 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:50:51,453 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:50:52,454 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:50:53,455 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:50:54,456 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:50:55,457 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:50:56,458 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:50:57,459 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:50:58,460 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:50:59,461 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:51:00,462 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:51:01,463 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:51:02,464 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:51:03,465 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:51:04,466 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:51:05,467 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:51:06,469 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:51:07,470 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:51:08,471 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:51:09,472 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:51:10,473 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:51:11,474 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:51:12,475 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:51:13,476 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:51:14,477 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:51:15,478 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:51:15,479 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 15:51:21,481 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:51:22,482 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:51:23,483 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:51:24,484 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:51:25,485 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:51:26,486 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:51:27,487 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:51:28,488 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:51:29,490 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:51:30,491 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:51:31,492 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:51:32,493 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:51:33,494 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:51:34,495 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:51:35,496 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:51:36,497 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:51:37,498 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:51:38,499 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:51:39,500 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:51:40,501 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:51:41,502 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:51:42,503 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:51:43,505 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:51:44,506 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:51:45,507 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:51:46,508 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:51:47,509 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:51:48,510 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:51:49,511 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:51:50,512 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:51:51,513 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:51:52,514 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:51:53,515 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:51:54,516 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:51:55,518 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:51:56,519 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:51:57,520 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:51:58,521 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:51:59,522 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:52:00,523 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:52:01,524 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:52:02,525 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:52:03,526 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:52:04,527 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:52:05,528 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:52:06,529 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:52:07,530 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:52:08,531 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:52:09,532 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:52:10,533 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:52:10,534 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 15:52:16,536 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:52:17,537 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:52:18,538 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:52:19,539 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:52:20,540 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:52:21,541 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:52:22,542 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:52:23,543 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:52:24,544 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:52:25,545 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:52:26,546 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:52:27,547 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:52:28,548 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:52:29,549 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:52:30,550 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:52:31,551 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:52:32,552 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:52:33,553 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:52:34,555 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:52:35,556 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:52:36,556 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:52:37,558 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:52:38,559 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:52:39,560 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:52:40,561 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:52:41,562 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:52:42,563 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:52:43,564 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:52:44,565 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:52:45,566 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:52:46,567 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:52:47,568 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:52:48,569 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:52:49,570 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:52:50,571 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:52:51,573 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:52:52,574 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:52:53,575 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:52:54,576 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:52:55,577 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:52:56,578 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:52:57,579 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:52:58,580 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:52:59,581 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:53:00,582 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:53:01,583 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:53:02,584 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:53:03,585 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:53:04,586 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:53:05,587 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:53:05,589 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 15:53:11,590 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:53:12,591 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:53:13,592 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:53:14,593 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:53:15,594 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:53:16,596 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:53:17,597 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:53:18,598 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:53:19,599 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:53:20,600 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:53:21,601 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:53:22,602 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:53:23,603 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:53:24,604 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:53:25,605 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:53:26,606 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:53:27,607 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:53:28,608 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:53:29,609 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:53:30,610 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:53:31,612 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:53:32,613 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:53:33,614 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:53:34,615 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:53:35,616 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:53:36,617 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:53:37,618 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:53:38,619 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:53:39,620 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:53:40,621 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:53:41,622 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:53:42,623 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:53:43,625 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:53:44,626 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:53:45,627 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:53:46,628 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:53:47,629 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:53:48,630 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:53:49,632 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:53:50,633 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:53:51,634 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:53:52,635 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:53:53,636 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:53:54,637 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:53:55,638 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:53:56,639 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:53:57,641 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:53:58,642 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:53:59,643 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:54:00,644 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:54:00,646 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 15:54:06,647 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:54:07,648 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:54:08,650 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:54:09,651 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:54:10,652 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:54:11,653 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:54:12,654 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:54:13,655 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:54:14,657 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:54:15,658 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:54:16,659 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:54:17,660 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:54:18,661 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:54:19,663 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:54:20,664 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:54:21,665 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:54:22,666 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:54:23,667 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:54:24,668 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:54:25,670 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:54:26,671 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:54:27,672 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:54:28,673 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:54:29,674 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:54:30,675 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:54:31,676 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:54:32,678 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:54:33,679 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:54:34,680 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:54:35,681 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:54:36,682 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:54:37,683 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:54:38,684 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:54:39,686 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:54:40,687 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:54:41,688 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:54:42,689 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:54:43,690 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:54:44,692 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:54:45,693 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:54:46,694 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:54:47,695 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:54:48,696 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:54:49,697 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:54:50,699 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:54:51,700 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:54:52,701 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:54:53,702 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:54:54,703 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:54:55,705 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:54:55,706 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 15:55:01,708 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:55:02,709 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:55:03,710 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:55:04,711 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:55:05,712 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:55:06,714 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:55:07,715 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:55:08,716 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:55:09,717 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:55:10,718 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:55:11,720 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:55:12,721 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:55:13,722 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:55:14,723 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:55:15,724 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:55:16,726 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:55:17,727 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:55:18,728 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:55:19,729 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:55:20,730 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:55:21,732 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:55:22,733 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:55:23,734 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:55:24,735 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:55:25,736 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:55:26,738 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:55:27,739 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:55:28,740 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:55:29,741 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:55:30,742 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:55:31,744 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:55:32,745 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:55:33,746 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:55:34,747 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:55:35,748 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:55:36,749 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:55:37,751 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:55:38,752 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:55:39,753 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:55:40,754 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:55:41,755 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:55:42,757 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:55:43,758 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:55:44,759 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:55:45,760 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:55:46,762 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:55:47,763 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:55:48,764 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:55:49,765 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:55:50,766 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:55:50,768 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 15:55:56,769 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:55:57,771 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:55:58,772 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:55:59,773 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:56:00,774 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:56:01,776 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:56:02,777 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:56:03,778 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:56:04,779 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:56:05,780 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:56:06,781 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:56:07,783 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:56:08,784 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:56:09,785 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:56:10,786 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:56:11,787 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:56:12,789 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:56:13,790 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:56:14,791 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:56:15,792 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:56:16,794 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:56:17,795 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:56:18,796 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:56:19,797 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:56:20,798 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:56:21,800 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:56:22,801 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:56:23,802 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:56:24,803 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:56:25,804 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:56:26,806 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:56:27,807 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:56:28,808 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:56:29,809 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:56:30,810 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:56:31,811 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:56:32,813 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:56:33,814 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:56:34,815 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:56:35,816 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:56:36,817 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:56:37,819 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:56:38,820 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:56:39,821 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:56:40,822 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:56:41,824 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:56:42,825 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:56:43,826 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:56:44,827 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:56:45,828 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:56:45,830 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 15:56:51,831 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:56:52,833 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:56:53,834 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:56:54,835 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:56:55,836 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:56:56,837 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:56:57,838 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:56:58,858 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:56:59,859 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:57:00,860 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:57:01,861 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:57:02,862 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:57:03,863 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:57:04,865 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:57:05,866 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:57:06,867 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:57:07,868 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:57:08,870 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:57:09,871 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:57:10,872 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:57:11,873 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:57:12,874 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:57:13,875 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:57:14,876 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:57:15,878 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:57:16,879 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:57:17,880 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:57:18,881 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:57:19,882 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:57:20,884 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:57:21,885 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:57:22,886 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:57:23,887 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:57:24,888 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:57:25,890 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:57:26,891 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:57:27,892 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:57:28,893 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:57:29,894 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:57:30,895 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:57:31,897 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:57:32,898 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:57:33,899 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:57:34,900 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:57:35,901 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:57:36,903 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:57:37,904 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:57:38,905 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:57:39,906 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:57:40,907 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:57:40,909 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 15:57:46,910 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:57:47,912 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:57:48,913 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:57:49,914 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:57:50,915 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:57:51,916 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:57:52,917 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:57:53,919 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:57:54,920 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:57:55,921 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:57:56,922 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:57:57,923 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:57:58,924 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:57:59,926 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:58:00,927 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:58:01,928 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:58:02,929 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:58:03,931 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:58:04,932 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:58:05,933 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:58:06,934 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:58:07,935 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:58:08,936 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:58:09,938 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:58:10,939 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:58:11,940 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:58:12,941 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:58:13,942 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:58:14,944 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:58:15,945 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:58:16,946 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:58:17,947 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:58:18,949 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:58:19,950 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:58:20,951 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:58:21,952 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:58:22,953 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:58:23,954 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:58:24,956 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:58:25,957 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:58:26,958 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:58:27,959 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:58:28,960 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:58:29,961 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:58:30,963 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:58:31,964 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:58:32,965 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:58:33,966 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:58:34,967 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:58:35,969 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:58:35,970 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 15:58:41,972 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:58:42,973 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:58:43,974 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:58:44,975 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:58:45,976 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:58:46,978 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:58:47,979 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:58:48,980 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:58:49,981 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:58:50,982 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:58:51,983 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:58:52,985 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:58:53,986 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:58:54,987 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:58:55,988 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:58:56,989 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:58:57,990 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:58:58,992 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:58:59,993 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:59:00,994 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:59:01,995 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:59:02,996 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:59:03,997 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:59:04,999 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:59:06,000 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:59:07,001 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:59:08,002 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:59:09,003 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:59:10,004 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:59:11,006 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:59:12,007 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:59:13,008 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:59:14,009 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:59:15,010 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:59:16,011 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:59:17,013 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:59:18,014 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:59:19,015 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:59:20,016 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:59:21,018 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:59:22,019 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:59:23,020 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:59:24,021 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:59:25,022 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:59:26,023 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:59:27,025 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:59:28,026 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:59:29,027 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:59:30,028 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:59:31,029 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:59:31,031 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 15:59:37,033 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:59:38,034 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:59:39,035 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:59:40,036 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:59:41,037 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:59:42,038 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:59:43,039 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:59:44,040 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:59:45,041 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:59:46,043 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:59:47,044 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:59:48,045 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:59:49,046 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:59:50,048 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:59:51,049 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:59:52,050 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:59:53,051 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:59:54,052 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:59:55,053 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:59:56,055 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:59:57,056 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:59:58,057 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 15:59:59,058 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:00:00,059 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:00:01,060 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:00:02,062 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:00:03,063 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:00:04,064 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:00:05,065 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:00:06,066 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:00:07,068 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:00:08,069 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:00:09,070 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:00:10,071 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:00:11,072 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:00:12,074 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:00:13,075 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:00:14,076 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:00:15,077 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:00:16,078 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:00:17,079 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:00:18,081 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:00:19,082 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:00:20,083 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:00:21,084 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:00:22,085 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:00:23,086 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:00:24,088 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:00:25,089 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:00:26,090 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:00:26,091 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 16:00:32,093 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:00:33,094 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:00:34,095 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:00:35,096 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:00:36,098 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:00:37,099 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:00:38,100 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:00:39,101 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:00:40,102 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:00:41,103 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:00:42,105 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:00:43,106 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:00:44,107 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:00:45,108 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:00:46,109 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:00:47,111 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:00:48,112 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:00:49,113 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:00:50,114 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:00:51,115 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:00:52,117 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:00:53,118 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:00:54,119 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:00:55,120 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:00:56,121 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:00:57,123 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:00:58,124 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:00:59,125 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:01:00,126 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:01:01,127 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:01:02,128 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:01:03,129 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:01:04,131 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:01:05,132 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:01:06,133 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:01:07,134 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:01:08,135 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:01:09,137 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:01:10,138 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:01:11,139 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:01:12,141 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:01:13,142 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:01:14,143 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:01:15,144 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:01:16,145 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:01:17,147 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:01:18,148 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:01:19,149 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:01:20,150 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:01:21,151 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:01:21,153 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 16:01:27,156 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:01:28,158 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:01:29,159 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:01:30,160 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:01:31,161 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:01:32,162 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:01:33,164 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:01:34,165 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:01:35,166 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:01:36,167 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:01:37,168 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:01:38,170 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:01:39,171 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:01:40,172 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:01:41,173 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:01:42,174 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:01:43,175 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:01:44,177 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:01:45,178 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:01:46,179 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:01:47,180 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:01:48,181 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:01:49,183 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:01:50,184 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:01:51,185 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:01:52,186 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:01:53,187 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:01:54,189 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:01:55,190 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:01:56,191 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:01:57,192 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:01:58,193 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:01:59,194 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:02:00,195 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:02:01,196 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:02:02,198 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:02:03,199 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:02:04,200 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:02:05,201 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:02:06,202 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:02:07,203 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:02:08,205 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:02:09,206 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:02:10,207 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:02:11,208 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:02:12,209 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:02:13,210 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:02:14,212 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:02:15,213 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:02:16,214 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:02:16,217 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 16:02:22,219 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:02:23,220 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:02:24,221 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:02:25,222 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:02:26,223 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:02:27,225 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:02:28,226 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:02:29,227 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:02:30,228 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:02:31,229 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:02:32,230 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:02:33,231 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:02:34,232 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:02:35,234 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:02:36,235 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:02:37,236 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:02:38,237 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:02:39,238 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:02:40,239 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:02:41,241 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:02:42,242 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:02:43,243 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:02:44,244 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:02:45,245 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:02:46,246 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:02:47,248 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:02:48,249 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:02:49,250 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:02:50,251 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:02:51,253 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:02:52,254 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:02:53,255 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:02:54,256 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:02:55,257 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:02:56,258 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:02:57,260 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:02:58,261 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:02:59,262 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:03:00,263 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:03:01,264 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:03:02,266 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:03:03,267 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:03:04,268 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:03:05,269 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:03:06,270 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:03:07,272 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:03:08,273 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:03:09,274 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:03:10,275 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:03:11,276 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:03:11,278 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 16:03:17,279 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:03:18,281 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:03:19,282 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:03:20,283 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:03:21,284 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:03:22,285 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:03:23,287 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:03:24,288 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:03:25,289 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:03:26,290 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:03:27,291 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:03:28,293 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:03:29,294 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:03:30,295 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:03:31,296 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:03:32,297 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:03:33,298 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:03:34,300 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:03:35,301 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:03:36,302 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:03:37,303 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:03:38,304 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:03:39,305 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:03:40,306 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:03:41,308 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:03:42,309 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:03:43,310 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:03:44,311 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:03:45,312 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:03:46,313 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:03:47,315 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:03:48,316 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:03:49,317 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:03:50,318 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:03:51,319 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:03:52,320 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:03:53,322 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:03:54,323 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:03:55,324 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:03:56,325 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:03:57,326 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:03:58,327 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:03:59,329 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:04:00,330 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:04:01,331 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:04:02,332 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:04:03,333 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:04:04,336 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:04:05,338 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:04:06,339 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:04:06,340 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 16:04:12,342 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:04:13,343 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:04:14,344 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:04:15,345 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:04:16,346 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:04:17,348 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:04:18,349 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:04:19,350 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:04:20,351 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:04:21,352 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:04:22,354 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:04:23,355 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:04:24,356 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:04:25,357 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:04:26,358 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:04:27,359 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:04:28,361 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:04:29,362 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:04:30,363 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:04:31,364 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:04:32,365 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:04:33,367 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:04:34,368 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:04:35,369 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:04:36,370 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:04:37,371 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:04:38,372 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:04:39,374 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:04:40,375 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:04:41,376 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:04:42,377 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:04:43,378 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:04:44,379 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:04:45,381 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:04:46,382 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:04:47,383 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:04:48,384 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:04:49,386 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:04:50,387 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:04:51,388 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:04:52,389 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:04:53,390 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:04:54,391 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:04:55,392 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:04:56,394 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:04:57,395 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:04:58,396 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:04:59,397 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:05:00,398 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:05:01,399 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:05:01,401 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 16:05:07,402 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:05:08,404 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:05:09,405 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:05:10,406 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:05:11,407 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:05:12,408 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:05:13,409 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:05:14,410 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:05:15,412 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:05:16,413 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:05:17,414 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:05:18,415 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:05:19,417 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:05:20,418 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:05:21,419 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:05:22,420 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:05:23,421 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:05:24,422 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:05:25,424 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:05:26,425 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:05:27,426 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:05:28,427 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:05:29,428 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:05:30,430 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:05:31,431 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:05:32,432 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:05:33,433 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:05:34,434 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:05:35,436 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:05:36,437 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:05:37,438 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:05:38,439 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:05:39,440 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:05:40,442 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:05:41,443 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:05:42,444 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:05:43,445 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:05:44,446 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:05:45,448 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:05:46,449 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:05:47,450 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:05:48,451 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:05:49,452 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:05:50,453 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:05:51,455 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:05:52,456 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:05:53,457 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:05:54,458 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:05:55,459 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:05:56,461 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:05:56,462 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 16:06:02,464 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:06:03,465 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:06:04,466 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:06:05,467 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:06:06,468 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:06:07,470 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:06:08,471 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:06:09,472 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:06:10,473 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:06:11,474 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:06:12,476 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:06:13,477 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:06:14,478 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:06:15,479 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:06:16,480 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:06:17,481 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:06:18,483 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:06:19,484 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:06:20,485 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:06:21,486 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:06:22,487 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:06:23,488 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:06:24,489 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:06:25,491 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:06:26,492 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:06:27,493 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:06:28,494 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:06:29,495 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:06:30,496 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:06:31,498 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:06:32,499 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:06:33,500 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:06:34,501 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:06:35,502 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:06:36,503 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:06:37,505 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:06:38,506 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:06:39,507 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:06:40,508 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:06:41,509 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:06:42,510 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:06:43,512 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:06:44,513 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:06:45,514 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:06:46,515 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:06:47,516 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:06:48,518 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:06:49,519 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:06:50,520 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:06:51,521 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:06:51,523 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 16:06:57,524 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:06:58,525 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:06:59,526 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:07:00,527 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:07:01,529 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:07:02,530 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:07:03,531 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:07:04,532 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:07:05,533 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:07:06,534 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:07:07,535 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:07:08,537 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:07:09,538 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:07:10,539 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:07:11,540 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:07:12,541 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:07:13,542 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:07:14,543 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:07:15,545 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:07:16,546 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:07:17,547 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:07:18,548 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:07:19,549 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:07:20,550 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:07:21,551 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:07:22,553 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:07:23,554 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:07:24,555 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:07:25,556 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:07:26,557 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:07:27,558 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:07:28,559 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:07:29,561 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:07:30,562 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:07:31,563 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:07:32,564 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:07:33,565 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:07:34,566 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:07:35,567 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:07:36,568 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:07:37,570 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:07:38,571 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:07:39,572 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:07:40,573 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:07:41,574 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:07:42,575 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:07:43,576 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:07:44,578 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:07:45,579 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:07:46,580 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:07:46,581 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 16:07:52,583 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:07:53,584 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:07:54,585 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:07:55,586 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:07:56,587 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:07:57,589 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:07:58,590 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:07:59,591 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:08:00,592 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:08:01,593 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:08:02,594 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:08:03,595 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:08:04,596 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:08:05,597 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:08:06,599 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:08:07,600 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:08:08,601 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:08:09,602 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:08:10,603 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:08:11,604 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:08:12,605 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:08:13,606 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:08:14,608 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:08:15,609 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:08:16,610 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:08:17,611 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:08:18,612 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:08:19,613 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:08:20,615 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:08:21,616 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:08:22,617 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:08:23,618 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:08:24,619 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:08:25,620 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:08:26,621 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:08:27,623 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:08:28,624 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:08:29,625 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:08:30,626 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:08:31,627 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:08:32,628 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:08:33,629 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:08:34,630 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:08:35,632 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:08:36,633 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:08:37,634 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:08:38,635 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:08:39,636 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:08:40,637 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:08:41,638 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:08:41,640 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 16:08:47,641 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:08:48,642 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:08:49,644 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:08:50,645 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:08:51,646 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:08:52,647 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:08:53,648 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:08:54,649 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:08:55,650 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:08:56,651 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:08:57,653 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:08:58,654 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:08:59,655 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:09:00,656 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:09:01,657 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:09:02,658 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:09:03,659 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:09:04,660 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:09:05,661 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:09:06,662 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:09:07,663 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:09:08,665 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:09:09,666 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:09:10,667 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:09:11,668 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:09:12,669 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:09:13,670 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:09:14,671 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:09:15,672 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:09:16,674 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:09:17,675 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:09:18,676 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:09:19,677 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:09:20,678 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:09:21,679 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:09:22,680 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:09:23,682 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:09:24,683 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:09:25,684 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:09:26,685 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:09:27,686 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:09:28,687 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:09:29,688 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:09:30,689 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:09:31,690 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:09:32,691 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:09:33,693 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:09:34,694 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:09:35,695 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:09:36,696 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:09:36,697 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 16:09:42,699 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:09:43,700 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:09:44,701 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:09:45,702 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:09:46,703 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:09:47,705 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:09:48,706 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:09:49,707 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:09:50,708 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:09:51,709 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:09:52,710 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:09:53,711 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:09:54,712 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:09:55,714 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:09:56,715 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:09:57,716 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:09:58,717 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:09:59,718 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:10:00,719 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:10:01,720 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:10:02,722 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:10:03,723 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:10:04,724 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:10:05,725 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:10:06,726 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:10:07,727 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:10:08,728 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:10:09,729 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:10:10,730 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:10:11,732 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:10:12,733 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:10:13,734 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:10:14,735 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:10:15,736 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:10:16,737 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:10:17,738 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:10:18,739 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:10:19,740 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:10:20,742 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:10:21,743 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:10:22,744 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:10:23,745 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:10:24,746 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:10:25,747 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:10:26,748 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:10:27,749 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:10:28,750 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:10:29,752 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:10:30,753 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:10:31,754 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:10:31,755 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 16:10:37,757 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:10:38,758 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:10:39,759 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:10:40,760 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:10:41,762 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:10:42,763 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:10:43,764 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:10:44,765 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:10:45,766 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:10:46,767 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:10:47,768 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:10:48,770 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:10:49,771 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:10:50,772 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:10:51,773 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:10:52,774 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:10:53,775 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:10:54,776 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:10:55,777 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:10:56,779 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:10:57,780 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:10:58,781 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:10:59,782 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:11:00,783 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:11:01,784 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:11:02,785 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:11:03,787 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:11:04,788 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:11:05,789 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:11:06,790 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:11:07,791 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:11:08,792 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:11:09,793 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:11:10,794 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:11:11,796 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:11:12,797 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:11:13,798 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:11:14,799 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:11:15,800 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:11:16,801 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:11:17,803 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:11:18,804 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:11:19,805 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:11:20,806 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:11:21,808 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:11:22,809 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:11:23,810 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:11:24,811 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:11:25,812 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:11:26,814 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:11:26,815 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 16:11:32,817 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:11:33,818 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:11:34,820 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:11:35,821 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:11:36,822 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:11:37,111 WARN com.cloudera.cmf.event.publish.EventStorePublisherWithRetry: Failed to publish event: SimpleEvent{attributes={ROLE=[hdfs-DATANODE-b30a464b10a57fdd49ea734cd52a8291], HOSTS=[dmidlkprdls04.svr.luc.edu], ROLE_TYPE=[DATANODE], CATEGORY=[LOG_MESSAGE], EVENTCODE=[EV_LOG_EVENT], SERVICE=[hdfs], SERVICE_TYPE=[HDFS], LOG_LEVEL=[WARN], HOST_IDS=[5c33df90-d247-4c6d-b9e0-5908a423580a], SEVERITY=[IMPORTANT]}, content=Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /var/lib/hadoop-hdfs/hadoop-http-auth-signature-secret, timestamp=1751909970885} - 1 of 60 failure(s) in last 1818s java.io.IOException: Error connecting to dmidlkprdls01.svr.luc.edu/192.168.158.1:7184 at com.cloudera.cmf.event.shaded.org.apache.avro.ipc.netty.NettyTransceiver.getChannel(NettyTransceiver.java:269) at com.cloudera.cmf.event.shaded.org.apache.avro.ipc.netty.NettyTransceiver.(NettyTransceiver.java:197) at com.cloudera.cmf.event.shaded.org.apache.avro.ipc.netty.NettyTransceiver.(NettyTransceiver.java:120) at com.cloudera.cmf.event.publish.AvroEventStorePublishProxy.checkSpecificRequestor(AvroEventStorePublishProxy.java:120) at com.cloudera.cmf.event.publish.AvroEventStorePublishProxy.publishEvent(AvroEventStorePublishProxy.java:204) at com.cloudera.cmf.event.publish.EventStorePublisherWithRetry$PublishEventTask.run(EventStorePublisherWithRetry.java:242) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:750) Caused by: com.cloudera.cmf.event.shaded.io.netty.channel.AbstractChannel$AnnotatedNoRouteToHostException: No route to host: dmidlkprdls01.svr.luc.edu/192.168.158.1:7184 Caused by: java.net.NoRouteToHostException: No route to host at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:716) at com.cloudera.cmf.event.shaded.io.netty.channel.socket.nio.NioSocketChannel.doFinishConnect(NioSocketChannel.java:337) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.finishConnect(AbstractNioChannel.java:339) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:776) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:724) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:650) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:562) at com.cloudera.cmf.event.shaded.io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997) at com.cloudera.cmf.event.shaded.io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) at java.lang.Thread.run(Thread.java:750) 2025-07-07 16:11:37,823 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:11:38,824 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:11:39,825 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:11:40,827 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:11:41,828 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:11:42,829 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:11:43,830 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:11:44,831 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:11:45,832 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:11:46,834 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:11:47,835 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:11:48,836 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:11:49,837 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:11:50,838 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:11:51,839 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:11:52,840 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:11:53,842 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:11:54,843 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:11:55,844 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:11:56,845 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:11:57,846 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:11:58,847 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:11:59,848 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:12:00,850 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:12:01,851 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:12:02,852 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:12:03,853 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:12:04,854 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:12:05,855 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:12:06,856 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:12:07,858 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:12:08,859 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:12:09,860 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:12:10,861 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:12:11,862 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:12:12,864 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:12:13,865 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:12:14,866 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:12:15,867 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:12:16,868 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:12:17,869 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:12:18,870 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:12:19,871 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:12:20,873 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:12:21,874 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:12:21,875 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 16:12:27,877 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:12:28,878 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:12:29,879 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:12:30,881 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:12:31,882 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:12:32,883 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:12:33,884 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:12:34,885 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:12:35,887 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:12:36,888 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:12:37,889 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:12:38,890 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:12:39,891 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:12:40,892 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:12:41,894 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:12:42,895 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:12:43,896 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:12:44,897 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:12:45,898 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:12:46,900 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:12:47,901 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:12:48,902 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:12:49,903 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:12:50,904 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:12:51,906 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:12:52,907 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:12:53,908 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:12:54,909 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:12:55,910 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:12:56,911 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:12:57,912 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:12:58,913 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:12:59,915 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:13:00,916 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:13:01,917 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:13:02,918 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:13:03,919 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:13:04,920 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:13:05,922 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:13:06,923 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:13:07,924 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:13:08,925 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:13:09,926 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:13:10,927 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:13:11,929 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:13:12,930 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:13:13,931 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:13:14,932 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:13:15,933 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:13:16,934 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:13:16,936 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 16:13:22,937 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:13:23,939 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:13:24,940 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:13:25,941 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:13:26,942 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:13:27,943 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:13:28,944 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:13:29,946 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:13:30,947 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:13:31,948 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:13:32,949 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:13:33,950 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:13:34,951 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:13:35,953 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:13:36,954 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:13:37,955 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:13:38,956 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:13:39,957 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:13:40,959 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:13:41,960 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:13:42,961 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:13:43,962 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:13:44,963 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:13:45,964 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:13:46,966 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:13:47,967 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:13:48,968 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:13:49,969 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:13:50,971 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:13:51,972 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:13:52,973 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:13:53,974 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:13:54,975 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:13:55,977 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:13:56,978 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:13:57,979 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:13:58,980 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:13:59,981 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:14:00,983 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:14:01,984 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:14:02,985 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:14:03,986 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:14:04,987 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:14:05,988 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:14:06,989 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:14:07,991 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:14:08,992 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:14:09,993 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:14:10,994 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:14:11,995 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:14:11,997 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 16:14:17,999 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:14:19,000 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:14:20,001 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:14:21,002 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:14:22,003 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:14:23,005 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:14:24,006 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:14:25,007 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:14:26,008 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:14:27,010 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:14:28,011 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:14:29,012 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:14:30,013 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:14:31,014 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:14:32,016 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:14:33,017 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:14:34,018 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:14:35,019 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:14:36,021 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:14:37,022 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:14:38,023 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:14:39,024 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:14:40,025 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:14:41,026 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:14:42,027 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:14:43,029 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:14:44,030 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:14:45,031 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:14:46,032 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:14:47,033 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:14:48,034 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:14:49,036 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:14:50,037 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:14:51,038 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:14:52,039 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:14:53,041 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:14:54,042 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:14:55,043 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:14:56,044 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:14:57,045 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:14:58,046 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:14:59,048 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:15:00,049 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:15:01,050 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:15:02,051 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:15:03,052 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:15:04,053 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:15:05,055 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:15:06,056 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:15:07,057 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:15:07,058 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 16:15:13,060 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:15:14,061 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:15:15,062 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:15:16,064 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:15:17,065 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:15:18,066 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:15:19,067 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:15:20,069 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:15:21,070 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:15:22,071 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:15:23,072 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:15:24,073 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:15:25,074 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:15:26,075 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:15:27,077 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:15:28,078 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:15:29,079 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:15:30,080 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:15:31,082 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:15:32,083 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:15:33,084 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:15:34,085 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:15:35,086 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:15:36,088 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:15:37,089 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:15:38,090 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:15:39,091 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:15:40,093 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:15:41,094 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:15:42,095 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:15:43,096 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:15:44,097 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:15:45,099 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:15:46,100 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:15:47,101 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:15:48,102 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:15:49,103 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:15:50,104 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:15:51,106 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:15:52,107 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:15:53,108 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:15:54,109 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:15:55,110 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:15:56,111 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:15:57,112 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:15:58,114 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:15:59,115 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:16:00,116 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:16:01,117 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:16:02,118 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:16:02,120 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 16:16:08,121 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:16:09,123 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:16:10,124 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:16:11,125 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:16:12,126 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:16:13,128 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:16:14,129 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:16:15,130 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:16:16,131 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:16:17,132 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:16:18,134 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:16:19,135 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:16:20,136 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:16:21,137 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:16:22,138 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:16:23,140 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:16:24,141 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:16:25,142 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:16:26,143 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:16:27,145 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:16:28,146 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:16:29,147 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:16:30,148 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:16:31,149 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:16:32,150 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:16:33,152 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:16:34,153 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:16:35,154 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:16:36,155 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:16:37,156 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:16:38,158 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:16:39,159 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:16:40,160 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:16:41,161 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:16:42,162 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:16:43,163 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:16:44,165 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:16:45,166 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:16:46,167 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:16:47,168 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:16:48,169 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:16:49,171 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:16:50,172 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:16:51,173 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:16:52,174 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:16:53,175 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:16:54,177 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:16:55,178 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:16:56,179 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:16:57,180 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:16:57,181 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 16:17:03,183 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:17:04,184 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:17:05,185 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:17:06,187 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:17:07,188 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:17:08,189 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:17:09,190 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:17:10,191 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:17:11,192 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:17:12,193 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:17:13,195 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:17:14,196 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:17:15,197 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:17:16,198 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:17:17,199 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:17:18,201 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:17:19,202 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:17:20,203 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:17:21,204 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:17:22,205 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:17:23,207 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:17:24,208 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:17:25,209 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:17:26,210 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:17:27,211 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:17:28,212 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:17:29,214 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:17:30,215 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:17:31,216 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:17:32,217 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:17:33,218 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:17:34,219 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:17:35,220 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:17:36,222 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:17:37,223 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:17:38,224 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:17:39,225 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:17:40,226 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:17:41,227 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:17:42,228 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:17:43,229 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:17:44,230 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:17:45,232 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:17:46,233 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:17:47,234 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:17:48,235 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:17:49,236 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:17:50,237 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:17:51,238 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:17:52,239 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:17:52,241 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 16:17:58,242 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:17:59,243 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:18:00,244 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:18:01,245 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:18:02,247 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:18:03,248 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:18:04,249 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:18:05,250 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:18:06,251 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:18:07,252 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:18:08,253 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:18:09,255 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:18:10,256 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:18:11,257 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:18:12,258 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:18:13,259 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:18:14,260 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:18:15,261 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:18:16,262 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:18:17,264 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:18:18,265 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:18:19,266 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:18:20,267 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:18:21,269 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:18:22,270 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:18:23,271 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:18:24,272 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:18:25,273 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:18:26,274 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:18:27,276 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:18:28,277 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:18:29,278 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:18:30,279 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:18:31,280 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:18:32,281 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:18:33,283 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:18:34,284 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:18:35,285 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:18:36,286 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:18:37,287 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:18:38,288 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:18:39,289 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:18:40,291 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:18:41,292 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:18:42,293 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:18:43,294 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:18:44,295 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:18:45,297 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:18:46,298 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:18:47,299 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:18:47,301 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 16:18:53,302 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:18:54,303 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:18:55,305 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:18:56,306 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:18:57,307 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:18:58,308 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:18:59,309 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:19:00,311 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:19:01,312 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:19:02,313 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:19:03,314 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:19:04,315 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:19:05,317 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:19:06,318 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:19:07,319 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:19:08,320 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:19:09,321 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:19:10,323 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:19:11,324 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:19:12,325 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:19:13,326 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:19:14,327 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:19:15,329 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:19:16,330 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:19:17,331 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:19:18,332 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:19:19,334 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:19:20,335 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:19:21,336 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:19:22,337 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:19:23,338 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:19:24,340 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:19:25,341 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:19:26,342 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:19:27,343 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:19:28,344 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:19:29,346 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:19:30,347 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:19:31,348 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:19:32,349 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:19:33,350 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:19:34,351 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:19:35,353 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:19:36,354 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:19:37,355 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:19:38,356 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:19:39,357 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:19:40,358 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:19:41,360 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:19:42,361 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:19:42,362 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 16:19:48,366 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:19:49,367 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:19:50,368 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:19:51,369 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:19:52,371 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:19:53,372 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:19:54,373 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:19:55,374 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:19:56,375 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:19:57,377 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:19:58,378 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:19:59,379 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:20:00,380 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:20:01,381 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:20:02,383 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:20:03,384 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:20:04,385 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:20:05,386 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:20:06,387 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:20:07,389 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:20:08,390 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:20:09,391 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:20:10,392 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:20:11,393 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:20:12,395 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:20:13,396 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:20:14,397 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:20:15,398 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:20:16,399 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:20:17,401 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:20:18,402 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:20:19,403 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:20:20,404 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:20:21,405 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:20:22,406 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:20:23,408 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:20:24,409 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:20:25,410 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:20:26,411 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:20:27,412 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:20:28,413 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:20:29,414 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:20:30,415 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:20:31,416 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:20:32,417 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:20:33,418 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:20:34,420 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:20:35,421 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:20:36,422 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:20:37,423 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:20:37,426 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 16:20:43,428 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:20:44,429 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:20:45,430 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:20:46,431 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:20:47,432 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:20:48,433 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:20:49,434 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:20:50,436 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:20:51,437 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:20:52,438 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:20:53,439 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:20:54,440 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:20:55,441 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:20:56,442 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:20:57,443 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:20:58,444 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:20:59,445 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:21:00,447 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:21:01,448 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:21:02,449 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:21:03,450 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:21:04,451 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:21:05,452 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:21:06,454 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:21:07,455 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:21:08,456 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:21:09,457 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:21:10,458 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:21:11,459 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:21:12,460 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:21:13,462 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:21:14,463 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:21:15,464 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:21:16,465 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:21:17,466 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:21:18,467 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:21:19,468 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:21:20,469 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:21:21,471 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:21:22,472 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:21:23,473 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:21:24,474 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:21:25,475 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:21:26,476 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:21:27,477 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:21:28,479 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:21:29,480 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:21:30,481 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:21:31,482 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:21:32,483 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:21:32,484 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 16:21:38,486 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:21:39,487 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:21:40,488 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:21:41,490 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:21:42,491 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:21:43,492 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:21:44,493 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:21:45,494 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:21:46,495 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:21:47,497 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:21:48,498 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:21:49,499 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:21:50,500 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:21:51,501 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:21:52,502 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:21:53,503 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:21:54,504 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:21:55,506 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:21:56,507 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:21:57,508 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:21:58,509 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:21:59,510 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:22:00,511 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:22:01,512 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:22:02,513 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:22:03,514 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:22:04,515 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:22:05,516 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:22:06,517 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:22:07,519 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:22:08,520 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:22:09,521 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:22:10,522 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:22:11,523 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:22:12,524 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:22:13,525 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:22:14,527 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:22:15,528 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:22:16,529 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:22:17,530 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:22:18,531 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:22:19,533 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:22:20,534 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:22:21,535 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:22:22,536 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:22:23,537 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:22:24,538 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:22:25,539 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:22:26,540 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:22:27,542 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:22:27,543 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 16:22:33,544 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:22:34,546 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:22:35,547 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:22:36,548 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:22:37,549 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:22:38,550 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:22:39,551 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:22:40,552 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:22:41,554 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:22:42,555 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:22:43,556 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:22:44,557 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:22:45,558 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:22:46,559 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:22:47,560 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:22:48,562 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:22:49,563 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:22:50,564 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:22:51,565 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:22:52,566 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:22:53,568 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:22:54,569 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:22:55,570 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:22:56,571 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:22:57,572 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:22:58,573 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:22:59,574 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:23:00,576 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:23:01,577 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:23:02,578 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:23:03,579 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:23:04,580 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:23:05,581 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:23:06,582 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:23:07,583 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:23:08,585 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:23:09,586 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:23:10,587 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:23:11,588 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:23:12,589 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:23:13,590 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:23:14,591 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:23:15,592 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:23:16,593 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:23:17,595 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:23:18,596 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:23:19,597 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:23:20,598 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:23:21,599 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:23:22,600 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:23:22,601 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 16:23:28,603 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:23:29,604 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:23:30,605 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:23:31,606 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:23:32,608 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:23:33,609 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:23:34,610 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:23:35,611 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:23:36,612 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:23:37,613 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:23:38,614 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:23:39,616 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:23:40,616 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:23:41,618 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:23:42,619 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:23:43,620 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:23:44,621 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:23:45,622 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:23:46,623 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:23:47,624 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:23:48,625 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:23:49,626 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:23:50,627 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:23:51,628 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:23:52,629 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:23:53,630 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:23:54,631 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:23:55,633 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:23:56,634 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:23:57,635 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:23:58,636 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:23:59,637 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:24:00,638 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:24:01,639 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:24:02,640 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:24:03,641 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:24:04,642 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:24:05,643 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:24:06,645 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:24:07,646 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:24:08,647 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:24:09,648 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:24:10,649 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:24:11,650 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:24:12,651 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:24:13,652 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:24:14,653 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:24:15,654 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:24:16,656 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:24:17,657 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:24:17,658 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 16:24:23,660 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:24:24,661 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:24:25,662 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:24:26,663 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:24:27,664 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:24:28,665 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:24:29,666 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:24:30,667 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:24:31,668 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:24:32,669 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:24:33,670 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:24:34,671 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:24:35,672 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:24:36,673 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:24:37,674 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:24:38,675 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:24:39,676 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:24:40,677 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:24:41,679 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:24:42,680 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:24:43,681 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:24:44,682 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:24:45,683 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:24:46,684 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:24:47,685 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:24:48,686 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:24:49,687 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:24:50,688 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:24:51,689 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:24:52,690 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:24:53,691 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:24:54,692 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:24:55,693 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:24:56,694 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:24:57,696 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:24:58,697 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:24:59,698 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:25:00,699 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:25:01,700 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:25:02,701 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:25:03,702 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:25:04,703 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:25:05,704 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:25:06,705 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:25:07,706 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:25:08,707 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:25:09,708 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:25:10,709 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:25:11,711 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:25:12,711 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:25:12,713 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 16:25:18,715 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:25:19,716 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:25:20,717 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:25:21,718 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:25:22,719 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:25:23,720 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:25:24,721 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:25:25,722 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:25:26,723 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:25:27,724 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:25:28,725 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:25:29,727 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:25:30,728 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:25:31,729 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:25:32,730 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:25:33,731 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:25:34,732 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:25:35,733 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:25:36,734 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:25:37,735 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:25:38,736 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:25:39,737 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:25:40,739 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:25:41,739 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:25:42,741 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:25:43,742 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:25:44,743 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:25:45,744 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:25:46,745 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:25:47,746 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:25:48,747 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:25:49,748 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:25:50,750 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:25:51,751 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:25:52,752 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:25:53,753 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:25:54,754 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:25:55,755 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:25:56,756 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:25:57,757 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:25:58,758 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:25:59,759 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:26:00,761 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:26:01,762 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:26:02,763 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:26:03,764 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:26:04,765 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:26:05,766 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:26:06,767 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:26:07,768 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:26:07,770 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 16:26:13,771 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:26:14,772 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:26:15,773 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:26:16,775 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:26:17,776 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:26:18,777 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:26:19,778 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:26:20,779 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:26:21,780 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:26:22,781 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:26:23,782 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:26:24,783 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:26:25,784 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:26:26,786 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:26:27,787 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:26:28,788 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:26:29,789 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:26:30,790 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:26:31,791 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:26:32,792 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:26:33,793 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:26:34,794 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:26:35,795 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:26:36,796 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:26:37,797 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:26:38,799 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:26:39,800 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:26:40,801 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:26:41,802 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:26:42,803 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:26:43,804 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:26:44,805 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:26:45,806 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:26:46,807 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:26:47,808 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:26:48,809 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:26:49,810 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:26:50,812 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:26:51,813 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:26:52,814 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:26:53,815 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:26:54,816 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:26:55,817 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:26:56,818 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:26:57,819 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:26:58,820 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:26:59,821 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:27:00,822 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:27:01,823 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:27:02,825 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:27:02,826 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 16:27:08,827 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:27:09,828 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:27:10,829 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:27:11,830 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:27:12,832 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:27:13,833 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:27:14,834 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:27:15,835 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:27:16,836 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:27:17,837 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:27:18,838 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:27:19,839 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:27:20,840 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:27:21,841 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:27:22,842 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:27:23,843 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:27:24,845 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:27:25,846 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:27:26,847 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:27:27,848 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:27:28,849 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:27:29,850 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:27:30,851 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:27:31,852 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:27:32,853 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:27:33,854 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:27:34,855 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:27:35,857 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:27:36,858 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:27:37,859 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:27:38,860 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:27:39,861 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:27:40,862 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:27:41,863 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:27:42,864 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:27:43,865 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:27:44,866 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:27:45,867 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:27:46,869 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:27:47,870 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:27:48,871 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:27:49,872 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:27:50,873 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:27:51,874 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:27:52,875 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:27:53,876 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:27:54,877 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:27:55,878 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:27:56,880 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:27:57,881 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:27:57,882 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 16:28:03,884 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:28:04,885 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:28:05,886 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:28:06,887 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:28:07,887 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:28:08,889 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:28:09,890 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:28:10,891 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:28:11,892 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:28:12,892 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:28:13,894 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:28:14,895 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:28:15,896 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:28:16,896 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:28:17,898 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:28:18,899 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:28:19,900 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:28:20,901 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:28:21,902 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:28:22,903 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:28:23,904 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:28:24,905 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:28:25,906 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:28:26,907 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:28:27,908 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:28:28,909 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:28:29,910 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:28:30,911 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:28:31,912 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:28:32,913 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:28:33,915 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:28:34,916 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:28:35,917 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:28:36,918 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:28:37,919 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:28:38,920 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:28:39,921 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:28:40,922 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:28:41,923 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:28:42,924 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:28:43,925 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:28:44,926 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:28:45,927 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:28:46,928 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:28:47,929 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:28:48,930 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:28:49,931 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:28:50,932 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:28:51,933 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:28:52,935 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:28:52,936 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 16:28:58,937 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:28:59,939 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:29:00,940 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:29:01,941 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:29:02,942 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:29:03,943 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:29:04,944 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:29:05,945 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:29:06,946 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:29:07,947 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:29:08,949 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:29:09,950 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:29:10,951 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:29:11,952 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:29:12,953 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:29:13,954 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:29:14,955 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:29:15,956 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:29:16,957 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:29:17,958 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:29:18,960 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:29:19,961 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:29:20,962 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:29:21,963 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:29:22,964 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:29:23,965 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:29:24,966 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:29:25,967 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:29:26,968 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:29:27,969 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:29:28,971 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:29:29,972 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:29:30,973 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:29:31,974 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:29:32,975 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:29:33,976 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:29:34,977 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:29:35,978 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:29:36,979 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:29:37,980 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:29:38,981 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:29:39,982 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:29:40,984 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:29:41,985 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:29:42,986 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:29:43,987 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:29:44,988 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:29:45,989 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:29:46,990 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:29:47,991 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:29:47,993 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 16:29:53,995 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:29:54,996 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:29:55,997 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:29:56,998 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:29:57,999 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:29:59,000 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:30:00,001 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:30:01,003 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:30:02,004 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:30:03,005 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:30:04,006 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:30:05,007 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:30:06,008 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:30:07,009 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:30:08,011 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:30:09,012 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:30:10,013 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:30:11,014 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:30:12,015 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:30:13,016 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:30:14,017 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:30:15,018 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:30:16,020 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:30:17,021 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:30:18,022 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:30:19,023 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:30:20,024 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:30:21,025 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:30:22,026 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:30:23,027 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:30:24,029 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:30:25,030 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:30:26,031 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:30:27,032 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:30:28,033 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:30:29,034 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:30:30,035 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:30:31,036 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:30:32,038 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:30:33,039 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:30:34,040 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:30:35,041 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:30:36,042 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:30:37,043 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:30:38,044 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:30:39,045 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:30:40,047 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:30:41,048 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:30:42,049 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:30:43,050 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:30:43,051 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 16:30:49,053 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:30:50,054 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:30:51,055 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:30:52,056 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:30:53,057 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:30:54,059 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:30:55,060 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:30:56,061 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:30:57,062 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:30:58,063 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:30:59,064 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:31:00,065 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:31:01,067 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:31:02,068 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:31:03,069 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:31:04,070 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:31:05,071 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:31:06,072 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:31:07,073 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:31:08,074 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:31:09,075 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:31:10,077 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:31:11,078 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:31:12,079 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:31:13,080 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:31:14,081 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:31:15,082 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:31:16,083 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:31:17,084 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:31:18,086 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:31:19,087 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:31:20,088 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:31:21,089 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:31:22,090 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:31:23,092 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:31:24,093 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:31:25,094 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:31:26,095 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:31:27,096 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:31:28,097 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:31:29,098 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:31:30,099 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:31:31,100 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:31:32,101 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:31:33,103 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:31:34,104 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:31:35,105 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:31:36,106 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:31:37,107 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:31:38,108 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:31:38,109 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 16:31:44,111 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:31:45,112 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:31:46,113 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:31:47,115 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:31:48,116 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:31:49,117 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:31:50,118 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:31:51,119 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:31:52,120 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:31:53,121 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:31:54,123 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:31:55,124 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:31:56,125 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:31:57,126 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:31:58,127 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:31:59,128 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:32:00,129 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:32:01,130 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:32:02,131 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:32:03,133 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:32:04,134 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:32:05,135 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:32:06,136 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:32:07,137 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:32:08,138 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:32:09,139 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:32:10,141 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:32:11,142 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:32:12,143 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:32:13,144 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:32:14,145 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:32:15,146 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:32:16,147 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:32:17,148 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:32:18,150 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:32:19,151 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:32:20,152 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:32:21,153 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:32:22,154 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:32:23,155 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:32:24,157 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:32:25,158 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:32:26,159 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:32:27,160 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:32:28,161 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:32:29,162 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:32:30,163 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:32:31,165 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:32:32,165 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:32:33,167 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:32:33,168 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 16:32:39,170 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:32:40,171 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:32:41,172 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:32:42,173 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:32:43,174 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:32:44,175 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:32:45,176 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:32:46,178 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:32:47,179 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:32:48,180 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:32:49,181 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:32:50,182 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:32:51,183 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:32:52,184 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:32:53,186 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:32:54,187 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:32:55,188 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:32:56,189 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:32:57,190 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:32:58,191 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:32:59,193 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:33:00,194 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:33:01,195 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:33:02,196 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:33:03,197 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:33:04,198 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:33:05,199 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:33:06,200 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:33:07,201 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:33:08,202 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:33:09,204 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:33:10,205 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:33:11,206 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:33:12,207 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:33:13,208 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:33:14,209 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:33:15,210 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:33:16,211 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:33:17,213 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:33:18,214 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:33:19,215 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:33:20,216 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:33:21,217 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:33:22,218 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:33:23,219 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:33:24,220 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:33:25,221 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:33:26,223 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:33:27,224 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:33:28,225 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:33:28,226 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 16:33:34,228 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:33:35,229 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:33:36,230 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:33:37,231 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:33:38,232 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:33:39,233 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:33:40,234 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:33:41,235 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:33:42,236 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:33:43,237 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:33:44,238 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:33:45,240 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:33:46,241 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:33:47,242 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:33:48,243 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:33:49,244 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:33:50,245 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:33:51,246 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:33:52,247 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:33:53,249 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:33:54,250 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:33:55,251 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:33:56,252 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:33:57,253 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:33:58,254 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:33:59,255 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:34:00,257 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:34:01,258 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:34:02,259 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:34:03,260 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:34:04,261 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:34:05,262 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:34:06,263 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:34:07,265 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:34:08,266 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:34:09,267 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:34:10,268 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:34:11,269 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:34:12,270 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:34:13,271 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:34:14,272 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:34:15,274 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:34:16,275 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:34:17,276 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:34:18,277 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:34:19,278 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:34:20,279 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:34:21,281 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:34:22,282 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:34:23,283 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:34:23,284 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 16:34:29,286 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:34:30,287 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:34:31,288 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:34:32,289 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:34:33,290 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:34:34,292 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:34:35,293 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:34:36,294 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:34:37,295 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:34:38,296 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:34:39,297 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:34:40,298 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:34:41,300 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:34:42,301 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:34:43,302 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:34:44,303 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:34:45,304 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:34:46,305 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:34:47,306 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:34:48,307 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:34:49,308 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:34:50,310 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:34:51,311 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:34:52,312 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:34:53,313 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:34:54,314 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:34:55,315 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:34:56,316 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:34:57,318 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:34:58,319 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:34:59,320 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:35:00,321 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:35:01,322 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:35:02,323 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:35:03,324 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:35:04,326 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:35:05,327 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:35:06,328 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:35:07,329 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:35:08,330 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:35:09,331 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:35:10,332 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:35:11,334 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:35:12,335 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:35:13,336 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:35:14,337 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:35:15,338 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:35:16,339 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:35:17,340 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:35:18,341 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:35:18,343 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 16:35:24,345 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:35:25,346 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:35:26,347 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:35:27,348 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:35:28,350 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:35:29,351 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:35:30,352 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:35:31,353 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:35:32,354 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:35:33,355 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:35:34,357 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:35:35,358 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:35:36,359 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:35:37,360 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:35:38,361 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:35:39,362 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:35:40,363 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:35:41,365 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:35:42,366 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:35:43,367 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:35:44,368 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:35:45,369 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:35:46,371 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:35:47,372 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:35:48,373 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:35:49,374 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:35:50,375 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:35:51,376 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:35:52,378 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:35:53,379 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:35:54,380 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:35:55,381 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:35:56,382 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:35:57,383 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:35:58,384 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:35:59,386 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:36:00,387 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:36:01,388 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:36:02,389 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:36:03,390 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:36:04,391 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:36:05,392 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:36:06,394 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:36:07,395 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:36:08,396 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:36:09,397 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:36:10,398 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:36:11,399 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:36:12,400 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:36:13,402 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:36:13,403 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 16:36:19,405 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:36:20,406 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:36:21,407 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:36:22,408 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:36:23,409 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:36:24,410 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:36:25,411 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:36:26,412 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:36:27,414 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:36:28,415 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:36:29,416 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:36:30,417 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:36:31,418 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:36:32,419 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:36:33,420 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:36:34,421 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:36:35,423 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:36:36,424 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:36:37,425 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:36:38,426 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:36:39,427 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:36:40,428 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:36:41,429 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:36:42,431 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:36:43,432 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:36:44,433 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:36:45,434 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:36:46,435 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:36:47,436 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:36:48,437 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:36:49,439 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:36:50,440 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:36:51,441 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:36:52,442 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:36:53,443 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:36:54,444 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:36:55,445 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:36:56,447 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:36:57,448 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:36:58,449 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:36:59,450 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:37:00,451 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:37:01,452 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:37:02,453 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:37:03,454 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:37:04,456 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:37:05,457 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:37:06,458 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:37:07,459 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:37:08,460 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:37:08,462 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 16:37:14,463 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:37:15,464 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:37:16,465 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:37:17,466 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:37:18,467 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:37:19,469 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:37:20,470 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:37:21,471 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:37:22,472 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:37:23,473 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:37:24,474 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:37:25,475 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:37:26,476 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:37:27,478 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:37:28,479 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:37:29,480 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:37:30,481 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:37:31,482 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:37:32,483 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:37:33,484 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:37:34,485 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:37:35,486 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:37:36,488 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:37:37,489 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:37:38,490 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:37:39,491 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:37:40,492 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:37:41,493 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:37:42,494 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:37:43,495 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:37:44,497 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:37:45,498 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:37:46,499 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:37:47,500 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:37:48,501 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:37:49,502 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:37:50,503 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:37:51,505 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:37:52,506 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:37:53,507 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:37:54,508 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:37:55,509 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:37:56,510 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:37:57,511 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:37:58,513 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:37:59,514 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:38:00,515 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:38:01,516 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:38:02,517 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:38:03,518 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:38:03,519 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 16:38:09,523 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:38:10,524 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:38:11,525 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:38:12,526 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:38:13,527 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:38:14,528 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:38:15,529 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:38:16,530 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:38:17,531 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:38:18,532 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:38:19,534 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:38:20,535 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:38:21,536 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:38:22,537 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:38:23,538 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:38:24,539 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:38:25,540 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:38:26,542 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:38:27,543 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:38:28,544 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:38:29,545 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:38:30,546 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:38:31,547 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:38:32,548 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:38:33,549 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:38:34,551 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:38:35,552 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:38:36,553 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:38:37,554 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:38:38,555 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:38:39,556 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:38:40,557 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:38:41,558 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:38:42,560 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:38:43,561 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:38:44,562 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:38:45,563 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:38:46,564 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:38:47,565 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:38:48,566 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:38:49,568 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:38:50,569 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:38:51,570 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:38:52,571 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:38:53,572 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:38:54,573 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:38:55,574 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:38:56,576 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:38:57,577 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:38:58,578 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:38:58,581 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 16:39:04,582 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:39:05,583 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:39:06,584 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:39:07,585 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:39:08,587 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:39:09,588 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:39:10,589 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:39:11,590 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:39:12,591 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:39:13,592 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:39:14,593 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:39:15,594 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:39:16,595 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:39:17,596 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:39:18,598 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:39:19,599 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:39:20,600 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:39:21,601 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:39:22,602 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:39:23,603 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:39:24,604 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:39:25,605 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:39:26,606 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:39:27,608 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:39:28,609 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:39:29,610 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:39:30,611 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:39:31,612 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:39:32,613 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:39:33,614 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:39:34,615 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:39:35,616 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:39:36,618 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:39:37,619 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:39:38,620 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:39:39,621 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:39:40,622 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:39:41,623 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:39:42,624 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:39:43,625 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:39:44,626 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:39:45,628 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:39:46,629 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:39:47,630 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:39:48,631 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:39:49,632 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:39:50,633 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:39:51,635 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:39:52,636 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:39:53,637 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:39:53,638 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 16:39:59,639 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:40:00,641 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:40:01,642 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:40:02,643 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:40:03,644 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:40:04,645 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:40:05,646 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:40:06,647 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:40:07,648 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:40:08,650 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:40:09,651 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:40:10,652 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:40:11,653 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:40:12,654 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:40:13,655 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:40:14,656 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:40:15,657 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:40:16,659 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:40:17,660 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:40:18,661 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:40:19,662 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:40:20,664 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:40:21,665 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:40:22,666 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:40:23,667 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:40:24,668 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:40:25,670 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:40:26,671 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:40:27,672 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:40:28,673 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:40:29,674 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:40:30,675 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:40:31,676 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:40:32,678 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:40:33,679 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:40:34,680 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:40:35,681 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:40:36,682 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:40:37,683 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:40:38,684 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:40:39,685 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:40:40,686 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:40:41,688 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:40:42,689 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:40:43,690 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:40:44,691 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:40:45,692 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:40:46,693 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:40:47,694 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:40:48,695 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:40:48,697 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 16:40:54,698 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:40:55,700 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:40:56,701 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:40:57,702 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:40:58,703 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:40:59,704 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:41:00,705 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:41:01,706 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:41:02,708 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:41:03,709 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:41:04,710 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:41:05,711 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:41:06,712 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:41:07,713 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:41:08,714 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:41:09,715 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:41:10,717 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:41:11,718 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:41:12,719 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:41:13,720 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:41:14,721 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:41:15,722 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:41:16,723 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:41:17,725 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:41:18,726 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:41:19,727 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:41:20,728 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:41:21,729 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:41:22,730 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:41:23,732 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:41:24,733 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:41:25,734 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:41:26,735 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:41:27,736 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:41:28,737 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:41:29,739 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:41:30,740 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:41:31,741 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:41:32,742 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:41:33,743 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:41:34,744 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:41:35,745 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:41:36,747 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:41:37,748 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:41:38,749 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:41:39,750 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:41:40,751 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:41:41,752 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:41:42,754 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:41:43,755 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:41:43,756 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 16:41:49,758 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:41:50,759 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:41:51,760 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:41:52,762 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:41:53,763 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:41:53,970 WARN com.cloudera.cmf.event.publish.EventStorePublisherWithRetry: Failed to publish event: SimpleEvent{attributes={ROLE=[hdfs-DATANODE-b30a464b10a57fdd49ea734cd52a8291], HOSTS=[dmidlkprdls04.svr.luc.edu], ROLE_TYPE=[DATANODE], CATEGORY=[LOG_MESSAGE], EVENTCODE=[EV_LOG_EVENT], SERVICE=[hdfs], SERVICE_TYPE=[HDFS], LOG_LEVEL=[WARN], HOST_IDS=[5c33df90-d247-4c6d-b9e0-5908a423580a], SEVERITY=[IMPORTANT]}, content=Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /var/lib/hadoop-hdfs/hadoop-http-auth-signature-secret, timestamp=1751909970885} - 1 of 60 failure(s) in last 1816s java.io.IOException: Error connecting to dmidlkprdls01.svr.luc.edu/192.168.158.1:7184 at com.cloudera.cmf.event.shaded.org.apache.avro.ipc.netty.NettyTransceiver.getChannel(NettyTransceiver.java:269) at com.cloudera.cmf.event.shaded.org.apache.avro.ipc.netty.NettyTransceiver.(NettyTransceiver.java:197) at com.cloudera.cmf.event.shaded.org.apache.avro.ipc.netty.NettyTransceiver.(NettyTransceiver.java:120) at com.cloudera.cmf.event.publish.AvroEventStorePublishProxy.checkSpecificRequestor(AvroEventStorePublishProxy.java:120) at com.cloudera.cmf.event.publish.AvroEventStorePublishProxy.publishEvent(AvroEventStorePublishProxy.java:204) at com.cloudera.cmf.event.publish.EventStorePublisherWithRetry$PublishEventTask.run(EventStorePublisherWithRetry.java:242) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:750) Caused by: com.cloudera.cmf.event.shaded.io.netty.channel.AbstractChannel$AnnotatedNoRouteToHostException: No route to host: dmidlkprdls01.svr.luc.edu/192.168.158.1:7184 Caused by: java.net.NoRouteToHostException: No route to host at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:716) at com.cloudera.cmf.event.shaded.io.netty.channel.socket.nio.NioSocketChannel.doFinishConnect(NioSocketChannel.java:337) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.finishConnect(AbstractNioChannel.java:339) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:776) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:724) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:650) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:562) at com.cloudera.cmf.event.shaded.io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997) at com.cloudera.cmf.event.shaded.io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) at java.lang.Thread.run(Thread.java:750) 2025-07-07 16:41:54,764 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:41:55,765 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:41:56,766 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:41:57,767 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:41:58,768 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:41:59,769 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:42:00,771 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:42:01,772 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:42:02,773 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:42:03,774 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:42:04,775 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:42:05,776 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:42:06,777 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:42:07,778 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:42:08,780 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:42:09,781 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:42:10,782 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:42:11,783 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:42:12,784 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:42:13,785 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:42:14,786 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:42:15,788 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:42:16,789 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:42:17,790 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:42:18,791 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:42:19,792 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:42:20,793 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:42:21,794 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:42:22,795 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:42:23,797 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:42:24,798 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:42:25,799 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:42:26,800 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:42:27,801 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:42:28,802 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:42:29,803 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:42:30,804 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:42:31,805 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:42:32,806 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:42:33,807 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:42:34,808 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:42:35,810 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:42:36,811 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:42:37,812 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:42:38,813 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:42:38,814 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 16:42:44,816 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:42:45,817 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:42:46,818 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:42:47,819 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:42:48,820 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:42:49,821 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:42:50,822 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:42:51,824 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:42:52,825 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:42:53,826 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:42:54,827 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:42:55,828 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:42:56,829 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:42:57,830 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:42:58,831 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:42:59,832 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:43:00,833 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:43:01,834 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:43:02,835 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:43:03,837 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:43:04,838 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:43:05,839 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:43:06,840 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:43:07,841 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:43:08,842 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:43:09,843 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:43:10,844 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:43:11,845 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:43:12,846 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:43:13,848 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:43:14,848 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:43:15,849 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:43:16,850 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:43:17,851 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:43:18,852 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:43:19,854 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:43:20,855 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:43:21,856 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:43:22,857 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:43:23,858 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:43:24,859 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:43:25,860 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:43:26,861 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:43:27,862 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:43:28,863 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:43:29,864 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:43:30,865 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:43:31,866 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:43:32,867 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:43:33,868 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:43:33,869 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 16:43:39,871 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:43:40,872 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:43:41,873 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:43:42,874 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:43:43,875 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:43:44,876 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:43:45,877 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:43:46,878 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:43:47,879 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:43:48,880 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:43:49,881 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:43:50,882 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:43:51,884 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:43:52,885 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:43:53,886 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:43:54,887 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:43:55,888 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:43:56,889 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:43:57,890 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:43:58,891 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:43:59,892 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:44:00,893 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:44:01,894 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:44:02,894 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:44:03,895 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:44:04,896 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:44:05,898 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:44:06,899 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:44:07,900 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:44:08,901 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:44:09,902 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:44:10,902 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:44:11,903 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:44:12,904 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:44:13,905 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:44:14,906 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:44:15,907 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:44:16,908 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:44:17,909 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:44:18,910 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:44:19,911 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:44:20,912 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:44:21,913 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:44:22,914 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:44:23,915 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:44:24,916 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:44:25,917 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:44:26,918 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:44:27,919 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:44:28,920 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:44:28,921 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 16:44:34,923 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:44:35,924 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:44:36,925 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:44:37,925 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:44:38,926 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:44:39,927 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:44:40,928 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:44:41,929 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:44:42,930 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:44:43,931 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:44:44,932 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:44:45,934 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:44:46,935 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:44:47,936 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:44:48,937 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:44:49,938 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:44:50,939 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:44:51,940 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:44:52,941 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:44:53,942 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:44:54,943 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:44:55,944 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:44:56,945 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:44:57,946 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:44:58,947 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:44:59,948 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:45:00,949 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:45:01,950 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:45:02,951 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:45:03,952 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:45:04,953 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:45:05,955 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:45:06,955 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:45:07,956 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:45:08,957 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:45:09,958 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:45:10,960 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:45:11,961 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:45:12,962 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:45:13,963 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:45:14,964 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:45:15,965 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:45:16,966 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:45:17,967 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:45:18,968 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:45:19,969 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:45:20,970 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:45:21,971 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:45:22,972 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:45:23,973 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:45:23,975 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 16:45:29,976 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:45:30,977 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:45:31,978 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:45:32,979 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:45:33,980 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:45:34,981 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:45:35,982 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:45:36,983 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:45:37,984 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:45:38,985 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:45:39,986 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:45:40,987 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:45:41,988 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:45:42,989 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:45:43,990 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:45:44,991 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:45:45,992 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:45:46,993 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:45:47,994 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:45:48,995 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:45:49,997 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:45:50,998 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:45:51,999 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:45:53,000 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:45:54,001 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:45:55,002 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:45:56,003 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:45:57,004 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:45:58,005 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:45:59,006 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:46:00,007 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:46:01,008 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:46:02,009 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:46:03,010 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:46:04,011 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:46:05,012 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:46:06,013 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:46:07,014 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:46:08,015 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:46:09,016 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:46:10,017 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:46:11,018 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:46:12,019 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:46:13,020 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:46:14,021 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:46:15,022 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:46:16,024 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:46:17,025 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:46:18,026 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:46:19,027 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:46:19,028 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 16:46:25,029 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:46:26,030 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:46:27,031 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:46:28,032 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:46:29,034 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:46:30,035 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:46:31,036 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:46:32,037 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:46:33,038 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:46:34,039 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:46:35,040 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:46:36,041 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:46:37,042 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:46:38,043 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:46:39,044 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:46:40,046 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:46:41,047 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:46:42,048 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:46:43,049 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:46:44,050 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:46:45,051 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:46:46,052 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:46:47,053 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:46:48,054 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:46:49,055 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:46:50,057 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:46:51,058 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:46:52,059 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:46:53,060 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:46:54,061 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:46:55,062 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:46:56,063 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:46:57,064 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:46:58,065 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:46:59,066 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:47:00,067 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:47:01,068 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:47:02,069 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:47:03,070 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:47:04,071 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:47:05,072 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:47:06,073 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:47:07,074 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:47:08,075 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:47:09,076 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:47:10,077 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:47:11,078 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:47:12,080 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:47:13,080 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:47:14,081 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:47:14,083 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 16:47:20,085 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:47:21,086 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:47:22,087 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:47:23,088 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:47:24,089 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:47:25,090 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:47:26,091 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:47:27,091 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:47:28,092 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:47:29,093 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:47:30,094 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:47:31,095 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:47:32,096 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:47:33,097 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:47:34,098 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:47:35,099 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:47:36,100 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:47:37,102 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:47:38,103 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:47:39,104 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:47:40,105 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:47:41,106 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:47:42,107 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:47:43,108 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:47:44,109 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:47:45,110 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:47:46,111 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:47:47,112 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:47:48,113 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:47:49,114 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:47:50,115 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:47:51,116 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:47:52,117 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:47:53,118 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:47:54,119 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:47:55,120 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:47:56,121 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:47:57,122 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:47:58,123 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:47:59,124 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:48:00,125 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:48:01,127 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:48:02,128 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:48:03,129 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:48:04,130 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:48:05,131 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:48:06,132 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:48:07,133 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:48:08,134 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:48:09,135 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:48:09,137 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 16:48:15,138 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:48:16,139 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:48:17,140 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:48:18,141 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:48:19,142 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:48:20,144 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:48:21,145 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:48:22,146 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:48:23,147 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:48:24,148 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:48:25,149 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:48:26,150 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:48:27,151 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:48:28,152 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:48:29,153 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:48:30,154 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:48:31,155 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:48:32,156 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:48:33,157 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:48:34,158 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:48:35,159 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:48:36,160 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:48:37,161 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:48:38,163 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:48:39,164 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:48:40,165 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:48:41,166 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:48:42,167 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:48:43,168 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:48:44,169 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:48:45,170 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:48:46,171 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:48:47,172 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:48:48,173 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:48:49,174 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:48:50,176 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:48:51,177 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:48:52,178 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:48:53,179 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:48:54,180 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:48:55,181 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:48:56,182 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:48:57,183 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:48:58,184 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:48:59,185 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:49:00,186 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:49:01,188 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:49:02,189 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:49:03,190 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:49:04,191 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:49:04,192 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 16:49:10,194 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:49:11,195 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:49:12,196 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:49:13,197 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:49:14,198 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:49:15,199 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:49:16,200 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:49:17,201 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:49:18,203 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:49:19,204 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:49:20,205 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:49:21,206 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:49:22,207 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:49:23,208 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:49:24,209 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:49:25,211 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:49:26,212 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:49:27,213 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:49:28,214 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:49:29,215 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:49:30,216 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:49:31,217 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:49:32,218 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:49:33,219 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:49:34,221 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:49:35,222 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:49:36,223 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:49:37,224 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:49:38,225 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:49:39,226 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:49:40,227 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:49:41,228 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:49:42,229 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:49:43,230 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:49:44,231 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:49:45,232 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:49:46,233 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:49:47,234 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:49:48,235 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:49:49,236 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:49:50,238 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:49:51,239 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:49:52,240 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:49:53,241 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:49:54,242 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:49:55,243 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:49:56,244 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:49:57,245 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:49:58,247 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:49:59,248 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:49:59,249 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 16:50:05,251 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:50:06,252 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:50:07,253 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:50:08,254 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:50:09,255 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:50:10,256 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:50:11,257 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:50:12,258 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:50:13,260 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:50:14,261 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:50:15,262 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:50:16,263 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:50:17,264 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:50:18,265 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:50:19,266 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:50:20,268 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:50:21,269 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:50:22,270 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:50:23,271 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:50:24,272 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:50:25,273 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:50:26,274 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:50:27,276 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:50:28,277 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:50:29,278 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:50:30,279 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:50:31,280 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:50:32,281 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:50:33,282 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:50:34,283 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:50:35,285 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:50:36,286 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:50:37,287 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:50:38,288 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:50:39,289 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:50:40,290 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:50:41,291 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:50:42,292 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:50:43,294 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:50:44,295 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:50:45,296 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:50:46,297 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:50:47,298 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:50:48,299 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:50:49,300 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:50:50,302 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:50:51,303 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:50:52,304 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:50:53,305 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:50:54,306 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:50:54,307 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 16:51:00,309 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:51:01,310 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:51:02,311 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:51:03,313 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:51:04,314 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:51:05,315 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:51:06,316 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:51:07,317 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:51:08,318 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:51:09,319 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:51:10,320 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:51:11,321 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:51:12,323 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:51:13,324 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:51:14,325 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:51:15,326 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:51:16,327 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:51:17,328 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:51:18,330 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:51:19,331 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:51:20,332 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:51:21,333 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:51:22,334 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:51:23,335 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:51:24,337 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:51:25,338 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:51:26,339 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:51:27,340 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:51:28,341 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:51:29,342 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:51:30,343 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:51:31,345 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:51:32,346 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:51:33,347 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:51:34,348 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:51:35,349 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:51:36,350 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:51:37,351 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:51:38,352 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:51:39,354 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:51:40,355 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:51:41,356 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:51:42,357 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:51:43,358 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:51:44,359 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:51:45,360 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:51:46,361 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:51:47,362 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:51:48,364 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:51:49,365 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:51:49,366 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 16:51:55,368 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:51:56,369 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:51:57,370 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:51:58,371 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:51:59,372 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:52:00,373 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:52:01,374 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:52:02,375 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:52:03,376 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:52:04,377 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:52:05,378 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:52:06,380 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:52:07,381 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:52:08,382 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:52:09,383 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:52:10,384 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:52:11,385 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:52:12,386 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:52:13,387 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:52:14,388 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:52:15,389 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:52:16,391 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:52:17,392 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:52:18,393 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:52:19,394 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:52:20,395 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:52:21,396 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:52:22,398 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:52:23,399 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:52:24,400 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:52:25,401 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:52:26,402 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:52:27,403 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:52:28,404 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:52:29,406 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:52:30,407 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:52:31,408 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:52:32,409 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:52:33,410 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:52:34,411 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:52:35,412 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:52:36,414 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:52:37,415 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:52:38,416 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:52:39,417 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:52:40,418 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:52:41,419 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:52:42,420 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:52:43,422 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:52:44,423 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:52:44,424 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 16:52:50,426 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:52:51,427 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:52:52,428 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:52:53,429 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:52:54,430 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:52:55,431 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:52:56,432 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:52:57,433 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:52:58,434 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:52:59,435 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:53:00,436 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:53:01,437 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:53:02,439 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:53:03,440 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:53:04,441 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:53:05,442 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:53:06,443 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:53:07,444 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:53:08,445 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:53:09,446 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:53:10,447 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:53:11,449 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:53:12,450 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:53:13,451 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:53:14,452 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:53:15,453 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:53:16,454 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:53:17,455 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:53:18,457 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:53:19,458 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:53:20,459 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:53:21,460 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:53:22,461 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:53:23,462 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:53:24,464 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:53:25,465 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:53:26,466 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:53:27,467 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:53:28,468 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:53:29,469 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:53:30,470 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:53:31,471 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:53:32,472 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:53:33,474 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:53:34,475 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:53:35,476 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:53:36,477 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:53:37,478 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:53:38,479 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:53:39,480 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:53:39,481 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 16:53:45,483 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:53:46,484 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:53:47,485 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:53:48,486 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:53:49,487 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:53:50,488 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:53:51,489 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:53:52,490 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:53:53,491 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:53:54,492 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:53:55,494 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:53:56,495 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:53:57,496 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:53:58,497 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:53:59,498 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:54:00,500 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:54:01,501 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:54:02,502 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:54:03,503 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:54:04,504 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:54:05,505 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:54:06,506 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:54:07,507 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:54:08,508 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:54:09,509 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:54:10,511 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:54:11,512 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:54:12,513 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:54:13,514 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:54:14,515 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:54:15,516 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:54:16,517 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:54:17,518 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:54:18,519 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:54:19,520 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:54:20,521 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:54:21,522 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:54:22,523 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:54:23,525 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:54:24,526 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:54:25,527 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:54:26,528 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:54:27,529 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:54:28,530 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:54:29,531 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:54:30,532 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:54:31,533 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:54:32,535 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:54:33,536 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:54:34,537 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:54:34,538 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 16:54:40,540 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:54:41,541 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:54:42,542 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:54:43,543 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:54:44,544 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:54:45,545 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:54:46,547 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:54:47,548 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:54:48,549 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:54:49,550 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:54:50,551 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:54:51,552 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:54:52,554 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:54:53,555 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:54:54,556 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:54:55,557 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:54:56,558 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:54:57,560 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:54:58,561 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:54:59,562 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:55:00,563 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:55:01,564 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:55:02,565 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:55:03,567 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:55:04,568 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:55:05,569 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:55:06,570 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:55:07,571 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:55:08,572 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:55:09,573 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:55:10,575 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:55:11,576 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:55:12,577 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:55:13,578 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:55:14,579 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:55:15,580 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:55:16,581 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:55:17,582 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:55:18,583 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:55:19,585 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:55:20,586 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:55:21,587 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:55:22,588 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:55:23,589 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:55:24,591 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:55:25,592 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:55:26,593 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:55:27,594 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:55:28,595 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:55:29,597 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:55:29,598 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 16:55:35,599 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:55:36,601 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:55:37,602 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:55:38,603 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:55:39,604 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:55:40,605 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:55:41,607 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:55:42,608 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:55:43,609 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:55:44,610 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:55:45,611 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:55:46,612 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:55:47,613 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:55:48,614 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:55:49,616 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:55:50,617 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:55:51,618 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:55:52,619 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:55:53,620 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:55:54,622 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:55:55,623 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:55:56,624 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:55:57,625 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:55:58,626 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:55:59,627 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:56:00,629 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:56:01,630 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:56:02,631 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:56:03,632 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:56:04,633 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:56:05,634 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:56:06,635 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:56:07,636 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:56:08,638 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:56:09,639 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:56:10,640 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:56:11,641 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:56:12,642 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:56:13,643 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:56:14,645 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:56:15,646 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:56:16,647 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:56:17,648 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:56:18,649 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:56:19,650 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:56:20,652 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:56:21,653 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:56:22,654 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:56:23,655 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:56:24,656 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:56:24,657 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 16:56:30,660 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:56:31,662 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:56:32,663 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:56:33,664 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:56:34,665 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:56:35,666 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:56:36,667 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:56:37,668 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:56:38,669 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:56:39,670 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:56:40,672 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:56:41,673 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:56:42,674 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:56:43,675 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:56:44,676 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:56:45,677 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:56:46,678 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:56:47,679 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:56:48,680 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:56:49,681 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:56:50,682 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:56:51,684 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:56:52,685 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:56:53,686 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:56:54,687 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:56:55,688 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:56:56,689 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:56:57,690 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:56:58,691 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:56:59,692 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:57:00,694 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:57:01,695 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:57:02,696 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:57:03,697 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:57:04,698 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:57:05,699 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:57:06,700 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:57:07,701 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:57:08,702 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:57:09,703 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:57:10,704 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:57:11,705 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:57:12,707 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:57:13,708 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:57:14,709 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:57:15,710 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:57:16,711 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:57:17,712 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:57:18,713 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:57:19,715 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:57:19,717 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 16:57:25,719 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:57:26,720 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:57:27,721 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:57:28,722 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:57:29,723 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:57:30,724 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:57:31,725 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:57:32,727 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:57:33,728 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:57:34,729 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:57:35,730 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:57:36,731 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:57:37,732 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:57:38,733 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:57:39,734 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:57:40,735 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:57:41,737 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:57:42,738 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:57:43,739 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:57:44,740 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:57:45,741 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:57:46,742 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:57:47,743 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:57:48,744 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:57:49,745 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:57:50,747 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:57:51,748 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:57:52,749 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:57:53,750 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:57:54,751 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:57:55,752 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:57:56,753 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:57:57,755 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:57:58,756 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:57:59,757 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:58:00,758 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:58:01,759 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:58:02,760 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:58:03,761 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:58:04,762 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:58:05,763 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:58:06,764 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:58:07,765 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:58:08,767 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:58:09,768 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:58:10,769 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:58:11,770 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:58:12,771 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:58:13,772 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:58:14,773 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:58:14,775 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 16:58:20,777 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:58:21,778 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:58:22,779 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:58:23,780 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:58:24,781 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:58:25,782 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:58:26,783 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:58:27,784 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:58:28,785 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:58:29,786 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:58:30,788 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:58:31,789 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:58:32,790 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:58:33,791 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:58:34,792 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:58:35,793 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:58:36,794 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:58:37,796 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:58:38,797 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:58:39,798 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:58:40,799 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:58:41,800 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:58:42,801 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:58:43,802 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:58:44,803 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:58:45,805 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:58:46,806 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:58:47,807 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:58:48,808 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:58:49,809 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:58:50,810 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:58:51,812 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:58:52,813 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:58:53,814 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:58:54,815 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:58:55,816 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:58:56,817 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:58:57,818 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:58:58,819 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:58:59,820 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:59:00,822 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:59:01,823 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:59:02,824 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:59:03,824 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:59:04,825 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:59:05,827 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:59:06,828 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:59:07,829 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:59:08,830 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:59:09,831 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:59:09,832 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 16:59:15,834 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:59:16,835 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:59:17,836 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:59:18,837 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:59:19,838 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:59:20,839 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:59:21,840 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:59:22,841 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:59:23,842 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:59:24,844 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:59:25,845 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:59:26,846 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:59:27,847 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:59:28,848 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:59:29,849 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:59:30,850 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:59:31,851 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:59:32,852 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:59:33,853 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:59:34,854 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:59:35,855 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:59:36,856 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:59:37,858 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:59:38,859 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:59:39,860 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:59:40,861 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:59:41,862 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:59:42,863 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:59:43,864 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:59:44,865 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:59:45,866 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:59:46,867 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:59:47,868 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:59:48,869 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:59:49,870 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:59:50,871 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:59:51,872 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:59:52,874 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:59:53,875 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:59:54,876 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:59:55,877 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:59:56,878 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:59:57,879 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:59:58,880 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 16:59:59,881 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:00:00,882 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:00:01,883 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:00:02,885 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:00:03,886 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:00:04,887 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:00:04,888 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 17:00:10,889 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:00:11,890 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:00:12,891 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:00:13,892 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:00:14,893 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:00:15,894 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:00:16,896 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:00:17,897 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:00:18,898 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:00:19,899 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:00:20,900 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:00:21,902 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:00:22,903 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:00:23,904 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:00:24,905 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:00:25,906 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:00:26,907 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:00:27,908 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:00:28,909 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:00:29,910 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:00:30,911 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:00:31,912 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:00:32,914 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:00:33,914 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:00:34,916 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:00:35,917 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:00:36,918 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:00:37,919 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:00:38,920 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:00:39,921 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:00:40,922 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:00:41,923 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:00:42,925 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:00:43,926 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:00:44,927 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:00:45,928 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:00:46,929 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:00:47,930 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:00:48,931 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:00:49,932 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:00:50,933 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:00:51,934 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:00:52,935 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:00:53,936 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:00:54,937 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:00:55,938 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:00:56,939 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:00:57,940 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:00:58,941 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:00:59,942 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:00:59,944 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 17:01:05,945 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:01:06,946 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:01:07,947 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:01:08,953 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:01:09,954 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:01:10,955 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:01:11,956 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:01:12,958 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:01:13,959 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:01:14,960 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:01:15,961 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:01:16,962 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:01:17,963 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:01:18,964 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:01:19,965 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:01:20,966 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:01:21,967 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:01:22,968 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:01:23,969 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:01:24,970 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:01:25,971 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:01:26,972 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:01:27,973 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:01:28,974 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:01:29,975 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:01:30,976 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:01:31,978 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:01:32,979 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:01:33,980 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:01:34,981 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:01:35,982 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:01:36,983 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:01:37,984 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:01:38,985 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:01:39,986 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:01:40,987 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:01:41,988 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:01:42,989 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:01:43,990 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:01:44,991 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:01:45,992 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:01:46,993 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:01:47,994 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:01:48,995 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:01:49,996 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:01:50,998 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:01:51,999 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:01:53,000 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:01:54,001 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:01:55,002 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:01:55,003 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 17:02:01,004 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:02:02,005 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:02:03,006 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:02:04,007 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:02:05,009 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:02:06,010 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:02:07,011 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:02:08,012 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:02:09,013 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:02:10,014 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:02:11,015 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:02:12,016 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:02:13,017 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:02:14,018 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:02:15,019 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:02:16,020 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:02:17,021 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:02:18,022 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:02:19,023 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:02:20,024 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:02:21,026 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:02:22,027 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:02:23,028 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:02:24,029 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:02:25,030 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:02:26,031 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:02:27,032 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:02:28,033 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:02:29,034 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:02:30,035 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:02:31,036 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:02:32,037 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:02:33,038 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:02:34,039 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:02:35,041 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:02:36,042 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:02:37,043 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:02:38,044 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:02:39,045 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:02:40,046 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:02:41,047 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:02:42,048 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:02:43,049 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:02:44,050 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:02:45,051 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:02:46,052 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:02:47,053 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:02:48,054 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:02:49,055 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:02:50,056 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:02:50,058 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 17:02:56,059 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:02:57,060 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:02:58,061 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:02:59,062 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:03:00,063 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:03:01,065 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:03:02,066 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:03:03,067 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:03:04,068 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:03:05,069 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:03:06,070 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:03:07,071 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:03:08,072 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:03:09,073 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:03:10,074 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:03:11,075 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:03:12,076 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:03:13,077 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:03:14,078 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:03:15,080 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:03:16,081 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:03:17,082 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:03:18,083 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:03:19,084 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:03:20,085 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:03:21,086 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:03:22,087 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:03:23,088 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:03:24,089 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:03:25,090 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:03:26,092 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:03:27,093 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:03:28,094 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:03:29,095 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:03:30,096 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:03:31,097 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:03:32,098 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:03:33,099 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:03:34,100 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:03:35,101 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:03:36,102 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:03:37,103 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:03:38,104 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:03:39,106 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:03:40,107 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:03:41,108 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:03:42,109 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:03:43,110 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:03:44,111 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:03:45,112 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:03:45,113 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 17:03:51,115 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:03:52,116 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:03:53,117 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:03:54,118 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:03:55,119 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:03:56,120 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:03:57,121 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:03:58,122 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:03:59,123 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:04:00,124 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:04:01,126 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:04:02,127 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:04:03,128 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:04:04,129 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:04:05,130 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:04:06,131 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:04:07,132 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:04:08,133 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:04:09,134 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:04:10,135 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:04:11,136 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:04:12,137 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:04:13,138 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:04:14,140 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:04:15,141 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:04:16,142 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:04:17,143 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:04:18,144 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:04:19,145 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:04:20,146 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:04:21,147 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:04:22,148 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:04:23,149 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:04:24,150 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:04:25,151 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:04:26,153 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:04:27,154 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:04:28,155 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:04:29,156 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:04:30,157 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:04:31,158 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:04:32,160 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:04:33,161 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:04:34,162 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:04:35,163 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:04:36,164 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:04:37,165 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:04:38,167 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:04:39,168 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:04:40,169 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:04:40,170 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 17:04:46,172 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:04:47,173 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:04:48,174 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:04:49,175 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:04:50,176 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:04:51,178 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:04:52,179 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:04:53,180 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:04:54,181 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:04:55,182 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:04:56,183 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:04:57,185 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:04:58,186 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:04:59,187 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:05:00,188 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:05:01,189 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:05:02,190 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:05:03,192 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:05:04,193 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:05:05,194 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:05:06,195 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:05:07,196 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:05:08,197 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:05:09,198 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:05:10,200 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:05:11,201 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:05:12,202 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:05:13,203 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:05:14,204 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:05:15,205 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:05:16,206 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:05:17,208 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:05:18,209 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:05:19,210 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:05:20,211 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:05:21,212 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:05:22,214 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:05:23,215 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:05:24,216 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:05:25,217 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:05:26,218 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:05:27,219 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:05:28,220 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:05:29,222 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:05:30,223 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:05:31,224 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:05:32,225 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:05:33,226 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:05:34,227 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:05:35,229 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:05:35,230 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 17:05:41,231 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:05:42,233 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:05:43,234 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:05:44,235 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:05:45,236 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:05:46,237 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:05:47,238 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:05:48,239 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:05:49,240 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:05:50,242 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:05:51,243 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:05:52,244 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:05:53,245 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:05:54,246 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:05:55,247 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:05:56,248 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:05:57,249 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:05:58,251 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:05:59,252 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:06:00,253 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:06:01,254 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:06:02,255 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:06:03,256 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:06:04,257 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:06:05,258 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:06:06,260 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:06:07,261 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:06:08,262 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:06:09,263 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:06:10,264 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:06:11,266 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:06:12,267 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:06:13,268 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:06:14,269 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:06:15,270 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:06:16,271 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:06:17,273 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:06:18,274 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:06:19,275 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:06:20,276 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:06:21,277 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:06:22,279 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:06:23,280 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:06:24,281 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:06:25,282 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:06:26,283 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:06:27,285 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:06:28,286 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:06:29,287 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:06:30,288 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:06:30,290 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 17:06:36,291 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:06:37,292 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:06:38,294 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:06:39,295 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:06:40,296 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:06:41,297 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:06:42,298 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:06:43,299 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:06:44,301 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:06:45,302 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:06:46,303 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:06:47,304 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:06:48,305 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:06:49,307 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:06:50,308 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:06:51,309 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:06:52,310 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:06:53,311 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:06:54,313 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:06:55,314 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:06:56,315 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:06:57,316 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:06:58,317 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:06:59,318 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:07:00,320 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:07:01,321 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:07:02,322 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:07:03,323 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:07:04,324 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:07:05,326 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:07:06,327 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:07:07,328 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:07:08,329 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:07:09,330 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:07:10,332 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:07:11,333 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:07:12,334 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:07:13,335 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:07:14,337 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:07:15,338 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:07:16,339 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:07:17,340 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:07:18,341 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:07:19,343 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:07:20,344 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:07:21,345 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:07:22,346 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:07:23,348 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:07:24,349 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:07:25,350 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:07:25,352 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 17:07:31,353 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:07:32,354 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:07:33,356 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:07:34,357 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:07:35,358 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:07:36,359 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:07:37,360 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:07:38,362 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:07:39,363 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:07:40,364 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:07:41,365 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:07:42,366 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:07:43,367 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:07:44,369 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:07:45,370 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:07:46,371 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:07:47,372 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:07:48,373 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:07:49,375 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:07:50,376 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:07:51,377 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:07:52,378 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:07:53,379 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:07:54,381 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:07:55,382 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:07:56,383 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:07:57,384 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:07:58,386 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:07:59,387 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:08:00,388 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:08:01,389 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:08:02,390 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:08:03,392 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:08:04,393 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:08:05,394 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:08:06,395 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:08:07,397 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:08:08,398 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:08:09,399 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:08:10,400 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:08:11,401 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:08:12,403 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:08:13,404 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:08:14,405 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:08:15,406 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:08:16,407 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:08:17,409 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:08:18,410 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:08:19,411 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:08:20,412 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:08:20,414 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 17:08:26,415 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:08:27,417 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:08:28,418 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:08:29,419 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:08:30,420 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:08:31,421 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:08:32,423 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:08:33,424 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:08:34,425 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:08:35,426 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:08:36,427 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:08:37,429 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:08:38,430 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:08:39,431 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:08:40,432 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:08:41,433 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:08:42,435 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:08:43,436 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:08:44,437 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:08:45,438 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:08:46,439 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:08:47,441 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:08:48,442 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:08:49,443 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:08:50,444 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:08:51,446 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:08:52,447 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:08:53,448 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:08:54,449 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:08:55,450 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:08:56,451 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:08:57,453 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:08:58,454 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:08:59,455 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:09:00,456 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:09:01,457 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:09:02,459 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:09:03,460 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:09:04,461 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:09:05,462 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:09:06,463 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:09:07,465 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:09:08,466 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:09:09,467 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:09:10,468 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:09:11,469 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:09:12,471 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:09:13,472 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:09:14,473 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:09:15,474 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:09:15,475 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 17:09:21,477 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:09:22,478 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:09:23,480 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:09:24,481 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:09:25,482 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:09:26,483 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:09:27,484 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:09:28,486 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:09:29,487 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:09:30,488 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:09:31,489 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:09:32,490 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:09:33,492 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:09:34,493 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:09:35,494 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:09:36,495 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:09:37,496 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:09:38,498 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:09:39,499 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:09:40,500 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:09:41,501 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:09:42,503 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:09:43,504 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:09:44,505 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:09:45,506 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:09:46,507 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:09:47,509 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:09:48,510 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:09:49,511 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:09:50,512 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:09:51,514 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:09:52,515 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:09:53,516 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:09:54,517 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:09:55,518 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:09:56,519 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:09:57,521 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:09:58,522 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:09:59,523 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:10:00,524 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:10:01,526 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:10:02,527 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:10:03,528 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:10:04,529 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:10:05,530 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:10:06,532 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:10:07,533 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:10:08,534 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:10:09,535 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:10:10,537 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:10:10,538 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 17:10:16,540 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:10:17,541 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:10:18,542 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:10:19,543 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:10:20,544 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:10:21,546 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:10:22,547 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:10:23,548 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:10:24,549 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:10:25,551 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:10:26,552 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:10:27,553 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:10:28,554 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:10:29,555 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:10:30,557 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:10:31,558 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:10:32,559 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:10:33,560 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:10:34,562 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:10:35,563 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:10:36,564 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:10:37,565 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:10:38,566 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:10:39,568 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:10:40,569 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:10:41,570 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:10:42,571 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:10:43,573 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:10:44,574 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:10:45,575 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:10:46,576 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:10:47,577 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:10:48,579 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:10:49,580 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:10:50,581 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:10:51,582 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:10:52,584 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:10:53,585 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:10:54,586 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:10:55,587 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:10:56,588 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:10:57,589 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:10:58,590 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:10:59,592 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:11:00,593 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:11:01,594 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:11:02,595 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:11:03,596 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:11:04,598 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:11:05,599 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:11:05,600 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 17:11:11,602 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:11:12,603 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:11:13,604 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:11:14,605 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:11:15,606 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:11:16,608 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:11:17,609 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:11:18,610 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:11:19,611 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:11:20,612 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:11:21,614 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:11:22,615 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:11:23,616 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:11:24,617 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:11:25,619 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:11:26,620 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:11:27,621 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:11:28,622 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:11:29,623 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:11:30,625 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:11:31,626 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:11:32,627 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:11:33,628 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:11:34,629 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:11:35,631 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:11:36,632 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:11:37,633 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:11:38,634 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:11:39,636 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:11:40,637 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:11:41,638 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:11:42,639 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:11:43,641 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:11:44,642 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:11:45,643 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:11:46,644 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:11:47,645 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:11:48,647 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:11:49,648 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:11:50,649 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:11:51,651 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:11:52,652 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:11:53,653 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:11:54,654 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:11:55,655 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:11:56,656 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:11:57,657 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:11:58,658 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:11:59,659 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:12:00,660 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:12:00,662 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 17:12:06,663 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:12:07,664 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:12:08,666 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:12:09,667 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:12:10,668 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:12:11,265 WARN com.cloudera.cmf.event.publish.EventStorePublisherWithRetry: Failed to publish event: SimpleEvent{attributes={ROLE=[hdfs-DATANODE-b30a464b10a57fdd49ea734cd52a8291], HOSTS=[dmidlkprdls04.svr.luc.edu], ROLE_TYPE=[DATANODE], CATEGORY=[LOG_MESSAGE], EVENTCODE=[EV_LOG_EVENT], SERVICE=[hdfs], SERVICE_TYPE=[HDFS], LOG_LEVEL=[WARN], HOST_IDS=[5c33df90-d247-4c6d-b9e0-5908a423580a], SEVERITY=[IMPORTANT]}, content=Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /var/lib/hadoop-hdfs/hadoop-http-auth-signature-secret, timestamp=1751909970885} - 1 of 60 failure(s) in last 1817s java.io.IOException: Error connecting to dmidlkprdls01.svr.luc.edu/192.168.158.1:7184 at com.cloudera.cmf.event.shaded.org.apache.avro.ipc.netty.NettyTransceiver.getChannel(NettyTransceiver.java:269) at com.cloudera.cmf.event.shaded.org.apache.avro.ipc.netty.NettyTransceiver.(NettyTransceiver.java:197) at com.cloudera.cmf.event.shaded.org.apache.avro.ipc.netty.NettyTransceiver.(NettyTransceiver.java:120) at com.cloudera.cmf.event.publish.AvroEventStorePublishProxy.checkSpecificRequestor(AvroEventStorePublishProxy.java:120) at com.cloudera.cmf.event.publish.AvroEventStorePublishProxy.publishEvent(AvroEventStorePublishProxy.java:204) at com.cloudera.cmf.event.publish.EventStorePublisherWithRetry$PublishEventTask.run(EventStorePublisherWithRetry.java:242) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:750) Caused by: com.cloudera.cmf.event.shaded.io.netty.channel.AbstractChannel$AnnotatedNoRouteToHostException: No route to host: dmidlkprdls01.svr.luc.edu/192.168.158.1:7184 Caused by: java.net.NoRouteToHostException: No route to host at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:716) at com.cloudera.cmf.event.shaded.io.netty.channel.socket.nio.NioSocketChannel.doFinishConnect(NioSocketChannel.java:337) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.finishConnect(AbstractNioChannel.java:339) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:776) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:724) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:650) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:562) at com.cloudera.cmf.event.shaded.io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997) at com.cloudera.cmf.event.shaded.io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) at java.lang.Thread.run(Thread.java:750) 2025-07-07 17:12:11,669 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:12:12,670 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:12:13,671 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:12:14,673 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:12:15,674 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:12:16,675 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:12:17,676 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:12:18,677 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:12:19,679 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:12:20,680 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:12:21,681 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:12:22,683 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:12:23,684 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:12:24,685 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:12:25,686 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:12:26,687 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:12:27,688 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:12:28,689 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:12:29,690 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:12:30,691 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:12:31,692 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:12:32,694 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:12:33,695 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:12:34,696 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:12:35,697 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:12:36,698 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:12:37,699 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:12:38,700 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:12:39,702 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:12:40,703 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:12:41,704 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:12:42,705 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:12:43,706 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:12:44,707 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:12:45,708 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:12:46,710 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:12:47,711 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:12:48,712 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:12:49,713 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:12:50,714 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:12:51,716 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:12:52,717 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:12:53,718 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:12:54,720 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:12:55,721 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:12:55,722 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 17:13:01,724 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:13:02,725 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:13:03,726 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:13:04,727 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:13:05,728 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:13:06,730 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:13:07,731 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:13:08,732 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:13:09,733 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:13:10,734 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:13:11,735 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:13:12,737 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:13:13,738 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:13:14,739 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:13:15,740 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:13:16,741 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:13:17,742 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:13:18,743 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:13:19,745 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:13:20,746 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:13:21,747 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:13:22,748 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:13:23,749 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:13:24,751 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:13:25,752 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:13:26,753 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:13:27,754 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:13:28,755 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:13:29,757 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:13:30,758 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:13:31,759 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:13:32,760 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:13:33,761 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:13:34,763 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:13:35,764 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:13:36,765 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:13:37,766 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:13:38,767 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:13:39,769 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:13:40,770 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:13:41,771 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:13:42,772 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:13:43,773 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:13:44,774 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:13:45,775 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:13:46,777 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:13:47,778 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:13:48,779 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:13:49,780 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:13:50,782 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:13:50,783 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 17:13:56,785 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:13:57,786 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:13:58,787 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:13:59,788 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:14:00,789 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:14:01,790 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:14:02,792 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:14:03,793 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:14:04,794 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:14:05,795 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:14:06,796 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:14:07,798 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:14:08,799 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:14:09,800 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:14:10,801 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:14:11,802 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:14:12,804 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:14:13,805 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:14:14,806 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:14:15,807 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:14:16,808 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:14:17,810 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:14:18,811 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:14:19,812 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:14:20,813 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:14:21,814 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:14:22,816 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:14:23,817 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:14:24,818 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:14:25,819 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:14:26,821 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:14:27,822 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:14:28,823 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:14:29,824 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:14:30,826 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:14:31,827 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:14:32,828 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:14:33,829 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:14:34,831 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:14:35,832 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:14:36,833 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:14:37,834 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:14:38,835 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:14:39,836 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:14:40,838 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:14:41,839 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:14:42,840 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:14:43,841 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:14:44,842 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:14:45,843 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:14:45,844 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 17:14:51,848 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:14:52,849 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:14:53,850 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:14:54,851 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:14:55,853 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:14:56,854 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:14:57,855 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:14:58,856 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:14:59,857 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:15:00,858 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:15:01,860 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:15:02,861 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:15:03,862 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:15:04,863 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:15:05,864 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:15:06,866 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:15:07,867 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:15:08,868 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:15:09,869 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:15:10,871 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:15:11,872 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:15:12,873 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:15:13,874 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:15:14,875 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:15:15,876 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:15:16,877 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:15:17,878 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:15:18,879 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:15:19,880 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:15:20,881 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:15:21,882 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:15:22,884 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:15:23,885 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:15:24,886 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:15:25,887 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:15:26,888 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:15:27,889 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:15:28,891 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:15:29,892 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:15:30,893 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:15:31,894 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:15:32,895 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:15:33,896 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:15:34,897 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:15:35,899 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:15:36,900 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:15:37,901 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:15:38,902 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:15:39,903 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:15:40,904 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:15:40,907 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 17:15:46,908 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:15:47,909 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:15:48,911 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:15:49,912 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:15:50,913 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:15:51,914 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:15:52,915 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:15:53,916 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:15:54,917 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:15:55,919 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:15:56,920 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:15:57,921 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:15:58,922 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:15:59,923 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:16:00,925 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:16:01,926 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:16:02,927 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:16:03,928 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:16:04,929 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:16:05,930 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:16:06,931 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:16:07,932 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:16:08,934 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:16:09,935 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:16:10,936 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:16:11,937 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:16:12,938 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:16:13,939 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:16:14,940 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:16:15,942 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:16:16,943 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:16:17,944 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:16:18,945 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:16:19,946 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:16:20,948 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:16:21,949 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:16:22,950 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:16:23,951 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:16:24,952 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:16:25,953 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:16:26,955 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:16:27,956 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:16:28,957 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:16:29,958 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:16:30,960 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:16:31,961 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:16:32,962 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:16:33,963 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:16:34,964 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:16:35,965 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:16:35,966 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 17:16:41,968 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:16:42,969 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:16:43,970 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:16:44,971 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:16:45,972 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:16:46,973 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:16:47,975 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:16:48,976 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:16:49,977 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:16:50,978 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:16:51,979 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:16:52,980 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:16:53,982 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:16:54,983 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:16:55,984 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:16:56,985 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:16:57,986 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:16:58,987 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:16:59,988 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:17:00,989 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:17:01,991 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:17:02,992 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:17:03,993 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:17:04,994 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:17:05,995 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:17:06,996 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:17:07,997 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:17:08,998 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:17:10,000 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:17:11,001 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:17:12,002 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:17:13,003 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:17:14,004 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:17:15,005 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:17:16,006 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:17:17,007 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:17:18,008 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:17:19,009 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:17:20,010 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:17:21,012 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:17:22,013 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:17:23,014 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:17:24,015 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:17:25,016 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:17:26,017 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:17:27,018 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:17:28,019 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:17:29,020 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:17:30,022 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:17:31,023 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:17:31,024 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 17:17:37,026 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:17:38,027 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:17:39,028 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:17:40,029 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:17:41,030 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:17:42,031 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:17:43,032 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:17:44,034 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:17:45,035 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:17:46,036 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:17:47,037 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:17:48,038 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:17:49,039 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:17:50,040 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:17:51,042 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:17:52,043 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:17:53,044 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:17:54,045 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:17:55,046 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:17:56,047 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:17:57,048 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:17:58,050 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:17:59,051 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:18:00,052 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:18:01,053 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:18:02,054 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:18:03,055 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:18:04,056 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:18:05,057 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:18:06,059 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:18:07,060 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:18:08,061 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:18:09,062 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:18:10,063 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:18:11,065 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:18:12,066 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:18:13,067 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:18:14,068 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:18:15,069 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:18:16,070 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:18:17,072 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:18:18,073 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:18:19,074 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:18:20,075 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:18:21,076 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:18:22,077 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:18:23,078 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:18:24,080 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:18:25,081 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:18:26,082 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:18:26,083 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 17:18:32,085 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:18:33,086 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:18:34,087 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:18:35,089 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:18:36,090 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:18:37,091 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:18:38,092 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:18:39,093 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:18:40,094 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:18:41,095 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:18:42,096 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:18:43,097 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:18:44,099 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:18:45,100 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:18:46,101 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:18:47,102 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:18:48,103 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:18:49,104 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:18:50,105 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:18:51,106 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:18:52,107 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:18:53,109 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:18:54,110 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:18:55,111 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:18:56,112 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:18:57,113 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:18:58,114 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:18:59,115 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:19:00,116 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:19:01,117 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:19:02,118 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:19:03,119 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:19:04,121 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:19:05,122 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:19:06,123 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:19:07,124 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:19:08,125 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:19:09,126 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:19:10,127 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:19:11,128 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:19:12,129 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:19:13,130 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:19:14,131 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:19:15,132 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:19:16,133 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:19:17,135 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:19:18,136 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:19:19,137 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:19:20,138 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:19:21,139 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:19:21,140 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 17:19:27,141 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:19:28,143 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:19:29,151 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:19:30,152 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:19:31,153 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:19:32,154 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:19:33,155 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:19:34,156 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:19:35,157 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:19:36,158 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:19:37,159 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:19:38,160 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:19:39,161 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:19:40,162 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:19:41,164 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:19:42,165 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:19:43,166 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:19:44,167 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:19:45,168 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:19:46,169 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:19:47,170 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:19:48,171 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:19:49,172 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:19:50,173 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:19:51,174 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:19:52,176 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:19:53,177 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:19:54,178 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:19:55,179 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:19:56,180 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:19:57,181 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:19:58,182 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:19:59,183 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:20:00,184 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:20:01,185 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:20:02,187 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:20:03,188 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:20:04,189 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:20:05,190 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:20:06,191 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:20:07,192 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:20:08,193 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:20:09,194 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:20:10,195 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:20:11,196 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:20:12,197 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:20:13,198 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:20:14,200 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:20:15,201 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:20:16,202 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:20:16,203 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 17:20:22,205 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:20:23,206 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:20:24,207 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:20:25,208 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:20:26,209 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:20:27,210 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:20:28,211 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:20:29,212 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:20:30,213 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:20:31,214 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:20:32,215 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:20:33,216 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:20:34,217 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:20:35,218 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:20:36,219 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:20:37,221 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:20:38,222 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:20:39,223 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:20:40,224 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:20:41,225 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:20:42,226 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:20:43,227 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:20:44,228 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:20:45,229 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:20:46,230 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:20:47,231 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:20:48,232 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:20:49,233 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:20:50,234 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:20:51,235 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:20:52,236 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:20:53,237 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:20:54,239 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:20:55,240 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:20:56,241 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:20:57,242 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:20:58,243 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:20:59,244 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:21:00,245 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:21:01,246 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:21:02,247 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:21:03,248 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:21:04,249 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:21:05,250 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:21:06,252 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:21:07,253 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:21:08,254 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:21:09,255 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:21:10,256 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:21:11,257 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:21:11,258 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 17:21:17,260 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:21:18,261 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:21:19,262 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:21:20,263 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:21:21,264 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:21:22,266 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:21:23,267 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:21:24,268 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:21:25,269 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:21:26,270 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:21:27,271 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:21:28,272 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:21:29,273 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:21:30,274 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:21:31,275 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:21:32,276 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:21:33,277 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:21:34,278 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:21:35,279 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:21:36,281 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:21:37,282 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:21:38,283 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:21:39,284 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:21:40,285 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:21:41,286 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:21:42,287 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:21:43,288 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:21:44,289 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:21:45,290 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:21:46,291 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:21:47,292 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:21:48,293 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:21:49,294 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:21:50,295 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:21:51,296 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:21:52,298 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:21:53,299 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:21:54,300 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:21:55,301 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:21:56,302 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:21:57,303 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:21:58,304 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:21:59,305 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:22:00,306 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:22:01,307 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:22:02,308 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:22:03,309 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:22:04,310 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:22:05,311 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:22:06,312 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:22:06,314 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 17:22:12,315 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:22:13,316 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:22:14,317 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:22:15,318 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:22:16,319 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:22:17,320 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:22:18,321 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:22:19,322 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:22:20,323 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:22:21,325 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:22:22,326 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:22:23,327 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:22:24,328 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:22:25,329 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:22:26,330 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:22:27,331 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:22:28,332 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:22:29,333 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:22:30,334 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:22:31,335 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:22:32,337 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:22:33,338 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:22:34,339 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:22:35,340 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:22:36,341 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:22:37,342 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:22:38,343 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:22:39,344 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:22:40,345 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:22:41,346 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:22:42,348 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:22:43,349 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:22:44,350 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:22:45,351 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:22:46,352 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:22:47,353 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:22:48,354 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:22:49,355 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:22:50,356 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:22:51,357 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:22:52,358 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:22:53,360 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:22:54,361 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:22:55,362 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:22:56,363 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:22:57,364 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:22:58,365 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:22:59,366 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:23:00,367 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:23:01,368 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:23:01,369 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 17:23:07,371 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:23:08,372 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:23:09,373 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:23:10,374 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:23:11,375 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:23:12,377 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:23:13,378 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:23:14,379 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:23:15,380 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:23:16,381 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:23:17,382 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:23:18,383 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:23:19,385 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:23:20,386 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:23:21,387 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:23:22,388 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:23:23,390 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:23:24,391 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:23:25,392 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:23:26,393 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:23:27,394 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:23:28,395 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:23:29,397 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:23:30,398 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:23:31,399 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:23:32,400 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:23:33,401 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:23:34,403 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:23:35,404 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:23:36,405 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:23:37,406 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:23:38,407 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:23:39,409 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:23:40,410 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:23:41,411 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:23:42,412 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:23:43,413 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:23:44,414 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:23:45,415 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:23:46,416 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:23:47,418 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:23:48,419 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:23:49,420 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:23:50,421 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:23:51,422 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:23:52,424 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:23:53,425 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:23:54,426 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:23:55,427 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:23:56,428 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:23:56,430 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 17:24:02,432 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:24:03,433 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:24:04,434 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:24:05,435 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:24:06,436 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:24:07,437 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:24:08,439 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:24:09,440 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:24:10,441 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:24:11,442 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:24:12,443 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:24:13,445 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:24:14,446 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:24:15,447 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:24:16,448 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:24:17,449 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:24:18,450 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:24:19,452 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:24:20,453 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:24:21,454 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:24:22,455 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:24:23,457 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:24:24,458 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:24:25,459 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:24:26,460 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:24:27,461 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:24:28,463 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:24:29,464 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:24:30,465 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:24:31,466 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:24:32,467 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:24:33,468 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:24:34,470 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:24:35,471 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:24:36,472 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:24:37,473 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:24:38,474 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:24:39,476 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:24:40,477 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:24:41,478 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:24:42,479 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:24:43,480 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:24:44,481 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:24:45,483 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:24:46,484 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:24:47,485 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:24:48,486 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:24:49,487 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:24:50,489 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:24:51,490 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:24:51,491 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 17:24:57,493 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:24:58,494 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:24:59,495 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:25:00,496 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:25:01,498 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:25:02,499 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:25:03,500 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:25:04,501 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:25:05,502 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:25:06,504 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:25:07,505 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:25:08,506 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:25:09,507 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:25:10,508 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:25:11,510 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:25:12,511 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:25:13,512 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:25:14,513 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:25:15,514 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:25:16,516 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:25:17,517 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:25:18,518 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:25:19,519 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:25:20,520 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:25:21,522 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:25:22,523 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:25:23,524 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:25:24,525 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:25:25,527 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:25:26,528 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:25:27,529 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:25:28,530 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:25:29,531 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:25:30,533 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:25:31,534 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:25:32,535 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:25:33,536 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:25:34,537 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:25:35,539 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:25:36,540 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:25:37,541 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:25:38,542 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:25:39,544 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:25:40,545 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:25:41,546 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:25:42,547 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:25:43,548 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:25:44,549 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:25:45,551 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:25:46,552 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:25:46,553 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 17:25:52,555 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:25:53,556 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:25:54,557 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:25:55,558 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:25:56,560 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:25:57,561 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:25:58,562 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:25:59,563 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:26:00,564 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:26:01,566 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:26:02,567 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:26:03,568 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:26:04,569 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:26:05,570 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:26:06,572 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:26:07,573 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:26:08,574 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:26:09,575 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:26:10,576 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:26:11,578 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:26:12,579 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:26:13,580 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:26:14,581 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:26:15,583 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:26:16,584 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:26:17,585 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:26:18,586 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:26:19,587 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:26:20,589 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:26:21,590 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:26:22,591 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:26:23,592 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:26:24,593 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:26:25,595 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:26:26,596 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:26:27,597 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:26:28,598 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:26:29,599 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:26:30,600 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:26:31,602 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:26:32,603 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:26:33,604 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:26:34,605 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:26:35,606 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:26:36,608 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:26:37,609 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:26:38,610 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:26:39,611 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:26:40,612 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:26:41,613 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:26:41,615 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 17:26:47,616 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:26:48,618 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:26:49,619 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:26:50,620 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:26:51,621 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:26:52,622 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:26:53,624 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:26:54,625 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:26:55,626 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:26:56,627 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:26:57,628 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:26:58,629 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:26:59,631 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:27:00,632 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:27:01,633 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:27:02,634 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:27:03,635 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:27:04,637 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:27:05,638 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:27:06,639 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:27:07,640 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:27:08,641 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:27:09,643 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:27:10,644 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:27:11,645 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:27:12,646 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:27:13,647 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:27:14,648 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:27:15,650 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:27:16,651 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:27:17,652 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:27:18,653 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:27:19,654 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:27:20,655 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:27:21,656 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:27:22,657 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:27:23,658 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:27:24,659 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:27:25,661 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:27:26,662 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:27:27,663 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:27:28,664 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:27:29,665 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:27:30,666 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:27:31,667 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:27:32,668 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:27:33,669 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:27:34,671 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:27:35,672 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:27:36,673 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:27:36,674 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 17:27:42,675 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:27:43,677 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:27:44,678 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:27:45,679 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:27:46,680 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:27:47,681 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:27:48,682 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:27:49,683 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:27:50,684 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:27:51,685 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:27:52,686 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:27:53,687 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:27:54,689 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:27:55,690 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:27:56,690 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:27:57,692 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:27:58,693 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:27:59,694 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:28:00,695 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:28:01,696 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:28:02,697 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:28:03,698 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:28:04,699 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:28:05,700 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:28:06,701 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:28:07,702 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:28:08,704 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:28:09,705 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:28:10,706 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:28:11,707 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:28:12,708 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:28:13,709 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:28:14,710 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:28:15,712 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:28:16,713 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:28:17,714 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:28:18,715 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:28:19,716 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:28:20,717 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:28:21,718 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:28:22,720 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:28:23,721 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:28:24,722 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:28:25,723 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:28:26,724 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:28:27,725 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:28:28,726 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:28:29,728 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:28:30,729 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:28:31,730 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:28:31,731 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 17:28:37,733 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:28:38,734 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:28:39,735 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:28:40,736 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:28:41,737 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:28:42,738 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:28:43,739 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:28:44,740 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:28:45,741 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:28:46,742 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:28:47,743 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:28:48,744 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:28:49,745 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:28:50,747 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:28:51,748 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:28:52,749 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:28:53,750 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:28:54,751 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:28:55,752 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:28:56,753 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:28:57,754 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:28:58,756 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:28:59,757 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:29:00,758 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:29:01,759 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:29:02,760 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:29:03,761 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:29:04,762 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:29:05,763 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:29:06,765 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:29:07,766 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:29:08,767 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:29:09,768 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:29:10,769 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:29:11,770 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:29:12,771 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:29:13,772 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:29:14,774 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:29:15,775 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:29:16,776 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:29:17,777 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:29:18,778 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:29:19,779 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:29:20,780 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:29:21,781 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:29:22,783 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:29:23,784 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:29:24,785 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:29:25,786 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:29:26,787 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:29:26,789 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 17:29:32,790 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:29:33,791 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:29:34,792 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:29:35,794 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:29:36,795 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:29:37,796 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:29:38,797 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:29:39,798 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:29:40,799 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:29:41,801 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:29:42,802 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:29:43,803 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:29:44,804 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:29:45,805 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:29:46,806 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:29:47,807 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:29:48,808 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:29:49,809 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:29:50,810 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:29:51,811 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:29:52,813 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:29:53,814 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:29:54,815 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:29:55,816 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:29:56,817 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:29:57,818 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:29:58,819 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:29:59,821 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:30:00,822 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:30:01,823 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:30:02,824 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:30:03,825 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:30:04,826 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:30:05,827 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:30:06,828 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:30:07,829 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:30:08,831 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:30:09,832 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:30:10,833 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:30:11,834 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:30:12,835 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:30:13,836 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:30:14,837 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:30:15,839 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:30:16,840 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:30:17,841 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:30:18,842 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:30:19,843 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:30:20,844 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:30:21,845 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:30:21,847 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 17:30:27,848 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:30:28,849 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:30:29,850 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:30:30,852 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:30:31,853 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:30:32,854 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:30:33,855 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:30:34,856 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:30:35,857 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:30:36,858 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:30:37,859 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:30:38,861 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:30:39,862 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:30:40,863 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:30:41,864 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:30:42,865 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:30:43,866 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:30:44,867 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:30:45,868 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:30:46,869 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:30:47,871 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:30:48,872 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:30:49,873 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:30:50,874 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:30:51,875 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:30:52,876 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:30:53,877 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:30:54,878 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:30:55,879 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:30:56,880 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:30:57,882 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:30:58,883 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:30:59,884 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:31:00,885 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:31:01,886 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:31:02,887 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:31:03,888 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:31:04,890 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:31:05,891 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:31:06,892 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:31:07,893 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:31:08,894 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:31:09,895 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:31:10,896 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:31:11,897 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:31:12,898 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:31:13,899 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:31:14,901 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:31:15,902 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:31:16,903 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:31:16,904 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 17:31:22,906 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:31:23,907 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:31:24,908 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:31:25,909 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:31:26,910 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:31:27,911 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:31:28,912 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:31:29,913 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:31:30,914 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:31:31,915 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:31:32,916 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:31:33,918 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:31:34,919 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:31:35,920 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:31:36,921 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:31:37,922 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:31:38,923 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:31:39,924 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:31:40,925 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:31:41,927 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:31:42,928 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:31:43,929 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:31:44,930 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:31:45,931 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:31:46,932 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:31:47,933 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:31:48,934 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:31:49,935 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:31:50,937 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:31:51,938 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:31:52,939 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:31:53,940 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:31:54,941 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:31:55,942 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:31:56,943 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:31:57,944 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:31:58,945 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:31:59,947 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:32:00,948 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:32:01,949 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:32:02,950 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:32:03,951 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:32:04,952 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:32:05,953 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:32:06,954 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:32:07,955 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:32:08,956 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:32:09,958 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:32:10,959 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:32:11,960 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:32:11,961 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 17:32:17,963 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:32:18,964 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:32:19,965 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:32:20,966 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:32:21,967 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:32:22,968 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:32:23,969 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:32:24,971 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:32:25,972 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:32:26,973 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:32:27,974 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:32:28,975 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:32:29,976 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:32:30,977 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:32:31,978 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:32:32,979 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:32:33,981 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:32:34,982 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:32:35,983 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:32:36,984 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:32:37,985 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:32:38,986 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:32:39,987 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:32:40,988 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:32:41,989 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:32:42,990 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:32:43,992 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:32:44,993 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:32:45,994 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:32:46,995 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:32:47,996 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:32:48,997 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:32:49,998 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:32:51,000 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:32:52,001 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:32:53,002 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:32:54,004 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:32:55,005 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:32:56,006 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:32:57,007 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:32:58,008 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:32:59,009 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:33:00,010 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:33:01,011 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:33:02,013 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:33:03,014 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:33:04,015 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:33:05,016 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:33:06,017 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:33:07,018 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:33:07,019 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 17:33:13,023 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:33:14,024 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:33:15,025 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:33:16,026 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:33:17,027 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:33:18,028 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:33:19,030 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:33:20,031 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:33:21,032 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:33:22,033 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:33:23,035 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:33:24,036 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:33:25,037 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:33:26,038 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:33:27,039 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:33:28,040 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:33:29,041 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:33:30,043 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:33:31,044 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:33:32,045 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:33:33,046 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:33:34,047 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:33:35,048 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:33:36,049 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:33:37,050 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:33:38,052 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:33:39,053 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:33:40,054 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:33:41,055 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:33:42,056 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:33:43,058 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:33:44,059 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:33:45,060 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:33:46,061 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:33:47,062 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:33:48,063 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:33:49,065 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:33:50,066 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:33:51,067 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:33:52,068 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:33:53,069 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:33:54,071 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:33:55,072 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:33:56,073 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:33:57,074 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:33:58,075 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:33:59,076 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:34:00,078 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:34:01,079 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:34:02,080 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:34:02,083 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 17:34:08,085 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:34:09,086 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:34:10,087 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:34:11,088 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:34:12,089 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:34:13,091 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:34:14,092 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:34:15,093 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:34:16,094 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:34:17,095 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:34:18,097 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:34:19,098 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:34:20,099 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:34:21,100 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:34:22,101 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:34:23,102 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:34:24,104 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:34:25,105 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:34:26,106 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:34:27,107 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:34:28,108 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:34:29,109 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:34:30,111 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:34:31,112 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:34:32,113 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:34:33,114 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:34:34,115 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:34:35,117 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:34:36,118 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:34:37,119 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:34:38,120 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:34:39,121 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:34:40,122 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:34:41,124 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:34:42,125 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:34:43,126 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:34:44,127 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:34:45,128 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:34:46,129 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:34:47,131 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:34:48,132 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:34:49,133 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:34:50,134 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:34:51,135 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:34:52,136 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:34:53,137 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:34:54,139 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:34:55,140 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:34:56,141 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:34:57,142 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:34:57,144 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 17:35:03,145 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:35:04,146 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:35:05,148 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:35:06,149 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:35:07,150 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:35:08,151 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:35:09,152 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:35:10,153 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:35:11,155 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:35:12,156 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:35:13,157 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:35:14,158 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:35:15,159 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:35:16,160 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:35:17,162 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:35:18,163 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:35:19,164 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:35:20,165 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:35:21,166 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:35:22,167 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:35:23,169 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:35:24,170 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:35:25,171 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:35:26,172 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:35:27,173 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:35:28,174 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:35:29,176 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:35:30,177 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:35:31,178 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:35:32,179 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:35:33,180 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:35:34,181 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:35:35,182 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:35:36,184 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:35:37,185 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:35:38,186 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:35:39,187 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:35:40,188 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:35:41,189 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:35:42,190 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:35:43,191 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:35:44,192 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:35:45,194 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:35:46,195 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:35:47,196 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:35:48,197 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:35:49,199 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:35:50,200 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:35:51,201 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:35:52,202 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:35:52,204 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 17:35:58,205 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:35:59,207 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:36:00,208 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:36:01,209 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:36:02,210 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:36:03,211 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:36:04,213 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:36:05,214 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:36:06,215 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:36:07,216 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:36:08,217 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:36:09,218 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:36:10,220 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:36:11,221 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:36:12,222 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:36:13,223 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:36:14,224 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:36:15,225 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:36:16,226 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:36:17,227 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:36:18,229 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:36:19,230 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:36:20,231 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:36:21,232 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:36:22,233 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:36:23,235 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:36:24,236 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:36:25,237 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:36:26,238 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:36:27,239 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:36:28,240 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:36:29,242 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:36:30,243 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:36:31,244 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:36:32,245 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:36:33,246 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:36:34,247 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:36:35,248 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:36:36,249 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:36:37,250 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:36:38,251 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:36:39,252 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:36:40,254 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:36:41,255 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:36:42,256 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:36:43,257 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:36:44,258 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:36:45,259 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:36:46,260 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:36:47,261 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:36:47,263 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 17:36:53,265 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:36:54,266 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:36:55,267 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:36:56,268 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:36:57,269 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:36:58,270 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:36:59,271 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:37:00,272 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:37:01,273 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:37:02,274 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:37:03,276 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:37:04,277 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:37:05,278 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:37:06,279 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:37:07,280 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:37:08,281 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:37:09,282 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:37:10,284 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:37:11,285 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:37:12,286 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:37:13,287 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:37:14,288 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:37:15,289 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:37:16,290 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:37:17,291 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:37:18,293 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:37:19,294 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:37:20,295 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:37:21,296 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:37:22,297 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:37:23,298 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:37:24,300 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:37:25,301 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:37:26,302 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:37:27,303 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:37:28,304 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:37:29,305 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:37:30,306 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:37:31,308 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:37:32,309 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:37:33,310 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:37:34,311 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:37:35,312 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:37:36,313 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:37:37,314 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:37:38,316 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:37:39,317 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:37:40,318 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:37:41,319 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:37:42,320 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:37:42,322 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 17:37:48,323 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:37:49,324 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:37:50,326 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:37:51,327 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:37:52,328 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:37:53,329 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:37:54,330 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:37:55,332 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:37:56,333 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:37:57,334 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:37:58,335 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:37:59,336 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:38:00,337 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:38:01,338 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:38:02,339 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:38:03,341 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:38:04,342 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:38:05,343 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:38:06,344 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:38:07,345 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:38:08,346 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:38:09,347 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:38:10,348 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:38:11,349 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:38:12,351 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:38:13,352 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:38:14,353 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:38:15,354 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:38:16,355 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:38:17,356 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:38:18,357 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:38:19,358 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:38:20,360 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:38:21,361 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:38:22,362 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:38:23,363 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:38:24,364 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:38:25,366 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:38:26,367 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:38:27,368 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:38:28,369 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:38:29,370 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:38:30,371 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:38:31,372 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:38:32,373 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:38:33,374 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:38:34,376 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:38:35,377 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:38:36,378 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:38:37,379 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:38:37,380 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 17:38:43,382 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:38:44,383 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:38:45,384 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:38:46,386 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:38:47,387 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:38:48,388 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:38:49,389 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:38:50,390 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:38:51,391 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:38:52,392 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:38:53,394 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:38:54,395 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:38:55,396 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:38:56,397 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:38:57,398 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:38:58,399 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:38:59,400 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:39:00,402 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:39:01,403 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:39:02,404 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:39:03,405 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:39:04,406 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:39:05,407 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:39:06,408 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:39:07,409 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:39:08,411 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:39:09,412 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:39:10,413 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:39:11,414 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:39:12,415 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:39:13,416 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:39:14,417 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:39:15,419 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:39:16,420 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:39:17,421 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:39:18,422 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:39:19,423 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:39:20,424 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:39:21,425 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:39:22,426 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:39:23,428 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:39:24,429 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:39:25,430 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:39:26,431 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:39:27,432 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:39:28,433 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:39:29,434 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:39:30,435 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:39:31,436 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:39:32,437 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:39:32,439 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 17:39:38,440 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:39:39,442 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:39:40,443 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:39:41,444 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:39:42,445 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:39:43,446 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:39:44,447 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:39:45,448 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:39:46,449 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:39:47,450 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:39:48,451 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:39:49,453 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:39:50,454 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:39:51,455 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:39:52,456 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:39:53,457 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:39:54,458 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:39:55,459 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:39:56,461 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:39:57,462 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:39:58,463 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:39:59,464 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:40:00,465 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:40:01,466 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:40:02,467 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:40:03,468 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:40:04,469 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:40:05,470 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:40:06,471 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:40:07,472 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:40:08,474 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:40:09,475 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:40:10,476 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:40:11,477 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:40:12,478 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:40:13,479 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:40:14,480 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:40:15,481 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:40:16,482 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:40:17,484 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:40:18,485 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:40:19,486 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:40:20,487 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:40:21,488 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:40:22,489 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:40:23,490 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:40:24,491 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:40:25,493 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:40:26,494 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:40:27,495 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:40:27,496 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 17:40:33,498 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:40:34,499 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:40:35,500 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:40:36,501 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:40:37,503 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:40:38,503 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:40:39,505 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:40:40,506 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:40:41,507 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:40:42,508 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:40:43,509 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:40:44,510 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:40:45,512 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:40:46,513 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:40:47,514 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:40:48,515 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:40:49,516 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:40:50,517 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:40:51,518 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:40:52,519 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:40:53,520 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:40:54,521 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:40:55,522 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:40:56,523 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:40:57,525 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:40:58,526 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:40:59,527 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:41:00,528 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:41:01,529 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:41:02,530 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:41:03,531 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:41:04,532 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:41:05,533 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:41:06,534 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:41:07,535 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:41:08,536 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:41:09,537 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:41:10,539 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:41:11,540 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:41:12,541 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:41:13,542 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:41:14,543 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:41:15,544 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:41:16,545 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:41:17,547 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:41:18,548 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:41:19,549 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:41:20,550 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:41:21,551 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:41:22,552 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:41:22,554 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 17:41:28,555 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:41:29,556 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:41:30,558 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:41:31,559 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:41:32,560 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:41:33,561 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:41:34,562 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:41:35,563 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:41:36,564 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:41:37,565 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:41:38,566 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:41:39,568 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:41:40,569 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:41:41,570 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:41:42,571 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:41:43,572 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:41:44,573 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:41:45,574 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:41:46,575 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:41:47,576 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:41:48,577 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:41:49,579 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:41:50,580 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:41:51,581 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:41:52,582 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:41:53,583 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:41:54,584 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:41:55,586 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:41:56,587 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:41:57,588 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:41:58,589 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:41:59,590 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:42:00,591 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:42:01,593 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:42:02,594 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:42:03,595 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:42:04,596 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:42:05,597 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:42:06,598 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:42:07,599 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:42:08,600 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:42:09,601 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:42:10,602 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:42:11,603 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:42:12,604 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:42:13,605 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:42:14,606 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:42:15,607 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:42:16,608 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:42:17,610 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:42:17,611 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 17:42:23,613 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:42:24,614 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:42:25,615 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:42:26,616 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:42:27,617 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:42:28,500 WARN com.cloudera.cmf.event.publish.EventStorePublisherWithRetry: Failed to publish event: SimpleEvent{attributes={ROLE=[hdfs-DATANODE-b30a464b10a57fdd49ea734cd52a8291], HOSTS=[dmidlkprdls04.svr.luc.edu], ROLE_TYPE=[DATANODE], CATEGORY=[LOG_MESSAGE], EVENTCODE=[EV_LOG_EVENT], SERVICE=[hdfs], SERVICE_TYPE=[HDFS], LOG_LEVEL=[WARN], HOST_IDS=[5c33df90-d247-4c6d-b9e0-5908a423580a], SEVERITY=[IMPORTANT]}, content=Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /var/lib/hadoop-hdfs/hadoop-http-auth-signature-secret, timestamp=1751909970885} - 1 of 60 failure(s) in last 1817s java.io.IOException: Error connecting to dmidlkprdls01.svr.luc.edu/192.168.158.1:7184 at com.cloudera.cmf.event.shaded.org.apache.avro.ipc.netty.NettyTransceiver.getChannel(NettyTransceiver.java:269) at com.cloudera.cmf.event.shaded.org.apache.avro.ipc.netty.NettyTransceiver.(NettyTransceiver.java:197) at com.cloudera.cmf.event.shaded.org.apache.avro.ipc.netty.NettyTransceiver.(NettyTransceiver.java:120) at com.cloudera.cmf.event.publish.AvroEventStorePublishProxy.checkSpecificRequestor(AvroEventStorePublishProxy.java:120) at com.cloudera.cmf.event.publish.AvroEventStorePublishProxy.publishEvent(AvroEventStorePublishProxy.java:204) at com.cloudera.cmf.event.publish.EventStorePublisherWithRetry$PublishEventTask.run(EventStorePublisherWithRetry.java:242) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:750) Caused by: com.cloudera.cmf.event.shaded.io.netty.channel.AbstractChannel$AnnotatedNoRouteToHostException: No route to host: dmidlkprdls01.svr.luc.edu/192.168.158.1:7184 Caused by: java.net.NoRouteToHostException: No route to host at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:716) at com.cloudera.cmf.event.shaded.io.netty.channel.socket.nio.NioSocketChannel.doFinishConnect(NioSocketChannel.java:337) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.finishConnect(AbstractNioChannel.java:339) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:776) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:724) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:650) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:562) at com.cloudera.cmf.event.shaded.io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997) at com.cloudera.cmf.event.shaded.io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) at java.lang.Thread.run(Thread.java:750) 2025-07-07 17:42:28,618 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:42:29,619 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:42:30,621 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:42:31,621 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:42:32,623 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:42:33,624 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:42:34,625 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:42:35,626 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:42:36,627 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:42:37,628 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:42:38,629 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:42:39,631 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:42:40,632 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:42:41,633 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:42:42,634 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:42:43,635 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:42:44,637 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:42:45,638 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:42:46,639 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:42:47,640 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:42:48,641 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:42:49,642 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:42:50,644 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:42:51,645 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:42:52,646 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:42:53,647 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:42:54,648 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:42:55,649 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:42:56,650 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:42:57,651 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:42:58,652 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:42:59,654 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:43:00,655 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:43:01,656 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:43:02,657 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:43:03,658 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:43:04,659 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:43:05,660 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:43:06,661 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:43:07,663 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:43:08,664 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:43:09,665 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:43:10,666 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:43:11,667 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:43:12,668 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:43:12,670 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 17:43:18,671 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:43:19,672 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:43:20,673 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:43:21,675 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:43:22,676 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:43:23,677 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:43:24,678 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:43:25,679 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:43:26,681 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:43:27,682 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:43:28,683 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:43:29,684 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:43:30,685 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:43:31,686 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:43:32,687 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:43:33,688 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:43:34,689 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:43:35,691 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:43:36,692 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:43:37,693 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:43:38,694 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:43:39,695 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:43:40,696 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:43:41,698 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:43:42,699 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:43:43,700 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:43:44,701 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:43:45,703 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:43:46,704 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:43:47,705 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:43:48,706 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:43:49,707 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:43:50,708 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:43:51,710 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:43:52,711 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:43:53,712 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:43:54,713 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 17:43:55,082 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: RECEIVED SIGNAL 15: SIGTERM 2025-07-07 17:43:55,088 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG: /************************************************************ SHUTDOWN_MSG: Shutting down DataNode at dmidlkprdls04.svr.luc.edu/192.168.158.4 ************************************************************/ 2025-07-07 17:43:55,091 ERROR org.apache.hadoop.conf.Configuration: error parsing conf core-default.xml java.io.FileNotFoundException: /opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/jars/hadoop-common-3.1.1.7.3.1.0-197.jar (No such file or directory) at java.util.zip.ZipFile.open(Native Method) at java.util.zip.ZipFile.(ZipFile.java:231) at java.util.zip.ZipFile.(ZipFile.java:157) at java.util.jar.JarFile.(JarFile.java:169) at java.util.jar.JarFile.(JarFile.java:106) at sun.net.www.protocol.jar.URLJarFile.(URLJarFile.java:93) at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69) at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99) at sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122) at sun.net.www.protocol.jar.JarURLConnection.getInputStream(JarURLConnection.java:152) at org.apache.hadoop.conf.Configuration.parse(Configuration.java:3023) at org.apache.hadoop.conf.Configuration.getStreamReader(Configuration.java:3119) at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:3077) at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:3050) at org.apache.hadoop.conf.Configuration.loadProps(Configuration.java:2928) at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2910) at org.apache.hadoop.conf.Configuration.get(Configuration.java:1264) at org.apache.hadoop.conf.Configuration.getTimeDuration(Configuration.java:1876) at org.apache.hadoop.conf.Configuration.getTimeDuration(Configuration.java:1853) at org.apache.hadoop.util.ShutdownHookManager.getShutdownTimeout(ShutdownHookManager.java:183) at org.apache.hadoop.util.ShutdownHookManager.shutdownExecutor(ShutdownHookManager.java:145) at org.apache.hadoop.util.ShutdownHookManager.access$300(ShutdownHookManager.java:65) at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:102) 2025-07-07 18:06:40,260 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: STARTUP_MSG: /************************************************************ STARTUP_MSG: Starting DataNode STARTUP_MSG: host = dmidlkprdls04.svr.luc.edu/192.168.158.4 STARTUP_MSG: args = [] STARTUP_MSG: version = 3.1.1.7.3.1.0-197 STARTUP_MSG: classpath = /var/run/cloudera-scm-agent/process/55-hdfs-DATANODE:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/audience-annotations-0.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/avro.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/aws-java-sdk-bundle-1.12.720.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-hdfs-plugin-shim-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-plugin-classloader-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-yarn-plugin-shim-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/azure-data-lake-store-sdk-2.3.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/checker-qual-3.33.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-beanutils-1.9.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-cli-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-codec-1.15.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-collections-3.2.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-compress-1.23.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-configuration2-2.10.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-io-2.11.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-lang-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-lang3-3.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-logging-1.1.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-math3-3.6.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-net-3.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-text-1.10.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-client-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-framework-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-recipes-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/failureaccess-1.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/gson-2.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/guava-32.0.1-jre.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/httpclient-4.5.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/httpcore-4.4.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/j2objc-annotations-2.8.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-core-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-core-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-databind-2.12.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-mapper-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-xc-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/javax.activation-api-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/javax.servlet-api-3.1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jaxb-api-2.2.11.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jcip-annotations-1.0-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-core-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-json-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-server-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-servlet-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jettison-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-http-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-io-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-security-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-server-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-servlet-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-util-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-util-ajax-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-webapp-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-xml-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jline-3.22.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsch-0.1.54.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsp-api-2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsr305-3.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsr311-api-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jul-to-slf4j-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-admin-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-client-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-common-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-core-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-crypto-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-identity-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-server-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-simplekdc-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-asn1-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-config-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-pkix-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-xdr-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/logredactor-2.0.16.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/lz4-java-1.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/metrics-core-3.2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-all-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-buffer-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-haproxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-http-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-http2-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-memcache-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-mqtt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-redis-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-smtp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-log4j12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-reload4j-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-api-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/reload4j-1.2.22.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/zookeeper.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-udt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-unix-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-kqueue-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final-linux-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-classes-kqueue-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-rxtx-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-classes-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-classes-macos-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-ssl-ocsp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-proxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-xml-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-stomp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-socks-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/wildfly-openssl-2.1.4.ClouderaFinal.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/protobuf-java-2.5.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/nimbus-jose-jwt-9.37.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/woodstox-core-5.4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/token-provider-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/stax2-api-4.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/snappy-java-1.1.10.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/re2j-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/zookeeper-jute.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-sctp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-kqueue-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final-linux-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//gcs-connector-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-annotations.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-auth.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-aws.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-datalake.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-kms.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-nfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//ozone-filesystem-hadoop3-1.3.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-nfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-kms-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-datalake-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-aws-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-auth-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-annotations-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//gcs-connector-2.1.2.7.3.1.0-197-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-thrift.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-scala_2.12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-protobuf.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-pig.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-pig-bundle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-jackson.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-hadoop.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-hadoop-bundle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-generator.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-format-structures.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-encoding.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-column.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-cascading3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-cascading.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-avro.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/./:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/audience-annotations-0.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/avro-1.11.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/checker-qual-3.33.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-beanutils-1.9.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-codec-1.15.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-collections-3.2.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-compress-1.23.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-configuration2-2.10.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-io-2.11.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-lang3-3.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-math3-3.6.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-net-3.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-text-1.10.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-client-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-framework-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-recipes-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/failureaccess-1.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/gson-2.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/guava-32.0.1-jre.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/httpclient-4.5.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/httpcore-4.4.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/j2objc-annotations-2.8.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-core-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-databind-2.12.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-jaxrs-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-xc-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/javax.activation-api-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/javax.servlet-api-3.1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jaxb-api-2.2.11.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jaxb-impl-2.2.3-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jcip-annotations-1.0-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-core-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-json-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-server-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-servlet-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jettison-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-http-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-io-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-security-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-server-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-servlet-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-util-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-util-ajax-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-webapp-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-xml-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jline-3.22.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsch-0.1.54.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/json-simple-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsr311-api-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-admin-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-client-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-common-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-core-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-crypto-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-identity-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-server-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-simplekdc-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-asn1-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-config-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-pkix-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-xdr-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/leveldbjni-cldr-1.9.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/lz4-java-1.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/metrics-core-3.2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-all-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-buffer-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-haproxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-http-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-http2-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-memcache-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-mqtt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-redis-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-smtp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-socks-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-stomp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-xml-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-proxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-ssl-ocsp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/re2j-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/zookeeper-3.8.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-sctp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-rxtx-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-unix-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-kqueue-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-kqueue-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final-linux-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final-linux-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-classes-kqueue-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-classes-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-classes-macos-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/nimbus-jose-jwt-9.37.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/woodstox-core-5.4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/token-provider-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/stax2-api-4.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/snappy-java-1.1.10.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/reload4j-1.2.22.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/zookeeper-jute-3.8.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-udt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-httpfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-nfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-nfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-httpfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//asm-5.0.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//aspectjweaver-1.9.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//azure-storage-7.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//checker-compat-qual-2.5.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-slf4j-backend-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-system-backend-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//google-extensions-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//azure-keyvault-core-1.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//accessors-smart-2.4.9.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gcs-connector-2.1.2.7.3.1.0-197-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archive-logs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archive-logs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archives-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archives.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-datajoin-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-datajoin.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-distcp-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-distcp.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-extras-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-extras.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-fs2img-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-fs2img.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-gridmix-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-gridmix.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-kafka-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-kafka.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-app-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-app.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-core-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-core.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-nativetask-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-nativetask.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-uploader-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-uploader.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-examples-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-examples.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-openstack-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-openstack.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-resourceestimator-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-resourceestimator.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-rumen-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-rumen.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-sls-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-sls.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-streaming-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-streaming.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ojalgo-43.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//kafka-clients-2.8.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//log4j-core-2.18.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-hook-abfs-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//forbiddenapis-3.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-intg-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//log4j-api-2.18.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//zstd-jni-1.4.9-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-i18n.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-s3-lib-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//javax.activation-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//bundle-2.23.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//json-smart-2.4.10.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-util-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-shell.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-cloud-bindings.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-hook-s3-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//aspectjrt-1.9.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/./:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/swagger-annotations-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/objenesis-1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jersey-client-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jakarta.xml.bind-api-2.3.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jakarta.activation-api-1.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-dataformat-yaml-2.9.10.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcutil-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcprov-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcpkix-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/HikariCP-java7-2.4.12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/snakeyaml-2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jsonschema2pojo-core-1.0.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/joda-time-2.10.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jna-5.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jersey-guice-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/javax.inject-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-module-jaxb-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-jaxrs-json-provider-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-jaxrs-base-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/guice-servlet-4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/guice-4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/geronimo-jcache_1.0_spec-1.0-alpha-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/fst-2.50.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/ehcache-3.3.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/dnsjava-2.1.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/codemodel-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-api.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-distributedshell.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-cloud-resourcemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-registry.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-nodemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-resourcemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-router.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-sharedcachemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-timeline-pluginstorage.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-web-proxy.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-api.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-core.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-registry-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-api-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-core-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-api-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-web-proxy-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-timeline-pluginstorage-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-tests-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-sharedcachemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-router-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-resourcemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-nodemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-cloud-resourcemanager-1.0.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-distributedshell-3.1.1.7.3.1.0-197.jar:/opt/cloudera/cm/lib/plugins/event-publish-7.13.1-shaded.jar:/opt/cloudera/cm/lib/plugins/tt-instrumentation-7.13.1.jar STARTUP_MSG: build = git@github.infra.cloudera.com:CDH/hadoop.git -r 31a42fb39494f541ffae15c3c61185deeeacca86; compiled by 'jenkins' on 2024-12-04T01:09Z STARTUP_MSG: java = 1.8.0_432 ************************************************************/ 2025-07-07 18:06:40,345 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: registered UNIX signal handlers for [TERM, HUP, INT] 2025-07-07 18:06:40,677 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d1/dfs/dn 2025-07-07 18:06:40,683 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d2/dfs/dn 2025-07-07 18:06:40,683 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d3/dfs/dn 2025-07-07 18:06:40,684 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d4/dfs/dn 2025-07-07 18:06:40,839 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: Loaded properties from hadoop-metrics2.properties 2025-07-07 18:06:40,942 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled Metric snapshot period at 10 second(s). 2025-07-07 18:06:40,943 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: DataNode metrics system started 2025-07-07 18:06:41,332 INFO org.apache.hadoop.hdfs.server.common.Util: dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling 2025-07-07 18:06:41,360 INFO org.apache.hadoop.hdfs.server.datanode.BlockScanner: Initialized block scanner with targetBytesPerSec 1048576 2025-07-07 18:06:41,368 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: File descriptor passing is enabled. 2025-07-07 18:06:41,370 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Configured hostname is dmidlkprdls04.svr.luc.edu 2025-07-07 18:06:41,370 INFO org.apache.hadoop.hdfs.server.common.Util: dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling 2025-07-07 18:06:41,377 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting DataNode with maxLockedMemory = 4294967296 2025-07-07 18:06:41,407 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened streaming server at /192.168.158.4:9866 2025-07-07 18:06:41,410 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Balancing bandwidth is 10485760 bytes/s 2025-07-07 18:06:41,410 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Number threads for balancing is 50 2025-07-07 18:06:41,414 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Balancing bandwidth is 10485760 bytes/s 2025-07-07 18:06:41,414 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Number threads for balancing is 50 2025-07-07 18:06:41,414 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Listening on UNIX domain socket: /var/run/hdfs-sockets/dn 2025-07-07 18:06:41,466 INFO org.eclipse.jetty.util.log: Logging initialized @2489ms to org.eclipse.jetty.util.log.Slf4jLog 2025-07-07 18:06:41,595 WARN org.apache.hadoop.security.authentication.server.AuthenticationFilter: Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /var/lib/hadoop-hdfs/hadoop-http-auth-signature-secret 2025-07-07 18:06:41,605 INFO org.apache.hadoop.http.HttpRequestLog: Http request log for http.requests.datanode is not defined 2025-07-07 18:06:41,614 INFO org.apache.hadoop.http.HttpServer2: Added global filter 'safety' (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter) 2025-07-07 18:06:41,616 INFO org.apache.hadoop.security.HttpCrossOriginFilterInitializer: CORS filter not enabled. Please set hadoop.http.cross-origin.enabled to 'true' to enable it 2025-07-07 18:06:41,617 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context datanode 2025-07-07 18:06:41,617 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context logs 2025-07-07 18:06:41,618 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context static 2025-07-07 18:06:41,661 INFO org.apache.hadoop.http.HttpServer2: Jetty bound to port 39489 2025-07-07 18:06:41,664 INFO org.eclipse.jetty.server.Server: jetty-9.4.54.v20240208; built: 2024-02-08T19:42:39.027Z; git: cef3fbd6d736a21e7d541a5db490381d95a2047d; jvm 1.8.0_432-b06 2025-07-07 18:06:41,705 INFO org.eclipse.jetty.server.session: DefaultSessionIdManager workerName=node0 2025-07-07 18:06:41,706 INFO org.eclipse.jetty.server.session: No SessionScavenger set, using defaults 2025-07-07 18:06:41,708 INFO org.eclipse.jetty.server.session: node0 Scavenging every 600000ms 2025-07-07 18:06:41,743 WARN org.apache.hadoop.security.authentication.server.AuthenticationFilter: Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /var/lib/hadoop-hdfs/hadoop-http-auth-signature-secret 2025-07-07 18:06:41,749 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.s.ServletContextHandler@1b11ef33{logs,/logs,file:///var/log/hadoop-hdfs/,AVAILABLE} 2025-07-07 18:06:41,750 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.s.ServletContextHandler@2f2bf0e2{static,/static,file:///opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/static/,AVAILABLE} 2025-07-07 18:06:41,866 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.w.WebAppContext@6ebd78d1{datanode,/,file:///opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/datanode/,AVAILABLE}{file:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/datanode} 2025-07-07 18:06:41,893 INFO org.eclipse.jetty.server.AbstractConnector: Started ServerConnector@7fb9f71f{HTTP/1.1, (http/1.1)}{localhost:39489} 2025-07-07 18:06:41,893 INFO org.eclipse.jetty.server.Server: Started @2916ms 2025-07-07 18:06:42,251 WARN com.cloudera.cmf.event.publish.EventStorePublisherWithRetry: Failed to publish event: SimpleEvent{attributes={ROLE=[hdfs-DATANODE-b30a464b10a57fdd49ea734cd52a8291], HOSTS=[dmidlkprdls04.svr.luc.edu], ROLE_TYPE=[DATANODE], CATEGORY=[LOG_MESSAGE], EVENTCODE=[EV_LOG_EVENT], SERVICE=[hdfs], SERVICE_TYPE=[HDFS], LOG_LEVEL=[WARN], HOST_IDS=[5c33df90-d247-4c6d-b9e0-5908a423580a], SEVERITY=[IMPORTANT]}, content=Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /var/lib/hadoop-hdfs/hadoop-http-auth-signature-secret, timestamp=1751929601595} java.io.IOException: Error connecting to dmidlkprdls01.svr.luc.edu/192.168.158.1:7184 at com.cloudera.cmf.event.shaded.org.apache.avro.ipc.netty.NettyTransceiver.getChannel(NettyTransceiver.java:269) at com.cloudera.cmf.event.shaded.org.apache.avro.ipc.netty.NettyTransceiver.(NettyTransceiver.java:197) at com.cloudera.cmf.event.shaded.org.apache.avro.ipc.netty.NettyTransceiver.(NettyTransceiver.java:120) at com.cloudera.cmf.event.publish.AvroEventStorePublishProxy.checkSpecificRequestor(AvroEventStorePublishProxy.java:120) at com.cloudera.cmf.event.publish.AvroEventStorePublishProxy.publishEvent(AvroEventStorePublishProxy.java:204) at com.cloudera.cmf.event.publish.EventStorePublisherWithRetry$PublishEventTask.run(EventStorePublisherWithRetry.java:242) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:750) Caused by: com.cloudera.cmf.event.shaded.io.netty.channel.AbstractChannel$AnnotatedNoRouteToHostException: No route to host: dmidlkprdls01.svr.luc.edu/192.168.158.1:7184 Caused by: java.net.NoRouteToHostException: No route to host at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:716) at com.cloudera.cmf.event.shaded.io.netty.channel.socket.nio.NioSocketChannel.doFinishConnect(NioSocketChannel.java:337) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.finishConnect(AbstractNioChannel.java:339) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:776) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:724) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:650) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:562) at com.cloudera.cmf.event.shaded.io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997) at com.cloudera.cmf.event.shaded.io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:06:42,276 INFO org.apache.hadoop.hdfs.server.datanode.web.DatanodeHttpServer: Listening HTTP traffic on /192.168.158.4:9864 2025-07-07 18:06:42,288 INFO org.apache.hadoop.util.JvmPauseMonitor: Starting JVM pause monitor 2025-07-07 18:06:42,289 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: dnUserName = hdfs 2025-07-07 18:06:42,289 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: supergroup = supergroup 2025-07-07 18:06:42,355 INFO org.apache.hadoop.ipc.CallQueueManager: Using callQueue: class java.util.concurrent.LinkedBlockingQueue queueCapacity: 1000 scheduler: class org.apache.hadoop.ipc.DefaultRpcScheduler 2025-07-07 18:06:42,373 INFO org.apache.hadoop.ipc.Server: Starting Socket Reader #1 for port 9867 2025-07-07 18:06:42,416 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened IPC server at /192.168.158.4:9867 2025-07-07 18:06:42,456 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Refresh request received for nameservices: null 2025-07-07 18:06:42,466 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting BPOfferServices for nameservices: 2025-07-07 18:06:42,483 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool (Datanode Uuid unassigned) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 starting to offer service 2025-07-07 18:06:42,490 INFO org.apache.hadoop.ipc.Server: IPC Server Responder: starting 2025-07-07 18:06:42,490 INFO org.apache.hadoop.ipc.Server: IPC Server listener on 9867: starting 2025-07-07 18:06:43,606 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:06:44,608 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:06:45,610 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:06:46,612 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:06:47,614 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:06:48,615 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:06:49,617 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:06:50,619 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:06:51,621 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:06:52,622 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:06:53,624 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:06:54,626 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:06:55,628 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:06:56,630 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:06:57,631 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:06:58,633 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:06:59,615 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:06:59,622 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:06:59,624 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 45 more 2025-07-07 18:06:59,635 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:06:59,944 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.produce(EatWhatYouKill.java:137) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:06:59,945 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.produce(EatWhatYouKill.java:137) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:06:59,947 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.produce(EatWhatYouKill.java:137) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 49 more 2025-07-07 18:07:00,637 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:07:01,639 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:07:02,641 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:07:03,642 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:07:04,644 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:07:05,646 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:07:06,647 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:07:07,649 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:07:08,651 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:07:09,653 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:07:10,655 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:07:11,657 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:07:12,658 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:07:13,660 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:07:14,662 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:07:15,664 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:07:16,666 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:07:17,668 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:07:18,670 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:07:19,672 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:07:19,860 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:07:19,862 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:07:19,863 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 18:07:20,673 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:07:21,675 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:07:22,677 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:07:23,678 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:07:24,680 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:07:25,682 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:07:26,683 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:07:27,685 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:07:28,687 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:07:29,689 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:07:30,690 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:07:31,692 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:07:32,694 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:07:32,698 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 18:07:38,701 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:07:39,703 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:07:40,705 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:07:41,706 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:07:42,708 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:07:43,710 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:07:44,712 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:07:45,714 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:07:46,715 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:07:47,717 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:07:48,719 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:07:49,720 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:07:50,722 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:07:51,724 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:07:52,726 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:07:53,727 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:07:54,729 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:07:55,731 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:07:56,733 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:07:57,734 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:07:58,736 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:07:59,738 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:08:00,740 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:08:01,741 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:08:02,743 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:08:03,745 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:08:04,746 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:08:05,748 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:08:06,750 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:08:07,752 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:08:08,753 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:08:09,755 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:08:10,757 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:08:11,758 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:08:12,760 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:08:13,762 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:08:14,764 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:08:15,765 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:08:16,767 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:08:17,769 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:08:18,770 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:08:19,772 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:08:19,843 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:08:19,844 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:08:19,845 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 18:08:20,774 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:08:21,776 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:08:22,777 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:08:23,779 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:08:24,781 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:08:25,783 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:08:26,785 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:08:27,786 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:08:27,789 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 18:08:33,791 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:08:34,793 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:08:35,795 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:08:36,796 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:08:37,798 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:08:38,800 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:08:39,801 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:08:40,803 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:08:41,805 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:08:42,806 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:08:43,808 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:08:44,810 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:08:45,812 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:08:46,814 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:08:47,815 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:08:48,817 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:08:49,819 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:08:50,820 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:08:51,822 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:08:52,824 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:08:53,826 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:08:54,827 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:08:55,829 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:08:56,831 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:08:57,832 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:08:58,834 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:08:59,836 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:09:00,837 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:09:01,839 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:09:02,840 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:09:03,842 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:09:04,843 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:09:05,845 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:09:06,847 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:09:07,848 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:09:08,850 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:09:09,852 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:09:10,853 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:09:11,855 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:09:12,856 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:09:13,858 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:09:14,860 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:09:15,862 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:09:16,864 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:09:17,865 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:09:18,867 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:09:19,842 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:09:19,843 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:09:19,845 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 18:09:19,868 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:09:20,871 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:09:21,873 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:09:22,874 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:09:22,876 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 18:09:28,878 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:09:29,880 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:09:30,881 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:09:31,883 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:09:32,885 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:09:33,886 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:09:34,888 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:09:35,890 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:09:36,891 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:09:37,893 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:09:38,895 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:09:39,896 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:09:40,898 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:09:41,899 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:09:42,901 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:09:43,903 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:09:44,905 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:09:45,906 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:09:46,908 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:09:47,910 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:09:48,911 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:09:49,913 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:09:50,915 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:09:51,917 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:09:52,918 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:09:53,920 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:09:54,921 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:09:55,923 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:09:56,925 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:09:57,926 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:09:58,928 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:09:59,930 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:10:00,931 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:10:01,933 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:10:02,935 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:10:03,936 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:10:04,938 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:10:05,940 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:10:06,941 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:10:07,943 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:10:08,944 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:10:09,946 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:10:10,947 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:10:11,949 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:10:12,951 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:10:13,953 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:10:14,954 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:10:15,956 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:10:16,958 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:10:17,959 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:10:17,961 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 18:10:19,854 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:10:19,856 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:10:19,857 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 18:10:23,963 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:10:24,965 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:10:25,967 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:10:26,968 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:10:27,970 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:10:28,971 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:10:29,973 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:10:30,974 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:10:31,976 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:10:32,978 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:10:33,979 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:10:34,981 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:10:35,983 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:10:36,985 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:10:37,986 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:10:38,988 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:10:39,990 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:10:40,991 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:10:41,993 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:10:42,995 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:10:43,996 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:10:44,999 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:10:46,000 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:10:47,002 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:10:48,004 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:10:49,005 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:10:50,007 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:10:51,009 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:10:52,010 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:10:53,012 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:10:54,013 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:10:55,015 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:10:56,017 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:10:57,018 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:10:58,020 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:10:59,021 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:11:00,023 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:11:01,025 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:11:02,026 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:11:03,027 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:11:04,029 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:11:05,030 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:11:06,032 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:11:07,033 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:11:08,035 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:11:09,036 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:11:10,038 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:11:11,039 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:11:12,041 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:11:13,042 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:11:13,044 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 18:11:19,047 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:11:19,841 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:11:19,842 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:11:19,843 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 18:11:20,048 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:11:21,050 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:11:22,052 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:11:23,053 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:11:24,054 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:11:25,056 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:11:26,057 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:11:27,059 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:11:28,060 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:11:29,062 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:11:30,063 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:11:31,065 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:11:32,066 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:11:33,068 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:11:34,069 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:11:35,071 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:11:36,072 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:11:37,074 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:11:38,075 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:11:39,077 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:11:40,078 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:11:41,080 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:11:42,081 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:11:43,083 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:11:44,084 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:11:45,086 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:11:46,087 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:11:47,089 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:11:48,090 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:11:49,092 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:11:50,093 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:11:51,095 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:11:52,097 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:11:53,098 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:11:54,100 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:11:55,101 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:11:56,102 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:11:57,104 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:11:58,105 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:11:59,107 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:12:00,108 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:12:01,110 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:12:02,111 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:12:03,112 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:12:04,114 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:12:05,115 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:12:06,117 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:12:07,118 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:12:08,120 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:12:08,122 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 18:12:14,124 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:12:15,126 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:12:16,127 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:12:17,129 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:12:18,130 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:12:19,132 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:12:19,861 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:12:19,862 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:12:19,863 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 18:12:20,133 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:12:21,135 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:12:22,136 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:12:23,138 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:12:24,139 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:12:25,140 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:12:26,142 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:12:27,143 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:12:28,145 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:12:29,146 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:12:30,148 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:12:31,149 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:12:32,150 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:12:33,152 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:12:34,153 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:12:35,154 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:12:36,156 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:12:37,157 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:12:38,159 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:12:39,160 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:12:40,161 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:12:41,163 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:12:42,164 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:12:43,166 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:12:44,167 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:12:45,169 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:12:46,170 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:12:47,172 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:12:48,173 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:12:49,174 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:12:50,176 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:12:51,178 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:12:52,179 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:12:53,180 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:12:54,182 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:12:55,184 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:12:56,185 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:12:57,186 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:12:58,188 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:12:59,189 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:13:00,190 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:13:01,192 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:13:02,193 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:13:03,195 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:13:03,196 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 18:13:09,199 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:13:10,200 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:13:11,201 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:13:12,203 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:13:13,204 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:13:14,206 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:13:15,207 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:13:16,209 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:13:17,210 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:13:18,212 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:13:19,213 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:13:19,841 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:13:19,842 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:13:19,843 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 18:13:20,214 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:13:21,216 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:13:22,217 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:13:23,218 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:13:24,220 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:13:25,221 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:13:26,222 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:13:27,224 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:13:28,225 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:13:29,227 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:13:30,228 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:13:31,229 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:13:32,231 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:13:33,232 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:13:34,233 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:13:35,235 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:13:36,236 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:13:37,238 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:13:38,239 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:13:39,240 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:13:40,241 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:13:41,243 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:13:42,244 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:13:43,245 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:13:44,247 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:13:45,248 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:13:46,249 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:13:47,251 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:13:48,252 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:13:49,253 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:13:50,254 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:13:51,256 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:13:52,257 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:13:53,258 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:13:54,260 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:13:55,261 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:13:56,262 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:13:57,264 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:13:58,265 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:13:58,266 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 18:14:04,268 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:14:05,270 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:14:06,271 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:14:07,272 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:14:08,273 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:14:09,275 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:14:10,276 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:14:11,277 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:14:12,278 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:14:13,280 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:14:14,281 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:14:15,282 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:14:16,284 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:14:17,285 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:14:18,286 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:14:19,287 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:14:19,843 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:14:19,844 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:14:19,846 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 18:14:20,289 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:14:21,290 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:14:22,291 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:14:23,292 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:14:24,294 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:14:44,315 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); maxRetries=45 2025-07-07 18:14:49,396 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:14:52,468 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:14:55,540 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:14:58,612 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:15:01,684 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:15:04,756 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:15:07,828 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:15:10,900 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:15:13,972 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:15:17,044 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:15:19,856 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:15:19,857 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:15:19,858 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 18:15:20,116 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:15:23,188 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:15:26,260 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:15:29,332 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:15:32,404 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:15:35,476 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:15:38,548 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:15:41,620 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:15:44,692 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:15:47,764 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:15:50,836 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:15:53,909 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:15:56,980 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:16:00,052 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:16:01,053 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:16:02,055 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:16:03,056 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:16:04,057 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:16:05,059 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:16:05,060 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 18:16:11,062 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:16:12,064 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:16:13,065 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:16:14,067 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:16:15,068 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:16:16,069 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:16:17,071 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:16:18,072 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:16:19,073 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:16:19,842 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:16:19,843 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:16:19,844 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 18:16:20,075 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:16:21,076 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:16:22,077 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:16:23,079 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:16:24,080 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:16:25,081 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:16:26,082 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:16:27,084 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:16:28,085 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:16:29,086 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:16:30,088 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:16:31,089 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:16:32,090 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:16:33,092 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:16:34,093 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:16:35,094 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:16:36,095 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:16:37,097 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:16:38,098 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:16:39,099 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:16:40,101 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:16:41,102 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:16:42,103 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:16:43,105 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:16:44,106 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:16:45,107 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:16:46,109 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:16:47,110 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:16:48,111 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:16:49,113 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:16:50,114 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:16:51,116 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:16:52,117 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:16:53,119 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:16:54,120 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:16:55,121 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:16:56,123 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:16:57,124 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:16:58,125 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:16:59,126 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:17:00,128 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:17:00,129 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 18:17:06,131 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:17:07,133 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:17:08,134 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:17:09,135 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:17:10,137 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:17:11,138 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:17:12,139 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:17:13,141 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:17:14,142 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:17:15,143 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:17:16,145 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:17:17,146 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:17:18,148 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:17:19,149 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:17:19,855 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:17:19,857 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:17:19,858 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 18:17:20,150 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:17:21,152 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:17:22,153 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:17:23,155 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:17:24,156 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:17:25,157 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:17:26,159 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:17:27,160 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:17:28,161 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:17:29,163 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:17:30,164 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:17:31,165 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:17:32,167 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:17:33,168 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:17:34,169 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:17:35,171 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:17:36,172 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:17:37,173 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:17:38,175 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:17:39,176 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:17:40,177 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:17:41,179 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:17:42,180 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:17:43,181 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:17:44,183 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:17:45,184 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:17:46,185 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:17:47,187 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:17:48,188 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:17:49,190 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:17:50,191 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:17:51,193 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:17:52,194 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:17:53,196 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:17:54,197 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:17:55,198 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:17:55,200 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 18:18:01,202 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:18:02,203 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:18:03,205 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:18:04,206 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:18:05,207 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:18:06,209 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:18:07,210 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:18:08,211 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:18:09,213 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:18:10,214 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:18:11,215 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:18:12,217 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:18:13,218 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:18:14,220 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:18:15,221 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:18:16,222 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:18:17,224 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:18:18,225 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:18:19,227 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:18:19,843 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:18:19,844 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:18:19,845 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 18:18:20,228 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:18:21,230 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:18:22,231 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:18:23,232 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:18:24,234 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:18:25,235 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:18:26,236 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:18:27,238 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:18:28,239 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:18:29,241 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:18:30,242 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:18:31,243 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:18:32,245 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:18:33,246 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:18:34,247 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:18:35,249 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:18:36,250 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:18:37,251 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:18:38,253 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:18:39,254 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:18:40,255 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:18:41,257 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:18:42,258 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:18:43,260 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:18:44,261 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:18:45,262 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:18:46,263 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:18:47,265 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:18:48,266 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:18:49,268 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:18:50,269 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:18:50,271 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 18:18:56,273 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:18:57,274 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:18:58,276 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:18:59,277 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:19:00,279 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:19:01,280 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:19:02,281 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:19:03,283 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:19:04,284 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:19:05,285 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:19:06,287 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:19:07,288 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:19:08,289 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:19:09,291 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:19:10,292 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:19:11,293 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:19:12,294 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:19:13,296 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:19:14,297 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:19:15,298 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:19:16,300 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:19:17,301 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:19:18,302 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:19:19,304 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:19:19,856 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:19:19,857 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:19:19,858 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 18:19:20,305 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:19:21,306 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:19:22,308 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:19:23,309 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:19:24,310 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:19:25,312 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:19:26,313 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:19:27,314 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:19:28,316 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:19:29,317 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:19:30,318 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:19:31,320 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:19:32,321 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:19:33,322 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:19:34,324 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:19:35,325 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:19:36,326 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:19:37,328 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:19:38,329 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:19:39,330 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:19:40,332 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:19:41,333 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:19:42,334 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:19:43,336 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:19:44,337 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:19:45,338 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:19:45,340 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 18:19:51,342 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:19:52,344 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:19:53,345 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:19:54,346 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:19:55,348 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:19:56,349 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:19:57,350 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:19:58,352 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:19:59,353 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:20:00,354 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:20:01,356 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:20:02,357 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:20:03,358 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:20:04,360 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:20:05,361 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:20:06,362 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:20:07,364 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:20:08,365 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:20:09,366 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:20:10,368 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:20:11,369 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:20:12,370 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:20:13,372 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:20:14,373 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:20:15,374 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:20:16,376 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:20:17,377 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:20:18,379 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:20:19,380 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:20:19,862 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:20:19,863 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:20:19,865 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 18:20:20,381 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:20:21,383 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:20:22,384 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:20:23,386 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:20:24,387 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:20:25,388 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:20:26,390 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:20:27,391 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:20:28,392 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:20:29,394 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:20:30,395 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:20:31,396 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:20:32,397 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:20:33,399 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:20:34,400 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:20:35,401 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:20:36,403 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:20:37,404 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:20:38,405 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:20:39,407 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:20:40,408 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:20:40,410 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 18:20:46,412 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:20:47,414 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:20:48,415 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:20:49,416 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:20:50,418 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:20:51,419 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:20:52,421 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:20:53,422 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:20:54,423 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:20:55,425 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:20:56,426 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:20:57,427 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:20:58,429 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:20:59,430 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:21:00,432 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:21:01,433 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:21:02,434 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:21:03,436 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:21:04,437 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:21:05,438 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:21:06,440 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:21:07,441 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:21:08,442 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:21:09,444 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:21:10,445 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:21:11,446 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:21:12,448 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:21:13,449 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:21:14,450 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:21:15,452 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:21:16,453 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:21:17,454 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:21:18,456 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:21:19,457 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:21:19,855 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:21:19,856 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:21:19,857 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 18:21:20,458 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:21:21,460 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:21:22,461 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:21:23,462 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:21:24,464 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:21:25,465 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:21:26,467 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:21:27,468 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:21:28,469 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:21:29,471 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:21:30,472 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:21:31,473 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:21:32,475 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:21:33,476 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:21:34,477 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:21:35,479 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:21:35,481 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 18:21:41,483 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:21:42,484 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:21:43,486 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:21:44,487 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:21:45,488 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:21:46,490 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:21:47,491 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:21:48,493 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:21:49,494 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:21:50,495 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:21:51,497 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:21:52,498 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:21:53,499 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:21:54,501 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:21:55,502 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:21:56,503 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:21:57,505 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:21:58,506 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:21:59,507 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:22:00,509 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:22:01,510 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:22:02,511 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:22:03,513 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:22:04,514 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:22:05,515 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:22:06,517 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:22:07,518 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:22:08,520 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:22:09,521 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:22:10,522 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:22:11,524 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:22:12,525 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:22:13,526 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:22:14,528 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:22:15,529 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:22:16,530 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:22:17,531 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:22:18,533 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:22:19,534 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:22:19,842 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:22:19,843 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:22:19,843 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 18:22:20,535 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:22:21,537 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:22:22,538 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:22:23,540 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:22:24,541 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:22:25,542 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:22:26,544 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:22:27,545 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:22:28,546 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:22:29,548 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:22:30,549 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:22:30,552 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 18:22:36,554 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:22:37,555 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:22:38,556 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:22:39,558 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:22:40,559 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:22:41,560 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:22:42,562 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:22:43,563 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:22:44,564 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:22:45,566 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:22:46,567 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:22:47,568 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:22:48,570 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:22:49,571 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:22:50,572 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:22:51,574 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:22:52,575 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:22:53,576 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:22:54,578 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:22:55,579 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:22:56,580 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:22:57,582 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:22:58,583 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:22:59,584 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:23:00,585 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:23:01,587 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:23:02,588 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:23:03,590 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:23:04,591 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:23:05,592 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:23:06,593 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:23:07,595 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:23:08,596 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:23:09,597 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:23:10,599 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:23:11,600 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:23:12,601 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:23:13,603 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:23:14,604 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:23:15,605 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:23:16,606 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:23:17,608 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:23:18,609 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:23:19,610 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:23:19,858 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:23:19,859 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:23:19,860 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 18:23:20,612 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:23:21,613 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:23:22,614 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:23:23,616 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:23:24,617 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:23:25,618 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:23:25,621 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 18:23:31,622 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:23:32,624 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:23:33,625 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:23:34,626 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:23:35,628 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:23:36,629 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:23:37,630 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:23:38,631 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:23:39,633 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:23:40,634 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:23:41,636 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:23:42,637 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:23:43,638 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:23:44,639 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:23:45,641 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:23:46,642 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:23:47,643 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:23:48,644 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:23:49,646 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:23:50,647 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:23:51,649 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:23:52,650 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:23:53,651 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:23:54,653 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:23:55,654 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:23:56,655 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:23:57,657 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:23:58,658 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:23:59,659 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:24:00,661 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:24:01,662 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:24:02,663 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:24:03,665 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:24:04,666 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:24:05,667 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:24:06,668 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:24:07,670 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:24:08,671 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:24:09,672 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:24:10,674 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:24:11,675 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:24:12,676 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:24:13,678 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:24:14,679 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:24:15,680 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:24:16,682 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:24:17,683 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:24:18,684 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:24:19,686 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:24:19,845 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:24:19,846 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:24:19,848 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 18:24:20,687 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:24:20,689 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 18:24:26,690 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:24:27,692 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:24:28,693 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:24:29,694 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:24:30,696 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:24:31,697 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:24:32,698 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:24:33,700 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:24:34,701 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:24:35,702 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:24:36,704 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:24:37,705 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:24:38,706 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:24:39,707 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:24:40,709 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:24:41,710 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:24:42,711 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:24:43,713 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:24:44,714 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:24:45,715 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:24:46,717 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:24:47,718 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:24:48,719 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:24:49,721 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:24:50,722 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:24:51,724 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:24:52,725 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:24:53,727 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:24:54,728 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:24:55,729 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:24:56,731 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:24:57,732 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:24:58,733 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:24:59,735 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:25:00,736 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:25:01,737 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:25:02,739 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:25:03,740 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:25:04,741 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:25:05,743 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:25:06,744 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:25:07,746 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:25:08,747 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:25:09,748 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:25:10,749 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:25:11,751 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:25:12,752 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:25:13,754 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:25:14,755 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:25:15,756 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:25:15,758 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 18:25:19,859 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:25:19,860 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:25:19,861 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 18:25:21,760 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:25:22,762 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:25:23,763 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:25:24,764 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:25:25,765 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:25:26,767 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:25:27,768 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:25:28,769 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:25:29,771 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:25:30,772 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:25:31,773 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:25:32,775 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:25:33,776 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:25:34,777 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:25:35,779 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:25:36,780 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:25:37,781 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:25:38,783 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:25:39,784 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:25:40,786 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:25:41,787 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:25:42,788 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:25:43,790 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:25:44,791 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:25:45,792 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:25:46,794 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:25:47,795 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:25:48,796 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:25:49,798 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:25:50,799 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:25:51,801 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:25:52,802 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:25:53,803 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:25:54,805 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:25:55,806 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:25:56,807 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:25:57,809 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:25:58,810 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:25:59,811 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:26:00,813 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:26:01,814 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:26:02,815 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:26:03,817 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:26:04,818 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:26:05,819 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:26:06,821 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:26:07,822 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:26:08,823 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:26:09,825 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:26:10,826 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:26:10,828 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 18:26:16,831 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:26:17,832 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:26:18,833 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:26:19,835 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:26:19,848 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:26:19,849 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:26:19,850 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 18:26:20,836 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:26:21,837 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:26:22,839 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:26:23,840 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:26:24,841 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:26:25,843 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:26:26,844 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:26:27,846 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:26:28,847 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:26:29,848 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:26:30,850 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:26:31,851 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:26:32,852 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:26:33,853 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:26:34,855 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:26:35,856 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:26:36,857 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:26:37,859 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:26:38,860 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:26:39,862 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:26:40,863 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:26:41,864 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:26:42,866 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:26:43,867 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:26:44,868 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:26:45,870 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:26:46,871 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:26:47,873 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:26:48,874 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:26:49,875 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:26:50,877 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:26:51,879 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:26:52,880 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:26:53,881 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:26:54,883 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:26:55,884 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:26:56,885 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:26:57,887 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:26:58,888 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:26:59,889 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:27:00,891 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:27:01,892 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:27:02,893 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:27:03,895 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:27:04,896 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:27:05,897 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:27:05,900 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 18:27:11,902 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:27:12,903 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:27:13,904 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:27:14,906 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:27:15,907 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:27:16,908 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:27:17,910 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:27:18,911 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:27:19,858 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:27:19,859 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:27:19,860 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 18:27:19,912 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:27:20,914 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:27:21,915 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:27:22,916 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:27:23,918 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:27:24,919 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:27:25,920 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:27:26,922 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:27:27,923 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:27:28,925 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:27:29,926 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:27:30,927 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:27:31,928 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:27:32,930 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:27:33,931 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:27:34,933 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:27:35,934 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:27:36,935 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:27:37,937 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:27:38,938 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:27:39,939 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:27:40,941 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:27:41,942 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:27:42,943 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:27:43,945 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:27:44,946 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:27:45,947 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:27:46,949 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:27:47,950 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:27:48,951 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:27:49,953 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:27:50,955 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:27:51,957 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:27:52,958 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:27:53,959 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:27:54,961 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:27:55,962 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:27:56,963 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:27:57,965 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:27:58,966 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:27:59,967 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:28:00,968 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:28:00,970 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 18:28:06,972 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:28:07,974 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:28:08,975 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:28:09,976 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:28:10,977 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:28:11,979 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:28:12,980 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:28:13,982 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:28:14,983 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:28:15,984 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:28:16,985 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:28:17,987 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:28:18,988 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:28:19,845 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:28:19,846 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:28:19,847 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 18:28:19,990 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:28:20,991 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:28:21,992 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:28:22,994 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:28:23,995 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:28:24,996 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:28:25,998 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:28:26,999 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:28:28,000 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:28:29,002 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:28:30,003 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:28:31,005 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:28:32,006 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:28:33,007 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:28:34,009 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:28:35,010 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:28:36,011 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:28:37,013 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:28:38,014 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:28:39,015 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:28:40,017 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:28:41,018 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:28:42,019 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:28:43,021 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:28:44,022 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:28:45,023 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:28:46,024 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:28:47,026 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:28:48,027 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:28:49,028 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:28:50,030 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:28:51,032 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:28:52,033 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:28:53,035 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:28:54,036 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:28:55,037 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:28:56,038 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:28:56,040 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 18:29:02,042 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:29:03,043 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:29:04,045 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:29:05,046 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:29:06,047 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:29:07,049 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:29:08,050 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:29:09,051 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:29:10,053 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:29:11,054 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:29:12,056 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:29:13,057 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:29:14,058 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:29:15,060 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:29:16,061 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:29:17,062 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:29:18,064 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:29:19,065 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:29:19,847 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:29:19,848 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:29:19,849 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 18:29:20,066 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:29:21,068 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:29:22,069 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:29:23,071 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:29:24,072 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:29:25,073 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:29:26,075 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:29:27,076 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:29:28,077 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:29:29,079 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:29:30,080 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:29:31,081 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:29:32,083 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:29:33,084 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:29:34,085 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:29:35,087 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:29:36,088 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:29:37,089 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:29:38,091 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:29:39,092 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:29:40,093 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:29:41,095 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:29:42,096 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:29:43,097 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:29:44,099 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:29:45,100 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:29:46,101 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:29:47,103 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:29:48,104 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:29:49,106 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:29:50,107 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:29:51,109 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:29:51,111 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 18:29:57,112 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:29:58,114 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:29:59,115 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:30:00,117 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:30:01,118 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:30:02,119 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:30:03,120 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:30:04,122 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:30:05,123 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:30:06,124 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:30:07,126 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:30:08,127 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:30:09,128 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:30:10,130 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:30:11,131 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:30:12,132 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:30:13,134 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:30:14,135 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:30:15,136 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:30:16,138 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:30:17,139 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:30:18,140 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:30:19,142 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:30:19,863 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:30:19,864 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:30:19,865 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 18:30:20,143 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:30:21,144 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:30:22,146 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:30:23,147 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:30:24,149 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:30:25,150 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:30:26,151 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:30:27,153 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:30:28,154 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:30:29,155 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:30:30,157 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:30:31,158 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:30:32,159 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:30:33,160 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:30:34,162 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:30:35,163 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:30:36,164 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:30:37,166 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:30:38,167 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:30:39,168 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:30:40,170 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:30:41,171 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:30:42,172 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:30:43,174 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:30:44,175 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:30:45,176 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:30:46,178 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:30:46,180 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 18:30:52,182 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:30:53,183 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:30:54,185 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:30:55,186 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:30:56,188 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:30:57,189 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:30:58,190 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:30:59,191 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:31:00,193 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:31:01,194 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:31:02,195 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:31:03,197 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:31:04,198 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:31:05,199 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:31:06,201 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:31:07,202 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:31:08,203 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:31:09,205 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:31:10,206 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:31:11,207 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:31:12,209 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:31:13,210 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:31:14,211 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:31:15,212 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:31:16,214 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:31:17,215 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:31:18,216 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:31:19,217 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:31:19,847 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:31:19,848 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:31:19,850 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 18:31:20,219 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:31:21,220 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:31:22,222 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:31:23,223 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:31:24,224 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:31:25,226 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:31:26,227 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:31:27,228 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:31:28,230 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:31:29,231 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:31:30,232 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:31:31,234 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:31:32,235 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:31:33,236 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:31:34,237 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:31:35,239 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:31:36,240 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:31:37,241 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:31:38,243 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:31:39,244 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:31:40,245 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:31:41,247 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:31:41,249 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 18:31:47,251 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:31:48,252 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:31:49,254 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:31:50,255 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:31:51,256 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:31:52,258 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:31:53,259 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:31:54,260 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:31:55,262 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:31:56,263 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:31:57,264 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:31:58,265 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:31:59,266 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:32:00,268 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:32:01,269 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:32:02,270 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:32:03,271 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:32:04,273 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:32:05,274 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:32:06,275 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:32:07,276 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:32:08,278 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:32:09,279 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:32:10,280 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:32:11,282 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:32:12,283 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:32:13,284 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:32:14,285 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:32:15,287 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:32:16,288 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:32:17,289 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:32:18,290 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:32:19,292 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:32:19,863 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:32:19,864 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:32:19,865 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 18:32:20,293 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:32:21,294 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:32:22,296 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:32:23,297 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:32:24,298 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:32:25,299 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:32:26,301 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:32:27,302 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:32:28,303 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:32:29,305 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:32:30,306 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:32:31,307 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:32:32,308 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:32:33,310 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:32:34,311 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:32:35,312 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:32:36,313 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:32:36,315 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 18:32:42,317 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:32:43,318 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:32:44,320 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:32:45,321 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:32:46,322 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:32:47,323 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:32:48,325 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:32:49,326 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:32:50,327 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:32:51,329 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:32:52,330 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:32:53,331 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:32:54,333 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:32:55,334 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:32:56,335 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:32:57,336 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:32:58,338 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:32:59,339 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:33:00,340 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:33:01,342 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:33:02,343 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:33:03,344 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:33:04,345 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:33:05,347 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:33:06,348 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:33:07,349 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:33:08,351 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:33:09,352 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:33:10,353 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:33:11,354 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:33:12,355 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:33:13,357 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:33:14,358 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:33:15,359 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:33:16,361 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:33:17,362 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:33:18,363 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:33:19,364 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:33:19,850 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:33:19,851 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:33:19,852 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 18:33:20,366 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:33:21,367 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:33:22,368 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:33:23,370 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:33:24,371 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:33:25,372 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:33:26,374 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:33:27,375 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:33:28,376 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:33:29,378 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:33:30,379 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:33:31,380 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:33:31,382 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 18:33:37,384 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:33:38,385 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:33:39,387 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:33:40,388 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:33:41,389 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:33:42,391 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:33:43,392 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:33:44,393 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:33:45,395 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:33:46,396 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:33:47,397 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:33:48,398 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:33:49,400 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:33:50,401 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:33:51,402 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:33:52,404 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:33:53,405 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:33:54,406 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:33:55,408 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:33:56,409 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:33:57,410 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:33:58,412 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:33:59,413 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:34:00,414 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:34:01,416 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:34:02,417 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:34:03,418 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:34:04,419 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:34:05,421 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:34:06,422 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:34:07,423 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:34:08,425 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:34:09,426 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:34:10,427 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:34:11,428 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:34:12,430 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:34:13,431 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:34:14,432 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:34:15,433 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:34:16,435 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:34:17,436 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:34:18,437 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:34:19,439 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:34:19,860 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:34:19,861 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:34:19,864 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 18:34:20,440 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:34:21,441 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:34:22,443 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:34:23,444 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:34:24,445 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:34:25,447 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:34:26,448 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:34:26,450 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 18:34:32,452 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:34:33,454 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:34:34,455 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:34:35,456 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:34:36,457 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:34:37,459 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:34:38,460 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:34:39,461 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:34:40,462 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:34:41,464 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:34:42,465 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:34:43,466 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:34:44,467 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:34:45,469 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:34:46,470 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:34:47,471 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:34:48,473 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:34:49,474 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:34:50,475 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:34:51,476 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:34:52,478 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:34:53,479 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:34:54,481 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:34:55,482 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:34:56,483 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:34:57,485 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:34:58,486 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:34:59,487 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:35:00,488 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:35:01,490 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:35:02,491 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:35:03,492 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:35:04,494 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:35:05,495 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:35:06,496 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:35:07,497 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:35:08,499 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:35:09,500 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:35:10,501 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:35:11,503 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:35:12,504 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:35:13,505 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:35:14,507 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:35:15,508 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:35:16,509 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:35:17,511 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:35:18,512 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:35:19,513 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:35:19,849 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:35:19,850 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:35:19,851 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 18:35:20,515 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:35:21,516 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:35:21,517 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 18:35:27,519 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:35:28,521 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:35:29,522 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:35:30,523 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:35:31,525 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:35:32,526 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:35:33,527 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:35:34,529 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:35:35,530 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:35:36,531 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:35:37,533 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:35:38,534 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:35:39,535 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:35:40,537 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:35:41,538 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:35:42,539 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:35:43,541 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:35:44,542 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:35:45,543 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:35:46,544 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:35:47,546 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:35:48,547 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:35:49,548 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:35:50,550 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:35:51,551 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:35:52,553 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:35:53,554 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:35:54,555 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:35:55,557 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:35:56,558 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:35:57,559 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:35:58,561 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:35:59,562 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:36:00,563 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:36:01,565 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:36:02,566 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:36:03,567 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:36:04,569 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:36:05,570 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:36:06,571 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:36:07,573 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:36:08,574 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:36:09,575 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:36:10,576 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:36:11,578 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:36:12,579 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:36:13,580 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:36:14,581 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:36:15,583 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:36:16,584 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:36:16,586 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 18:36:19,853 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:36:19,854 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:36:19,855 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 45 more 2025-07-07 18:36:22,588 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:36:23,589 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:36:24,591 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:36:25,592 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:36:26,593 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:36:27,595 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:36:28,596 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:36:29,597 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:36:30,599 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:36:31,600 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:36:32,601 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:36:33,602 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:36:34,604 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:36:35,605 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:36:36,606 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:36:37,608 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:36:38,609 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:36:39,610 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:36:40,611 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:36:41,613 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:36:42,614 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:36:43,615 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:36:44,617 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:36:45,618 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:36:46,619 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:36:47,620 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:36:48,622 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:36:49,623 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:36:50,624 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:36:51,626 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:36:52,627 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:36:53,628 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:36:54,630 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:36:55,631 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:36:56,632 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:36:57,633 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:36:58,635 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:36:59,636 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:37:00,637 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:37:01,629 WARN com.cloudera.cmf.event.publish.EventStorePublisherWithRetry: Failed to publish event: SimpleEvent{attributes={ROLE=[hdfs-DATANODE-b30a464b10a57fdd49ea734cd52a8291], HOSTS=[dmidlkprdls04.svr.luc.edu], ROLE_TYPE=[DATANODE], CATEGORY=[LOG_MESSAGE], EVENTCODE=[EV_LOG_EVENT], SERVICE=[hdfs], SERVICE_TYPE=[HDFS], LOG_LEVEL=[WARN], HOST_IDS=[5c33df90-d247-4c6d-b9e0-5908a423580a], SEVERITY=[IMPORTANT]}, content=Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /var/lib/hadoop-hdfs/hadoop-http-auth-signature-secret, timestamp=1751929601595} - 1 of 60 failure(s) in last 1819s java.io.IOException: Error connecting to dmidlkprdls01.svr.luc.edu/192.168.158.1:7184 at com.cloudera.cmf.event.shaded.org.apache.avro.ipc.netty.NettyTransceiver.getChannel(NettyTransceiver.java:269) at com.cloudera.cmf.event.shaded.org.apache.avro.ipc.netty.NettyTransceiver.(NettyTransceiver.java:197) at com.cloudera.cmf.event.shaded.org.apache.avro.ipc.netty.NettyTransceiver.(NettyTransceiver.java:120) at com.cloudera.cmf.event.publish.AvroEventStorePublishProxy.checkSpecificRequestor(AvroEventStorePublishProxy.java:120) at com.cloudera.cmf.event.publish.AvroEventStorePublishProxy.publishEvent(AvroEventStorePublishProxy.java:204) at com.cloudera.cmf.event.publish.EventStorePublisherWithRetry$PublishEventTask.run(EventStorePublisherWithRetry.java:242) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:750) Caused by: com.cloudera.cmf.event.shaded.io.netty.channel.AbstractChannel$AnnotatedNoRouteToHostException: No route to host: dmidlkprdls01.svr.luc.edu/192.168.158.1:7184 Caused by: java.net.NoRouteToHostException: No route to host at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:716) at com.cloudera.cmf.event.shaded.io.netty.channel.socket.nio.NioSocketChannel.doFinishConnect(NioSocketChannel.java:337) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.finishConnect(AbstractNioChannel.java:339) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:776) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:724) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:650) at com.cloudera.cmf.event.shaded.io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:562) at com.cloudera.cmf.event.shaded.io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997) at com.cloudera.cmf.event.shaded.io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:37:01,639 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:37:02,640 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:37:03,641 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:37:04,642 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:37:05,643 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:37:06,645 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:37:07,646 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:37:08,647 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:37:09,649 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:37:10,650 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:37:11,651 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:37:11,653 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 18:37:17,655 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:37:18,656 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:37:19,657 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:37:19,867 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:37:19,868 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:37:19,869 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 18:37:20,659 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:37:21,660 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:37:22,661 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:37:23,663 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:37:24,664 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:37:25,666 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:37:26,667 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:37:27,668 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:37:28,669 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:37:29,671 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:37:30,672 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:37:31,673 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:37:32,674 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:37:33,676 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:37:34,677 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:37:35,678 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:37:36,679 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:37:37,681 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:37:38,682 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:37:39,683 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:37:40,685 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:37:41,686 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:37:42,687 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:37:43,688 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:37:44,690 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:37:45,691 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:37:46,692 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:37:47,693 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:37:48,695 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:37:49,696 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:37:50,697 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:37:51,699 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:37:52,701 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:37:53,702 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:37:54,703 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:37:55,704 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:37:56,706 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:37:57,707 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:37:58,708 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:37:59,710 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:38:00,711 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:38:01,712 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:38:02,713 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:38:03,715 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:38:04,716 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:38:05,717 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:38:06,719 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:38:06,720 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 18:38:12,722 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:38:13,723 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:38:14,724 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:38:15,726 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:38:16,727 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:38:17,728 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:38:18,729 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:38:19,731 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:38:19,853 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:38:19,854 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:38:19,854 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 18:38:20,732 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:38:21,733 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:38:22,735 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:38:23,736 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:38:24,738 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:38:25,739 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:38:26,740 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:38:27,742 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:38:28,743 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:38:29,744 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:38:30,746 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:38:31,747 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:38:32,748 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:38:33,749 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:38:34,751 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:38:35,752 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:38:36,753 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:38:37,755 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:38:38,756 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:38:39,757 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:38:40,759 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:38:41,760 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:38:42,761 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:38:43,763 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:38:44,764 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:38:45,765 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:38:46,766 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:38:47,768 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:38:48,769 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:38:49,770 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:38:50,771 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:38:51,773 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:38:52,775 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:38:53,776 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:38:54,778 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:38:55,779 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:38:56,780 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:38:57,781 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:38:58,783 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:38:59,784 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:39:00,785 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:39:01,787 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:39:01,788 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 18:39:07,790 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:39:08,792 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:39:09,793 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:39:10,794 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:39:11,795 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:39:12,797 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:39:13,798 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:39:14,800 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:39:15,801 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:39:16,802 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:39:17,803 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:39:18,805 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:39:19,806 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:39:19,851 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:39:19,852 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:39:19,853 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 18:39:20,807 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:39:21,808 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:39:22,810 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:39:23,811 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:39:24,813 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:39:25,814 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:39:26,815 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:39:27,816 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:39:28,818 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:39:29,819 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:39:30,820 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:39:31,821 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:39:32,823 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:39:33,824 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:39:34,826 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:39:35,827 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:39:36,828 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:39:37,830 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:39:38,831 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:39:39,832 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:39:40,834 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:39:41,835 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:39:42,836 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:39:43,838 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:39:44,839 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:39:45,840 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:39:46,842 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:39:47,843 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:39:48,844 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:39:49,845 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:39:50,847 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:39:51,849 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:39:52,850 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:39:53,852 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:39:54,853 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:39:55,854 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:39:56,856 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:39:56,857 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 18:40:02,859 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:40:03,860 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:40:04,862 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:40:05,863 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:40:06,864 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:40:07,866 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:40:08,867 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:40:09,868 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:40:10,870 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:40:11,871 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:40:12,872 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:40:13,874 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:40:14,875 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:40:15,876 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:40:16,878 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:40:17,879 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:40:18,880 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:40:19,864 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:40:19,865 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:40:19,866 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 45 more 2025-07-07 18:40:19,881 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:40:20,883 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:40:21,884 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:40:22,885 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:40:23,887 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:40:24,888 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:40:25,889 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:40:26,890 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:40:27,892 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:40:28,893 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:40:29,894 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:40:30,896 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:40:31,897 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:40:32,898 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:40:33,900 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:40:34,901 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:40:35,902 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:40:36,904 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:40:37,905 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:40:38,906 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:40:39,907 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:40:40,909 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:40:41,910 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:40:42,911 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:40:43,913 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:40:44,914 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:40:45,915 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:40:46,917 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:40:47,918 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:40:48,919 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:40:49,920 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:40:50,922 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:40:51,924 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:40:51,925 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 18:40:57,927 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:40:58,929 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:40:59,930 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:41:00,931 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:41:01,932 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:41:02,933 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:41:03,935 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:41:04,936 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:41:05,937 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:41:06,939 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:41:07,940 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:41:08,941 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:41:09,942 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:41:10,944 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:41:11,945 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:41:12,946 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:41:13,948 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:41:14,949 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:41:15,950 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:41:16,952 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:41:17,953 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:41:18,954 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:41:19,852 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:41:19,853 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:41:19,854 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 18:41:19,955 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:41:20,957 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:41:21,958 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:41:22,959 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:41:23,961 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:41:24,962 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:41:25,963 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:41:26,965 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:41:27,966 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:41:28,967 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:41:29,968 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:41:30,970 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:41:31,971 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:41:32,972 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:41:33,974 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:41:34,975 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:41:35,976 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:41:36,977 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:41:37,979 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:41:38,980 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:41:39,981 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:41:40,983 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:41:41,984 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:41:42,985 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:41:43,986 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:41:44,988 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:41:45,989 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:41:46,990 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:41:46,993 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 18:41:52,995 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:41:53,996 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:41:54,997 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:41:55,999 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:41:57,000 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:41:58,001 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:41:59,002 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:42:00,004 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:42:01,005 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:42:02,006 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:42:03,008 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:42:04,009 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:42:05,010 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:42:06,012 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:42:07,013 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:42:08,014 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:42:09,015 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:42:10,017 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:42:11,018 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:42:12,019 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:42:13,021 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:42:14,022 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:42:15,023 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:42:16,025 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:42:17,026 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:42:18,027 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:42:19,029 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:42:19,865 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:42:19,866 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:42:19,867 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 18:42:20,030 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:42:21,031 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:42:22,032 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:42:23,034 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:42:24,035 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:42:25,036 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:42:26,037 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:42:27,039 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:42:28,040 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:42:29,042 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:42:30,043 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:42:31,044 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:42:32,046 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:42:33,047 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:42:34,048 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:42:35,049 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:42:36,051 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:42:37,052 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:42:38,053 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:42:39,054 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:42:40,056 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:42:41,057 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:42:42,058 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:42:42,060 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 18:42:48,062 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:42:49,064 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:42:50,065 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:42:51,066 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:42:52,067 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:42:53,069 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:42:54,070 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:42:55,072 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:42:56,073 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:42:57,074 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:42:58,076 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:42:59,077 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:43:00,078 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:43:01,080 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:43:02,081 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:43:03,082 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:43:04,084 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:43:05,085 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:43:06,086 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:43:07,087 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:43:08,089 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:43:09,090 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:43:10,091 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:43:11,093 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:43:12,094 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:43:13,095 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:43:14,097 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:43:15,098 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:43:16,099 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:43:17,115 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:43:18,116 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:43:19,118 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:43:19,854 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:43:19,855 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:43:19,856 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 18:43:20,119 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:43:21,120 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:43:22,122 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:43:23,123 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:43:24,124 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:43:25,126 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:43:26,127 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:43:27,129 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:43:28,130 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:43:29,131 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:43:30,133 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:43:31,134 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:43:32,135 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:43:33,136 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:43:34,138 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:43:35,139 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:43:36,140 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:43:37,142 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:43:37,144 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 18:43:43,146 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:43:44,147 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:43:45,148 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:43:46,150 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:43:47,151 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:43:48,152 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:43:49,153 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:43:50,155 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:43:51,156 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:43:52,157 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:43:53,159 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:43:54,160 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:43:55,162 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:43:56,163 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:43:57,164 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:43:58,166 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:43:59,167 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:44:00,168 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:44:01,170 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:44:02,171 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:44:03,172 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:44:04,174 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:44:05,175 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:44:06,176 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:44:07,178 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:44:08,179 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:44:09,180 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:44:10,181 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:44:11,183 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:44:12,184 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:44:13,185 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:44:14,187 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:44:15,188 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:44:16,189 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:44:17,191 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:44:18,192 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:44:19,193 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:44:19,854 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:44:19,855 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:44:19,858 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 18:44:20,194 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:44:21,196 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:44:22,197 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:44:23,198 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:44:24,199 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:44:25,201 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:44:26,202 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:44:27,203 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:44:28,204 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:44:29,206 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:44:30,207 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:44:31,208 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:44:32,210 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:44:32,212 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 18:44:38,216 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:44:39,217 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:44:40,218 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:44:41,220 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:44:42,221 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:44:43,222 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:44:44,224 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:44:45,225 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:44:46,226 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:44:47,228 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:44:48,229 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:44:49,230 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:44:50,231 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:44:51,233 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:44:52,234 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:44:53,236 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:44:54,237 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:44:55,238 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:44:56,240 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:44:57,241 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:44:58,243 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:44:59,244 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:45:00,245 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:45:01,246 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:45:02,248 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:45:03,249 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:45:04,250 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:45:05,252 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:45:06,253 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:45:07,254 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:45:08,256 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:45:09,257 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:45:10,258 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:45:11,260 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:45:12,261 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:45:13,262 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:45:14,264 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:45:15,265 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:45:16,266 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:45:17,267 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:45:18,269 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:45:19,270 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:45:19,866 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:45:19,867 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:45:19,868 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 18:45:20,271 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:45:21,272 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:45:22,274 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:45:23,275 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:45:24,276 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:45:25,278 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:45:26,279 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:45:27,280 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:45:27,283 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 18:45:33,285 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:45:34,286 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:45:35,287 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:45:36,289 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:45:37,290 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:45:38,291 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:45:39,292 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:45:40,293 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:45:41,295 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:45:42,296 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:45:43,297 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:45:44,298 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:45:45,300 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:45:46,301 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:45:47,302 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:45:48,303 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:45:49,305 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:45:50,306 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:45:51,307 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:45:52,308 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:45:53,310 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:45:54,311 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:45:55,312 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:45:56,313 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:45:57,315 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:45:58,316 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:45:59,317 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:46:00,319 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:46:01,320 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:46:02,321 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:46:03,322 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:46:04,323 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:46:05,325 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:46:06,326 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:46:07,327 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:46:08,329 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:46:09,330 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:46:10,331 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:46:11,332 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:46:12,334 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:46:13,335 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:46:14,337 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:46:15,338 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:46:16,339 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:46:17,340 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:46:18,341 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:46:19,343 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:46:19,852 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:46:19,853 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:46:19,854 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 18:46:20,344 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:46:21,345 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:46:22,346 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:46:22,348 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 18:46:28,350 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:46:29,351 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:46:30,352 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:46:31,354 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:46:32,355 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:46:33,356 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:46:34,357 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:46:35,359 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:46:36,360 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:46:37,361 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:46:38,362 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:46:39,364 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:46:40,365 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:46:41,366 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:46:42,368 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:46:43,369 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:46:44,370 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:46:45,371 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:46:46,373 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:46:47,374 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:46:48,375 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:46:49,377 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:46:50,378 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:46:51,380 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:46:52,381 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:46:53,382 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:46:54,384 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:46:55,385 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:46:56,386 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:46:57,387 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:46:58,389 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:46:59,390 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:47:00,391 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:47:01,392 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:47:02,394 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:47:03,395 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:47:04,396 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:47:05,397 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:47:06,399 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:47:07,400 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:47:08,401 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:47:09,402 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:47:10,404 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:47:11,405 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:47:12,406 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:47:13,408 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:47:14,409 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:47:15,410 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:47:16,411 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:47:17,412 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:47:17,414 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 18:47:19,858 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:47:19,859 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:47:19,860 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 18:47:23,416 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:47:24,417 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:47:25,419 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:47:26,420 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:47:27,421 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:47:28,422 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:47:29,424 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:47:30,425 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:47:31,426 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:47:32,427 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:47:33,429 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:47:34,430 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:47:35,431 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:47:36,432 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:47:37,434 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:47:38,435 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:47:39,436 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:47:40,437 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:47:41,438 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:47:42,440 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:47:43,441 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:47:44,442 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:47:45,443 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:47:46,445 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:47:47,446 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:47:48,447 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:47:49,448 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:47:50,450 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:47:51,451 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:47:52,452 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:47:53,454 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:47:54,455 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:47:55,456 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:47:56,457 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:47:57,459 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:47:58,460 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:47:59,461 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:48:00,462 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:48:01,464 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:48:02,465 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:48:03,466 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:48:04,467 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:48:05,468 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:48:06,470 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:48:07,471 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:48:08,472 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:48:09,473 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:48:10,475 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:48:11,476 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:48:12,477 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:48:12,479 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 18:48:18,481 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:48:19,482 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:48:19,872 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:48:19,873 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:48:19,874 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 45 more 2025-07-07 18:48:20,484 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:48:21,485 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:48:22,486 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:48:23,488 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:48:24,489 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:48:25,490 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:48:26,492 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:48:27,493 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:48:28,494 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:48:29,495 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:48:30,497 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:48:31,498 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:48:32,499 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:48:33,501 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:48:34,502 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:48:35,503 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:48:36,505 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:48:37,506 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:48:38,507 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:48:39,509 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:48:40,510 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:48:41,511 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:48:42,512 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:48:43,514 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:48:44,515 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:48:45,516 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:48:46,518 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:48:47,519 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:48:48,520 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:48:49,521 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:48:50,523 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:48:51,525 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:48:52,526 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:48:53,528 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:48:54,529 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:48:55,530 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:48:56,532 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:48:57,533 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:48:58,534 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:48:59,535 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:49:00,537 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:49:01,538 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:49:02,539 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:49:03,541 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:49:04,542 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:49:05,543 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:49:06,544 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:49:07,546 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:49:07,547 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 18:49:13,549 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:49:14,551 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:49:15,552 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:49:16,553 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:49:17,555 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:49:18,556 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:49:19,557 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:49:19,859 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:49:19,860 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:49:19,861 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 18:49:20,559 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:49:21,560 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:49:22,561 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:49:23,563 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:49:24,564 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:49:25,565 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:49:26,566 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:49:27,568 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:49:28,569 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:49:29,570 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:49:30,572 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:49:31,573 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:49:32,574 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:49:33,576 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:49:34,577 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:49:35,578 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:49:36,580 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:49:37,581 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:49:38,582 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:49:39,583 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:49:40,585 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:49:41,586 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:49:42,587 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:49:43,588 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:49:44,590 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:49:45,591 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:49:46,592 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:49:47,594 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:49:48,595 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:49:49,596 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:49:50,598 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:49:51,599 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:49:52,601 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:49:53,602 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:49:54,603 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:49:55,605 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:49:56,606 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:49:57,607 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:49:58,608 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:49:59,610 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:50:00,611 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:50:01,612 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:50:02,614 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:50:02,615 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 18:50:08,617 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:50:09,618 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:50:10,620 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:50:11,622 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:50:12,624 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:50:13,625 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:50:14,626 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:50:15,628 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:50:16,629 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:50:17,630 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:50:18,631 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:50:19,633 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:50:19,870 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:50:19,870 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:50:19,871 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 18:50:20,634 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:50:21,635 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:50:22,637 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:50:23,638 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:50:24,639 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:50:25,641 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:50:26,642 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:50:27,643 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:50:28,644 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:50:29,645 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:50:30,647 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:50:31,648 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:50:32,649 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:50:33,651 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:50:34,652 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:50:35,653 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:50:36,655 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:50:37,656 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:50:38,657 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:50:39,658 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:50:40,660 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:50:41,661 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:50:42,662 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:50:43,663 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:50:44,665 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:50:45,666 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:50:46,667 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:50:47,669 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:50:48,670 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:50:49,671 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:50:50,673 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:50:51,674 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:50:52,676 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:50:53,677 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:50:54,678 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:50:55,680 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:50:56,681 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:50:57,682 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:50:57,684 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 18:51:03,686 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:51:04,687 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:51:05,688 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:51:06,689 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:51:07,691 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:51:08,692 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:51:09,693 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:51:10,695 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:51:11,696 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:51:12,697 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:51:13,699 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:51:14,700 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:51:15,701 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:51:16,703 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:51:17,704 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:51:18,705 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:51:19,706 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:51:19,858 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:51:19,859 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:51:19,860 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 18:51:20,708 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:51:21,709 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:51:22,710 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:51:23,712 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:51:24,713 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:51:25,714 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:51:26,716 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:51:27,717 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:51:28,718 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:51:29,720 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:51:30,721 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:51:31,722 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:51:32,723 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:51:33,725 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:51:34,726 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:51:35,727 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:51:36,729 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:51:37,730 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:51:38,731 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:51:39,732 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:51:40,734 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:51:41,735 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:51:42,736 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:51:43,737 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:51:44,739 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:51:45,740 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:51:46,741 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:51:47,743 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:51:48,744 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:51:49,745 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:51:50,746 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:51:51,748 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:51:52,750 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:51:52,751 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 18:51:58,753 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:51:59,754 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:52:00,755 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:52:01,756 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:52:02,758 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:52:03,759 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:52:04,760 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:52:05,761 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:52:06,763 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:52:07,764 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:52:08,765 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:52:09,766 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:52:10,768 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:52:11,769 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:52:12,770 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:52:13,771 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:52:14,773 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:52:15,774 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:52:16,775 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:52:17,777 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:52:18,778 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:52:19,779 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:52:19,856 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:52:19,857 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:52:19,858 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 18:52:20,780 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:52:21,782 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:52:22,783 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:52:23,784 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:52:24,786 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:52:25,787 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:52:26,788 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:52:27,789 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:52:28,791 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:52:29,792 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:52:30,793 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:52:31,795 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:52:32,796 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:52:33,797 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:52:34,798 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:52:35,799 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:52:36,800 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:52:37,802 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:52:38,803 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:52:39,804 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:52:40,805 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:52:41,807 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:52:42,808 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:52:43,809 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:52:44,811 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:52:45,812 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:52:46,813 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:52:47,814 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:52:47,817 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 18:52:53,819 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:52:54,820 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:52:55,821 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:52:56,822 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:52:57,824 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:52:58,825 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:52:59,826 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:53:00,827 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:53:01,829 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:53:02,830 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:53:03,831 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:53:04,832 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:53:05,833 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:53:06,834 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:53:07,836 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:53:08,837 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:53:09,838 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:53:10,839 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:53:11,840 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:53:12,841 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:53:13,843 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:53:14,844 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:53:15,845 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:53:16,846 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:53:17,848 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:53:18,849 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:53:19,850 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:53:19,868 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:53:19,869 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:53:19,870 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 18:53:20,851 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:53:21,853 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:53:22,854 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:53:23,856 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:53:24,857 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:53:25,858 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:53:26,859 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:53:27,860 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:53:28,862 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:53:29,863 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:53:30,864 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:53:31,865 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:53:32,866 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:53:33,868 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:53:34,869 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:53:35,870 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:53:36,871 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:53:37,873 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:53:38,874 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:53:39,875 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:53:40,876 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:53:41,877 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:53:42,878 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:53:42,881 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 18:53:48,883 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:53:49,884 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:53:50,885 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:53:51,887 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:53:52,888 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:53:53,889 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:53:54,890 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:53:55,891 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:53:56,893 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:53:57,894 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:53:58,895 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:53:59,896 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:54:00,897 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:54:01,899 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:54:02,900 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:54:03,901 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:54:04,902 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:54:05,903 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:54:06,905 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:54:07,906 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:54:08,907 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:54:09,908 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:54:10,909 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:54:11,911 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:54:12,912 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:54:13,913 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:54:14,914 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:54:15,915 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:54:16,917 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:54:17,918 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:54:18,919 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:54:19,856 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:54:19,858 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:54:19,860 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 18:54:19,920 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:54:20,921 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:54:21,923 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:54:22,924 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:54:23,925 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:54:24,927 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:54:25,928 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:54:26,929 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:54:27,930 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:54:28,932 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:54:29,933 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:54:30,934 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:54:31,935 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:54:32,937 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:54:33,938 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:54:34,939 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:54:35,940 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:54:36,942 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:54:37,943 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:54:37,945 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 18:54:43,947 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:54:44,948 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:54:45,949 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:54:46,951 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:54:47,952 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:54:48,953 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:54:49,954 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:54:50,955 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:54:51,957 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:54:52,958 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:54:53,960 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:54:54,961 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:54:55,962 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:54:56,963 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:54:57,965 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:54:58,966 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:54:59,967 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:55:00,968 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:55:01,970 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:55:02,971 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:55:03,972 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:55:04,973 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:55:05,974 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:55:06,976 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:55:07,977 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:55:08,978 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:55:09,979 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:55:10,981 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:55:11,982 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:55:12,983 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:55:13,984 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:55:14,986 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:55:15,987 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:55:16,988 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:55:17,989 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:55:18,990 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:55:19,860 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:55:19,862 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:55:19,863 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 18:55:19,991 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:55:20,993 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:55:21,994 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:55:22,995 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:55:23,997 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:55:24,998 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:55:25,999 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:55:27,001 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:55:28,002 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:55:29,003 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:55:30,004 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:55:31,005 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:55:32,006 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:55:33,008 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:55:33,010 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 18:55:39,012 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:55:40,013 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:55:41,014 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:55:42,016 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:55:43,017 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:55:44,018 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:55:45,019 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:55:46,020 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:55:47,022 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:55:48,023 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:55:49,024 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:55:50,025 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:55:51,026 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:55:52,027 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:55:53,029 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:55:54,030 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:55:55,031 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:55:56,032 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:55:57,034 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:55:58,035 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:55:59,036 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:56:00,037 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:56:01,039 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:56:02,040 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:56:03,041 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:56:04,042 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:56:05,043 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:56:06,045 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:56:07,046 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:56:08,047 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:56:09,048 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:56:10,049 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:56:11,051 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:56:12,052 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:56:13,053 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:56:14,054 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:56:15,055 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:56:16,057 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:56:17,058 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:56:18,059 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:56:19,060 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:56:19,877 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:56:19,878 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:56:19,879 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 18:56:20,062 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:56:21,063 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:56:22,064 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:56:23,065 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:56:24,067 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:56:25,068 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:56:26,069 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:56:27,070 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:56:28,072 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:56:28,074 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 18:56:34,076 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:56:35,077 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:56:36,078 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:56:37,079 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:56:38,081 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:56:39,082 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:56:40,083 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:56:41,084 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:56:42,086 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:56:43,087 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:56:44,088 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:56:45,089 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:56:46,091 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:56:47,092 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:56:48,093 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:56:49,094 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:56:50,096 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:56:51,097 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:56:52,098 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:56:53,099 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:56:54,101 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:56:55,102 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:56:56,103 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:56:57,104 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:56:58,106 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:56:59,107 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:57:00,108 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:57:01,109 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:57:02,110 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:57:03,112 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:57:04,113 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:57:05,114 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:57:06,115 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:57:07,116 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:57:08,118 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:57:09,119 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:57:10,120 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:57:11,122 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:57:12,123 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:57:13,124 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:57:14,125 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:57:15,126 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:57:16,127 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:57:17,129 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:57:18,130 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:57:19,131 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:57:19,859 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:57:19,860 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:57:19,861 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 18:57:20,132 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:57:21,134 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:57:22,135 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:57:23,136 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:57:23,137 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 18:57:29,139 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:57:30,140 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:57:31,141 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:57:32,143 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:57:33,144 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:57:34,145 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:57:35,146 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:57:36,147 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:57:37,149 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:57:38,150 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:57:39,151 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:57:40,152 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:57:41,153 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:57:42,155 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:57:43,156 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:57:44,157 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:57:45,158 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:57:46,159 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:57:47,160 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:57:48,162 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:57:49,163 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:57:50,164 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:57:51,166 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:57:52,167 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:57:53,168 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:57:54,170 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:57:55,171 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:57:56,172 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:57:57,173 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:57:58,175 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:57:59,176 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:58:00,177 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:58:01,178 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:58:02,179 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:58:03,181 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:58:04,182 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:58:05,183 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:58:06,184 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:58:07,185 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:58:08,187 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:58:09,188 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:58:10,189 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:58:11,190 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:58:12,192 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:58:13,193 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:58:14,194 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:58:15,195 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:58:16,197 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:58:17,198 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:58:18,199 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:58:18,201 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 18:58:19,858 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:58:19,859 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:58:19,860 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 18:58:24,203 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:58:25,204 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:58:26,205 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:58:27,207 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:58:28,208 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:58:29,209 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:58:30,210 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:58:31,211 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:58:32,213 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:58:33,214 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:58:34,215 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:58:35,216 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:58:36,217 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:58:37,219 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:58:38,220 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:58:39,221 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:58:40,222 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:58:41,224 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:58:42,225 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:58:43,226 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:58:44,227 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:58:45,228 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:58:46,230 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:58:47,231 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:58:48,232 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:58:49,234 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:58:50,235 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:58:51,236 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:58:52,237 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:58:53,239 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:58:54,240 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:58:55,241 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:58:56,242 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:58:57,244 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:58:58,245 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:58:59,246 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:59:00,247 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:59:01,249 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:59:02,250 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:59:03,251 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:59:04,252 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:59:05,254 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:59:06,255 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:59:07,256 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:59:08,257 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:59:09,259 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:59:10,260 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:59:11,261 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:59:12,262 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:59:13,264 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:59:13,265 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 18:59:19,267 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:59:19,861 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:59:19,862 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 18:59:19,863 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 18:59:20,268 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:59:21,269 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:59:22,271 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:59:23,272 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:59:24,273 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:59:25,274 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:59:26,276 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:59:27,277 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:59:28,278 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:59:29,279 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:59:30,281 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:59:31,282 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:59:32,283 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:59:33,284 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:59:34,285 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:59:35,287 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:59:36,288 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:59:37,289 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:59:38,290 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:59:39,291 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:59:40,293 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:59:41,294 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:59:42,295 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:59:43,296 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:59:44,297 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:59:45,299 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:59:46,300 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:59:47,301 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:59:48,302 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:59:49,304 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:59:50,305 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:59:51,307 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:59:52,308 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:59:53,309 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:59:54,311 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:59:55,312 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:59:56,313 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:59:57,314 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:59:58,316 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 18:59:59,317 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:00:00,318 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:00:01,319 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:00:02,320 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:00:03,322 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:00:04,323 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:00:05,324 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:00:06,325 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:00:07,326 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:00:08,328 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:00:08,329 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 19:00:14,331 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:00:15,332 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:00:16,333 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:00:17,334 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:00:18,336 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:00:19,337 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:00:19,873 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 19:00:19,874 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 19:00:19,875 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 19:00:20,338 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:00:21,339 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:00:22,340 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:00:23,342 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:00:24,343 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:00:25,344 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:00:26,346 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:00:27,347 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:00:28,348 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:00:29,349 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:00:30,350 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:00:31,352 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:00:32,353 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:00:33,354 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:00:34,355 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:00:35,357 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:00:36,358 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:00:37,359 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:00:38,360 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:00:39,362 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:00:40,363 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:00:41,364 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:00:42,366 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:00:43,367 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:00:44,368 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:00:45,369 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:00:46,370 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:00:47,372 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:00:48,373 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:00:49,374 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:00:50,375 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:00:51,377 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:00:52,378 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:00:53,380 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:00:54,381 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:00:55,382 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:00:56,384 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:00:57,385 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:00:58,386 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:00:59,387 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:01:00,389 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:01:01,390 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:01:02,391 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:01:03,392 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:01:03,394 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 19:01:09,396 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:01:10,397 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:01:11,398 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:01:12,400 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:01:13,401 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:01:14,402 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:01:15,403 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:01:16,405 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:01:17,406 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:01:18,407 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:01:19,408 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:01:19,860 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.GeneratedMethodAccessor131.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 19:01:19,861 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.GeneratedMethodAccessor133.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-07 19:01:19,862 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.GeneratedMethodAccessor134.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-07 19:01:20,409 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:01:21,411 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:01:22,412 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:01:23,413 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:01:24,415 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:01:25,416 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:01:26,417 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:01:27,418 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:01:28,420 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-07 19:01:28,523 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Acknowledging ACTIVE Namenode during handshakeBlock pool (Datanode Uuid unassigned) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 19:01:28,526 INFO org.apache.hadoop.hdfs.server.common.Storage: Using 4 threads to upgrade data directories (dfs.datanode.parallel.volumes.load.threads.num=4, dataDirs=4) 2025-07-07 19:01:28,535 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d1/dfs/dn/in_use.lock acquired by nodename 125810@dmidlkprdls04.svr.luc.edu 2025-07-07 19:01:28,539 WARN org.apache.hadoop.hdfs.server.common.Storage: Failed to add storage directory [DISK]file:/hdfs/d1/dfs/dn java.io.IOException: Incompatible clusterIDs in /hdfs/d1/dfs/dn: namenode clusterID = cluster13; datanode clusterID = cluster72 at org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:744) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadStorageDirectory(DataStorage.java:294) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadDataStorage(DataStorage.java:407) at org.apache.hadoop.hdfs.server.datanode.DataStorage.addStorageLocations(DataStorage.java:387) at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:559) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-07 19:01:28,540 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d2/dfs/dn/in_use.lock acquired by nodename 125810@dmidlkprdls04.svr.luc.edu 2025-07-07 19:01:28,541 WARN org.apache.hadoop.hdfs.server.common.Storage: Failed to add storage directory [DISK]file:/hdfs/d2/dfs/dn java.io.IOException: Incompatible clusterIDs in /hdfs/d2/dfs/dn: namenode clusterID = cluster13; datanode clusterID = cluster72 at org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:744) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadStorageDirectory(DataStorage.java:294) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadDataStorage(DataStorage.java:407) at org.apache.hadoop.hdfs.server.datanode.DataStorage.addStorageLocations(DataStorage.java:387) at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:559) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-07 19:01:28,542 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d3/dfs/dn/in_use.lock acquired by nodename 125810@dmidlkprdls04.svr.luc.edu 2025-07-07 19:01:28,543 WARN org.apache.hadoop.hdfs.server.common.Storage: Failed to add storage directory [DISK]file:/hdfs/d3/dfs/dn java.io.IOException: Incompatible clusterIDs in /hdfs/d3/dfs/dn: namenode clusterID = cluster13; datanode clusterID = cluster72 at org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:744) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadStorageDirectory(DataStorage.java:294) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadDataStorage(DataStorage.java:407) at org.apache.hadoop.hdfs.server.datanode.DataStorage.addStorageLocations(DataStorage.java:387) at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:559) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-07 19:01:28,545 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d4/dfs/dn/in_use.lock acquired by nodename 125810@dmidlkprdls04.svr.luc.edu 2025-07-07 19:01:28,545 WARN org.apache.hadoop.hdfs.server.common.Storage: Failed to add storage directory [DISK]file:/hdfs/d4/dfs/dn java.io.IOException: Incompatible clusterIDs in /hdfs/d4/dfs/dn: namenode clusterID = cluster13; datanode clusterID = cluster72 at org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:744) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadStorageDirectory(DataStorage.java:294) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadDataStorage(DataStorage.java:407) at org.apache.hadoop.hdfs.server.datanode.DataStorage.addStorageLocations(DataStorage.java:387) at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:559) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-07 19:01:28,548 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: Initialization failed for Block pool (Datanode Uuid 92aa408f-509e-4bad-98e0-6e83f0a34a12) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Exiting. java.io.IOException: All specified directories have failed to load. at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:560) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-07 19:01:28,549 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Ending block pool service for: Block pool (Datanode Uuid 92aa408f-509e-4bad-98e0-6e83f0a34a12) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-07 19:01:28,551 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Removed Block pool (Datanode Uuid 92aa408f-509e-4bad-98e0-6e83f0a34a12) 2025-07-07 19:01:30,552 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Exiting Datanode 2025-07-07 19:01:30,558 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG: /************************************************************ SHUTDOWN_MSG: Shutting down DataNode at dmidlkprdls04.svr.luc.edu/192.168.158.4 ************************************************************/ 2025-07-09 17:38:23,282 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: STARTUP_MSG: /************************************************************ STARTUP_MSG: Starting DataNode STARTUP_MSG: host = dmidlkprdls04.svr.luc.edu/192.168.158.4 STARTUP_MSG: args = [] STARTUP_MSG: version = 3.1.1.7.3.1.0-197 STARTUP_MSG: classpath = /var/run/cloudera-scm-agent/process/146-hdfs-DATANODE:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/audience-annotations-0.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/avro.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/aws-java-sdk-bundle-1.12.720.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-hdfs-plugin-shim-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-plugin-classloader-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-yarn-plugin-shim-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/azure-data-lake-store-sdk-2.3.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/checker-qual-3.33.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-beanutils-1.9.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-cli-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-codec-1.15.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-collections-3.2.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-compress-1.23.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-configuration2-2.10.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-io-2.11.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-lang-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-lang3-3.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-logging-1.1.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-math3-3.6.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-net-3.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-text-1.10.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-client-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-framework-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-recipes-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/failureaccess-1.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/gson-2.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/guava-32.0.1-jre.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/httpclient-4.5.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/httpcore-4.4.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/j2objc-annotations-2.8.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-core-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-core-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-databind-2.12.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-mapper-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-xc-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/javax.activation-api-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/javax.servlet-api-3.1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jaxb-api-2.2.11.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jcip-annotations-1.0-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-core-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-json-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-server-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-servlet-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jettison-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-http-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-io-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-security-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-server-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-servlet-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-util-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-util-ajax-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-webapp-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-xml-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jline-3.22.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsch-0.1.54.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsp-api-2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsr305-3.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsr311-api-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jul-to-slf4j-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-admin-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-client-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-common-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-core-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-crypto-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-identity-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-server-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-simplekdc-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-asn1-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-config-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-pkix-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-xdr-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/logredactor-2.0.16.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/lz4-java-1.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/metrics-core-3.2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-all-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-buffer-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-haproxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-http-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-http2-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-memcache-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-mqtt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-redis-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-smtp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-log4j12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-reload4j-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-api-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/reload4j-1.2.22.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/zookeeper.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-udt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-unix-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-kqueue-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final-linux-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-classes-kqueue-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-rxtx-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-classes-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-classes-macos-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-ssl-ocsp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-proxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-xml-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-stomp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-socks-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/wildfly-openssl-2.1.4.ClouderaFinal.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/protobuf-java-2.5.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/nimbus-jose-jwt-9.37.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/woodstox-core-5.4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/token-provider-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/stax2-api-4.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/snappy-java-1.1.10.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/re2j-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/zookeeper-jute.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-sctp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-kqueue-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final-linux-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//gcs-connector-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-annotations.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-auth.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-aws.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-datalake.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-kms.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-nfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//ozone-filesystem-hadoop3-1.3.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-nfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-kms-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-datalake-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-aws-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-auth-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-annotations-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//gcs-connector-2.1.2.7.3.1.0-197-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-thrift.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-scala_2.12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-protobuf.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-pig.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-pig-bundle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-jackson.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-hadoop.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-hadoop-bundle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-generator.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-format-structures.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-encoding.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-column.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-cascading3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-cascading.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-avro.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/./:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/audience-annotations-0.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/avro-1.11.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/checker-qual-3.33.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-beanutils-1.9.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-codec-1.15.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-collections-3.2.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-compress-1.23.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-configuration2-2.10.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-io-2.11.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-lang3-3.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-math3-3.6.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-net-3.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-text-1.10.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-client-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-framework-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-recipes-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/failureaccess-1.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/gson-2.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/guava-32.0.1-jre.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/httpclient-4.5.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/httpcore-4.4.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/j2objc-annotations-2.8.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-core-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-databind-2.12.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-jaxrs-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-xc-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/javax.activation-api-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/javax.servlet-api-3.1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jaxb-api-2.2.11.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jaxb-impl-2.2.3-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jcip-annotations-1.0-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-core-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-json-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-server-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-servlet-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jettison-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-http-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-io-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-security-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-server-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-servlet-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-util-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-util-ajax-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-webapp-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-xml-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jline-3.22.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsch-0.1.54.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/json-simple-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsr311-api-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-admin-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-client-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-common-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-core-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-crypto-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-identity-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-server-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-simplekdc-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-asn1-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-config-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-pkix-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-xdr-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/leveldbjni-cldr-1.9.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/lz4-java-1.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/metrics-core-3.2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-all-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-buffer-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-haproxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-http-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-http2-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-memcache-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-mqtt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-redis-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-smtp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-socks-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-stomp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-xml-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-proxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-ssl-ocsp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/re2j-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/zookeeper-3.8.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-sctp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-rxtx-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-unix-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-kqueue-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-kqueue-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final-linux-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final-linux-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-classes-kqueue-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-classes-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-classes-macos-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/nimbus-jose-jwt-9.37.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/woodstox-core-5.4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/token-provider-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/stax2-api-4.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/snappy-java-1.1.10.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/reload4j-1.2.22.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/zookeeper-jute-3.8.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-udt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-httpfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-nfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-nfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-httpfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//asm-5.0.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//aspectjweaver-1.9.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//azure-storage-7.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//checker-compat-qual-2.5.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-slf4j-backend-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-system-backend-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//google-extensions-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//azure-keyvault-core-1.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//accessors-smart-2.4.9.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gcs-connector-2.1.2.7.3.1.0-197-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archive-logs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archive-logs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archives-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archives.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-datajoin-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-datajoin.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-distcp-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-distcp.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-extras-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-extras.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-fs2img-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-fs2img.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-gridmix-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-gridmix.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-kafka-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-kafka.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-app-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-app.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-core-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-core.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-nativetask-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-nativetask.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-uploader-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-uploader.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-examples-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-examples.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-openstack-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-openstack.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-resourceestimator-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-resourceestimator.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-rumen-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-rumen.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-sls-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-sls.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-streaming-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-streaming.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ojalgo-43.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//kafka-clients-2.8.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//log4j-core-2.18.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-hook-abfs-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//forbiddenapis-3.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-intg-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//log4j-api-2.18.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//zstd-jni-1.4.9-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-i18n.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-s3-lib-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//javax.activation-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//bundle-2.23.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//json-smart-2.4.10.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-util-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-shell.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-cloud-bindings.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-hook-s3-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//aspectjrt-1.9.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/./:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/swagger-annotations-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/objenesis-1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jersey-client-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jakarta.xml.bind-api-2.3.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jakarta.activation-api-1.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-dataformat-yaml-2.9.10.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcutil-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcprov-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcpkix-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/HikariCP-java7-2.4.12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/snakeyaml-2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jsonschema2pojo-core-1.0.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/joda-time-2.10.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jna-5.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jersey-guice-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/javax.inject-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-module-jaxb-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-jaxrs-json-provider-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-jaxrs-base-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/guice-servlet-4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/guice-4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/geronimo-jcache_1.0_spec-1.0-alpha-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/fst-2.50.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/ehcache-3.3.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/dnsjava-2.1.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/codemodel-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-api.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-distributedshell.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-cloud-resourcemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-registry.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-nodemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-resourcemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-router.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-sharedcachemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-timeline-pluginstorage.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-web-proxy.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-api.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-core.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-registry-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-api-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-core-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-api-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-web-proxy-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-timeline-pluginstorage-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-tests-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-sharedcachemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-router-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-resourcemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-nodemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-cloud-resourcemanager-1.0.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-distributedshell-3.1.1.7.3.1.0-197.jar:/opt/cloudera/cm/lib/plugins/event-publish-7.13.1-shaded.jar:/opt/cloudera/cm/lib/plugins/tt-instrumentation-7.13.1.jar STARTUP_MSG: build = git@github.infra.cloudera.com:CDH/hadoop.git -r 31a42fb39494f541ffae15c3c61185deeeacca86; compiled by 'jenkins' on 2024-12-04T01:09Z STARTUP_MSG: java = 1.8.0_432 ************************************************************/ 2025-07-09 17:38:23,370 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: registered UNIX signal handlers for [TERM, HUP, INT] 2025-07-09 17:38:23,745 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d1/dfs/dn 2025-07-09 17:38:23,751 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d2/dfs/dn 2025-07-09 17:38:23,752 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d3/dfs/dn 2025-07-09 17:38:23,752 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d4/dfs/dn 2025-07-09 17:38:23,917 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: Loaded properties from hadoop-metrics2.properties 2025-07-09 17:38:24,037 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled Metric snapshot period at 10 second(s). 2025-07-09 17:38:24,038 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: DataNode metrics system started 2025-07-09 17:38:24,489 INFO org.apache.hadoop.hdfs.server.common.Util: dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling 2025-07-09 17:38:24,517 INFO org.apache.hadoop.hdfs.server.datanode.BlockScanner: Initialized block scanner with targetBytesPerSec 1048576 2025-07-09 17:38:24,524 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: File descriptor passing is enabled. 2025-07-09 17:38:24,526 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Configured hostname is dmidlkprdls04.svr.luc.edu 2025-07-09 17:38:24,526 INFO org.apache.hadoop.hdfs.server.common.Util: dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling 2025-07-09 17:38:24,532 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting DataNode with maxLockedMemory = 4294967296 2025-07-09 17:38:24,561 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened streaming server at /192.168.158.4:9866 2025-07-09 17:38:24,563 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Balancing bandwidth is 10485760 bytes/s 2025-07-09 17:38:24,564 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Number threads for balancing is 50 2025-07-09 17:38:24,569 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Balancing bandwidth is 10485760 bytes/s 2025-07-09 17:38:24,569 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Number threads for balancing is 50 2025-07-09 17:38:24,569 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Listening on UNIX domain socket: /var/run/hdfs-sockets/dn 2025-07-09 17:38:24,624 INFO org.eclipse.jetty.util.log: Logging initialized @2632ms to org.eclipse.jetty.util.log.Slf4jLog 2025-07-09 17:38:24,759 WARN org.apache.hadoop.security.authentication.server.AuthenticationFilter: Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /var/lib/hadoop-hdfs/hadoop-http-auth-signature-secret 2025-07-09 17:38:24,768 INFO org.apache.hadoop.http.HttpRequestLog: Http request log for http.requests.datanode is not defined 2025-07-09 17:38:24,778 INFO org.apache.hadoop.http.HttpServer2: Added global filter 'safety' (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter) 2025-07-09 17:38:24,780 INFO org.apache.hadoop.security.HttpCrossOriginFilterInitializer: CORS filter not enabled. Please set hadoop.http.cross-origin.enabled to 'true' to enable it 2025-07-09 17:38:24,781 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context datanode 2025-07-09 17:38:24,781 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context static 2025-07-09 17:38:24,781 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context logs 2025-07-09 17:38:24,821 INFO org.apache.hadoop.http.HttpServer2: Jetty bound to port 46307 2025-07-09 17:38:24,823 INFO org.eclipse.jetty.server.Server: jetty-9.4.54.v20240208; built: 2024-02-08T19:42:39.027Z; git: cef3fbd6d736a21e7d541a5db490381d95a2047d; jvm 1.8.0_432-b06 2025-07-09 17:38:24,862 INFO org.eclipse.jetty.server.session: DefaultSessionIdManager workerName=node0 2025-07-09 17:38:24,862 INFO org.eclipse.jetty.server.session: No SessionScavenger set, using defaults 2025-07-09 17:38:24,864 INFO org.eclipse.jetty.server.session: node0 Scavenging every 600000ms 2025-07-09 17:38:24,892 WARN org.apache.hadoop.security.authentication.server.AuthenticationFilter: Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /var/lib/hadoop-hdfs/hadoop-http-auth-signature-secret 2025-07-09 17:38:24,898 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.s.ServletContextHandler@302fec27{logs,/logs,file:///var/log/hadoop-hdfs/,AVAILABLE} 2025-07-09 17:38:24,900 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.s.ServletContextHandler@1b11ef33{static,/static,file:///opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/static/,AVAILABLE} 2025-07-09 17:38:25,052 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.w.WebAppContext@1cfd1875{datanode,/,file:///opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/datanode/,AVAILABLE}{file:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/datanode} 2025-07-09 17:38:25,074 INFO org.eclipse.jetty.server.AbstractConnector: Started ServerConnector@73511076{HTTP/1.1, (http/1.1)}{localhost:46307} 2025-07-09 17:38:25,075 INFO org.eclipse.jetty.server.Server: Started @3083ms 2025-07-09 17:38:25,396 INFO org.apache.hadoop.hdfs.server.datanode.web.DatanodeHttpServer: Listening HTTP traffic on /192.168.158.4:9864 2025-07-09 17:38:25,407 INFO org.apache.hadoop.util.JvmPauseMonitor: Starting JVM pause monitor 2025-07-09 17:38:25,408 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: dnUserName = hdfs 2025-07-09 17:38:25,409 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: supergroup = supergroup 2025-07-09 17:38:25,483 INFO org.apache.hadoop.ipc.CallQueueManager: Using callQueue: class java.util.concurrent.LinkedBlockingQueue queueCapacity: 1000 scheduler: class org.apache.hadoop.ipc.DefaultRpcScheduler 2025-07-09 17:38:25,506 INFO org.apache.hadoop.ipc.Server: Starting Socket Reader #1 for port 9867 2025-07-09 17:38:25,557 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened IPC server at /192.168.158.4:9867 2025-07-09 17:38:25,595 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Refresh request received for nameservices: null 2025-07-09 17:38:25,605 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting BPOfferServices for nameservices: 2025-07-09 17:38:25,621 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool (Datanode Uuid unassigned) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 starting to offer service 2025-07-09 17:38:25,629 INFO org.apache.hadoop.ipc.Server: IPC Server Responder: starting 2025-07-09 17:38:25,629 INFO org.apache.hadoop.ipc.Server: IPC Server listener on 9867: starting 2025-07-09 17:38:26,817 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-09 17:38:27,818 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-09 17:38:27,963 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Acknowledging ACTIVE Namenode during handshakeBlock pool (Datanode Uuid unassigned) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-09 17:38:27,967 INFO org.apache.hadoop.hdfs.server.common.Storage: Using 4 threads to upgrade data directories (dfs.datanode.parallel.volumes.load.threads.num=4, dataDirs=4) 2025-07-09 17:38:27,975 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d1/dfs/dn/in_use.lock acquired by nodename 26523@dmidlkprdls04.svr.luc.edu 2025-07-09 17:38:27,978 WARN org.apache.hadoop.hdfs.server.common.Storage: Failed to add storage directory [DISK]file:/hdfs/d1/dfs/dn java.io.IOException: Incompatible clusterIDs in /hdfs/d1/dfs/dn: namenode clusterID = cluster38; datanode clusterID = cluster72 at org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:744) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadStorageDirectory(DataStorage.java:294) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadDataStorage(DataStorage.java:407) at org.apache.hadoop.hdfs.server.datanode.DataStorage.addStorageLocations(DataStorage.java:387) at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:559) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-09 17:38:27,983 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d2/dfs/dn/in_use.lock acquired by nodename 26523@dmidlkprdls04.svr.luc.edu 2025-07-09 17:38:27,984 WARN org.apache.hadoop.hdfs.server.common.Storage: Failed to add storage directory [DISK]file:/hdfs/d2/dfs/dn java.io.IOException: Incompatible clusterIDs in /hdfs/d2/dfs/dn: namenode clusterID = cluster38; datanode clusterID = cluster72 at org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:744) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadStorageDirectory(DataStorage.java:294) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadDataStorage(DataStorage.java:407) at org.apache.hadoop.hdfs.server.datanode.DataStorage.addStorageLocations(DataStorage.java:387) at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:559) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-09 17:38:27,985 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d3/dfs/dn/in_use.lock acquired by nodename 26523@dmidlkprdls04.svr.luc.edu 2025-07-09 17:38:27,986 WARN org.apache.hadoop.hdfs.server.common.Storage: Failed to add storage directory [DISK]file:/hdfs/d3/dfs/dn java.io.IOException: Incompatible clusterIDs in /hdfs/d3/dfs/dn: namenode clusterID = cluster38; datanode clusterID = cluster72 at org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:744) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadStorageDirectory(DataStorage.java:294) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadDataStorage(DataStorage.java:407) at org.apache.hadoop.hdfs.server.datanode.DataStorage.addStorageLocations(DataStorage.java:387) at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:559) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-09 17:38:27,987 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d4/dfs/dn/in_use.lock acquired by nodename 26523@dmidlkprdls04.svr.luc.edu 2025-07-09 17:38:27,988 WARN org.apache.hadoop.hdfs.server.common.Storage: Failed to add storage directory [DISK]file:/hdfs/d4/dfs/dn java.io.IOException: Incompatible clusterIDs in /hdfs/d4/dfs/dn: namenode clusterID = cluster38; datanode clusterID = cluster72 at org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:744) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadStorageDirectory(DataStorage.java:294) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadDataStorage(DataStorage.java:407) at org.apache.hadoop.hdfs.server.datanode.DataStorage.addStorageLocations(DataStorage.java:387) at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:559) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-09 17:38:27,992 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: Initialization failed for Block pool (Datanode Uuid 92aa408f-509e-4bad-98e0-6e83f0a34a12) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Exiting. java.io.IOException: All specified directories have failed to load. at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:560) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-09 17:38:27,992 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Ending block pool service for: Block pool (Datanode Uuid 92aa408f-509e-4bad-98e0-6e83f0a34a12) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-09 17:38:27,995 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Removed Block pool (Datanode Uuid 92aa408f-509e-4bad-98e0-6e83f0a34a12) 2025-07-09 17:38:29,996 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Exiting Datanode 2025-07-09 17:38:30,003 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG: /************************************************************ SHUTDOWN_MSG: Shutting down DataNode at dmidlkprdls04.svr.luc.edu/192.168.158.4 ************************************************************/ 2025-07-09 17:38:34,369 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: STARTUP_MSG: /************************************************************ STARTUP_MSG: Starting DataNode STARTUP_MSG: host = dmidlkprdls04.svr.luc.edu/192.168.158.4 STARTUP_MSG: args = [] STARTUP_MSG: version = 3.1.1.7.3.1.0-197 STARTUP_MSG: classpath = /var/run/cloudera-scm-agent/process/146-hdfs-DATANODE:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/audience-annotations-0.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/avro.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/aws-java-sdk-bundle-1.12.720.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-hdfs-plugin-shim-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-plugin-classloader-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-yarn-plugin-shim-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/azure-data-lake-store-sdk-2.3.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/checker-qual-3.33.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-beanutils-1.9.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-cli-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-codec-1.15.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-collections-3.2.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-compress-1.23.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-configuration2-2.10.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-io-2.11.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-lang-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-lang3-3.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-logging-1.1.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-math3-3.6.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-net-3.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-text-1.10.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-client-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-framework-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-recipes-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/failureaccess-1.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/gson-2.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/guava-32.0.1-jre.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/httpclient-4.5.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/httpcore-4.4.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/j2objc-annotations-2.8.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-core-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-core-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-databind-2.12.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-mapper-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-xc-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/javax.activation-api-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/javax.servlet-api-3.1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jaxb-api-2.2.11.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jcip-annotations-1.0-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-core-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-json-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-server-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-servlet-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jettison-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-http-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-io-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-security-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-server-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-servlet-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-util-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-util-ajax-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-webapp-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-xml-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jline-3.22.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsch-0.1.54.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsp-api-2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsr305-3.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsr311-api-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jul-to-slf4j-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-admin-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-client-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-common-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-core-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-crypto-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-identity-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-server-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-simplekdc-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-asn1-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-config-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-pkix-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-xdr-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/logredactor-2.0.16.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/lz4-java-1.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/metrics-core-3.2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-all-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-buffer-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-haproxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-http-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-http2-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-memcache-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-mqtt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-redis-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-smtp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-log4j12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-reload4j-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-api-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/reload4j-1.2.22.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/zookeeper.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-udt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-unix-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-kqueue-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final-linux-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-classes-kqueue-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-rxtx-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-classes-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-classes-macos-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-ssl-ocsp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-proxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-xml-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-stomp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-socks-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/wildfly-openssl-2.1.4.ClouderaFinal.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/protobuf-java-2.5.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/nimbus-jose-jwt-9.37.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/woodstox-core-5.4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/token-provider-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/stax2-api-4.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/snappy-java-1.1.10.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/re2j-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/zookeeper-jute.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-sctp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-kqueue-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final-linux-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//gcs-connector-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-annotations.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-auth.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-aws.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-datalake.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-kms.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-nfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//ozone-filesystem-hadoop3-1.3.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-nfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-kms-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-datalake-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-aws-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-auth-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-annotations-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//gcs-connector-2.1.2.7.3.1.0-197-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-thrift.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-scala_2.12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-protobuf.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-pig.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-pig-bundle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-jackson.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-hadoop.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-hadoop-bundle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-generator.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-format-structures.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-encoding.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-column.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-cascading3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-cascading.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-avro.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/./:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/audience-annotations-0.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/avro-1.11.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/checker-qual-3.33.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-beanutils-1.9.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-codec-1.15.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-collections-3.2.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-compress-1.23.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-configuration2-2.10.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-io-2.11.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-lang3-3.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-math3-3.6.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-net-3.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-text-1.10.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-client-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-framework-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-recipes-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/failureaccess-1.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/gson-2.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/guava-32.0.1-jre.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/httpclient-4.5.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/httpcore-4.4.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/j2objc-annotations-2.8.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-core-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-databind-2.12.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-jaxrs-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-xc-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/javax.activation-api-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/javax.servlet-api-3.1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jaxb-api-2.2.11.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jaxb-impl-2.2.3-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jcip-annotations-1.0-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-core-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-json-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-server-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-servlet-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jettison-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-http-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-io-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-security-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-server-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-servlet-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-util-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-util-ajax-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-webapp-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-xml-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jline-3.22.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsch-0.1.54.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/json-simple-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsr311-api-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-admin-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-client-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-common-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-core-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-crypto-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-identity-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-server-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-simplekdc-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-asn1-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-config-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-pkix-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-xdr-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/leveldbjni-cldr-1.9.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/lz4-java-1.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/metrics-core-3.2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-all-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-buffer-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-haproxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-http-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-http2-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-memcache-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-mqtt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-redis-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-smtp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-socks-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-stomp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-xml-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-proxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-ssl-ocsp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/re2j-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/zookeeper-3.8.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-sctp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-rxtx-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-unix-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-kqueue-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-kqueue-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final-linux-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final-linux-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-classes-kqueue-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-classes-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-classes-macos-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/nimbus-jose-jwt-9.37.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/woodstox-core-5.4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/token-provider-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/stax2-api-4.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/snappy-java-1.1.10.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/reload4j-1.2.22.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/zookeeper-jute-3.8.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-udt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-httpfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-nfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-nfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-httpfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//asm-5.0.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//aspectjweaver-1.9.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//azure-storage-7.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//checker-compat-qual-2.5.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-slf4j-backend-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-system-backend-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//google-extensions-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//azure-keyvault-core-1.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//accessors-smart-2.4.9.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gcs-connector-2.1.2.7.3.1.0-197-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archive-logs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archive-logs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archives-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archives.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-datajoin-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-datajoin.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-distcp-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-distcp.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-extras-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-extras.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-fs2img-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-fs2img.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-gridmix-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-gridmix.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-kafka-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-kafka.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-app-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-app.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-core-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-core.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-nativetask-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-nativetask.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-uploader-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-uploader.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-examples-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-examples.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-openstack-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-openstack.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-resourceestimator-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-resourceestimator.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-rumen-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-rumen.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-sls-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-sls.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-streaming-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-streaming.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ojalgo-43.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//kafka-clients-2.8.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//log4j-core-2.18.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-hook-abfs-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//forbiddenapis-3.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-intg-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//log4j-api-2.18.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//zstd-jni-1.4.9-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-i18n.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-s3-lib-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//javax.activation-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//bundle-2.23.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//json-smart-2.4.10.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-util-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-shell.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-cloud-bindings.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-hook-s3-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//aspectjrt-1.9.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/./:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/swagger-annotations-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/objenesis-1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jersey-client-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jakarta.xml.bind-api-2.3.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jakarta.activation-api-1.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-dataformat-yaml-2.9.10.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcutil-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcprov-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcpkix-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/HikariCP-java7-2.4.12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/snakeyaml-2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jsonschema2pojo-core-1.0.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/joda-time-2.10.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jna-5.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jersey-guice-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/javax.inject-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-module-jaxb-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-jaxrs-json-provider-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-jaxrs-base-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/guice-servlet-4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/guice-4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/geronimo-jcache_1.0_spec-1.0-alpha-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/fst-2.50.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/ehcache-3.3.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/dnsjava-2.1.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/codemodel-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-api.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-distributedshell.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-cloud-resourcemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-registry.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-nodemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-resourcemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-router.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-sharedcachemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-timeline-pluginstorage.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-web-proxy.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-api.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-core.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-registry-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-api-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-core-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-api-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-web-proxy-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-timeline-pluginstorage-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-tests-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-sharedcachemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-router-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-resourcemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-nodemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-cloud-resourcemanager-1.0.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-distributedshell-3.1.1.7.3.1.0-197.jar:/opt/cloudera/cm/lib/plugins/event-publish-7.13.1-shaded.jar:/opt/cloudera/cm/lib/plugins/tt-instrumentation-7.13.1.jar STARTUP_MSG: build = git@github.infra.cloudera.com:CDH/hadoop.git -r 31a42fb39494f541ffae15c3c61185deeeacca86; compiled by 'jenkins' on 2024-12-04T01:09Z STARTUP_MSG: java = 1.8.0_432 ************************************************************/ 2025-07-09 17:38:34,461 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: registered UNIX signal handlers for [TERM, HUP, INT] 2025-07-09 17:38:34,826 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d1/dfs/dn 2025-07-09 17:38:34,832 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d2/dfs/dn 2025-07-09 17:38:34,833 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d3/dfs/dn 2025-07-09 17:38:34,833 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d4/dfs/dn 2025-07-09 17:38:34,998 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: Loaded properties from hadoop-metrics2.properties 2025-07-09 17:38:35,108 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled Metric snapshot period at 10 second(s). 2025-07-09 17:38:35,109 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: DataNode metrics system started 2025-07-09 17:38:35,398 INFO org.apache.hadoop.hdfs.server.common.Util: dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling 2025-07-09 17:38:35,424 INFO org.apache.hadoop.hdfs.server.datanode.BlockScanner: Initialized block scanner with targetBytesPerSec 1048576 2025-07-09 17:38:35,431 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: File descriptor passing is enabled. 2025-07-09 17:38:35,432 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Configured hostname is dmidlkprdls04.svr.luc.edu 2025-07-09 17:38:35,433 INFO org.apache.hadoop.hdfs.server.common.Util: dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling 2025-07-09 17:38:35,548 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting DataNode with maxLockedMemory = 4294967296 2025-07-09 17:38:35,589 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened streaming server at /192.168.158.4:9866 2025-07-09 17:38:35,593 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Balancing bandwidth is 10485760 bytes/s 2025-07-09 17:38:35,593 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Number threads for balancing is 50 2025-07-09 17:38:35,598 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Balancing bandwidth is 10485760 bytes/s 2025-07-09 17:38:35,598 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Number threads for balancing is 50 2025-07-09 17:38:35,599 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Listening on UNIX domain socket: /var/run/hdfs-sockets/dn 2025-07-09 17:38:35,659 INFO org.eclipse.jetty.util.log: Logging initialized @2454ms to org.eclipse.jetty.util.log.Slf4jLog 2025-07-09 17:38:35,797 WARN org.apache.hadoop.security.authentication.server.AuthenticationFilter: Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /var/lib/hadoop-hdfs/hadoop-http-auth-signature-secret 2025-07-09 17:38:35,806 INFO org.apache.hadoop.http.HttpRequestLog: Http request log for http.requests.datanode is not defined 2025-07-09 17:38:35,818 INFO org.apache.hadoop.http.HttpServer2: Added global filter 'safety' (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter) 2025-07-09 17:38:35,821 INFO org.apache.hadoop.security.HttpCrossOriginFilterInitializer: CORS filter not enabled. Please set hadoop.http.cross-origin.enabled to 'true' to enable it 2025-07-09 17:38:35,822 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context datanode 2025-07-09 17:38:35,822 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context logs 2025-07-09 17:38:35,823 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context static 2025-07-09 17:38:35,866 INFO org.apache.hadoop.http.HttpServer2: Jetty bound to port 41277 2025-07-09 17:38:35,867 INFO org.eclipse.jetty.server.Server: jetty-9.4.54.v20240208; built: 2024-02-08T19:42:39.027Z; git: cef3fbd6d736a21e7d541a5db490381d95a2047d; jvm 1.8.0_432-b06 2025-07-09 17:38:35,924 INFO org.eclipse.jetty.server.session: DefaultSessionIdManager workerName=node0 2025-07-09 17:38:35,924 INFO org.eclipse.jetty.server.session: No SessionScavenger set, using defaults 2025-07-09 17:38:35,927 INFO org.eclipse.jetty.server.session: node0 Scavenging every 660000ms 2025-07-09 17:38:35,956 WARN org.apache.hadoop.security.authentication.server.AuthenticationFilter: Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /var/lib/hadoop-hdfs/hadoop-http-auth-signature-secret 2025-07-09 17:38:35,962 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.s.ServletContextHandler@48c40605{logs,/logs,file:///var/log/hadoop-hdfs/,AVAILABLE} 2025-07-09 17:38:35,964 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.s.ServletContextHandler@6cea706c{static,/static,file:///opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/static/,AVAILABLE} 2025-07-09 17:38:36,081 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.w.WebAppContext@2c444798{datanode,/,file:///opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/datanode/,AVAILABLE}{file:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/datanode} 2025-07-09 17:38:36,098 INFO org.eclipse.jetty.server.AbstractConnector: Started ServerConnector@532721fd{HTTP/1.1, (http/1.1)}{localhost:41277} 2025-07-09 17:38:36,098 INFO org.eclipse.jetty.server.Server: Started @2893ms 2025-07-09 17:38:36,397 INFO org.apache.hadoop.hdfs.server.datanode.web.DatanodeHttpServer: Listening HTTP traffic on /192.168.158.4:9864 2025-07-09 17:38:36,408 INFO org.apache.hadoop.util.JvmPauseMonitor: Starting JVM pause monitor 2025-07-09 17:38:36,409 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: dnUserName = hdfs 2025-07-09 17:38:36,409 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: supergroup = supergroup 2025-07-09 17:38:36,472 INFO org.apache.hadoop.ipc.CallQueueManager: Using callQueue: class java.util.concurrent.LinkedBlockingQueue queueCapacity: 1000 scheduler: class org.apache.hadoop.ipc.DefaultRpcScheduler 2025-07-09 17:38:36,493 INFO org.apache.hadoop.ipc.Server: Starting Socket Reader #1 for port 9867 2025-07-09 17:38:36,537 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened IPC server at /192.168.158.4:9867 2025-07-09 17:38:36,568 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Refresh request received for nameservices: null 2025-07-09 17:38:36,577 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting BPOfferServices for nameservices: 2025-07-09 17:38:36,590 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool (Datanode Uuid unassigned) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 starting to offer service 2025-07-09 17:38:36,596 INFO org.apache.hadoop.ipc.Server: IPC Server Responder: starting 2025-07-09 17:38:36,596 INFO org.apache.hadoop.ipc.Server: IPC Server listener on 9867: starting 2025-07-09 17:38:36,900 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Acknowledging ACTIVE Namenode during handshakeBlock pool (Datanode Uuid unassigned) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-09 17:38:36,908 INFO org.apache.hadoop.hdfs.server.common.Storage: Using 4 threads to upgrade data directories (dfs.datanode.parallel.volumes.load.threads.num=4, dataDirs=4) 2025-07-09 17:38:36,922 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d1/dfs/dn/in_use.lock acquired by nodename 27073@dmidlkprdls04.svr.luc.edu 2025-07-09 17:38:36,927 WARN org.apache.hadoop.hdfs.server.common.Storage: Failed to add storage directory [DISK]file:/hdfs/d1/dfs/dn java.io.IOException: Incompatible clusterIDs in /hdfs/d1/dfs/dn: namenode clusterID = cluster38; datanode clusterID = cluster72 at org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:744) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadStorageDirectory(DataStorage.java:294) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadDataStorage(DataStorage.java:407) at org.apache.hadoop.hdfs.server.datanode.DataStorage.addStorageLocations(DataStorage.java:387) at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:559) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-09 17:38:36,935 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d2/dfs/dn/in_use.lock acquired by nodename 27073@dmidlkprdls04.svr.luc.edu 2025-07-09 17:38:36,936 WARN org.apache.hadoop.hdfs.server.common.Storage: Failed to add storage directory [DISK]file:/hdfs/d2/dfs/dn java.io.IOException: Incompatible clusterIDs in /hdfs/d2/dfs/dn: namenode clusterID = cluster38; datanode clusterID = cluster72 at org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:744) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadStorageDirectory(DataStorage.java:294) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadDataStorage(DataStorage.java:407) at org.apache.hadoop.hdfs.server.datanode.DataStorage.addStorageLocations(DataStorage.java:387) at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:559) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-09 17:38:36,937 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d3/dfs/dn/in_use.lock acquired by nodename 27073@dmidlkprdls04.svr.luc.edu 2025-07-09 17:38:36,938 WARN org.apache.hadoop.hdfs.server.common.Storage: Failed to add storage directory [DISK]file:/hdfs/d3/dfs/dn java.io.IOException: Incompatible clusterIDs in /hdfs/d3/dfs/dn: namenode clusterID = cluster38; datanode clusterID = cluster72 at org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:744) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadStorageDirectory(DataStorage.java:294) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadDataStorage(DataStorage.java:407) at org.apache.hadoop.hdfs.server.datanode.DataStorage.addStorageLocations(DataStorage.java:387) at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:559) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-09 17:38:36,939 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d4/dfs/dn/in_use.lock acquired by nodename 27073@dmidlkprdls04.svr.luc.edu 2025-07-09 17:38:36,939 WARN org.apache.hadoop.hdfs.server.common.Storage: Failed to add storage directory [DISK]file:/hdfs/d4/dfs/dn java.io.IOException: Incompatible clusterIDs in /hdfs/d4/dfs/dn: namenode clusterID = cluster38; datanode clusterID = cluster72 at org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:744) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadStorageDirectory(DataStorage.java:294) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadDataStorage(DataStorage.java:407) at org.apache.hadoop.hdfs.server.datanode.DataStorage.addStorageLocations(DataStorage.java:387) at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:559) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-09 17:38:36,943 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: Initialization failed for Block pool (Datanode Uuid 92aa408f-509e-4bad-98e0-6e83f0a34a12) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Exiting. java.io.IOException: All specified directories have failed to load. at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:560) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-09 17:38:36,944 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Ending block pool service for: Block pool (Datanode Uuid 92aa408f-509e-4bad-98e0-6e83f0a34a12) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-09 17:38:36,945 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Removed Block pool (Datanode Uuid 92aa408f-509e-4bad-98e0-6e83f0a34a12) 2025-07-09 17:38:38,946 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Exiting Datanode 2025-07-09 17:38:38,953 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG: /************************************************************ SHUTDOWN_MSG: Shutting down DataNode at dmidlkprdls04.svr.luc.edu/192.168.158.4 ************************************************************/ 2025-07-09 17:38:44,423 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: STARTUP_MSG: /************************************************************ STARTUP_MSG: Starting DataNode STARTUP_MSG: host = dmidlkprdls04.svr.luc.edu/192.168.158.4 STARTUP_MSG: args = [] STARTUP_MSG: version = 3.1.1.7.3.1.0-197 STARTUP_MSG: classpath = /var/run/cloudera-scm-agent/process/146-hdfs-DATANODE:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/audience-annotations-0.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/avro.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/aws-java-sdk-bundle-1.12.720.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-hdfs-plugin-shim-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-plugin-classloader-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-yarn-plugin-shim-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/azure-data-lake-store-sdk-2.3.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/checker-qual-3.33.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-beanutils-1.9.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-cli-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-codec-1.15.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-collections-3.2.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-compress-1.23.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-configuration2-2.10.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-io-2.11.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-lang-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-lang3-3.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-logging-1.1.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-math3-3.6.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-net-3.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-text-1.10.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-client-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-framework-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-recipes-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/failureaccess-1.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/gson-2.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/guava-32.0.1-jre.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/httpclient-4.5.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/httpcore-4.4.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/j2objc-annotations-2.8.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-core-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-core-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-databind-2.12.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-mapper-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-xc-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/javax.activation-api-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/javax.servlet-api-3.1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jaxb-api-2.2.11.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jcip-annotations-1.0-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-core-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-json-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-server-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-servlet-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jettison-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-http-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-io-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-security-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-server-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-servlet-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-util-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-util-ajax-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-webapp-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-xml-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jline-3.22.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsch-0.1.54.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsp-api-2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsr305-3.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsr311-api-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jul-to-slf4j-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-admin-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-client-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-common-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-core-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-crypto-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-identity-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-server-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-simplekdc-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-asn1-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-config-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-pkix-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-xdr-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/logredactor-2.0.16.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/lz4-java-1.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/metrics-core-3.2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-all-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-buffer-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-haproxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-http-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-http2-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-memcache-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-mqtt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-redis-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-smtp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-log4j12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-reload4j-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-api-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/reload4j-1.2.22.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/zookeeper.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-udt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-unix-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-kqueue-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final-linux-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-classes-kqueue-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-rxtx-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-classes-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-classes-macos-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-ssl-ocsp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-proxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-xml-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-stomp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-socks-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/wildfly-openssl-2.1.4.ClouderaFinal.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/protobuf-java-2.5.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/nimbus-jose-jwt-9.37.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/woodstox-core-5.4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/token-provider-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/stax2-api-4.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/snappy-java-1.1.10.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/re2j-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/zookeeper-jute.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-sctp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-kqueue-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final-linux-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//gcs-connector-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-annotations.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-auth.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-aws.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-datalake.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-kms.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-nfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//ozone-filesystem-hadoop3-1.3.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-nfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-kms-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-datalake-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-aws-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-auth-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-annotations-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//gcs-connector-2.1.2.7.3.1.0-197-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-thrift.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-scala_2.12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-protobuf.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-pig.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-pig-bundle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-jackson.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-hadoop.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-hadoop-bundle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-generator.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-format-structures.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-encoding.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-column.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-cascading3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-cascading.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-avro.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/./:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/audience-annotations-0.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/avro-1.11.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/checker-qual-3.33.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-beanutils-1.9.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-codec-1.15.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-collections-3.2.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-compress-1.23.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-configuration2-2.10.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-io-2.11.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-lang3-3.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-math3-3.6.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-net-3.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-text-1.10.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-client-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-framework-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-recipes-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/failureaccess-1.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/gson-2.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/guava-32.0.1-jre.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/httpclient-4.5.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/httpcore-4.4.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/j2objc-annotations-2.8.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-core-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-databind-2.12.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-jaxrs-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-xc-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/javax.activation-api-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/javax.servlet-api-3.1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jaxb-api-2.2.11.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jaxb-impl-2.2.3-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jcip-annotations-1.0-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-core-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-json-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-server-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-servlet-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jettison-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-http-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-io-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-security-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-server-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-servlet-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-util-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-util-ajax-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-webapp-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-xml-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jline-3.22.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsch-0.1.54.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/json-simple-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsr311-api-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-admin-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-client-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-common-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-core-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-crypto-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-identity-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-server-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-simplekdc-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-asn1-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-config-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-pkix-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-xdr-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/leveldbjni-cldr-1.9.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/lz4-java-1.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/metrics-core-3.2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-all-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-buffer-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-haproxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-http-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-http2-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-memcache-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-mqtt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-redis-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-smtp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-socks-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-stomp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-xml-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-proxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-ssl-ocsp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/re2j-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/zookeeper-3.8.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-sctp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-rxtx-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-unix-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-kqueue-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-kqueue-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final-linux-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final-linux-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-classes-kqueue-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-classes-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-classes-macos-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/nimbus-jose-jwt-9.37.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/woodstox-core-5.4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/token-provider-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/stax2-api-4.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/snappy-java-1.1.10.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/reload4j-1.2.22.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/zookeeper-jute-3.8.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-udt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-httpfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-nfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-nfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-httpfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//asm-5.0.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//aspectjweaver-1.9.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//azure-storage-7.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//checker-compat-qual-2.5.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-slf4j-backend-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-system-backend-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//google-extensions-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//azure-keyvault-core-1.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//accessors-smart-2.4.9.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gcs-connector-2.1.2.7.3.1.0-197-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archive-logs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archive-logs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archives-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archives.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-datajoin-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-datajoin.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-distcp-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-distcp.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-extras-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-extras.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-fs2img-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-fs2img.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-gridmix-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-gridmix.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-kafka-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-kafka.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-app-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-app.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-core-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-core.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-nativetask-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-nativetask.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-uploader-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-uploader.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-examples-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-examples.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-openstack-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-openstack.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-resourceestimator-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-resourceestimator.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-rumen-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-rumen.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-sls-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-sls.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-streaming-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-streaming.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ojalgo-43.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//kafka-clients-2.8.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//log4j-core-2.18.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-hook-abfs-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//forbiddenapis-3.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-intg-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//log4j-api-2.18.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//zstd-jni-1.4.9-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-i18n.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-s3-lib-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//javax.activation-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//bundle-2.23.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//json-smart-2.4.10.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-util-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-shell.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-cloud-bindings.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-hook-s3-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//aspectjrt-1.9.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/./:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/swagger-annotations-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/objenesis-1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jersey-client-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jakarta.xml.bind-api-2.3.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jakarta.activation-api-1.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-dataformat-yaml-2.9.10.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcutil-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcprov-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcpkix-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/HikariCP-java7-2.4.12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/snakeyaml-2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jsonschema2pojo-core-1.0.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/joda-time-2.10.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jna-5.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jersey-guice-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/javax.inject-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-module-jaxb-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-jaxrs-json-provider-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-jaxrs-base-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/guice-servlet-4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/guice-4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/geronimo-jcache_1.0_spec-1.0-alpha-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/fst-2.50.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/ehcache-3.3.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/dnsjava-2.1.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/codemodel-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-api.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-distributedshell.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-cloud-resourcemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-registry.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-nodemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-resourcemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-router.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-sharedcachemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-timeline-pluginstorage.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-web-proxy.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-api.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-core.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-registry-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-api-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-core-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-api-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-web-proxy-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-timeline-pluginstorage-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-tests-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-sharedcachemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-router-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-resourcemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-nodemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-cloud-resourcemanager-1.0.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-distributedshell-3.1.1.7.3.1.0-197.jar:/opt/cloudera/cm/lib/plugins/event-publish-7.13.1-shaded.jar:/opt/cloudera/cm/lib/plugins/tt-instrumentation-7.13.1.jar STARTUP_MSG: build = git@github.infra.cloudera.com:CDH/hadoop.git -r 31a42fb39494f541ffae15c3c61185deeeacca86; compiled by 'jenkins' on 2024-12-04T01:09Z STARTUP_MSG: java = 1.8.0_432 ************************************************************/ 2025-07-09 17:38:44,512 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: registered UNIX signal handlers for [TERM, HUP, INT] 2025-07-09 17:38:44,862 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d1/dfs/dn 2025-07-09 17:38:44,868 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d2/dfs/dn 2025-07-09 17:38:44,869 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d3/dfs/dn 2025-07-09 17:38:44,869 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d4/dfs/dn 2025-07-09 17:38:45,032 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: Loaded properties from hadoop-metrics2.properties 2025-07-09 17:38:45,149 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled Metric snapshot period at 10 second(s). 2025-07-09 17:38:45,149 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: DataNode metrics system started 2025-07-09 17:38:45,554 INFO org.apache.hadoop.hdfs.server.common.Util: dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling 2025-07-09 17:38:45,580 INFO org.apache.hadoop.hdfs.server.datanode.BlockScanner: Initialized block scanner with targetBytesPerSec 1048576 2025-07-09 17:38:45,587 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: File descriptor passing is enabled. 2025-07-09 17:38:45,588 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Configured hostname is dmidlkprdls04.svr.luc.edu 2025-07-09 17:38:45,589 INFO org.apache.hadoop.hdfs.server.common.Util: dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling 2025-07-09 17:38:45,595 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting DataNode with maxLockedMemory = 4294967296 2025-07-09 17:38:45,622 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened streaming server at /192.168.158.4:9866 2025-07-09 17:38:45,624 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Balancing bandwidth is 10485760 bytes/s 2025-07-09 17:38:45,624 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Number threads for balancing is 50 2025-07-09 17:38:45,628 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Balancing bandwidth is 10485760 bytes/s 2025-07-09 17:38:45,628 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Number threads for balancing is 50 2025-07-09 17:38:45,628 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Listening on UNIX domain socket: /var/run/hdfs-sockets/dn 2025-07-09 17:38:45,678 INFO org.eclipse.jetty.util.log: Logging initialized @2533ms to org.eclipse.jetty.util.log.Slf4jLog 2025-07-09 17:38:45,800 WARN org.apache.hadoop.security.authentication.server.AuthenticationFilter: Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /var/lib/hadoop-hdfs/hadoop-http-auth-signature-secret 2025-07-09 17:38:45,809 INFO org.apache.hadoop.http.HttpRequestLog: Http request log for http.requests.datanode is not defined 2025-07-09 17:38:45,819 INFO org.apache.hadoop.http.HttpServer2: Added global filter 'safety' (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter) 2025-07-09 17:38:45,822 INFO org.apache.hadoop.security.HttpCrossOriginFilterInitializer: CORS filter not enabled. Please set hadoop.http.cross-origin.enabled to 'true' to enable it 2025-07-09 17:38:45,824 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context datanode 2025-07-09 17:38:45,824 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context logs 2025-07-09 17:38:45,824 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context static 2025-07-09 17:38:45,876 INFO org.apache.hadoop.http.HttpServer2: Jetty bound to port 34061 2025-07-09 17:38:45,878 INFO org.eclipse.jetty.server.Server: jetty-9.4.54.v20240208; built: 2024-02-08T19:42:39.027Z; git: cef3fbd6d736a21e7d541a5db490381d95a2047d; jvm 1.8.0_432-b06 2025-07-09 17:38:45,934 INFO org.eclipse.jetty.server.session: DefaultSessionIdManager workerName=node0 2025-07-09 17:38:45,935 INFO org.eclipse.jetty.server.session: No SessionScavenger set, using defaults 2025-07-09 17:38:45,938 INFO org.eclipse.jetty.server.session: node0 Scavenging every 660000ms 2025-07-09 17:38:45,966 WARN org.apache.hadoop.security.authentication.server.AuthenticationFilter: Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /var/lib/hadoop-hdfs/hadoop-http-auth-signature-secret 2025-07-09 17:38:45,972 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.s.ServletContextHandler@1b11ef33{logs,/logs,file:///var/log/hadoop-hdfs/,AVAILABLE} 2025-07-09 17:38:45,973 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.s.ServletContextHandler@2f2bf0e2{static,/static,file:///opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/static/,AVAILABLE} 2025-07-09 17:38:46,084 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.w.WebAppContext@6ebd78d1{datanode,/,file:///opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/datanode/,AVAILABLE}{file:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/datanode} 2025-07-09 17:38:46,097 INFO org.eclipse.jetty.server.AbstractConnector: Started ServerConnector@7fb9f71f{HTTP/1.1, (http/1.1)}{localhost:34061} 2025-07-09 17:38:46,098 INFO org.eclipse.jetty.server.Server: Started @2953ms 2025-07-09 17:38:46,407 INFO org.apache.hadoop.hdfs.server.datanode.web.DatanodeHttpServer: Listening HTTP traffic on /192.168.158.4:9864 2025-07-09 17:38:46,418 INFO org.apache.hadoop.util.JvmPauseMonitor: Starting JVM pause monitor 2025-07-09 17:38:46,419 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: dnUserName = hdfs 2025-07-09 17:38:46,420 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: supergroup = supergroup 2025-07-09 17:38:46,493 INFO org.apache.hadoop.ipc.CallQueueManager: Using callQueue: class java.util.concurrent.LinkedBlockingQueue queueCapacity: 1000 scheduler: class org.apache.hadoop.ipc.DefaultRpcScheduler 2025-07-09 17:38:46,516 INFO org.apache.hadoop.ipc.Server: Starting Socket Reader #1 for port 9867 2025-07-09 17:38:46,566 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened IPC server at /192.168.158.4:9867 2025-07-09 17:38:46,603 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Refresh request received for nameservices: null 2025-07-09 17:38:46,613 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting BPOfferServices for nameservices: 2025-07-09 17:38:46,629 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool (Datanode Uuid unassigned) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 starting to offer service 2025-07-09 17:38:46,636 INFO org.apache.hadoop.ipc.Server: IPC Server Responder: starting 2025-07-09 17:38:46,636 INFO org.apache.hadoop.ipc.Server: IPC Server listener on 9867: starting 2025-07-09 17:38:46,961 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Acknowledging ACTIVE Namenode during handshakeBlock pool (Datanode Uuid unassigned) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-09 17:38:46,966 INFO org.apache.hadoop.hdfs.server.common.Storage: Using 4 threads to upgrade data directories (dfs.datanode.parallel.volumes.load.threads.num=4, dataDirs=4) 2025-07-09 17:38:46,974 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d1/dfs/dn/in_use.lock acquired by nodename 27585@dmidlkprdls04.svr.luc.edu 2025-07-09 17:38:46,978 WARN org.apache.hadoop.hdfs.server.common.Storage: Failed to add storage directory [DISK]file:/hdfs/d1/dfs/dn java.io.IOException: Incompatible clusterIDs in /hdfs/d1/dfs/dn: namenode clusterID = cluster38; datanode clusterID = cluster72 at org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:744) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadStorageDirectory(DataStorage.java:294) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadDataStorage(DataStorage.java:407) at org.apache.hadoop.hdfs.server.datanode.DataStorage.addStorageLocations(DataStorage.java:387) at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:559) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-09 17:38:46,983 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d2/dfs/dn/in_use.lock acquired by nodename 27585@dmidlkprdls04.svr.luc.edu 2025-07-09 17:38:46,984 WARN org.apache.hadoop.hdfs.server.common.Storage: Failed to add storage directory [DISK]file:/hdfs/d2/dfs/dn java.io.IOException: Incompatible clusterIDs in /hdfs/d2/dfs/dn: namenode clusterID = cluster38; datanode clusterID = cluster72 at org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:744) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadStorageDirectory(DataStorage.java:294) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadDataStorage(DataStorage.java:407) at org.apache.hadoop.hdfs.server.datanode.DataStorage.addStorageLocations(DataStorage.java:387) at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:559) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-09 17:38:46,985 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d3/dfs/dn/in_use.lock acquired by nodename 27585@dmidlkprdls04.svr.luc.edu 2025-07-09 17:38:46,985 WARN org.apache.hadoop.hdfs.server.common.Storage: Failed to add storage directory [DISK]file:/hdfs/d3/dfs/dn java.io.IOException: Incompatible clusterIDs in /hdfs/d3/dfs/dn: namenode clusterID = cluster38; datanode clusterID = cluster72 at org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:744) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadStorageDirectory(DataStorage.java:294) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadDataStorage(DataStorage.java:407) at org.apache.hadoop.hdfs.server.datanode.DataStorage.addStorageLocations(DataStorage.java:387) at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:559) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-09 17:38:46,986 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d4/dfs/dn/in_use.lock acquired by nodename 27585@dmidlkprdls04.svr.luc.edu 2025-07-09 17:38:46,987 WARN org.apache.hadoop.hdfs.server.common.Storage: Failed to add storage directory [DISK]file:/hdfs/d4/dfs/dn java.io.IOException: Incompatible clusterIDs in /hdfs/d4/dfs/dn: namenode clusterID = cluster38; datanode clusterID = cluster72 at org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:744) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadStorageDirectory(DataStorage.java:294) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadDataStorage(DataStorage.java:407) at org.apache.hadoop.hdfs.server.datanode.DataStorage.addStorageLocations(DataStorage.java:387) at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:559) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-09 17:38:46,990 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: Initialization failed for Block pool (Datanode Uuid 92aa408f-509e-4bad-98e0-6e83f0a34a12) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Exiting. java.io.IOException: All specified directories have failed to load. at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:560) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-09 17:38:46,990 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Ending block pool service for: Block pool (Datanode Uuid 92aa408f-509e-4bad-98e0-6e83f0a34a12) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-09 17:38:46,992 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Removed Block pool (Datanode Uuid 92aa408f-509e-4bad-98e0-6e83f0a34a12) 2025-07-09 17:38:48,993 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Exiting Datanode 2025-07-09 17:38:49,000 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG: /************************************************************ SHUTDOWN_MSG: Shutting down DataNode at dmidlkprdls04.svr.luc.edu/192.168.158.4 ************************************************************/ 2025-07-09 17:38:56,070 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: STARTUP_MSG: /************************************************************ STARTUP_MSG: Starting DataNode STARTUP_MSG: host = dmidlkprdls04.svr.luc.edu/192.168.158.4 STARTUP_MSG: args = [] STARTUP_MSG: version = 3.1.1.7.3.1.0-197 STARTUP_MSG: classpath = /var/run/cloudera-scm-agent/process/146-hdfs-DATANODE:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/audience-annotations-0.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/avro.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/aws-java-sdk-bundle-1.12.720.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-hdfs-plugin-shim-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-plugin-classloader-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-yarn-plugin-shim-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/azure-data-lake-store-sdk-2.3.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/checker-qual-3.33.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-beanutils-1.9.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-cli-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-codec-1.15.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-collections-3.2.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-compress-1.23.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-configuration2-2.10.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-io-2.11.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-lang-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-lang3-3.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-logging-1.1.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-math3-3.6.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-net-3.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-text-1.10.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-client-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-framework-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-recipes-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/failureaccess-1.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/gson-2.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/guava-32.0.1-jre.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/httpclient-4.5.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/httpcore-4.4.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/j2objc-annotations-2.8.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-core-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-core-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-databind-2.12.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-mapper-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-xc-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/javax.activation-api-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/javax.servlet-api-3.1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jaxb-api-2.2.11.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jcip-annotations-1.0-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-core-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-json-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-server-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-servlet-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jettison-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-http-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-io-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-security-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-server-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-servlet-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-util-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-util-ajax-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-webapp-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-xml-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jline-3.22.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsch-0.1.54.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsp-api-2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsr305-3.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsr311-api-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jul-to-slf4j-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-admin-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-client-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-common-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-core-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-crypto-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-identity-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-server-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-simplekdc-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-asn1-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-config-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-pkix-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-xdr-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/logredactor-2.0.16.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/lz4-java-1.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/metrics-core-3.2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-all-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-buffer-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-haproxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-http-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-http2-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-memcache-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-mqtt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-redis-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-smtp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-log4j12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-reload4j-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-api-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/reload4j-1.2.22.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/zookeeper.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-udt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-unix-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-kqueue-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final-linux-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-classes-kqueue-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-rxtx-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-classes-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-classes-macos-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-ssl-ocsp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-proxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-xml-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-stomp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-socks-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/wildfly-openssl-2.1.4.ClouderaFinal.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/protobuf-java-2.5.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/nimbus-jose-jwt-9.37.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/woodstox-core-5.4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/token-provider-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/stax2-api-4.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/snappy-java-1.1.10.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/re2j-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/zookeeper-jute.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-sctp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-kqueue-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final-linux-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//gcs-connector-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-annotations.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-auth.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-aws.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-datalake.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-kms.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-nfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//ozone-filesystem-hadoop3-1.3.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-nfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-kms-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-datalake-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-aws-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-auth-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-annotations-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//gcs-connector-2.1.2.7.3.1.0-197-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-thrift.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-scala_2.12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-protobuf.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-pig.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-pig-bundle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-jackson.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-hadoop.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-hadoop-bundle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-generator.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-format-structures.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-encoding.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-column.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-cascading3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-cascading.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-avro.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/./:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/audience-annotations-0.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/avro-1.11.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/checker-qual-3.33.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-beanutils-1.9.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-codec-1.15.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-collections-3.2.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-compress-1.23.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-configuration2-2.10.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-io-2.11.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-lang3-3.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-math3-3.6.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-net-3.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-text-1.10.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-client-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-framework-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-recipes-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/failureaccess-1.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/gson-2.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/guava-32.0.1-jre.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/httpclient-4.5.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/httpcore-4.4.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/j2objc-annotations-2.8.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-core-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-databind-2.12.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-jaxrs-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-xc-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/javax.activation-api-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/javax.servlet-api-3.1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jaxb-api-2.2.11.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jaxb-impl-2.2.3-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jcip-annotations-1.0-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-core-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-json-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-server-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-servlet-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jettison-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-http-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-io-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-security-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-server-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-servlet-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-util-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-util-ajax-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-webapp-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-xml-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jline-3.22.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsch-0.1.54.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/json-simple-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsr311-api-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-admin-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-client-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-common-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-core-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-crypto-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-identity-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-server-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-simplekdc-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-asn1-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-config-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-pkix-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-xdr-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/leveldbjni-cldr-1.9.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/lz4-java-1.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/metrics-core-3.2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-all-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-buffer-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-haproxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-http-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-http2-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-memcache-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-mqtt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-redis-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-smtp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-socks-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-stomp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-xml-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-proxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-ssl-ocsp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/re2j-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/zookeeper-3.8.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-sctp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-rxtx-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-unix-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-kqueue-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-kqueue-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final-linux-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final-linux-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-classes-kqueue-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-classes-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-classes-macos-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/nimbus-jose-jwt-9.37.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/woodstox-core-5.4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/token-provider-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/stax2-api-4.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/snappy-java-1.1.10.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/reload4j-1.2.22.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/zookeeper-jute-3.8.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-udt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-httpfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-nfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-nfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-httpfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//asm-5.0.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//aspectjweaver-1.9.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//azure-storage-7.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//checker-compat-qual-2.5.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-slf4j-backend-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-system-backend-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//google-extensions-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//azure-keyvault-core-1.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//accessors-smart-2.4.9.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gcs-connector-2.1.2.7.3.1.0-197-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archive-logs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archive-logs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archives-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archives.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-datajoin-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-datajoin.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-distcp-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-distcp.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-extras-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-extras.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-fs2img-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-fs2img.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-gridmix-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-gridmix.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-kafka-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-kafka.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-app-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-app.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-core-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-core.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-nativetask-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-nativetask.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-uploader-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-uploader.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-examples-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-examples.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-openstack-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-openstack.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-resourceestimator-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-resourceestimator.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-rumen-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-rumen.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-sls-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-sls.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-streaming-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-streaming.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ojalgo-43.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//kafka-clients-2.8.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//log4j-core-2.18.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-hook-abfs-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//forbiddenapis-3.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-intg-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//log4j-api-2.18.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//zstd-jni-1.4.9-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-i18n.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-s3-lib-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//javax.activation-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//bundle-2.23.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//json-smart-2.4.10.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-util-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-shell.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-cloud-bindings.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-hook-s3-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//aspectjrt-1.9.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/./:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/swagger-annotations-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/objenesis-1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jersey-client-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jakarta.xml.bind-api-2.3.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jakarta.activation-api-1.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-dataformat-yaml-2.9.10.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcutil-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcprov-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcpkix-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/HikariCP-java7-2.4.12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/snakeyaml-2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jsonschema2pojo-core-1.0.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/joda-time-2.10.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jna-5.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jersey-guice-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/javax.inject-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-module-jaxb-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-jaxrs-json-provider-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-jaxrs-base-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/guice-servlet-4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/guice-4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/geronimo-jcache_1.0_spec-1.0-alpha-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/fst-2.50.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/ehcache-3.3.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/dnsjava-2.1.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/codemodel-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-api.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-distributedshell.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-cloud-resourcemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-registry.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-nodemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-resourcemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-router.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-sharedcachemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-timeline-pluginstorage.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-web-proxy.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-api.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-core.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-registry-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-api-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-core-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-api-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-web-proxy-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-timeline-pluginstorage-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-tests-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-sharedcachemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-router-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-resourcemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-nodemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-cloud-resourcemanager-1.0.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-distributedshell-3.1.1.7.3.1.0-197.jar:/opt/cloudera/cm/lib/plugins/event-publish-7.13.1-shaded.jar:/opt/cloudera/cm/lib/plugins/tt-instrumentation-7.13.1.jar STARTUP_MSG: build = git@github.infra.cloudera.com:CDH/hadoop.git -r 31a42fb39494f541ffae15c3c61185deeeacca86; compiled by 'jenkins' on 2024-12-04T01:09Z STARTUP_MSG: java = 1.8.0_432 ************************************************************/ 2025-07-09 17:38:56,154 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: registered UNIX signal handlers for [TERM, HUP, INT] 2025-07-09 17:38:56,522 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d1/dfs/dn 2025-07-09 17:38:56,529 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d2/dfs/dn 2025-07-09 17:38:56,529 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d3/dfs/dn 2025-07-09 17:38:56,530 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d4/dfs/dn 2025-07-09 17:38:56,691 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: Loaded properties from hadoop-metrics2.properties 2025-07-09 17:38:56,794 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled Metric snapshot period at 10 second(s). 2025-07-09 17:38:56,794 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: DataNode metrics system started 2025-07-09 17:38:57,086 INFO org.apache.hadoop.hdfs.server.common.Util: dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling 2025-07-09 17:38:57,110 INFO org.apache.hadoop.hdfs.server.datanode.BlockScanner: Initialized block scanner with targetBytesPerSec 1048576 2025-07-09 17:38:57,117 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: File descriptor passing is enabled. 2025-07-09 17:38:57,118 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Configured hostname is dmidlkprdls04.svr.luc.edu 2025-07-09 17:38:57,119 INFO org.apache.hadoop.hdfs.server.common.Util: dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling 2025-07-09 17:38:57,127 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting DataNode with maxLockedMemory = 4294967296 2025-07-09 17:38:57,273 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened streaming server at /192.168.158.4:9866 2025-07-09 17:38:57,277 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Balancing bandwidth is 10485760 bytes/s 2025-07-09 17:38:57,277 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Number threads for balancing is 50 2025-07-09 17:38:57,282 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Balancing bandwidth is 10485760 bytes/s 2025-07-09 17:38:57,282 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Number threads for balancing is 50 2025-07-09 17:38:57,282 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Listening on UNIX domain socket: /var/run/hdfs-sockets/dn 2025-07-09 17:38:57,342 INFO org.eclipse.jetty.util.log: Logging initialized @2445ms to org.eclipse.jetty.util.log.Slf4jLog 2025-07-09 17:38:57,475 WARN org.apache.hadoop.security.authentication.server.AuthenticationFilter: Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /var/lib/hadoop-hdfs/hadoop-http-auth-signature-secret 2025-07-09 17:38:57,484 INFO org.apache.hadoop.http.HttpRequestLog: Http request log for http.requests.datanode is not defined 2025-07-09 17:38:57,494 INFO org.apache.hadoop.http.HttpServer2: Added global filter 'safety' (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter) 2025-07-09 17:38:57,497 INFO org.apache.hadoop.security.HttpCrossOriginFilterInitializer: CORS filter not enabled. Please set hadoop.http.cross-origin.enabled to 'true' to enable it 2025-07-09 17:38:57,498 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context datanode 2025-07-09 17:38:57,498 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context logs 2025-07-09 17:38:57,498 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context static 2025-07-09 17:38:57,542 INFO org.apache.hadoop.http.HttpServer2: Jetty bound to port 42209 2025-07-09 17:38:57,544 INFO org.eclipse.jetty.server.Server: jetty-9.4.54.v20240208; built: 2024-02-08T19:42:39.027Z; git: cef3fbd6d736a21e7d541a5db490381d95a2047d; jvm 1.8.0_432-b06 2025-07-09 17:38:57,600 INFO org.eclipse.jetty.server.session: DefaultSessionIdManager workerName=node0 2025-07-09 17:38:57,600 INFO org.eclipse.jetty.server.session: No SessionScavenger set, using defaults 2025-07-09 17:38:57,603 INFO org.eclipse.jetty.server.session: node0 Scavenging every 600000ms 2025-07-09 17:38:57,632 WARN org.apache.hadoop.security.authentication.server.AuthenticationFilter: Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /var/lib/hadoop-hdfs/hadoop-http-auth-signature-secret 2025-07-09 17:38:57,636 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.s.ServletContextHandler@1b11ef33{logs,/logs,file:///var/log/hadoop-hdfs/,AVAILABLE} 2025-07-09 17:38:57,638 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.s.ServletContextHandler@2f2bf0e2{static,/static,file:///opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/static/,AVAILABLE} 2025-07-09 17:38:57,746 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.w.WebAppContext@6ebd78d1{datanode,/,file:///opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/datanode/,AVAILABLE}{file:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/datanode} 2025-07-09 17:38:57,759 INFO org.eclipse.jetty.server.AbstractConnector: Started ServerConnector@7fb9f71f{HTTP/1.1, (http/1.1)}{localhost:42209} 2025-07-09 17:38:57,759 INFO org.eclipse.jetty.server.Server: Started @2863ms 2025-07-09 17:38:58,044 INFO org.apache.hadoop.hdfs.server.datanode.web.DatanodeHttpServer: Listening HTTP traffic on /192.168.158.4:9864 2025-07-09 17:38:58,054 INFO org.apache.hadoop.util.JvmPauseMonitor: Starting JVM pause monitor 2025-07-09 17:38:58,055 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: dnUserName = hdfs 2025-07-09 17:38:58,055 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: supergroup = supergroup 2025-07-09 17:38:58,125 INFO org.apache.hadoop.ipc.CallQueueManager: Using callQueue: class java.util.concurrent.LinkedBlockingQueue queueCapacity: 1000 scheduler: class org.apache.hadoop.ipc.DefaultRpcScheduler 2025-07-09 17:38:58,146 INFO org.apache.hadoop.ipc.Server: Starting Socket Reader #1 for port 9867 2025-07-09 17:38:58,193 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened IPC server at /192.168.158.4:9867 2025-07-09 17:38:58,231 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Refresh request received for nameservices: null 2025-07-09 17:38:58,242 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting BPOfferServices for nameservices: 2025-07-09 17:38:58,258 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool (Datanode Uuid unassigned) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 starting to offer service 2025-07-09 17:38:58,267 INFO org.apache.hadoop.ipc.Server: IPC Server Responder: starting 2025-07-09 17:38:58,267 INFO org.apache.hadoop.ipc.Server: IPC Server listener on 9867: starting 2025-07-09 17:38:58,546 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Acknowledging ACTIVE Namenode during handshakeBlock pool (Datanode Uuid unassigned) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-09 17:38:58,551 INFO org.apache.hadoop.hdfs.server.common.Storage: Using 4 threads to upgrade data directories (dfs.datanode.parallel.volumes.load.threads.num=4, dataDirs=4) 2025-07-09 17:38:58,558 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d1/dfs/dn/in_use.lock acquired by nodename 28154@dmidlkprdls04.svr.luc.edu 2025-07-09 17:38:58,562 WARN org.apache.hadoop.hdfs.server.common.Storage: Failed to add storage directory [DISK]file:/hdfs/d1/dfs/dn java.io.IOException: Incompatible clusterIDs in /hdfs/d1/dfs/dn: namenode clusterID = cluster38; datanode clusterID = cluster72 at org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:744) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadStorageDirectory(DataStorage.java:294) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadDataStorage(DataStorage.java:407) at org.apache.hadoop.hdfs.server.datanode.DataStorage.addStorageLocations(DataStorage.java:387) at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:559) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-09 17:38:58,568 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d2/dfs/dn/in_use.lock acquired by nodename 28154@dmidlkprdls04.svr.luc.edu 2025-07-09 17:38:58,568 WARN org.apache.hadoop.hdfs.server.common.Storage: Failed to add storage directory [DISK]file:/hdfs/d2/dfs/dn java.io.IOException: Incompatible clusterIDs in /hdfs/d2/dfs/dn: namenode clusterID = cluster38; datanode clusterID = cluster72 at org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:744) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadStorageDirectory(DataStorage.java:294) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadDataStorage(DataStorage.java:407) at org.apache.hadoop.hdfs.server.datanode.DataStorage.addStorageLocations(DataStorage.java:387) at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:559) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-09 17:38:58,569 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d3/dfs/dn/in_use.lock acquired by nodename 28154@dmidlkprdls04.svr.luc.edu 2025-07-09 17:38:58,570 WARN org.apache.hadoop.hdfs.server.common.Storage: Failed to add storage directory [DISK]file:/hdfs/d3/dfs/dn java.io.IOException: Incompatible clusterIDs in /hdfs/d3/dfs/dn: namenode clusterID = cluster38; datanode clusterID = cluster72 at org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:744) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadStorageDirectory(DataStorage.java:294) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadDataStorage(DataStorage.java:407) at org.apache.hadoop.hdfs.server.datanode.DataStorage.addStorageLocations(DataStorage.java:387) at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:559) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-09 17:38:58,571 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d4/dfs/dn/in_use.lock acquired by nodename 28154@dmidlkprdls04.svr.luc.edu 2025-07-09 17:38:58,571 WARN org.apache.hadoop.hdfs.server.common.Storage: Failed to add storage directory [DISK]file:/hdfs/d4/dfs/dn java.io.IOException: Incompatible clusterIDs in /hdfs/d4/dfs/dn: namenode clusterID = cluster38; datanode clusterID = cluster72 at org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:744) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadStorageDirectory(DataStorage.java:294) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadDataStorage(DataStorage.java:407) at org.apache.hadoop.hdfs.server.datanode.DataStorage.addStorageLocations(DataStorage.java:387) at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:559) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-09 17:38:58,575 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: Initialization failed for Block pool (Datanode Uuid 92aa408f-509e-4bad-98e0-6e83f0a34a12) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Exiting. java.io.IOException: All specified directories have failed to load. at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:560) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1753) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1689) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:394) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:305) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:865) at java.lang.Thread.run(Thread.java:750) 2025-07-09 17:38:58,576 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Ending block pool service for: Block pool (Datanode Uuid 92aa408f-509e-4bad-98e0-6e83f0a34a12) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-09 17:38:58,579 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Removed Block pool (Datanode Uuid 92aa408f-509e-4bad-98e0-6e83f0a34a12) 2025-07-09 17:39:00,579 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Exiting Datanode 2025-07-09 17:39:00,586 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG: /************************************************************ SHUTDOWN_MSG: Shutting down DataNode at dmidlkprdls04.svr.luc.edu/192.168.158.4 ************************************************************/ 2025-07-09 17:45:05,228 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: STARTUP_MSG: /************************************************************ STARTUP_MSG: Starting DataNode STARTUP_MSG: host = dmidlkprdls04.svr.luc.edu/192.168.158.4 STARTUP_MSG: args = [] STARTUP_MSG: version = 3.1.1.7.3.1.0-197 STARTUP_MSG: classpath = /var/run/cloudera-scm-agent/process/153-hdfs-DATANODE:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/audience-annotations-0.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/avro.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/aws-java-sdk-bundle-1.12.720.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-hdfs-plugin-shim-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-plugin-classloader-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-yarn-plugin-shim-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/azure-data-lake-store-sdk-2.3.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/checker-qual-3.33.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-beanutils-1.9.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-cli-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-codec-1.15.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-collections-3.2.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-compress-1.23.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-configuration2-2.10.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-io-2.11.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-lang-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-lang3-3.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-logging-1.1.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-math3-3.6.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-net-3.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-text-1.10.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-client-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-framework-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-recipes-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/failureaccess-1.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/gson-2.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/guava-32.0.1-jre.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/httpclient-4.5.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/httpcore-4.4.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/j2objc-annotations-2.8.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-core-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-core-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-databind-2.12.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-mapper-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-xc-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/javax.activation-api-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/javax.servlet-api-3.1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jaxb-api-2.2.11.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jcip-annotations-1.0-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-core-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-json-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-server-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-servlet-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jettison-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-http-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-io-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-security-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-server-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-servlet-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-util-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-util-ajax-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-webapp-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-xml-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jline-3.22.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsch-0.1.54.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsp-api-2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsr305-3.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsr311-api-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jul-to-slf4j-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-admin-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-client-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-common-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-core-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-crypto-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-identity-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-server-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-simplekdc-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-asn1-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-config-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-pkix-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-xdr-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/logredactor-2.0.16.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/lz4-java-1.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/metrics-core-3.2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-all-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-buffer-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-haproxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-http-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-http2-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-memcache-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-mqtt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-redis-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-smtp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-log4j12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-reload4j-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-api-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/reload4j-1.2.22.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/zookeeper.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-udt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-unix-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-kqueue-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final-linux-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-classes-kqueue-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-rxtx-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-classes-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-classes-macos-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-ssl-ocsp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-proxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-xml-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-stomp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-socks-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/wildfly-openssl-2.1.4.ClouderaFinal.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/protobuf-java-2.5.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/nimbus-jose-jwt-9.37.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/woodstox-core-5.4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/token-provider-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/stax2-api-4.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/snappy-java-1.1.10.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/re2j-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/zookeeper-jute.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-sctp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-kqueue-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final-linux-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//gcs-connector-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-annotations.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-auth.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-aws.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-datalake.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-kms.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-nfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//ozone-filesystem-hadoop3-1.3.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-nfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-kms-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-datalake-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-aws-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-auth-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-annotations-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//gcs-connector-2.1.2.7.3.1.0-197-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-thrift.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-scala_2.12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-protobuf.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-pig.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-pig-bundle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-jackson.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-hadoop.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-hadoop-bundle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-generator.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-format-structures.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-encoding.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-column.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-cascading3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-cascading.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-avro.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/./:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/audience-annotations-0.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/avro-1.11.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/checker-qual-3.33.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-beanutils-1.9.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-codec-1.15.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-collections-3.2.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-compress-1.23.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-configuration2-2.10.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-io-2.11.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-lang3-3.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-math3-3.6.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-net-3.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-text-1.10.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-client-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-framework-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-recipes-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/failureaccess-1.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/gson-2.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/guava-32.0.1-jre.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/httpclient-4.5.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/httpcore-4.4.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/j2objc-annotations-2.8.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-core-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-databind-2.12.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-jaxrs-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-xc-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/javax.activation-api-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/javax.servlet-api-3.1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jaxb-api-2.2.11.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jaxb-impl-2.2.3-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jcip-annotations-1.0-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-core-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-json-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-server-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-servlet-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jettison-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-http-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-io-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-security-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-server-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-servlet-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-util-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-util-ajax-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-webapp-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-xml-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jline-3.22.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsch-0.1.54.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/json-simple-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsr311-api-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-admin-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-client-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-common-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-core-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-crypto-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-identity-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-server-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-simplekdc-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-asn1-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-config-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-pkix-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-xdr-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/leveldbjni-cldr-1.9.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/lz4-java-1.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/metrics-core-3.2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-all-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-buffer-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-haproxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-http-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-http2-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-memcache-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-mqtt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-redis-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-smtp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-socks-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-stomp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-xml-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-proxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-ssl-ocsp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/re2j-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/zookeeper-3.8.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-sctp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-rxtx-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-unix-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-kqueue-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-kqueue-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final-linux-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final-linux-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-classes-kqueue-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-classes-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-classes-macos-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/nimbus-jose-jwt-9.37.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/woodstox-core-5.4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/token-provider-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/stax2-api-4.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/snappy-java-1.1.10.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/reload4j-1.2.22.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/zookeeper-jute-3.8.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-udt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-httpfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-nfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-nfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-httpfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//asm-5.0.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//aspectjweaver-1.9.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//azure-storage-7.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//checker-compat-qual-2.5.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-slf4j-backend-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-system-backend-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//google-extensions-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//azure-keyvault-core-1.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//accessors-smart-2.4.9.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gcs-connector-2.1.2.7.3.1.0-197-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archive-logs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archive-logs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archives-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archives.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-datajoin-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-datajoin.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-distcp-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-distcp.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-extras-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-extras.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-fs2img-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-fs2img.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-gridmix-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-gridmix.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-kafka-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-kafka.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-app-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-app.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-core-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-core.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-nativetask-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-nativetask.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-uploader-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-uploader.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-examples-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-examples.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-openstack-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-openstack.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-resourceestimator-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-resourceestimator.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-rumen-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-rumen.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-sls-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-sls.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-streaming-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-streaming.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ojalgo-43.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//kafka-clients-2.8.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//log4j-core-2.18.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-hook-abfs-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//forbiddenapis-3.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-intg-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//log4j-api-2.18.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//zstd-jni-1.4.9-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-i18n.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-s3-lib-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//javax.activation-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//bundle-2.23.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//json-smart-2.4.10.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-util-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-shell.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-cloud-bindings.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-hook-s3-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//aspectjrt-1.9.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/./:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/swagger-annotations-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/objenesis-1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jersey-client-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jakarta.xml.bind-api-2.3.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jakarta.activation-api-1.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-dataformat-yaml-2.9.10.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcutil-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcprov-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcpkix-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/HikariCP-java7-2.4.12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/snakeyaml-2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jsonschema2pojo-core-1.0.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/joda-time-2.10.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jna-5.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jersey-guice-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/javax.inject-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-module-jaxb-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-jaxrs-json-provider-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-jaxrs-base-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/guice-servlet-4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/guice-4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/geronimo-jcache_1.0_spec-1.0-alpha-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/fst-2.50.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/ehcache-3.3.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/dnsjava-2.1.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/codemodel-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-api.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-distributedshell.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-cloud-resourcemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-registry.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-nodemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-resourcemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-router.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-sharedcachemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-timeline-pluginstorage.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-web-proxy.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-api.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-core.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-registry-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-api-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-core-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-api-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-web-proxy-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-timeline-pluginstorage-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-tests-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-sharedcachemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-router-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-resourcemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-nodemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-cloud-resourcemanager-1.0.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-distributedshell-3.1.1.7.3.1.0-197.jar:/opt/cloudera/cm/lib/plugins/event-publish-7.13.1-shaded.jar:/opt/cloudera/cm/lib/plugins/tt-instrumentation-7.13.1.jar STARTUP_MSG: build = git@github.infra.cloudera.com:CDH/hadoop.git -r 31a42fb39494f541ffae15c3c61185deeeacca86; compiled by 'jenkins' on 2024-12-04T01:09Z STARTUP_MSG: java = 1.8.0_432 ************************************************************/ 2025-07-09 17:45:05,311 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: registered UNIX signal handlers for [TERM, HUP, INT] 2025-07-09 17:45:05,672 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d1/dfs/dn 2025-07-09 17:45:05,678 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d2/dfs/dn 2025-07-09 17:45:05,679 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d3/dfs/dn 2025-07-09 17:45:05,679 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d4/dfs/dn 2025-07-09 17:45:05,840 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: Loaded properties from hadoop-metrics2.properties 2025-07-09 17:45:05,943 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled Metric snapshot period at 10 second(s). 2025-07-09 17:45:05,943 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: DataNode metrics system started 2025-07-09 17:45:06,216 INFO org.apache.hadoop.hdfs.server.common.Util: dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling 2025-07-09 17:45:06,238 INFO org.apache.hadoop.hdfs.server.datanode.BlockScanner: Initialized block scanner with targetBytesPerSec 1048576 2025-07-09 17:45:06,244 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: File descriptor passing is enabled. 2025-07-09 17:45:06,245 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Configured hostname is dmidlkprdls04.svr.luc.edu 2025-07-09 17:45:06,246 INFO org.apache.hadoop.hdfs.server.common.Util: dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling 2025-07-09 17:45:06,252 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting DataNode with maxLockedMemory = 4294967296 2025-07-09 17:45:06,376 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened streaming server at /192.168.158.4:9866 2025-07-09 17:45:06,380 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Balancing bandwidth is 10485760 bytes/s 2025-07-09 17:45:06,380 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Number threads for balancing is 50 2025-07-09 17:45:06,386 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Balancing bandwidth is 10485760 bytes/s 2025-07-09 17:45:06,386 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Number threads for balancing is 50 2025-07-09 17:45:06,386 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Listening on UNIX domain socket: /var/run/hdfs-sockets/dn 2025-07-09 17:45:06,454 INFO org.eclipse.jetty.util.log: Logging initialized @2418ms to org.eclipse.jetty.util.log.Slf4jLog 2025-07-09 17:45:06,599 WARN org.apache.hadoop.security.authentication.server.AuthenticationFilter: Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /var/lib/hadoop-hdfs/hadoop-http-auth-signature-secret 2025-07-09 17:45:06,609 INFO org.apache.hadoop.http.HttpRequestLog: Http request log for http.requests.datanode is not defined 2025-07-09 17:45:06,618 INFO org.apache.hadoop.http.HttpServer2: Added global filter 'safety' (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter) 2025-07-09 17:45:06,621 INFO org.apache.hadoop.security.HttpCrossOriginFilterInitializer: CORS filter not enabled. Please set hadoop.http.cross-origin.enabled to 'true' to enable it 2025-07-09 17:45:06,623 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context datanode 2025-07-09 17:45:06,623 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context logs 2025-07-09 17:45:06,624 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context static 2025-07-09 17:45:06,664 INFO org.apache.hadoop.http.HttpServer2: Jetty bound to port 39965 2025-07-09 17:45:06,666 INFO org.eclipse.jetty.server.Server: jetty-9.4.54.v20240208; built: 2024-02-08T19:42:39.027Z; git: cef3fbd6d736a21e7d541a5db490381d95a2047d; jvm 1.8.0_432-b06 2025-07-09 17:45:06,722 INFO org.eclipse.jetty.server.session: DefaultSessionIdManager workerName=node0 2025-07-09 17:45:06,723 INFO org.eclipse.jetty.server.session: No SessionScavenger set, using defaults 2025-07-09 17:45:06,727 INFO org.eclipse.jetty.server.session: node0 Scavenging every 660000ms 2025-07-09 17:45:06,756 WARN org.apache.hadoop.security.authentication.server.AuthenticationFilter: Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /var/lib/hadoop-hdfs/hadoop-http-auth-signature-secret 2025-07-09 17:45:06,761 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.s.ServletContextHandler@1b11ef33{logs,/logs,file:///var/log/hadoop-hdfs/,AVAILABLE} 2025-07-09 17:45:06,763 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.s.ServletContextHandler@2f2bf0e2{static,/static,file:///opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/static/,AVAILABLE} 2025-07-09 17:45:06,878 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.w.WebAppContext@6ebd78d1{datanode,/,file:///opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/datanode/,AVAILABLE}{file:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/datanode} 2025-07-09 17:45:06,890 INFO org.eclipse.jetty.server.AbstractConnector: Started ServerConnector@7fb9f71f{HTTP/1.1, (http/1.1)}{localhost:39965} 2025-07-09 17:45:06,890 INFO org.eclipse.jetty.server.Server: Started @2855ms 2025-07-09 17:45:07,168 INFO org.apache.hadoop.hdfs.server.datanode.web.DatanodeHttpServer: Listening HTTP traffic on /192.168.158.4:9864 2025-07-09 17:45:07,177 INFO org.apache.hadoop.util.JvmPauseMonitor: Starting JVM pause monitor 2025-07-09 17:45:07,178 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: dnUserName = hdfs 2025-07-09 17:45:07,178 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: supergroup = supergroup 2025-07-09 17:45:07,241 INFO org.apache.hadoop.ipc.CallQueueManager: Using callQueue: class java.util.concurrent.LinkedBlockingQueue queueCapacity: 1000 scheduler: class org.apache.hadoop.ipc.DefaultRpcScheduler 2025-07-09 17:45:07,262 INFO org.apache.hadoop.ipc.Server: Starting Socket Reader #1 for port 9867 2025-07-09 17:45:07,315 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened IPC server at /192.168.158.4:9867 2025-07-09 17:45:07,351 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Refresh request received for nameservices: null 2025-07-09 17:45:07,360 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting BPOfferServices for nameservices: 2025-07-09 17:45:07,375 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool (Datanode Uuid unassigned) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 starting to offer service 2025-07-09 17:45:07,383 INFO org.apache.hadoop.ipc.Server: IPC Server Responder: starting 2025-07-09 17:45:07,383 INFO org.apache.hadoop.ipc.Server: IPC Server listener on 9867: starting 2025-07-09 17:45:07,674 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Acknowledging ACTIVE Namenode during handshakeBlock pool (Datanode Uuid unassigned) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-09 17:45:07,680 INFO org.apache.hadoop.hdfs.server.common.Storage: Using 4 threads to upgrade data directories (dfs.datanode.parallel.volumes.load.threads.num=4, dataDirs=4) 2025-07-09 17:45:07,689 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d1/dfs/dn/in_use.lock acquired by nodename 30097@dmidlkprdls04.svr.luc.edu 2025-07-09 17:45:07,691 INFO org.apache.hadoop.hdfs.server.common.Storage: Storage directory with location [DISK]file:/hdfs/d1/dfs/dn is not formatted for namespace 1873149390. Formatting... 2025-07-09 17:45:07,693 INFO org.apache.hadoop.hdfs.server.common.Storage: Generated new storageID DS-ce66e326-7944-4aba-923e-338e673949ad for directory /hdfs/d1/dfs/dn 2025-07-09 17:45:07,702 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d2/dfs/dn/in_use.lock acquired by nodename 30097@dmidlkprdls04.svr.luc.edu 2025-07-09 17:45:07,702 INFO org.apache.hadoop.hdfs.server.common.Storage: Storage directory with location [DISK]file:/hdfs/d2/dfs/dn is not formatted for namespace 1873149390. Formatting... 2025-07-09 17:45:07,703 INFO org.apache.hadoop.hdfs.server.common.Storage: Generated new storageID DS-2c424dce-012f-41dd-b422-62baf2440179 for directory /hdfs/d2/dfs/dn 2025-07-09 17:45:07,705 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d3/dfs/dn/in_use.lock acquired by nodename 30097@dmidlkprdls04.svr.luc.edu 2025-07-09 17:45:07,706 INFO org.apache.hadoop.hdfs.server.common.Storage: Storage directory with location [DISK]file:/hdfs/d3/dfs/dn is not formatted for namespace 1873149390. Formatting... 2025-07-09 17:45:07,706 INFO org.apache.hadoop.hdfs.server.common.Storage: Generated new storageID DS-1e0d1de4-58e1-476f-9d43-f8e6a357b221 for directory /hdfs/d3/dfs/dn 2025-07-09 17:45:07,709 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d4/dfs/dn/in_use.lock acquired by nodename 30097@dmidlkprdls04.svr.luc.edu 2025-07-09 17:45:07,710 INFO org.apache.hadoop.hdfs.server.common.Storage: Storage directory with location [DISK]file:/hdfs/d4/dfs/dn is not formatted for namespace 1873149390. Formatting... 2025-07-09 17:45:07,710 INFO org.apache.hadoop.hdfs.server.common.Storage: Generated new storageID DS-b54a869d-4580-409c-a867-264df6e81d91 for directory /hdfs/d4/dfs/dn 2025-07-09 17:45:07,743 INFO org.apache.hadoop.hdfs.server.common.Storage: Analyzing storage directories for bpid BP-1290524523-192.168.158.1-1752100694127 2025-07-09 17:45:07,744 INFO org.apache.hadoop.hdfs.server.common.Storage: Locking is disabled for /hdfs/d1/dfs/dn/current/BP-1290524523-192.168.158.1-1752100694127 2025-07-09 17:45:07,745 INFO org.apache.hadoop.hdfs.server.common.Storage: Block pool storage directory for location [DISK]file:/hdfs/d1/dfs/dn and block pool id BP-1290524523-192.168.158.1-1752100694127 is not formatted. Formatting ... 2025-07-09 17:45:07,745 INFO org.apache.hadoop.hdfs.server.common.Storage: Formatting block pool BP-1290524523-192.168.158.1-1752100694127 directory /hdfs/d1/dfs/dn/current/BP-1290524523-192.168.158.1-1752100694127/current 2025-07-09 17:45:07,770 INFO org.apache.hadoop.hdfs.server.common.Storage: Analyzing storage directories for bpid BP-1290524523-192.168.158.1-1752100694127 2025-07-09 17:45:07,770 INFO org.apache.hadoop.hdfs.server.common.Storage: Locking is disabled for /hdfs/d2/dfs/dn/current/BP-1290524523-192.168.158.1-1752100694127 2025-07-09 17:45:07,770 INFO org.apache.hadoop.hdfs.server.common.Storage: Block pool storage directory for location [DISK]file:/hdfs/d2/dfs/dn and block pool id BP-1290524523-192.168.158.1-1752100694127 is not formatted. Formatting ... 2025-07-09 17:45:07,770 INFO org.apache.hadoop.hdfs.server.common.Storage: Formatting block pool BP-1290524523-192.168.158.1-1752100694127 directory /hdfs/d2/dfs/dn/current/BP-1290524523-192.168.158.1-1752100694127/current 2025-07-09 17:45:07,793 INFO org.apache.hadoop.hdfs.server.common.Storage: Analyzing storage directories for bpid BP-1290524523-192.168.158.1-1752100694127 2025-07-09 17:45:07,794 INFO org.apache.hadoop.hdfs.server.common.Storage: Locking is disabled for /hdfs/d3/dfs/dn/current/BP-1290524523-192.168.158.1-1752100694127 2025-07-09 17:45:07,794 INFO org.apache.hadoop.hdfs.server.common.Storage: Block pool storage directory for location [DISK]file:/hdfs/d3/dfs/dn and block pool id BP-1290524523-192.168.158.1-1752100694127 is not formatted. Formatting ... 2025-07-09 17:45:07,794 INFO org.apache.hadoop.hdfs.server.common.Storage: Formatting block pool BP-1290524523-192.168.158.1-1752100694127 directory /hdfs/d3/dfs/dn/current/BP-1290524523-192.168.158.1-1752100694127/current 2025-07-09 17:45:07,814 INFO org.apache.hadoop.hdfs.server.common.Storage: Analyzing storage directories for bpid BP-1290524523-192.168.158.1-1752100694127 2025-07-09 17:45:07,815 INFO org.apache.hadoop.hdfs.server.common.Storage: Locking is disabled for /hdfs/d4/dfs/dn/current/BP-1290524523-192.168.158.1-1752100694127 2025-07-09 17:45:07,815 INFO org.apache.hadoop.hdfs.server.common.Storage: Block pool storage directory for location [DISK]file:/hdfs/d4/dfs/dn and block pool id BP-1290524523-192.168.158.1-1752100694127 is not formatted. Formatting ... 2025-07-09 17:45:07,815 INFO org.apache.hadoop.hdfs.server.common.Storage: Formatting block pool BP-1290524523-192.168.158.1-1752100694127 directory /hdfs/d4/dfs/dn/current/BP-1290524523-192.168.158.1-1752100694127/current 2025-07-09 17:45:07,817 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Setting up storage: nsid=1873149390;bpid=BP-1290524523-192.168.158.1-1752100694127;lv=-57;nsInfo=lv=-64;cid=cluster38;nsid=1873149390;c=1752100694127;bpid=BP-1290524523-192.168.158.1-1752100694127;dnuuid=null 2025-07-09 17:45:07,819 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Generated and persisted new Datanode UUID 16d8ffe4-1b83-4b60-bef2-bc2c9affa424 2025-07-09 17:45:07,836 INFO org.apache.hadoop.conf.Configuration.deprecation: No unit for dfs.datanode.lock-reporting-threshold-ms(300) assuming MILLISECONDS 2025-07-09 17:45:07,839 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: The datanode lock is a read write lock 2025-07-09 17:45:07,891 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Added new volume: DS-ce66e326-7944-4aba-923e-338e673949ad 2025-07-09 17:45:07,891 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Added volume - [DISK]file:/hdfs/d1/dfs/dn, StorageType: DISK 2025-07-09 17:45:07,893 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Added new volume: DS-2c424dce-012f-41dd-b422-62baf2440179 2025-07-09 17:45:07,893 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Added volume - [DISK]file:/hdfs/d2/dfs/dn, StorageType: DISK 2025-07-09 17:45:07,894 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Added new volume: DS-1e0d1de4-58e1-476f-9d43-f8e6a357b221 2025-07-09 17:45:07,894 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Added volume - [DISK]file:/hdfs/d3/dfs/dn, StorageType: DISK 2025-07-09 17:45:07,897 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Added new volume: DS-b54a869d-4580-409c-a867-264df6e81d91 2025-07-09 17:45:07,897 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Added volume - [DISK]file:/hdfs/d4/dfs/dn, StorageType: DISK 2025-07-09 17:45:07,904 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Registered FSDatasetState MBean 2025-07-09 17:45:07,910 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Adding block pool BP-1290524523-192.168.158.1-1752100694127 2025-07-09 17:45:07,911 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Scanning block pool BP-1290524523-192.168.158.1-1752100694127 on volume /hdfs/d1/dfs/dn... 2025-07-09 17:45:07,911 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Scanning block pool BP-1290524523-192.168.158.1-1752100694127 on volume /hdfs/d2/dfs/dn... 2025-07-09 17:45:07,912 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Scanning block pool BP-1290524523-192.168.158.1-1752100694127 on volume /hdfs/d4/dfs/dn... 2025-07-09 17:45:07,912 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Scanning block pool BP-1290524523-192.168.158.1-1752100694127 on volume /hdfs/d3/dfs/dn... 2025-07-09 17:45:07,983 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Time taken to scan block pool BP-1290524523-192.168.158.1-1752100694127 on /hdfs/d1/dfs/dn: 71ms 2025-07-09 17:45:07,985 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Time taken to scan block pool BP-1290524523-192.168.158.1-1752100694127 on /hdfs/d3/dfs/dn: 72ms 2025-07-09 17:45:07,986 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Time taken to scan block pool BP-1290524523-192.168.158.1-1752100694127 on /hdfs/d2/dfs/dn: 74ms 2025-07-09 17:45:07,986 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Time taken to scan block pool BP-1290524523-192.168.158.1-1752100694127 on /hdfs/d4/dfs/dn: 74ms 2025-07-09 17:45:07,986 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Total time to scan all replicas for block pool BP-1290524523-192.168.158.1-1752100694127: 76ms 2025-07-09 17:45:07,989 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Adding replicas to map for block pool BP-1290524523-192.168.158.1-1752100694127 on volume /hdfs/d1/dfs/dn... 2025-07-09 17:45:07,989 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.BlockPoolSlice: Replica Cache file: /hdfs/d1/dfs/dn/current/BP-1290524523-192.168.158.1-1752100694127/current/replicas doesn't exist 2025-07-09 17:45:07,989 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Adding replicas to map for block pool BP-1290524523-192.168.158.1-1752100694127 on volume /hdfs/d2/dfs/dn... 2025-07-09 17:45:07,989 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Adding replicas to map for block pool BP-1290524523-192.168.158.1-1752100694127 on volume /hdfs/d3/dfs/dn... 2025-07-09 17:45:07,990 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.BlockPoolSlice: Replica Cache file: /hdfs/d3/dfs/dn/current/BP-1290524523-192.168.158.1-1752100694127/current/replicas doesn't exist 2025-07-09 17:45:07,990 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.BlockPoolSlice: Replica Cache file: /hdfs/d2/dfs/dn/current/BP-1290524523-192.168.158.1-1752100694127/current/replicas doesn't exist 2025-07-09 17:45:07,990 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Adding replicas to map for block pool BP-1290524523-192.168.158.1-1752100694127 on volume /hdfs/d4/dfs/dn... 2025-07-09 17:45:07,991 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.BlockPoolSlice: Replica Cache file: /hdfs/d4/dfs/dn/current/BP-1290524523-192.168.158.1-1752100694127/current/replicas doesn't exist 2025-07-09 17:45:07,992 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Time to add replicas to map for block pool BP-1290524523-192.168.158.1-1752100694127 on volume /hdfs/d4/dfs/dn: 1ms 2025-07-09 17:45:07,992 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Time to add replicas to map for block pool BP-1290524523-192.168.158.1-1752100694127 on volume /hdfs/d3/dfs/dn: 3ms 2025-07-09 17:45:07,993 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Time to add replicas to map for block pool BP-1290524523-192.168.158.1-1752100694127 on volume /hdfs/d1/dfs/dn: 4ms 2025-07-09 17:45:07,994 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Time to add replicas to map for block pool BP-1290524523-192.168.158.1-1752100694127 on volume /hdfs/d2/dfs/dn: 4ms 2025-07-09 17:45:07,994 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Total time to add all replicas to map for block pool BP-1290524523-192.168.158.1-1752100694127: 6ms 2025-07-09 17:45:07,995 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for /hdfs/d1/dfs/dn 2025-07-09 17:45:08,011 INFO org.apache.hadoop.hdfs.server.datanode.checker.DatasetVolumeChecker: Scheduled health check for volume /hdfs/d1/dfs/dn 2025-07-09 17:45:08,014 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for /hdfs/d2/dfs/dn 2025-07-09 17:45:08,014 INFO org.apache.hadoop.hdfs.server.datanode.checker.DatasetVolumeChecker: Scheduled health check for volume /hdfs/d2/dfs/dn 2025-07-09 17:45:08,014 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for /hdfs/d3/dfs/dn 2025-07-09 17:45:08,015 INFO org.apache.hadoop.hdfs.server.datanode.checker.DatasetVolumeChecker: Scheduled health check for volume /hdfs/d3/dfs/dn 2025-07-09 17:45:08,015 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for /hdfs/d4/dfs/dn 2025-07-09 17:45:08,015 INFO org.apache.hadoop.hdfs.server.datanode.checker.DatasetVolumeChecker: Scheduled health check for volume /hdfs/d4/dfs/dn 2025-07-09 17:45:08,020 INFO org.apache.hadoop.hdfs.server.datanode.VolumeScanner: Now scanning bpid BP-1290524523-192.168.158.1-1752100694127 on volume /hdfs/d2/dfs/dn 2025-07-09 17:45:08,020 INFO org.apache.hadoop.hdfs.server.datanode.VolumeScanner: Now scanning bpid BP-1290524523-192.168.158.1-1752100694127 on volume /hdfs/d1/dfs/dn 2025-07-09 17:45:08,020 INFO org.apache.hadoop.hdfs.server.datanode.VolumeScanner: Now scanning bpid BP-1290524523-192.168.158.1-1752100694127 on volume /hdfs/d4/dfs/dn 2025-07-09 17:45:08,020 INFO org.apache.hadoop.hdfs.server.datanode.VolumeScanner: Now scanning bpid BP-1290524523-192.168.158.1-1752100694127 on volume /hdfs/d3/dfs/dn 2025-07-09 17:45:08,027 INFO org.apache.hadoop.hdfs.server.datanode.VolumeScanner: VolumeScanner(/hdfs/d3/dfs/dn, DS-1e0d1de4-58e1-476f-9d43-f8e6a357b221): finished scanning block pool BP-1290524523-192.168.158.1-1752100694127 2025-07-09 17:45:08,027 INFO org.apache.hadoop.hdfs.server.datanode.VolumeScanner: VolumeScanner(/hdfs/d2/dfs/dn, DS-2c424dce-012f-41dd-b422-62baf2440179): finished scanning block pool BP-1290524523-192.168.158.1-1752100694127 2025-07-09 17:45:08,027 INFO org.apache.hadoop.hdfs.server.datanode.VolumeScanner: VolumeScanner(/hdfs/d1/dfs/dn, DS-ce66e326-7944-4aba-923e-338e673949ad): finished scanning block pool BP-1290524523-192.168.158.1-1752100694127 2025-07-09 17:45:08,027 INFO org.apache.hadoop.hdfs.server.datanode.VolumeScanner: VolumeScanner(/hdfs/d4/dfs/dn, DS-b54a869d-4580-409c-a867-264df6e81d91): finished scanning block pool BP-1290524523-192.168.158.1-1752100694127 2025-07-09 17:45:08,037 INFO org.apache.hadoop.hdfs.server.datanode.DirectoryScanner: Periodic Directory Tree Verification scan starting at 7/9/25 9:45 PM with interval of 21600000ms 2025-07-09 17:45:08,050 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool BP-1290524523-192.168.158.1-1752100694127 (Datanode Uuid 16d8ffe4-1b83-4b60-bef2-bc2c9affa424) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 beginning handshake with NN 2025-07-09 17:45:08,051 INFO org.apache.hadoop.hdfs.server.datanode.VolumeScanner: VolumeScanner(/hdfs/d4/dfs/dn, DS-b54a869d-4580-409c-a867-264df6e81d91): no suitable block pools found to scan. Waiting 1814399968 ms. 2025-07-09 17:45:08,051 INFO org.apache.hadoop.hdfs.server.datanode.VolumeScanner: VolumeScanner(/hdfs/d1/dfs/dn, DS-ce66e326-7944-4aba-923e-338e673949ad): no suitable block pools found to scan. Waiting 1814399968 ms. 2025-07-09 17:45:08,051 INFO org.apache.hadoop.hdfs.server.datanode.VolumeScanner: VolumeScanner(/hdfs/d3/dfs/dn, DS-1e0d1de4-58e1-476f-9d43-f8e6a357b221): no suitable block pools found to scan. Waiting 1814399968 ms. 2025-07-09 17:45:08,051 INFO org.apache.hadoop.hdfs.server.datanode.VolumeScanner: VolumeScanner(/hdfs/d2/dfs/dn, DS-2c424dce-012f-41dd-b422-62baf2440179): no suitable block pools found to scan. Waiting 1814399968 ms. 2025-07-09 17:45:08,100 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool Block pool BP-1290524523-192.168.158.1-1752100694127 (Datanode Uuid 16d8ffe4-1b83-4b60-bef2-bc2c9affa424) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 successfully registered with NN 2025-07-09 17:45:08,102 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: For namenode dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 using BLOCKREPORT_INTERVAL of 21600000msec CACHEREPORT_INTERVAL of 10000msec Initial delay: 0msec; heartBeatInterval=3000 2025-07-09 17:45:08,103 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting IBR Task Handler. 2025-07-09 17:45:08,336 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Successfully sent block report 0x5c4d10c1e37de2b6, containing 4 storage report(s), of which we sent 4. The reports had 0 total blocks and used 1 RPC(s). This took 8 msec to generate and 105 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5. 2025-07-09 17:45:08,337 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Got finalize command for block pool BP-1290524523-192.168.158.1-1752100694127 2025-07-09 17:48:02,861 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: RECEIVED SIGNAL 15: SIGTERM 2025-07-09 17:48:02,867 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG: /************************************************************ SHUTDOWN_MSG: Shutting down DataNode at dmidlkprdls04.svr.luc.edu/192.168.158.4 ************************************************************/ 2025-07-09 17:59:15,146 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: STARTUP_MSG: /************************************************************ STARTUP_MSG: Starting DataNode STARTUP_MSG: host = dmidlkprdls04.svr.luc.edu/192.168.158.4 STARTUP_MSG: args = [] STARTUP_MSG: version = 3.1.1.7.3.1.0-197 STARTUP_MSG: classpath = /var/run/cloudera-scm-agent/process/186-hdfs-DATANODE:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/audience-annotations-0.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/avro.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/aws-java-sdk-bundle-1.12.720.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-hdfs-plugin-shim-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-plugin-classloader-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-yarn-plugin-shim-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/azure-data-lake-store-sdk-2.3.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/checker-qual-3.33.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-beanutils-1.9.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-cli-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-codec-1.15.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-collections-3.2.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-compress-1.23.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-configuration2-2.10.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-io-2.11.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-lang-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-lang3-3.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-logging-1.1.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-math3-3.6.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-net-3.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-text-1.10.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-client-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-framework-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-recipes-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/failureaccess-1.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/gson-2.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/guava-32.0.1-jre.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/httpclient-4.5.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/httpcore-4.4.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/j2objc-annotations-2.8.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-core-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-core-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-databind-2.12.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-mapper-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-xc-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/javax.activation-api-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/javax.servlet-api-3.1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jaxb-api-2.2.11.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jcip-annotations-1.0-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-core-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-json-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-server-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-servlet-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jettison-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-http-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-io-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-security-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-server-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-servlet-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-util-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-util-ajax-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-webapp-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-xml-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jline-3.22.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsch-0.1.54.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsp-api-2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsr305-3.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsr311-api-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jul-to-slf4j-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-admin-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-client-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-common-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-core-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-crypto-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-identity-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-server-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-simplekdc-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-asn1-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-config-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-pkix-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-xdr-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/logredactor-2.0.16.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/lz4-java-1.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/metrics-core-3.2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-all-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-buffer-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-haproxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-http-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-http2-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-memcache-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-mqtt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-redis-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-smtp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-log4j12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-reload4j-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-api-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/reload4j-1.2.22.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/zookeeper.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-udt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-unix-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-kqueue-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final-linux-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-classes-kqueue-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-rxtx-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-classes-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-classes-macos-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-ssl-ocsp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-proxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-xml-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-stomp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-socks-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/wildfly-openssl-2.1.4.ClouderaFinal.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/protobuf-java-2.5.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/nimbus-jose-jwt-9.37.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/woodstox-core-5.4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/token-provider-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/stax2-api-4.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/snappy-java-1.1.10.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/re2j-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/zookeeper-jute.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-sctp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-kqueue-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final-linux-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//gcs-connector-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-annotations.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-auth.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-aws.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-datalake.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-kms.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-nfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//ozone-filesystem-hadoop3-1.3.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-nfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-kms-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-datalake-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-aws-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-auth-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-annotations-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//gcs-connector-2.1.2.7.3.1.0-197-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-thrift.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-scala_2.12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-protobuf.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-pig.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-pig-bundle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-jackson.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-hadoop.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-hadoop-bundle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-generator.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-format-structures.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-encoding.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-column.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-cascading3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-cascading.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-avro.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/./:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/audience-annotations-0.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/avro-1.11.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/checker-qual-3.33.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-beanutils-1.9.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-codec-1.15.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-collections-3.2.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-compress-1.23.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-configuration2-2.10.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-io-2.11.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-lang3-3.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-math3-3.6.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-net-3.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-text-1.10.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-client-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-framework-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-recipes-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/failureaccess-1.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/gson-2.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/guava-32.0.1-jre.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/httpclient-4.5.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/httpcore-4.4.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/j2objc-annotations-2.8.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-core-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-databind-2.12.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-jaxrs-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-xc-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/javax.activation-api-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/javax.servlet-api-3.1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jaxb-api-2.2.11.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jaxb-impl-2.2.3-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jcip-annotations-1.0-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-core-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-json-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-server-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-servlet-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jettison-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-http-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-io-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-security-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-server-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-servlet-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-util-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-util-ajax-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-webapp-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-xml-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jline-3.22.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsch-0.1.54.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/json-simple-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsr311-api-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-admin-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-client-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-common-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-core-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-crypto-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-identity-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-server-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-simplekdc-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-asn1-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-config-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-pkix-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-xdr-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/leveldbjni-cldr-1.9.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/lz4-java-1.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/metrics-core-3.2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-all-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-buffer-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-haproxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-http-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-http2-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-memcache-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-mqtt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-redis-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-smtp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-socks-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-stomp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-xml-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-proxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-ssl-ocsp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/re2j-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/zookeeper-3.8.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-sctp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-rxtx-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-unix-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-kqueue-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-kqueue-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final-linux-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final-linux-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-classes-kqueue-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-classes-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-classes-macos-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/nimbus-jose-jwt-9.37.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/woodstox-core-5.4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/token-provider-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/stax2-api-4.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/snappy-java-1.1.10.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/reload4j-1.2.22.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/zookeeper-jute-3.8.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-udt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-httpfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-nfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-nfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-httpfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//asm-5.0.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//aspectjweaver-1.9.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//azure-storage-7.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//checker-compat-qual-2.5.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-slf4j-backend-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-system-backend-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//google-extensions-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//azure-keyvault-core-1.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//accessors-smart-2.4.9.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gcs-connector-2.1.2.7.3.1.0-197-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archive-logs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archive-logs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archives-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archives.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-datajoin-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-datajoin.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-distcp-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-distcp.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-extras-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-extras.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-fs2img-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-fs2img.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-gridmix-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-gridmix.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-kafka-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-kafka.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-app-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-app.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-core-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-core.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-nativetask-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-nativetask.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-uploader-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-uploader.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-examples-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-examples.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-openstack-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-openstack.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-resourceestimator-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-resourceestimator.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-rumen-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-rumen.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-sls-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-sls.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-streaming-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-streaming.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ojalgo-43.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//kafka-clients-2.8.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//log4j-core-2.18.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-hook-abfs-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//forbiddenapis-3.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-intg-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//log4j-api-2.18.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//zstd-jni-1.4.9-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-i18n.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-s3-lib-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//javax.activation-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//bundle-2.23.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//json-smart-2.4.10.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-util-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-shell.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-cloud-bindings.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-hook-s3-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//aspectjrt-1.9.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/./:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/swagger-annotations-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/objenesis-1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jersey-client-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jakarta.xml.bind-api-2.3.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jakarta.activation-api-1.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-dataformat-yaml-2.9.10.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcutil-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcprov-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcpkix-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/HikariCP-java7-2.4.12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/snakeyaml-2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jsonschema2pojo-core-1.0.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/joda-time-2.10.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jna-5.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jersey-guice-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/javax.inject-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-module-jaxb-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-jaxrs-json-provider-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-jaxrs-base-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/guice-servlet-4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/guice-4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/geronimo-jcache_1.0_spec-1.0-alpha-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/fst-2.50.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/ehcache-3.3.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/dnsjava-2.1.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/codemodel-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-api.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-distributedshell.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-cloud-resourcemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-registry.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-nodemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-resourcemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-router.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-sharedcachemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-timeline-pluginstorage.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-web-proxy.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-api.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-core.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-registry-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-api-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-core-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-api-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-web-proxy-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-timeline-pluginstorage-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-tests-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-sharedcachemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-router-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-resourcemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-nodemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-cloud-resourcemanager-1.0.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-distributedshell-3.1.1.7.3.1.0-197.jar:/opt/cloudera/cm/lib/plugins/event-publish-7.13.1-shaded.jar:/opt/cloudera/cm/lib/plugins/tt-instrumentation-7.13.1.jar STARTUP_MSG: build = git@github.infra.cloudera.com:CDH/hadoop.git -r 31a42fb39494f541ffae15c3c61185deeeacca86; compiled by 'jenkins' on 2024-12-04T01:09Z STARTUP_MSG: java = 1.8.0_432 ************************************************************/ 2025-07-09 17:59:15,242 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: registered UNIX signal handlers for [TERM, HUP, INT] 2025-07-09 17:59:15,620 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d1/dfs/dn 2025-07-09 17:59:15,626 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d2/dfs/dn 2025-07-09 17:59:15,627 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d3/dfs/dn 2025-07-09 17:59:15,627 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d4/dfs/dn 2025-07-09 17:59:15,785 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: Loaded properties from hadoop-metrics2.properties 2025-07-09 17:59:15,884 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled Metric snapshot period at 10 second(s). 2025-07-09 17:59:15,884 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: DataNode metrics system started 2025-07-09 17:59:16,292 INFO org.apache.hadoop.hdfs.server.common.Util: dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling 2025-07-09 17:59:16,317 INFO org.apache.hadoop.hdfs.server.datanode.BlockScanner: Initialized block scanner with targetBytesPerSec 1048576 2025-07-09 17:59:16,324 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: File descriptor passing is enabled. 2025-07-09 17:59:16,326 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Configured hostname is dmidlkprdls04.svr.luc.edu 2025-07-09 17:59:16,326 INFO org.apache.hadoop.hdfs.server.common.Util: dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling 2025-07-09 17:59:16,332 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting DataNode with maxLockedMemory = 4294967296 2025-07-09 17:59:16,361 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened streaming server at /192.168.158.4:9866 2025-07-09 17:59:16,364 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Balancing bandwidth is 10485760 bytes/s 2025-07-09 17:59:16,364 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Number threads for balancing is 50 2025-07-09 17:59:16,368 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Balancing bandwidth is 10485760 bytes/s 2025-07-09 17:59:16,369 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Number threads for balancing is 50 2025-07-09 17:59:16,369 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Listening on UNIX domain socket: /var/run/hdfs-sockets/dn 2025-07-09 17:59:16,421 INFO org.eclipse.jetty.util.log: Logging initialized @2510ms to org.eclipse.jetty.util.log.Slf4jLog 2025-07-09 17:59:16,545 WARN org.apache.hadoop.security.authentication.server.AuthenticationFilter: Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /var/lib/hadoop-hdfs/hadoop-http-auth-signature-secret 2025-07-09 17:59:16,554 INFO org.apache.hadoop.http.HttpRequestLog: Http request log for http.requests.datanode is not defined 2025-07-09 17:59:16,565 INFO org.apache.hadoop.http.HttpServer2: Added global filter 'safety' (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter) 2025-07-09 17:59:16,567 INFO org.apache.hadoop.security.HttpCrossOriginFilterInitializer: CORS filter not enabled. Please set hadoop.http.cross-origin.enabled to 'true' to enable it 2025-07-09 17:59:16,568 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context datanode 2025-07-09 17:59:16,569 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context logs 2025-07-09 17:59:16,569 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context static 2025-07-09 17:59:16,619 INFO org.apache.hadoop.http.HttpServer2: Jetty bound to port 44537 2025-07-09 17:59:16,621 INFO org.eclipse.jetty.server.Server: jetty-9.4.54.v20240208; built: 2024-02-08T19:42:39.027Z; git: cef3fbd6d736a21e7d541a5db490381d95a2047d; jvm 1.8.0_432-b06 2025-07-09 17:59:16,678 INFO org.eclipse.jetty.server.session: DefaultSessionIdManager workerName=node0 2025-07-09 17:59:16,679 INFO org.eclipse.jetty.server.session: No SessionScavenger set, using defaults 2025-07-09 17:59:16,682 INFO org.eclipse.jetty.server.session: node0 Scavenging every 660000ms 2025-07-09 17:59:16,711 WARN org.apache.hadoop.security.authentication.server.AuthenticationFilter: Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /var/lib/hadoop-hdfs/hadoop-http-auth-signature-secret 2025-07-09 17:59:16,717 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.s.ServletContextHandler@1b11ef33{logs,/logs,file:///var/log/hadoop-hdfs/,AVAILABLE} 2025-07-09 17:59:16,718 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.s.ServletContextHandler@2f2bf0e2{static,/static,file:///opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/static/,AVAILABLE} 2025-07-09 17:59:16,847 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.w.WebAppContext@6ebd78d1{datanode,/,file:///opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/datanode/,AVAILABLE}{file:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/datanode} 2025-07-09 17:59:16,860 INFO org.eclipse.jetty.server.AbstractConnector: Started ServerConnector@7fb9f71f{HTTP/1.1, (http/1.1)}{localhost:44537} 2025-07-09 17:59:16,861 INFO org.eclipse.jetty.server.Server: Started @2950ms 2025-07-09 17:59:17,169 INFO org.apache.hadoop.hdfs.server.datanode.web.DatanodeHttpServer: Listening HTTP traffic on /192.168.158.4:9864 2025-07-09 17:59:17,179 INFO org.apache.hadoop.util.JvmPauseMonitor: Starting JVM pause monitor 2025-07-09 17:59:17,181 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: dnUserName = hdfs 2025-07-09 17:59:17,181 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: supergroup = supergroup 2025-07-09 17:59:17,253 INFO org.apache.hadoop.ipc.CallQueueManager: Using callQueue: class java.util.concurrent.LinkedBlockingQueue queueCapacity: 1000 scheduler: class org.apache.hadoop.ipc.DefaultRpcScheduler 2025-07-09 17:59:17,276 INFO org.apache.hadoop.ipc.Server: Starting Socket Reader #1 for port 9867 2025-07-09 17:59:17,330 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened IPC server at /192.168.158.4:9867 2025-07-09 17:59:17,371 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Refresh request received for nameservices: null 2025-07-09 17:59:17,384 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting BPOfferServices for nameservices: 2025-07-09 17:59:17,402 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool (Datanode Uuid unassigned) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 starting to offer service 2025-07-09 17:59:17,410 INFO org.apache.hadoop.ipc.Server: IPC Server Responder: starting 2025-07-09 17:59:17,410 INFO org.apache.hadoop.ipc.Server: IPC Server listener on 9867: starting 2025-07-09 17:59:18,613 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-09 17:59:19,615 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-09 17:59:19,776 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-09 17:59:24,902 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Acknowledging ACTIVE Namenode during handshakeBlock pool (Datanode Uuid unassigned) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-09 17:59:24,906 INFO org.apache.hadoop.hdfs.server.common.Storage: Using 4 threads to upgrade data directories (dfs.datanode.parallel.volumes.load.threads.num=4, dataDirs=4) 2025-07-09 17:59:24,913 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d1/dfs/dn/in_use.lock acquired by nodename 34049@dmidlkprdls04.svr.luc.edu 2025-07-09 17:59:24,914 INFO org.apache.hadoop.hdfs.server.common.Storage: Storage directory with location [DISK]file:/hdfs/d1/dfs/dn is not formatted for namespace 2068539957. Formatting... 2025-07-09 17:59:24,916 INFO org.apache.hadoop.hdfs.server.common.Storage: Generated new storageID DS-c6caf9b4-0cd0-462e-a7af-39538ffb6d0e for directory /hdfs/d1/dfs/dn 2025-07-09 17:59:24,923 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d2/dfs/dn/in_use.lock acquired by nodename 34049@dmidlkprdls04.svr.luc.edu 2025-07-09 17:59:24,924 INFO org.apache.hadoop.hdfs.server.common.Storage: Storage directory with location [DISK]file:/hdfs/d2/dfs/dn is not formatted for namespace 2068539957. Formatting... 2025-07-09 17:59:24,925 INFO org.apache.hadoop.hdfs.server.common.Storage: Generated new storageID DS-ab1b4344-d9fe-4401-915a-b02983ca3944 for directory /hdfs/d2/dfs/dn 2025-07-09 17:59:24,927 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d3/dfs/dn/in_use.lock acquired by nodename 34049@dmidlkprdls04.svr.luc.edu 2025-07-09 17:59:24,927 INFO org.apache.hadoop.hdfs.server.common.Storage: Storage directory with location [DISK]file:/hdfs/d3/dfs/dn is not formatted for namespace 2068539957. Formatting... 2025-07-09 17:59:24,928 INFO org.apache.hadoop.hdfs.server.common.Storage: Generated new storageID DS-f02a6d6f-472c-481a-aa41-d58991ac764f for directory /hdfs/d3/dfs/dn 2025-07-09 17:59:24,930 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d4/dfs/dn/in_use.lock acquired by nodename 34049@dmidlkprdls04.svr.luc.edu 2025-07-09 17:59:24,930 INFO org.apache.hadoop.hdfs.server.common.Storage: Storage directory with location [DISK]file:/hdfs/d4/dfs/dn is not formatted for namespace 2068539957. Formatting... 2025-07-09 17:59:24,931 INFO org.apache.hadoop.hdfs.server.common.Storage: Generated new storageID DS-e9eccc83-296b-4afa-bee5-915188e0d9a5 for directory /hdfs/d4/dfs/dn 2025-07-09 17:59:24,962 INFO org.apache.hadoop.hdfs.server.common.Storage: Analyzing storage directories for bpid BP-1059995147-192.168.158.1-1752101929360 2025-07-09 17:59:24,962 INFO org.apache.hadoop.hdfs.server.common.Storage: Locking is disabled for /hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360 2025-07-09 17:59:24,963 INFO org.apache.hadoop.hdfs.server.common.Storage: Block pool storage directory for location [DISK]file:/hdfs/d1/dfs/dn and block pool id BP-1059995147-192.168.158.1-1752101929360 is not formatted. Formatting ... 2025-07-09 17:59:24,963 INFO org.apache.hadoop.hdfs.server.common.Storage: Formatting block pool BP-1059995147-192.168.158.1-1752101929360 directory /hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current 2025-07-09 17:59:24,988 INFO org.apache.hadoop.hdfs.server.common.Storage: Analyzing storage directories for bpid BP-1059995147-192.168.158.1-1752101929360 2025-07-09 17:59:24,988 INFO org.apache.hadoop.hdfs.server.common.Storage: Locking is disabled for /hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360 2025-07-09 17:59:24,989 INFO org.apache.hadoop.hdfs.server.common.Storage: Block pool storage directory for location [DISK]file:/hdfs/d2/dfs/dn and block pool id BP-1059995147-192.168.158.1-1752101929360 is not formatted. Formatting ... 2025-07-09 17:59:24,989 INFO org.apache.hadoop.hdfs.server.common.Storage: Formatting block pool BP-1059995147-192.168.158.1-1752101929360 directory /hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current 2025-07-09 17:59:25,011 INFO org.apache.hadoop.hdfs.server.common.Storage: Analyzing storage directories for bpid BP-1059995147-192.168.158.1-1752101929360 2025-07-09 17:59:25,012 INFO org.apache.hadoop.hdfs.server.common.Storage: Locking is disabled for /hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360 2025-07-09 17:59:25,012 INFO org.apache.hadoop.hdfs.server.common.Storage: Block pool storage directory for location [DISK]file:/hdfs/d3/dfs/dn and block pool id BP-1059995147-192.168.158.1-1752101929360 is not formatted. Formatting ... 2025-07-09 17:59:25,012 INFO org.apache.hadoop.hdfs.server.common.Storage: Formatting block pool BP-1059995147-192.168.158.1-1752101929360 directory /hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current 2025-07-09 17:59:25,034 INFO org.apache.hadoop.hdfs.server.common.Storage: Analyzing storage directories for bpid BP-1059995147-192.168.158.1-1752101929360 2025-07-09 17:59:25,035 INFO org.apache.hadoop.hdfs.server.common.Storage: Locking is disabled for /hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360 2025-07-09 17:59:25,035 INFO org.apache.hadoop.hdfs.server.common.Storage: Block pool storage directory for location [DISK]file:/hdfs/d4/dfs/dn and block pool id BP-1059995147-192.168.158.1-1752101929360 is not formatted. Formatting ... 2025-07-09 17:59:25,035 INFO org.apache.hadoop.hdfs.server.common.Storage: Formatting block pool BP-1059995147-192.168.158.1-1752101929360 directory /hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current 2025-07-09 17:59:25,037 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Setting up storage: nsid=2068539957;bpid=BP-1059995147-192.168.158.1-1752101929360;lv=-57;nsInfo=lv=-64;cid=cluster59;nsid=2068539957;c=1752101929360;bpid=BP-1059995147-192.168.158.1-1752101929360;dnuuid=null 2025-07-09 17:59:25,039 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Generated and persisted new Datanode UUID be50c32a-aa23-4b9d-aa7f-05816b6e5f1a 2025-07-09 17:59:25,055 INFO org.apache.hadoop.conf.Configuration.deprecation: No unit for dfs.datanode.lock-reporting-threshold-ms(300) assuming MILLISECONDS 2025-07-09 17:59:25,058 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: The datanode lock is a read write lock 2025-07-09 17:59:25,105 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Added new volume: DS-c6caf9b4-0cd0-462e-a7af-39538ffb6d0e 2025-07-09 17:59:25,105 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Added volume - [DISK]file:/hdfs/d1/dfs/dn, StorageType: DISK 2025-07-09 17:59:25,107 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Added new volume: DS-ab1b4344-d9fe-4401-915a-b02983ca3944 2025-07-09 17:59:25,107 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Added volume - [DISK]file:/hdfs/d2/dfs/dn, StorageType: DISK 2025-07-09 17:59:25,109 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Added new volume: DS-f02a6d6f-472c-481a-aa41-d58991ac764f 2025-07-09 17:59:25,109 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Added volume - [DISK]file:/hdfs/d3/dfs/dn, StorageType: DISK 2025-07-09 17:59:25,111 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Added new volume: DS-e9eccc83-296b-4afa-bee5-915188e0d9a5 2025-07-09 17:59:25,111 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Added volume - [DISK]file:/hdfs/d4/dfs/dn, StorageType: DISK 2025-07-09 17:59:25,120 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Registered FSDatasetState MBean 2025-07-09 17:59:25,131 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Adding block pool BP-1059995147-192.168.158.1-1752101929360 2025-07-09 17:59:25,133 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Scanning block pool BP-1059995147-192.168.158.1-1752101929360 on volume /hdfs/d1/dfs/dn... 2025-07-09 17:59:25,133 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Scanning block pool BP-1059995147-192.168.158.1-1752101929360 on volume /hdfs/d2/dfs/dn... 2025-07-09 17:59:25,134 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Scanning block pool BP-1059995147-192.168.158.1-1752101929360 on volume /hdfs/d4/dfs/dn... 2025-07-09 17:59:25,133 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Scanning block pool BP-1059995147-192.168.158.1-1752101929360 on volume /hdfs/d3/dfs/dn... 2025-07-09 17:59:25,193 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Time taken to scan block pool BP-1059995147-192.168.158.1-1752101929360 on /hdfs/d2/dfs/dn: 58ms 2025-07-09 17:59:25,213 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Time taken to scan block pool BP-1059995147-192.168.158.1-1752101929360 on /hdfs/d3/dfs/dn: 77ms 2025-07-09 17:59:25,215 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Time taken to scan block pool BP-1059995147-192.168.158.1-1752101929360 on /hdfs/d1/dfs/dn: 81ms 2025-07-09 17:59:25,215 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Time taken to scan block pool BP-1059995147-192.168.158.1-1752101929360 on /hdfs/d4/dfs/dn: 79ms 2025-07-09 17:59:25,215 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Total time to scan all replicas for block pool BP-1059995147-192.168.158.1-1752101929360: 84ms 2025-07-09 17:59:25,218 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Adding replicas to map for block pool BP-1059995147-192.168.158.1-1752101929360 on volume /hdfs/d1/dfs/dn... 2025-07-09 17:59:25,218 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Adding replicas to map for block pool BP-1059995147-192.168.158.1-1752101929360 on volume /hdfs/d2/dfs/dn... 2025-07-09 17:59:25,219 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.BlockPoolSlice: Replica Cache file: /hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/replicas doesn't exist 2025-07-09 17:59:25,219 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Adding replicas to map for block pool BP-1059995147-192.168.158.1-1752101929360 on volume /hdfs/d3/dfs/dn... 2025-07-09 17:59:25,220 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.BlockPoolSlice: Replica Cache file: /hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/replicas doesn't exist 2025-07-09 17:59:25,219 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.BlockPoolSlice: Replica Cache file: /hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/replicas doesn't exist 2025-07-09 17:59:25,219 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Adding replicas to map for block pool BP-1059995147-192.168.158.1-1752101929360 on volume /hdfs/d4/dfs/dn... 2025-07-09 17:59:25,221 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.BlockPoolSlice: Replica Cache file: /hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/replicas doesn't exist 2025-07-09 17:59:25,223 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Time to add replicas to map for block pool BP-1059995147-192.168.158.1-1752101929360 on volume /hdfs/d4/dfs/dn: 2ms 2025-07-09 17:59:25,224 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Time to add replicas to map for block pool BP-1059995147-192.168.158.1-1752101929360 on volume /hdfs/d2/dfs/dn: 6ms 2025-07-09 17:59:25,225 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Time to add replicas to map for block pool BP-1059995147-192.168.158.1-1752101929360 on volume /hdfs/d1/dfs/dn: 6ms 2025-07-09 17:59:25,225 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Time to add replicas to map for block pool BP-1059995147-192.168.158.1-1752101929360 on volume /hdfs/d3/dfs/dn: 6ms 2025-07-09 17:59:25,226 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Total time to add all replicas to map for block pool BP-1059995147-192.168.158.1-1752101929360: 9ms 2025-07-09 17:59:25,227 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for /hdfs/d1/dfs/dn 2025-07-09 17:59:25,240 INFO org.apache.hadoop.hdfs.server.datanode.checker.DatasetVolumeChecker: Scheduled health check for volume /hdfs/d1/dfs/dn 2025-07-09 17:59:25,242 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for /hdfs/d2/dfs/dn 2025-07-09 17:59:25,242 INFO org.apache.hadoop.hdfs.server.datanode.checker.DatasetVolumeChecker: Scheduled health check for volume /hdfs/d2/dfs/dn 2025-07-09 17:59:25,243 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for /hdfs/d3/dfs/dn 2025-07-09 17:59:25,243 INFO org.apache.hadoop.hdfs.server.datanode.checker.DatasetVolumeChecker: Scheduled health check for volume /hdfs/d3/dfs/dn 2025-07-09 17:59:25,243 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for /hdfs/d4/dfs/dn 2025-07-09 17:59:25,244 INFO org.apache.hadoop.hdfs.server.datanode.checker.DatasetVolumeChecker: Scheduled health check for volume /hdfs/d4/dfs/dn 2025-07-09 17:59:25,248 INFO org.apache.hadoop.hdfs.server.datanode.VolumeScanner: Now scanning bpid BP-1059995147-192.168.158.1-1752101929360 on volume /hdfs/d2/dfs/dn 2025-07-09 17:59:25,248 INFO org.apache.hadoop.hdfs.server.datanode.VolumeScanner: Now scanning bpid BP-1059995147-192.168.158.1-1752101929360 on volume /hdfs/d3/dfs/dn 2025-07-09 17:59:25,248 INFO org.apache.hadoop.hdfs.server.datanode.VolumeScanner: Now scanning bpid BP-1059995147-192.168.158.1-1752101929360 on volume /hdfs/d4/dfs/dn 2025-07-09 17:59:25,248 INFO org.apache.hadoop.hdfs.server.datanode.VolumeScanner: Now scanning bpid BP-1059995147-192.168.158.1-1752101929360 on volume /hdfs/d1/dfs/dn 2025-07-09 17:59:25,256 INFO org.apache.hadoop.hdfs.server.datanode.VolumeScanner: VolumeScanner(/hdfs/d2/dfs/dn, DS-ab1b4344-d9fe-4401-915a-b02983ca3944): finished scanning block pool BP-1059995147-192.168.158.1-1752101929360 2025-07-09 17:59:25,256 INFO org.apache.hadoop.hdfs.server.datanode.VolumeScanner: VolumeScanner(/hdfs/d3/dfs/dn, DS-f02a6d6f-472c-481a-aa41-d58991ac764f): finished scanning block pool BP-1059995147-192.168.158.1-1752101929360 2025-07-09 17:59:25,256 INFO org.apache.hadoop.hdfs.server.datanode.VolumeScanner: VolumeScanner(/hdfs/d4/dfs/dn, DS-e9eccc83-296b-4afa-bee5-915188e0d9a5): finished scanning block pool BP-1059995147-192.168.158.1-1752101929360 2025-07-09 17:59:25,256 INFO org.apache.hadoop.hdfs.server.datanode.VolumeScanner: VolumeScanner(/hdfs/d1/dfs/dn, DS-c6caf9b4-0cd0-462e-a7af-39538ffb6d0e): finished scanning block pool BP-1059995147-192.168.158.1-1752101929360 2025-07-09 17:59:25,264 INFO org.apache.hadoop.hdfs.server.datanode.DirectoryScanner: Periodic Directory Tree Verification scan starting at 7/9/25 11:36 PM with interval of 21600000ms 2025-07-09 17:59:25,277 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool BP-1059995147-192.168.158.1-1752101929360 (Datanode Uuid be50c32a-aa23-4b9d-aa7f-05816b6e5f1a) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 beginning handshake with NN 2025-07-09 17:59:25,288 INFO org.apache.hadoop.hdfs.server.datanode.VolumeScanner: VolumeScanner(/hdfs/d2/dfs/dn, DS-ab1b4344-d9fe-4401-915a-b02983ca3944): no suitable block pools found to scan. Waiting 1814399960 ms. 2025-07-09 17:59:25,288 INFO org.apache.hadoop.hdfs.server.datanode.VolumeScanner: VolumeScanner(/hdfs/d1/dfs/dn, DS-c6caf9b4-0cd0-462e-a7af-39538ffb6d0e): no suitable block pools found to scan. Waiting 1814399960 ms. 2025-07-09 17:59:25,288 INFO org.apache.hadoop.hdfs.server.datanode.VolumeScanner: VolumeScanner(/hdfs/d3/dfs/dn, DS-f02a6d6f-472c-481a-aa41-d58991ac764f): no suitable block pools found to scan. Waiting 1814399960 ms. 2025-07-09 17:59:25,288 INFO org.apache.hadoop.hdfs.server.datanode.VolumeScanner: VolumeScanner(/hdfs/d4/dfs/dn, DS-e9eccc83-296b-4afa-bee5-915188e0d9a5): no suitable block pools found to scan. Waiting 1814399960 ms. 2025-07-09 17:59:25,375 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool Block pool BP-1059995147-192.168.158.1-1752101929360 (Datanode Uuid be50c32a-aa23-4b9d-aa7f-05816b6e5f1a) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 successfully registered with NN 2025-07-09 17:59:25,377 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: For namenode dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 using BLOCKREPORT_INTERVAL of 21600000msec CACHEREPORT_INTERVAL of 10000msec Initial delay: 0msec; heartBeatInterval=3000 2025-07-09 17:59:25,377 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting IBR Task Handler. 2025-07-09 17:59:25,641 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Successfully sent block report 0x12cfd9a757d31f25, containing 4 storage report(s), of which we sent 4. The reports had 0 total blocks and used 1 RPC(s). This took 7 msec to generate and 121 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5. 2025-07-09 17:59:25,642 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Got finalize command for block pool BP-1059995147-192.168.158.1-1752101929360 2025-07-09 18:08:17,995 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073741970_1146 src: /192.168.158.1:48916 dest: /192.168.158.4:9866 2025-07-09 18:08:18,022 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: DataNode{data=FSDataset{dirpath='[/hdfs/d1/dfs/dn, /hdfs/d2/dfs/dn, /hdfs/d3/dfs/dn, /hdfs/d4/dfs/dn]'}, localName='dmidlkprdls04.svr.luc.edu:9866', datanodeUuid='be50c32a-aa23-4b9d-aa7f-05816b6e5f1a', xmitsInProgress=0}:Exception transfering block BP-1059995147-192.168.158.1-1752101929360:blk_1073741970_1146 to mirror 192.168.158.9:9866 java.net.NoRouteToHostException: No route to host at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:716) at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:205) at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:586) at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:550) at org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:794) at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opWriteBlock(Receiver.java:178) at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:112) at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:291) at java.lang.Thread.run(Thread.java:750) 2025-07-09 18:08:18,025 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: opWriteBlock BP-1059995147-192.168.158.1-1752101929360:blk_1073741970_1146 received exception java.net.NoRouteToHostException: No route to host 2025-07-09 18:08:18,029 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: dmidlkprdls04.svr.luc.edu:9866:DataXceiver error processing WRITE_BLOCK operation src: /192.168.158.1:48916 dst: /192.168.158.4:9866 java.net.NoRouteToHostException: No route to host at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:716) at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:205) at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:586) at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:550) at org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:794) at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opWriteBlock(Receiver.java:178) at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:112) at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:291) at java.lang.Thread.run(Thread.java:750) 2025-07-09 18:08:18,067 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073741971_1147 src: /192.168.158.1:48930 dest: /192.168.158.4:9866 2025-07-09 18:08:18,069 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: DataNode{data=FSDataset{dirpath='[/hdfs/d1/dfs/dn, /hdfs/d2/dfs/dn, /hdfs/d3/dfs/dn, /hdfs/d4/dfs/dn]'}, localName='dmidlkprdls04.svr.luc.edu:9866', datanodeUuid='be50c32a-aa23-4b9d-aa7f-05816b6e5f1a', xmitsInProgress=0}:Exception transfering block BP-1059995147-192.168.158.1-1752101929360:blk_1073741971_1147 to mirror 192.168.158.7:9866 java.net.NoRouteToHostException: No route to host at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:716) at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:205) at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:586) at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:550) at org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:794) at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opWriteBlock(Receiver.java:178) at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:112) at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:291) at java.lang.Thread.run(Thread.java:750) 2025-07-09 18:08:18,070 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: opWriteBlock BP-1059995147-192.168.158.1-1752101929360:blk_1073741971_1147 received exception java.net.NoRouteToHostException: No route to host 2025-07-09 18:08:18,070 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: dmidlkprdls04.svr.luc.edu:9866:DataXceiver error processing WRITE_BLOCK operation src: /192.168.158.1:48930 dst: /192.168.158.4:9866 java.net.NoRouteToHostException: No route to host at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:716) at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:205) at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:586) at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:550) at org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:794) at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opWriteBlock(Receiver.java:178) at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:112) at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:291) at java.lang.Thread.run(Thread.java:750) 2025-07-09 18:08:18,080 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073741972_1148 src: /192.168.158.1:48932 dest: /192.168.158.4:9866 2025-07-09 18:08:18,082 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: DataNode{data=FSDataset{dirpath='[/hdfs/d1/dfs/dn, /hdfs/d2/dfs/dn, /hdfs/d3/dfs/dn, /hdfs/d4/dfs/dn]'}, localName='dmidlkprdls04.svr.luc.edu:9866', datanodeUuid='be50c32a-aa23-4b9d-aa7f-05816b6e5f1a', xmitsInProgress=0}:Exception transfering block BP-1059995147-192.168.158.1-1752101929360:blk_1073741972_1148 to mirror 192.168.158.6:9866 java.net.NoRouteToHostException: No route to host at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:716) at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:205) at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:586) at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:550) at org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:794) at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opWriteBlock(Receiver.java:178) at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:112) at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:291) at java.lang.Thread.run(Thread.java:750) 2025-07-09 18:08:18,083 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: opWriteBlock BP-1059995147-192.168.158.1-1752101929360:blk_1073741972_1148 received exception java.net.NoRouteToHostException: No route to host 2025-07-09 18:08:18,083 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: dmidlkprdls04.svr.luc.edu:9866:DataXceiver error processing WRITE_BLOCK operation src: /192.168.158.1:48932 dst: /192.168.158.4:9866 java.net.NoRouteToHostException: No route to host at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:716) at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:205) at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:586) at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:550) at org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:794) at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opWriteBlock(Receiver.java:178) at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:112) at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:291) at java.lang.Thread.run(Thread.java:750) 2025-07-09 18:09:29,833 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073741974_1150 src: /192.168.158.6:38264 dest: /192.168.158.4:9866 2025-07-09 18:09:33,553 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:38264, dest: /192.168.158.4:9866, bytes: 134217728, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-613726210_1, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073741974_1150, duration(ns): 3704528446 2025-07-09 18:09:33,553 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073741974_1150, type=LAST_IN_PIPELINE terminating 2025-07-09 18:09:33,594 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073741975_1151 src: /192.168.158.1:35916 dest: /192.168.158.4:9866 2025-07-09 18:09:37,822 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35916, dest: /192.168.158.4:9866, bytes: 134217728, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-613726210_1, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073741975_1151, duration(ns): 4180643683 2025-07-09 18:09:37,822 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073741975_1151, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-09 18:09:37,978 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073741976_1152 src: /192.168.158.5:48706 dest: /192.168.158.4:9866 2025-07-09 18:09:41,451 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:48706, dest: /192.168.158.4:9866, bytes: 134217728, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-613726210_1, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073741976_1152, duration(ns): 3468878244 2025-07-09 18:09:41,451 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073741976_1152, type=LAST_IN_PIPELINE terminating 2025-07-09 18:09:41,489 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073741977_1153 src: /192.168.158.1:43580 dest: /192.168.158.4:9866 2025-07-09 18:09:45,022 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43580, dest: /192.168.158.4:9866, bytes: 134217728, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-613726210_1, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073741977_1153, duration(ns): 3519127644 2025-07-09 18:09:45,022 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073741977_1153, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-09 18:09:47,767 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073741979_1155 src: /192.168.158.5:39044 dest: /192.168.158.4:9866 2025-07-09 18:09:47,813 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:39044, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1643012612_381, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073741979_1155, duration(ns): 39357752 2025-07-09 18:09:47,814 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073741979_1155, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-09 18:09:48,694 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073741980_1156 src: /192.168.158.7:52408 dest: /192.168.158.4:9866 2025-07-09 18:09:49,388 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073741981_1157 src: /192.168.158.7:52418 dest: /192.168.158.4:9866 2025-07-09 18:09:49,412 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:52418, dest: /192.168.158.4:9866, bytes: 1, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_920403633_1, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073741981_1157, duration(ns): 15959628 2025-07-09 18:09:49,414 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073741981_1157, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-09 18:09:49,874 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073741982_1158 src: /192.168.158.6:41686 dest: /192.168.158.4:9866 2025-07-09 18:09:49,886 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:41686, dest: /192.168.158.4:9866, bytes: 1, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_920403633_1, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073741982_1158, duration(ns): 9131097 2025-07-09 18:09:49,887 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073741982_1158, type=LAST_IN_PIPELINE terminating 2025-07-09 18:09:50,665 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073741983_1159 src: /192.168.158.6:41690 dest: /192.168.158.4:9866 2025-07-09 18:09:50,678 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:41690, dest: /192.168.158.4:9866, bytes: 1, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_920403633_1, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073741983_1159, duration(ns): 10558356 2025-07-09 18:09:50,678 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073741983_1159, type=LAST_IN_PIPELINE terminating 2025-07-09 18:09:51,251 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073741984_1160 src: /192.168.158.6:41698 dest: /192.168.158.4:9866 2025-07-09 18:09:51,266 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:41698, dest: /192.168.158.4:9866, bytes: 1, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_920403633_1, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073741984_1160, duration(ns): 11644512 2025-07-09 18:09:51,266 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073741984_1160, type=LAST_IN_PIPELINE terminating 2025-07-09 18:09:52,226 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:52408, dest: /192.168.158.4:9866, bytes: 134217728, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-613726210_1, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073741980_1156, duration(ns): 3520267073 2025-07-09 18:09:52,227 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073741980_1156, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-09 18:09:55,658 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073741986_1162 src: /192.168.158.5:39050 dest: /192.168.158.4:9866 2025-07-09 18:09:59,451 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:39050, dest: /192.168.158.4:9866, bytes: 134217728, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-613726210_1, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073741986_1162, duration(ns): 3789866402 2025-07-09 18:09:59,451 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073741986_1162, type=LAST_IN_PIPELINE terminating 2025-07-09 18:10:42,625 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073741989_1165 src: /192.168.158.1:58412 dest: /192.168.158.4:9866 2025-07-09 18:10:42,875 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58412, dest: /192.168.158.4:9866, bytes: 134217728, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1876684061_1, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073741989_1165, duration(ns): 236777922 2025-07-09 18:10:42,875 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073741989_1165, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-09 18:10:43,493 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073741992_1168 src: /192.168.158.1:58416 dest: /192.168.158.4:9866 2025-07-09 18:10:43,637 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58416, dest: /192.168.158.4:9866, bytes: 68526802, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1876684061_1, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073741992_1168, duration(ns): 131827279 2025-07-09 18:10:43,637 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073741992_1168, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-09 18:10:47,764 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073741993_1169 src: /192.168.158.6:50498 dest: /192.168.158.4:9866 2025-07-09 18:10:47,787 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:50498, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_942294351_381, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073741993_1169, duration(ns): 15424906 2025-07-09 18:10:47,788 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073741993_1169, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-09 18:11:52,765 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073741995_1171 src: /192.168.158.6:57344 dest: /192.168.158.4:9866 2025-07-09 18:11:52,792 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:57344, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1769839588_381, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073741995_1171, duration(ns): 18599688 2025-07-09 18:11:52,792 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073741995_1171, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-09 18:20:20,648 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742003_1179 src: /192.168.158.5:36490 dest: /192.168.158.4:9866 2025-07-09 18:20:20,672 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:36490, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-997825366_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742003_1179, duration(ns): 15132521 2025-07-09 18:20:20,673 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742003_1179, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-09 18:22:25,645 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742005_1181 src: /192.168.158.7:34028 dest: /192.168.158.4:9866 2025-07-09 18:22:25,662 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:34028, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2146097507_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742005_1181, duration(ns): 14344223 2025-07-09 18:22:25,663 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742005_1181, type=LAST_IN_PIPELINE terminating 2025-07-09 18:24:25,632 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742007_1183 src: /192.168.158.9:54050 dest: /192.168.158.4:9866 2025-07-09 18:24:25,649 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:54050, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_916406574_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742007_1183, duration(ns): 13822950 2025-07-09 18:24:25,649 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742007_1183, type=LAST_IN_PIPELINE terminating 2025-07-09 18:26:25,643 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742009_1185 src: /192.168.158.6:49658 dest: /192.168.158.4:9866 2025-07-09 18:26:25,668 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:49658, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-998415113_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742009_1185, duration(ns): 16722728 2025-07-09 18:26:25,668 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742009_1185, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-09 18:27:25,648 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742010_1186 src: /192.168.158.7:56516 dest: /192.168.158.4:9866 2025-07-09 18:27:25,667 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:56516, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-496487264_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742010_1186, duration(ns): 15364955 2025-07-09 18:27:25,668 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742010_1186, type=LAST_IN_PIPELINE terminating 2025-07-09 18:32:16,500 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742015_1191 src: /192.168.158.6:54390 dest: /192.168.158.4:9866 2025-07-09 18:32:16,518 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:54390, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_610884486_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742015_1191, duration(ns): 14736265 2025-07-09 18:32:16,518 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742015_1191, type=LAST_IN_PIPELINE terminating 2025-07-09 18:35:26,484 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742018_1194 src: /192.168.158.9:49614 dest: /192.168.158.4:9866 2025-07-09 18:35:26,513 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:49614, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1621889693_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742018_1194, duration(ns): 20742531 2025-07-09 18:35:26,513 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742018_1194, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-09 18:39:31,508 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742022_1198 src: /192.168.158.9:39736 dest: /192.168.158.4:9866 2025-07-09 18:39:31,525 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:39736, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-331925291_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742022_1198, duration(ns): 14679931 2025-07-09 18:39:31,526 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742022_1198, type=LAST_IN_PIPELINE terminating 2025-07-09 18:40:31,500 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742023_1199 src: /192.168.158.9:56790 dest: /192.168.158.4:9866 2025-07-09 18:40:31,525 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:56790, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1297297206_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742023_1199, duration(ns): 17313246 2025-07-09 18:40:31,526 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742023_1199, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-09 18:41:31,494 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742024_1200 src: /192.168.158.1:60088 dest: /192.168.158.4:9866 2025-07-09 18:41:31,531 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60088, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1222116312_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742024_1200, duration(ns): 23497827 2025-07-09 18:41:31,531 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742024_1200, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-09 18:43:31,488 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742026_1202 src: /192.168.158.1:32908 dest: /192.168.158.4:9866 2025-07-09 18:43:31,522 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:32908, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1228526466_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742026_1202, duration(ns): 20275564 2025-07-09 18:43:31,522 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742026_1202, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-09 18:45:36,516 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742028_1204 src: /192.168.158.1:50178 dest: /192.168.158.4:9866 2025-07-09 18:45:36,551 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50178, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1664979317_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742028_1204, duration(ns): 21623732 2025-07-09 18:45:36,551 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742028_1204, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-09 18:46:36,503 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742029_1205 src: /192.168.158.1:50922 dest: /192.168.158.4:9866 2025-07-09 18:46:36,592 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50922, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_132856744_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742029_1205, duration(ns): 25400757 2025-07-09 18:46:36,592 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742029_1205, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-09 18:47:41,503 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742030_1206 src: /192.168.158.7:39356 dest: /192.168.158.4:9866 2025-07-09 18:47:41,522 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:39356, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1017996551_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742030_1206, duration(ns): 15176918 2025-07-09 18:47:41,522 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742030_1206, type=LAST_IN_PIPELINE terminating 2025-07-09 18:48:41,509 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742031_1207 src: /192.168.158.8:56372 dest: /192.168.158.4:9866 2025-07-09 18:48:41,526 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:56372, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1853643098_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742031_1207, duration(ns): 14359112 2025-07-09 18:48:41,527 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742031_1207, type=LAST_IN_PIPELINE terminating 2025-07-09 18:49:41,506 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742032_1208 src: /192.168.158.6:38058 dest: /192.168.158.4:9866 2025-07-09 18:49:41,524 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:38058, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2049300268_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742032_1208, duration(ns): 14411865 2025-07-09 18:49:41,524 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742032_1208, type=LAST_IN_PIPELINE terminating 2025-07-09 18:50:46,532 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742033_1209 src: /192.168.158.7:37058 dest: /192.168.158.4:9866 2025-07-09 18:50:46,559 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:37058, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_331202703_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742033_1209, duration(ns): 19056069 2025-07-09 18:50:46,562 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742033_1209, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-09 18:51:16,037 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742034_1210 src: /192.168.158.5:52374 dest: /192.168.158.4:9866 2025-07-09 18:51:16,097 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:52374, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1838817937_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742034_1210, duration(ns): 52459218 2025-07-09 18:51:16,097 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742034_1210, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-09 18:54:21,659 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742037_1213 src: /192.168.158.1:45656 dest: /192.168.158.4:9866 2025-07-09 18:54:21,736 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45656, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1666481024_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742037_1213, duration(ns): 64501030 2025-07-09 18:54:21,737 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742037_1213, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-09 18:55:26,651 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742038_1214 src: /192.168.158.1:54584 dest: /192.168.158.4:9866 2025-07-09 18:55:26,687 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54584, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1847493722_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742038_1214, duration(ns): 23074378 2025-07-09 18:55:26,687 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742038_1214, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-09 18:56:26,642 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742039_1215 src: /192.168.158.5:49426 dest: /192.168.158.4:9866 2025-07-09 18:56:26,661 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:49426, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1894779713_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742039_1215, duration(ns): 16087230 2025-07-09 18:56:26,661 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742039_1215, type=LAST_IN_PIPELINE terminating 2025-07-09 18:57:26,649 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742040_1216 src: /192.168.158.6:41224 dest: /192.168.158.4:9866 2025-07-09 18:57:26,674 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:41224, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1366788961_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742040_1216, duration(ns): 17808630 2025-07-09 18:57:26,675 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742040_1216, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-09 18:58:26,662 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742041_1217 src: /192.168.158.9:41722 dest: /192.168.158.4:9866 2025-07-09 18:58:26,683 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:41722, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_508021941_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742041_1217, duration(ns): 17580057 2025-07-09 18:58:26,683 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742041_1217, type=LAST_IN_PIPELINE terminating 2025-07-09 18:59:19,613 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742018_1194 replica FinalizedReplica, blk_1073742018_1194, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742018 for deletion 2025-07-09 18:59:19,617 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742022_1198 replica FinalizedReplica, blk_1073742022_1198, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742022 for deletion 2025-07-09 18:59:19,617 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742018_1194 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742018 2025-07-09 18:59:19,619 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742023_1199 replica FinalizedReplica, blk_1073742023_1199, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742023 for deletion 2025-07-09 18:59:19,619 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742022_1198 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742022 2025-07-09 18:59:19,620 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742024_1200 replica FinalizedReplica, blk_1073742024_1200, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742024 for deletion 2025-07-09 18:59:19,621 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742023_1199 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742023 2025-07-09 18:59:19,621 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742026_1202 replica FinalizedReplica, blk_1073742026_1202, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742026 for deletion 2025-07-09 18:59:19,621 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742024_1200 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742024 2025-07-09 18:59:19,621 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742028_1204 replica FinalizedReplica, blk_1073742028_1204, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742028 for deletion 2025-07-09 18:59:19,622 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742026_1202 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742026 2025-07-09 18:59:19,622 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742028_1204 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742028 2025-07-09 18:59:19,622 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742029_1205 replica FinalizedReplica, blk_1073742029_1205, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742029 for deletion 2025-07-09 18:59:19,623 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742030_1206 replica FinalizedReplica, blk_1073742030_1206, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742030 for deletion 2025-07-09 18:59:19,623 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742029_1205 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742029 2025-07-09 18:59:19,623 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742031_1207 replica FinalizedReplica, blk_1073742031_1207, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742031 for deletion 2025-07-09 18:59:19,624 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742030_1206 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742030 2025-07-09 18:59:19,624 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742031_1207 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742031 2025-07-09 18:59:19,624 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742032_1208 replica FinalizedReplica, blk_1073742032_1208, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742032 for deletion 2025-07-09 18:59:19,625 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742033_1209 replica FinalizedReplica, blk_1073742033_1209, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742033 for deletion 2025-07-09 18:59:19,625 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742032_1208 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742032 2025-07-09 18:59:19,625 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073741970_1146 replica ReplicaBeingWritten, blk_1073741970_1146, RBW getNumBytes() = 0 getBytesOnDisk() = 0 getVisibleLength()= 0 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/rbw/blk_1073741970 bytesAcked=0 bytesOnDisk=0 for deletion 2025-07-09 18:59:19,625 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742033_1209 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742033 2025-07-09 18:59:19,626 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742034_1210 replica FinalizedReplica, blk_1073742034_1210, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742034 for deletion 2025-07-09 18:59:19,626 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073741970_1146 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/rbw/blk_1073741970 2025-07-09 18:59:19,626 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073741971_1147 replica ReplicaBeingWritten, blk_1073741971_1147, RBW getNumBytes() = 0 getBytesOnDisk() = 0 getVisibleLength()= 0 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/rbw/blk_1073741971 bytesAcked=0 bytesOnDisk=0 for deletion 2025-07-09 18:59:19,627 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742034_1210 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742034 2025-07-09 18:59:19,627 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073741971_1147 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/rbw/blk_1073741971 2025-07-09 18:59:19,627 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073741972_1148 replica ReplicaBeingWritten, blk_1073741972_1148, RBW getNumBytes() = 0 getBytesOnDisk() = 0 getVisibleLength()= 0 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/rbw/blk_1073741972 bytesAcked=0 bytesOnDisk=0 for deletion 2025-07-09 18:59:19,627 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742037_1213 replica FinalizedReplica, blk_1073742037_1213, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742037 for deletion 2025-07-09 18:59:19,628 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073741972_1148 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/rbw/blk_1073741972 2025-07-09 18:59:19,628 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742038_1214 replica FinalizedReplica, blk_1073742038_1214, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742038 for deletion 2025-07-09 18:59:19,628 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742037_1213 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742037 2025-07-09 18:59:19,628 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742039_1215 replica FinalizedReplica, blk_1073742039_1215, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742039 for deletion 2025-07-09 18:59:19,628 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742038_1214 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742038 2025-07-09 18:59:19,629 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742040_1216 replica FinalizedReplica, blk_1073742040_1216, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742040 for deletion 2025-07-09 18:59:19,629 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742039_1215 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742039 2025-07-09 18:59:19,629 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742041_1217 replica FinalizedReplica, blk_1073742041_1217, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742041 for deletion 2025-07-09 18:59:19,630 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742040_1216 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742040 2025-07-09 18:59:19,630 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073741979_1155 replica FinalizedReplica, blk_1073741979_1155, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073741979 for deletion 2025-07-09 18:59:19,630 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742041_1217 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742041 2025-07-09 18:59:19,631 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073741979_1155 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073741979 2025-07-09 18:59:19,631 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073741981_1157 replica FinalizedReplica, blk_1073741981_1157, FINALIZED getNumBytes() = 1 getBytesOnDisk() = 1 getVisibleLength()= 1 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073741981 for deletion 2025-07-09 18:59:19,631 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073741981_1157 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073741981 2025-07-09 18:59:19,631 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073741982_1158 replica FinalizedReplica, blk_1073741982_1158, FINALIZED getNumBytes() = 1 getBytesOnDisk() = 1 getVisibleLength()= 1 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073741982 for deletion 2025-07-09 18:59:19,632 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073741982_1158 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073741982 2025-07-09 18:59:19,632 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073741983_1159 replica FinalizedReplica, blk_1073741983_1159, FINALIZED getNumBytes() = 1 getBytesOnDisk() = 1 getVisibleLength()= 1 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073741983 for deletion 2025-07-09 18:59:19,633 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073741984_1160 replica FinalizedReplica, blk_1073741984_1160, FINALIZED getNumBytes() = 1 getBytesOnDisk() = 1 getVisibleLength()= 1 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073741984 for deletion 2025-07-09 18:59:19,633 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073741983_1159 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073741983 2025-07-09 18:59:19,633 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073741993_1169 replica FinalizedReplica, blk_1073741993_1169, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073741993 for deletion 2025-07-09 18:59:19,633 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073741984_1160 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073741984 2025-07-09 18:59:19,634 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073741995_1171 replica FinalizedReplica, blk_1073741995_1171, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073741995 for deletion 2025-07-09 18:59:19,634 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073741993_1169 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073741993 2025-07-09 18:59:19,634 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742003_1179 replica FinalizedReplica, blk_1073742003_1179, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742003 for deletion 2025-07-09 18:59:19,634 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073741995_1171 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073741995 2025-07-09 18:59:19,634 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742003_1179 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742003 2025-07-09 18:59:19,635 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742005_1181 replica FinalizedReplica, blk_1073742005_1181, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742005 for deletion 2025-07-09 18:59:19,635 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742005_1181 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742005 2025-07-09 18:59:19,635 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742007_1183 replica FinalizedReplica, blk_1073742007_1183, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742007 for deletion 2025-07-09 18:59:19,636 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742009_1185 replica FinalizedReplica, blk_1073742009_1185, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742009 for deletion 2025-07-09 18:59:19,636 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742007_1183 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742007 2025-07-09 18:59:19,636 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742009_1185 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742009 2025-07-09 18:59:19,636 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742010_1186 replica FinalizedReplica, blk_1073742010_1186, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742010 for deletion 2025-07-09 18:59:19,637 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742015_1191 replica FinalizedReplica, blk_1073742015_1191, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742015 for deletion 2025-07-09 18:59:19,637 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742010_1186 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742010 2025-07-09 18:59:19,637 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742015_1191 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742015 2025-07-09 19:00:36,640 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742043_1219 src: /192.168.158.1:38560 dest: /192.168.158.4:9866 2025-07-09 19:00:36,674 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38560, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-388856925_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742043_1219, duration(ns): 21332776 2025-07-09 19:00:36,675 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742043_1219, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-09 19:00:37,572 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742043_1219 replica FinalizedReplica, blk_1073742043_1219, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742043 for deletion 2025-07-09 19:00:37,574 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742043_1219 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742043 2025-07-09 19:03:36,672 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742046_1222 src: /192.168.158.8:44540 dest: /192.168.158.4:9866 2025-07-09 19:03:36,693 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:44540, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1169983757_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742046_1222, duration(ns): 17582482 2025-07-09 19:03:36,694 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742046_1222, type=LAST_IN_PIPELINE terminating 2025-07-09 19:03:40,586 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742046_1222 replica FinalizedReplica, blk_1073742046_1222, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742046 for deletion 2025-07-09 19:03:40,588 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742046_1222 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742046 2025-07-09 19:08:41,672 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742051_1227 src: /192.168.158.1:47198 dest: /192.168.158.4:9866 2025-07-09 19:08:41,712 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47198, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_990154125_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742051_1227, duration(ns): 25913865 2025-07-09 19:08:41,716 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742051_1227, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-09 19:08:43,606 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742051_1227 replica FinalizedReplica, blk_1073742051_1227, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742051 for deletion 2025-07-09 19:08:43,608 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742051_1227 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742051 2025-07-09 19:09:46,680 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742052_1228 src: /192.168.158.1:45500 dest: /192.168.158.4:9866 2025-07-09 19:09:46,717 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45500, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_291302125_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742052_1228, duration(ns): 24459037 2025-07-09 19:09:46,717 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742052_1228, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-09 19:09:49,613 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742052_1228 replica FinalizedReplica, blk_1073742052_1228, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742052 for deletion 2025-07-09 19:09:49,615 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742052_1228 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742052 2025-07-09 19:11:51,656 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742054_1230 src: /192.168.158.1:37432 dest: /192.168.158.4:9866 2025-07-09 19:11:51,690 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37432, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1883087963_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742054_1230, duration(ns): 21695935 2025-07-09 19:11:51,692 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742054_1230, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-09 19:11:55,627 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742054_1230 replica FinalizedReplica, blk_1073742054_1230, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742054 for deletion 2025-07-09 19:11:55,628 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742054_1230 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742054 2025-07-09 19:13:51,654 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742056_1232 src: /192.168.158.5:60154 dest: /192.168.158.4:9866 2025-07-09 19:13:51,684 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:60154, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_963820564_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742056_1232, duration(ns): 20676137 2025-07-09 19:13:51,684 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742056_1232, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-09 19:13:52,637 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742056_1232 replica FinalizedReplica, blk_1073742056_1232, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742056 for deletion 2025-07-09 19:13:52,638 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742056_1232 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742056 2025-07-09 19:15:51,694 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742058_1234 src: /192.168.158.5:33730 dest: /192.168.158.4:9866 2025-07-09 19:15:51,721 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:33730, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-81139065_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742058_1234, duration(ns): 19637417 2025-07-09 19:15:51,722 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742058_1234, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-09 19:15:52,645 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742058_1234 replica FinalizedReplica, blk_1073742058_1234, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742058 for deletion 2025-07-09 19:15:52,646 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742058_1234 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742058 2025-07-09 19:18:56,689 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742061_1237 src: /192.168.158.6:49632 dest: /192.168.158.4:9866 2025-07-09 19:18:56,707 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:49632, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-263451574_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742061_1237, duration(ns): 14637160 2025-07-09 19:18:56,707 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742061_1237, type=LAST_IN_PIPELINE terminating 2025-07-09 19:18:58,655 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742061_1237 replica FinalizedReplica, blk_1073742061_1237, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742061 for deletion 2025-07-09 19:18:58,657 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742061_1237 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742061 2025-07-09 19:21:56,734 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742064_1240 src: /192.168.158.8:57502 dest: /192.168.158.4:9866 2025-07-09 19:21:56,753 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:57502, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1746081487_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742064_1240, duration(ns): 15708732 2025-07-09 19:21:56,754 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742064_1240, type=LAST_IN_PIPELINE terminating 2025-07-09 19:22:01,670 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742064_1240 replica FinalizedReplica, blk_1073742064_1240, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742064 for deletion 2025-07-09 19:22:01,672 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742064_1240 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742064 2025-07-09 19:23:56,696 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742066_1242 src: /192.168.158.7:49208 dest: /192.168.158.4:9866 2025-07-09 19:23:56,726 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:49208, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_176095087_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742066_1242, duration(ns): 21356230 2025-07-09 19:23:56,726 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742066_1242, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-09 19:23:58,681 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742066_1242 replica FinalizedReplica, blk_1073742066_1242, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742066 for deletion 2025-07-09 19:23:58,683 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742066_1242 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742066 2025-07-09 19:25:01,715 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742067_1243 src: /192.168.158.7:51700 dest: /192.168.158.4:9866 2025-07-09 19:25:01,741 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:51700, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-172441293_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742067_1243, duration(ns): 17759444 2025-07-09 19:25:01,741 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742067_1243, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-09 19:25:07,682 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742067_1243 replica FinalizedReplica, blk_1073742067_1243, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742067 for deletion 2025-07-09 19:25:07,684 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742067_1243 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742067 2025-07-09 19:29:11,695 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742071_1247 src: /192.168.158.7:46302 dest: /192.168.158.4:9866 2025-07-09 19:29:11,715 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:46302, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_404916666_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742071_1247, duration(ns): 16679595 2025-07-09 19:29:11,715 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742071_1247, type=LAST_IN_PIPELINE terminating 2025-07-09 19:29:16,699 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742071_1247 replica FinalizedReplica, blk_1073742071_1247, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742071 for deletion 2025-07-09 19:29:16,700 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742071_1247 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742071 2025-07-09 19:30:11,722 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742072_1248 src: /192.168.158.5:49988 dest: /192.168.158.4:9866 2025-07-09 19:30:11,742 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:49988, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-23576088_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742072_1248, duration(ns): 17329368 2025-07-09 19:30:11,742 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742072_1248, type=LAST_IN_PIPELINE terminating 2025-07-09 19:30:13,705 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742072_1248 replica FinalizedReplica, blk_1073742072_1248, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742072 for deletion 2025-07-09 19:30:13,707 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742072_1248 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742072 2025-07-09 19:33:26,701 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742075_1251 src: /192.168.158.5:51802 dest: /192.168.158.4:9866 2025-07-09 19:33:26,729 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:51802, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1207948971_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742075_1251, duration(ns): 19983542 2025-07-09 19:33:26,732 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742075_1251, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-09 19:33:28,718 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742075_1251 replica FinalizedReplica, blk_1073742075_1251, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742075 for deletion 2025-07-09 19:33:28,720 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742075_1251 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742075 2025-07-09 19:35:26,695 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742077_1253 src: /192.168.158.1:57894 dest: /192.168.158.4:9866 2025-07-09 19:35:26,733 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57894, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1675745849_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742077_1253, duration(ns): 25128619 2025-07-09 19:35:26,733 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742077_1253, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-09 19:35:28,725 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742077_1253 replica FinalizedReplica, blk_1073742077_1253, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742077 for deletion 2025-07-09 19:35:28,726 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742077_1253 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742077 2025-07-09 19:36:26,694 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742078_1254 src: /192.168.158.5:56500 dest: /192.168.158.4:9866 2025-07-09 19:36:26,721 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:56500, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2045829131_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742078_1254, duration(ns): 19776452 2025-07-09 19:36:26,721 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742078_1254, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-09 19:36:31,733 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742078_1254 replica FinalizedReplica, blk_1073742078_1254, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742078 for deletion 2025-07-09 19:36:31,735 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742078_1254 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073742078 2025-07-09 19:38:26,695 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742080_1256 src: /192.168.158.1:36322 dest: /192.168.158.4:9866 2025-07-09 19:38:26,731 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36322, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1403878554_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742080_1256, duration(ns): 23042919 2025-07-09 19:38:26,731 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742080_1256, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-09 19:38:28,739 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742080_1256 replica FinalizedReplica, blk_1073742080_1256, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742080 for deletion 2025-07-09 19:38:28,741 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742080_1256 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742080 2025-07-09 19:40:31,693 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742082_1258 src: /192.168.158.1:53010 dest: /192.168.158.4:9866 2025-07-09 19:40:31,731 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53010, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-606109657_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742082_1258, duration(ns): 24328543 2025-07-09 19:40:31,731 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742082_1258, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-09 19:40:34,747 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742082_1258 replica FinalizedReplica, blk_1073742082_1258, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742082 for deletion 2025-07-09 19:40:34,749 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742082_1258 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742082 2025-07-09 19:41:31,696 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742083_1259 src: /192.168.158.1:47652 dest: /192.168.158.4:9866 2025-07-09 19:41:31,731 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47652, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-889755305_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742083_1259, duration(ns): 23457061 2025-07-09 19:41:31,731 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742083_1259, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-09 19:41:34,753 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742083_1259 replica FinalizedReplica, blk_1073742083_1259, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742083 for deletion 2025-07-09 19:41:34,754 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742083_1259 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742083 2025-07-09 19:42:36,717 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742084_1260 src: /192.168.158.8:58140 dest: /192.168.158.4:9866 2025-07-09 19:42:36,745 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:58140, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2056941111_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742084_1260, duration(ns): 20265952 2025-07-09 19:42:36,748 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742084_1260, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-09 19:42:37,756 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742084_1260 replica FinalizedReplica, blk_1073742084_1260, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742084 for deletion 2025-07-09 19:42:37,757 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742084_1260 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742084 2025-07-09 19:44:36,731 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742086_1262 src: /192.168.158.6:47476 dest: /192.168.158.4:9866 2025-07-09 19:44:36,749 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:47476, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_40871341_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742086_1262, duration(ns): 15417468 2025-07-09 19:44:36,749 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742086_1262, type=LAST_IN_PIPELINE terminating 2025-07-09 19:44:37,767 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742086_1262 replica FinalizedReplica, blk_1073742086_1262, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742086 for deletion 2025-07-09 19:44:37,768 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742086_1262 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742086 2025-07-09 19:46:41,736 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742088_1264 src: /192.168.158.8:51338 dest: /192.168.158.4:9866 2025-07-09 19:46:41,765 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:51338, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1224783946_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742088_1264, duration(ns): 20984295 2025-07-09 19:46:41,766 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742088_1264, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-09 19:46:43,768 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742088_1264 replica FinalizedReplica, blk_1073742088_1264, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742088 for deletion 2025-07-09 19:46:43,770 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742088_1264 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742088 2025-07-09 19:47:41,725 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742089_1265 src: /192.168.158.9:55944 dest: /192.168.158.4:9866 2025-07-09 19:47:41,746 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:55944, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1039515031_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742089_1265, duration(ns): 17971444 2025-07-09 19:47:41,747 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742089_1265, type=LAST_IN_PIPELINE terminating 2025-07-09 19:47:43,769 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742089_1265 replica FinalizedReplica, blk_1073742089_1265, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742089 for deletion 2025-07-09 19:47:43,771 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742089_1265 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742089 2025-07-09 19:51:47,585 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742093_1269 src: /192.168.158.1:51602 dest: /192.168.158.4:9866 2025-07-09 19:51:47,620 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51602, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-169022900_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742093_1269, duration(ns): 23104946 2025-07-09 19:51:47,620 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742093_1269, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-09 19:51:52,786 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742093_1269 replica FinalizedReplica, blk_1073742093_1269, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742093 for deletion 2025-07-09 19:51:52,787 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742093_1269 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742093 2025-07-09 19:52:46,750 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742094_1270 src: /192.168.158.1:46832 dest: /192.168.158.4:9866 2025-07-09 19:52:46,796 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46832, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1622462378_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742094_1270, duration(ns): 29926587 2025-07-09 19:52:46,796 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742094_1270, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-09 19:52:49,792 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742094_1270 replica FinalizedReplica, blk_1073742094_1270, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742094 for deletion 2025-07-09 19:52:49,793 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742094_1270 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742094 2025-07-09 19:53:46,755 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742095_1271 src: /192.168.158.7:52336 dest: /192.168.158.4:9866 2025-07-09 19:53:46,778 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:52336, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1755952176_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742095_1271, duration(ns): 20165658 2025-07-09 19:53:46,778 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742095_1271, type=LAST_IN_PIPELINE terminating 2025-07-09 19:53:52,796 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742095_1271 replica FinalizedReplica, blk_1073742095_1271, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742095 for deletion 2025-07-09 19:53:52,797 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742095_1271 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742095 2025-07-09 19:55:51,737 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742097_1273 src: /192.168.158.1:45868 dest: /192.168.158.4:9866 2025-07-09 19:55:51,769 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45868, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-235977881_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742097_1273, duration(ns): 20893431 2025-07-09 19:55:51,769 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742097_1273, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-09 19:55:52,810 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742097_1273 replica FinalizedReplica, blk_1073742097_1273, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742097 for deletion 2025-07-09 19:55:52,811 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742097_1273 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742097 2025-07-09 19:56:56,743 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742098_1274 src: /192.168.158.8:44470 dest: /192.168.158.4:9866 2025-07-09 19:56:56,771 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:44470, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1318177674_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742098_1274, duration(ns): 20427116 2025-07-09 19:56:56,772 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742098_1274, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-09 19:56:58,815 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742098_1274 replica FinalizedReplica, blk_1073742098_1274, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742098 for deletion 2025-07-09 19:56:58,817 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742098_1274 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742098 2025-07-09 19:57:56,750 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742099_1275 src: /192.168.158.1:50836 dest: /192.168.158.4:9866 2025-07-09 19:57:56,787 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50836, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_414987068_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742099_1275, duration(ns): 25454554 2025-07-09 19:57:56,788 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742099_1275, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-09 19:57:58,824 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742099_1275 replica FinalizedReplica, blk_1073742099_1275, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742099 for deletion 2025-07-09 19:57:58,825 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742099_1275 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742099 2025-07-09 19:59:01,730 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742100_1276 src: /192.168.158.1:46748 dest: /192.168.158.4:9866 2025-07-09 19:59:01,771 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46748, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1467471030_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742100_1276, duration(ns): 28067888 2025-07-09 19:59:01,772 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742100_1276, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-09 19:59:04,829 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742100_1276 replica FinalizedReplica, blk_1073742100_1276, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742100 for deletion 2025-07-09 19:59:04,831 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742100_1276 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742100 2025-07-09 20:00:01,743 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742101_1277 src: /192.168.158.1:33242 dest: /192.168.158.4:9866 2025-07-09 20:00:01,778 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33242, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1115488134_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742101_1277, duration(ns): 22642257 2025-07-09 20:00:01,778 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742101_1277, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-09 20:00:04,836 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742101_1277 replica FinalizedReplica, blk_1073742101_1277, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742101 for deletion 2025-07-09 20:00:04,837 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742101_1277 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742101 2025-07-09 20:03:01,736 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742104_1280 src: /192.168.158.7:33062 dest: /192.168.158.4:9866 2025-07-09 20:03:01,762 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:33062, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1543140794_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742104_1280, duration(ns): 18417071 2025-07-09 20:03:01,762 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742104_1280, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-09 20:03:01,853 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742104_1280 replica FinalizedReplica, blk_1073742104_1280, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742104 for deletion 2025-07-09 20:03:01,854 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742104_1280 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742104 2025-07-09 20:04:01,746 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742105_1281 src: /192.168.158.7:53058 dest: /192.168.158.4:9866 2025-07-09 20:04:01,774 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:53058, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1983603942_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742105_1281, duration(ns): 20749695 2025-07-09 20:04:01,775 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742105_1281, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-09 20:04:07,858 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742105_1281 replica FinalizedReplica, blk_1073742105_1281, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742105 for deletion 2025-07-09 20:04:07,859 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742105_1281 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742105 2025-07-09 20:05:01,738 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742106_1282 src: /192.168.158.1:60062 dest: /192.168.158.4:9866 2025-07-09 20:05:01,773 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60062, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1418210991_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742106_1282, duration(ns): 23034021 2025-07-09 20:05:01,773 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742106_1282, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-09 20:05:01,861 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742106_1282 replica FinalizedReplica, blk_1073742106_1282, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742106 for deletion 2025-07-09 20:05:01,862 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742106_1282 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742106 2025-07-09 20:06:06,752 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742107_1283 src: /192.168.158.1:46956 dest: /192.168.158.4:9866 2025-07-09 20:06:06,788 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46956, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-431867519_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742107_1283, duration(ns): 23955146 2025-07-09 20:06:06,788 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742107_1283, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-09 20:06:07,863 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742107_1283 replica FinalizedReplica, blk_1073742107_1283, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742107 for deletion 2025-07-09 20:06:07,864 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742107_1283 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742107 2025-07-09 20:08:06,744 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742109_1285 src: /192.168.158.8:40328 dest: /192.168.158.4:9866 2025-07-09 20:08:06,770 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:40328, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1524635689_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742109_1285, duration(ns): 19501302 2025-07-09 20:08:06,771 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742109_1285, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-09 20:08:07,869 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742109_1285 replica FinalizedReplica, blk_1073742109_1285, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742109 for deletion 2025-07-09 20:08:07,870 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742109_1285 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742109 2025-07-09 20:11:11,757 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742112_1288 src: /192.168.158.7:54306 dest: /192.168.158.4:9866 2025-07-09 20:11:11,783 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:54306, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1273140340_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742112_1288, duration(ns): 18823029 2025-07-09 20:11:11,783 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742112_1288, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-09 20:11:13,878 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742112_1288 replica FinalizedReplica, blk_1073742112_1288, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742112 for deletion 2025-07-09 20:11:13,879 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742112_1288 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742112 2025-07-09 20:14:16,767 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742115_1291 src: /192.168.158.1:33730 dest: /192.168.158.4:9866 2025-07-09 20:14:16,801 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33730, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-373268009_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742115_1291, duration(ns): 22853448 2025-07-09 20:14:16,802 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742115_1291, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-09 20:14:19,896 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742115_1291 replica FinalizedReplica, blk_1073742115_1291, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742115 for deletion 2025-07-09 20:14:19,897 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742115_1291 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742115 2025-07-09 20:15:16,743 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742116_1292 src: /192.168.158.5:45978 dest: /192.168.158.4:9866 2025-07-09 20:15:16,770 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:45978, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_817008886_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742116_1292, duration(ns): 18495163 2025-07-09 20:15:16,770 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742116_1292, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-09 20:15:16,899 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742116_1292 replica FinalizedReplica, blk_1073742116_1292, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742116 for deletion 2025-07-09 20:15:16,900 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742116_1292 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742116 2025-07-09 20:16:16,764 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742117_1293 src: /192.168.158.5:51028 dest: /192.168.158.4:9866 2025-07-09 20:16:16,783 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:51028, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1062787286_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742117_1293, duration(ns): 16132657 2025-07-09 20:16:16,784 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742117_1293, type=LAST_IN_PIPELINE terminating 2025-07-09 20:16:16,900 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742117_1293 replica FinalizedReplica, blk_1073742117_1293, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742117 for deletion 2025-07-09 20:16:16,902 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742117_1293 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742117 2025-07-09 20:18:16,761 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742119_1295 src: /192.168.158.5:39570 dest: /192.168.158.4:9866 2025-07-09 20:18:16,779 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:39570, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_630196146_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742119_1295, duration(ns): 15013367 2025-07-09 20:18:16,779 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742119_1295, type=LAST_IN_PIPELINE terminating 2025-07-09 20:18:16,903 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742119_1295 replica FinalizedReplica, blk_1073742119_1295, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742119 for deletion 2025-07-09 20:18:16,904 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742119_1295 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742119 2025-07-09 20:19:16,749 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742120_1296 src: /192.168.158.1:35800 dest: /192.168.158.4:9866 2025-07-09 20:19:16,787 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35800, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2038173437_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742120_1296, duration(ns): 24767275 2025-07-09 20:19:16,788 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742120_1296, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-09 20:19:16,903 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742120_1296 replica FinalizedReplica, blk_1073742120_1296, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742120 for deletion 2025-07-09 20:19:16,904 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742120_1296 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742120 2025-07-09 20:21:16,772 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742122_1298 src: /192.168.158.5:50840 dest: /192.168.158.4:9866 2025-07-09 20:21:16,800 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:50840, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-235845767_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742122_1298, duration(ns): 21499648 2025-07-09 20:21:16,801 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742122_1298, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-09 20:21:16,909 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742122_1298 replica FinalizedReplica, blk_1073742122_1298, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742122 for deletion 2025-07-09 20:21:16,911 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742122_1298 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742122 2025-07-09 20:22:16,769 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742123_1299 src: /192.168.158.5:51280 dest: /192.168.158.4:9866 2025-07-09 20:22:16,794 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:51280, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-125600709_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742123_1299, duration(ns): 18687711 2025-07-09 20:22:16,795 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742123_1299, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-09 20:22:16,910 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742123_1299 replica FinalizedReplica, blk_1073742123_1299, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742123 for deletion 2025-07-09 20:22:16,912 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742123_1299 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742123 2025-07-09 20:23:21,769 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742124_1300 src: /192.168.158.7:42670 dest: /192.168.158.4:9866 2025-07-09 20:23:21,788 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:42670, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1116254819_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742124_1300, duration(ns): 16298449 2025-07-09 20:23:21,789 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742124_1300, type=LAST_IN_PIPELINE terminating 2025-07-09 20:23:22,915 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742124_1300 replica FinalizedReplica, blk_1073742124_1300, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742124 for deletion 2025-07-09 20:23:22,917 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742124_1300 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742124 2025-07-09 20:25:26,780 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742126_1302 src: /192.168.158.1:41082 dest: /192.168.158.4:9866 2025-07-09 20:25:26,819 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41082, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1745752594_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742126_1302, duration(ns): 25710094 2025-07-09 20:25:26,819 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742126_1302, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-09 20:25:31,917 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742126_1302 replica FinalizedReplica, blk_1073742126_1302, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742126 for deletion 2025-07-09 20:25:31,919 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742126_1302 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742126 2025-07-09 20:27:31,808 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742128_1304 src: /192.168.158.1:48240 dest: /192.168.158.4:9866 2025-07-09 20:27:31,844 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48240, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1772687577_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742128_1304, duration(ns): 23817747 2025-07-09 20:27:31,845 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742128_1304, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-09 20:27:31,926 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742128_1304 replica FinalizedReplica, blk_1073742128_1304, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742128 for deletion 2025-07-09 20:27:31,928 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742128_1304 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742128 2025-07-09 20:28:31,790 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742129_1305 src: /192.168.158.5:50280 dest: /192.168.158.4:9866 2025-07-09 20:28:31,816 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:50280, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1892790313_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742129_1305, duration(ns): 19459442 2025-07-09 20:28:31,817 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742129_1305, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-09 20:28:34,931 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742129_1305 replica FinalizedReplica, blk_1073742129_1305, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742129 for deletion 2025-07-09 20:28:34,932 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742129_1305 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742129 2025-07-09 20:29:31,789 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742130_1306 src: /192.168.158.7:41648 dest: /192.168.158.4:9866 2025-07-09 20:29:31,808 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:41648, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_197906705_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742130_1306, duration(ns): 16292927 2025-07-09 20:29:31,808 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742130_1306, type=LAST_IN_PIPELINE terminating 2025-07-09 20:29:34,934 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742130_1306 replica FinalizedReplica, blk_1073742130_1306, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742130 for deletion 2025-07-09 20:29:34,935 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742130_1306 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742130 2025-07-09 20:31:31,787 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742132_1308 src: /192.168.158.7:57150 dest: /192.168.158.4:9866 2025-07-09 20:31:31,815 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:57150, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-767316992_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742132_1308, duration(ns): 20579989 2025-07-09 20:31:31,816 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742132_1308, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-09 20:31:31,938 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742132_1308 replica FinalizedReplica, blk_1073742132_1308, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742132 for deletion 2025-07-09 20:31:31,939 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742132_1308 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742132 2025-07-09 20:32:31,796 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742133_1309 src: /192.168.158.7:50572 dest: /192.168.158.4:9866 2025-07-09 20:32:31,816 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:50572, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_508837169_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742133_1309, duration(ns): 16675600 2025-07-09 20:32:31,816 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742133_1309, type=LAST_IN_PIPELINE terminating 2025-07-09 20:32:34,940 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742133_1309 replica FinalizedReplica, blk_1073742133_1309, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742133 for deletion 2025-07-09 20:32:34,941 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742133_1309 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742133 2025-07-09 20:33:31,791 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742134_1310 src: /192.168.158.1:53904 dest: /192.168.158.4:9866 2025-07-09 20:33:31,823 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53904, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1605180434_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742134_1310, duration(ns): 20626730 2025-07-09 20:33:31,825 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742134_1310, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-09 20:33:34,945 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742134_1310 replica FinalizedReplica, blk_1073742134_1310, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742134 for deletion 2025-07-09 20:33:34,946 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742134_1310 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742134 2025-07-09 20:34:31,798 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742135_1311 src: /192.168.158.8:44106 dest: /192.168.158.4:9866 2025-07-09 20:34:31,818 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:44106, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2121027126_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742135_1311, duration(ns): 17232838 2025-07-09 20:34:31,818 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742135_1311, type=LAST_IN_PIPELINE terminating 2025-07-09 20:34:31,948 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742135_1311 replica FinalizedReplica, blk_1073742135_1311, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742135 for deletion 2025-07-09 20:34:31,950 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742135_1311 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742135 2025-07-09 20:35:31,808 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742136_1312 src: /192.168.158.1:34870 dest: /192.168.158.4:9866 2025-07-09 20:35:31,844 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34870, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-731701858_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742136_1312, duration(ns): 23061666 2025-07-09 20:35:31,846 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742136_1312, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-09 20:35:31,954 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742136_1312 replica FinalizedReplica, blk_1073742136_1312, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742136 for deletion 2025-07-09 20:35:31,955 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742136_1312 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742136 2025-07-09 20:39:36,812 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742140_1316 src: /192.168.158.7:45872 dest: /192.168.158.4:9866 2025-07-09 20:39:36,830 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:45872, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_796494773_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742140_1316, duration(ns): 15418769 2025-07-09 20:39:36,831 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742140_1316, type=LAST_IN_PIPELINE terminating 2025-07-09 20:39:37,966 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742140_1316 replica FinalizedReplica, blk_1073742140_1316, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742140 for deletion 2025-07-09 20:39:37,967 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742140_1316 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742140 2025-07-09 20:40:41,818 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742141_1317 src: /192.168.158.1:54178 dest: /192.168.158.4:9866 2025-07-09 20:40:41,853 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54178, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1046904384_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742141_1317, duration(ns): 23319746 2025-07-09 20:40:41,853 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742141_1317, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-09 20:40:46,970 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742141_1317 replica FinalizedReplica, blk_1073742141_1317, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742141 for deletion 2025-07-09 20:40:46,971 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742141_1317 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742141 2025-07-09 20:45:46,831 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742146_1322 src: /192.168.158.8:50738 dest: /192.168.158.4:9866 2025-07-09 20:45:46,861 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:50738, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1984336890_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742146_1322, duration(ns): 22601845 2025-07-09 20:45:46,861 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742146_1322, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-09 20:45:46,985 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742146_1322 replica FinalizedReplica, blk_1073742146_1322, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742146 for deletion 2025-07-09 20:45:46,987 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742146_1322 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742146 2025-07-09 20:49:51,857 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742150_1326 src: /192.168.158.5:55540 dest: /192.168.158.4:9866 2025-07-09 20:49:51,872 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:55540, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_998571383_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742150_1326, duration(ns): 12487878 2025-07-09 20:49:51,873 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742150_1326, type=LAST_IN_PIPELINE terminating 2025-07-09 20:49:55,992 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742150_1326 replica FinalizedReplica, blk_1073742150_1326, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742150 for deletion 2025-07-09 20:49:55,993 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742150_1326 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742150 2025-07-09 20:51:56,830 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742152_1328 src: /192.168.158.1:56206 dest: /192.168.158.4:9866 2025-07-09 20:51:56,865 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56206, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2050349253_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742152_1328, duration(ns): 23451988 2025-07-09 20:51:56,866 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742152_1328, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-09 20:52:02,001 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742152_1328 replica FinalizedReplica, blk_1073742152_1328, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742152 for deletion 2025-07-09 20:52:02,002 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742152_1328 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742152 2025-07-09 20:54:01,833 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742154_1330 src: /192.168.158.7:39438 dest: /192.168.158.4:9866 2025-07-09 20:54:01,851 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:39438, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-665336454_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742154_1330, duration(ns): 15181226 2025-07-09 20:54:01,851 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742154_1330, type=LAST_IN_PIPELINE terminating 2025-07-09 20:54:02,009 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742154_1330 replica FinalizedReplica, blk_1073742154_1330, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742154 for deletion 2025-07-09 20:54:02,010 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742154_1330 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742154 2025-07-09 20:55:01,831 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742155_1331 src: /192.168.158.7:52932 dest: /192.168.158.4:9866 2025-07-09 20:55:01,851 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:52932, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-594545230_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742155_1331, duration(ns): 17019691 2025-07-09 20:55:01,852 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742155_1331, type=LAST_IN_PIPELINE terminating 2025-07-09 20:55:05,014 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742155_1331 replica FinalizedReplica, blk_1073742155_1331, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742155 for deletion 2025-07-09 20:55:05,016 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742155_1331 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742155 2025-07-09 20:56:01,862 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742156_1332 src: /192.168.158.1:36244 dest: /192.168.158.4:9866 2025-07-09 20:56:01,900 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36244, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2102457564_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742156_1332, duration(ns): 24458123 2025-07-09 20:56:01,900 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742156_1332, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-09 20:56:02,020 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742156_1332 replica FinalizedReplica, blk_1073742156_1332, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742156 for deletion 2025-07-09 20:56:02,022 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742156_1332 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742156 2025-07-09 20:59:06,852 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742159_1335 src: /192.168.158.1:57532 dest: /192.168.158.4:9866 2025-07-09 20:59:06,888 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57532, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_486172001_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742159_1335, duration(ns): 23812441 2025-07-09 20:59:06,889 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742159_1335, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-09 20:59:08,037 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742159_1335 replica FinalizedReplica, blk_1073742159_1335, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742159 for deletion 2025-07-09 20:59:08,039 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742159_1335 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742159 2025-07-09 21:02:11,882 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742162_1338 src: /192.168.158.9:33814 dest: /192.168.158.4:9866 2025-07-09 21:02:11,898 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:33814, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_488565269_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742162_1338, duration(ns): 13066477 2025-07-09 21:02:11,898 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742162_1338, type=LAST_IN_PIPELINE terminating 2025-07-09 21:02:14,047 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742162_1338 replica FinalizedReplica, blk_1073742162_1338, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742162 for deletion 2025-07-09 21:02:14,048 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742162_1338 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742162 2025-07-09 21:03:11,894 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742163_1339 src: /192.168.158.6:47408 dest: /192.168.158.4:9866 2025-07-09 21:03:11,920 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:47408, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_652577176_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742163_1339, duration(ns): 19468339 2025-07-09 21:03:11,920 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742163_1339, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-09 21:03:17,056 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742163_1339 replica FinalizedReplica, blk_1073742163_1339, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742163 for deletion 2025-07-09 21:03:17,057 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742163_1339 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742163 2025-07-09 21:04:11,881 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742164_1340 src: /192.168.158.1:46260 dest: /192.168.158.4:9866 2025-07-09 21:04:11,921 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46260, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1262274182_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742164_1340, duration(ns): 26044699 2025-07-09 21:04:11,921 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742164_1340, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-09 21:04:14,058 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742164_1340 replica FinalizedReplica, blk_1073742164_1340, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742164 for deletion 2025-07-09 21:04:14,059 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742164_1340 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742164 2025-07-09 21:05:16,875 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742165_1341 src: /192.168.158.1:37538 dest: /192.168.158.4:9866 2025-07-09 21:05:16,910 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37538, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-972782509_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742165_1341, duration(ns): 23454443 2025-07-09 21:05:16,912 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742165_1341, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-09 21:05:20,060 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742165_1341 replica FinalizedReplica, blk_1073742165_1341, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742165 for deletion 2025-07-09 21:05:20,061 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742165_1341 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742165 2025-07-09 21:06:16,862 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742166_1342 src: /192.168.158.1:41730 dest: /192.168.158.4:9866 2025-07-09 21:06:16,896 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41730, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_549788645_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742166_1342, duration(ns): 22882647 2025-07-09 21:06:16,897 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742166_1342, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-09 21:06:17,064 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742166_1342 replica FinalizedReplica, blk_1073742166_1342, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742166 for deletion 2025-07-09 21:06:17,066 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742166_1342 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742166 2025-07-09 21:08:16,883 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742168_1344 src: /192.168.158.1:39812 dest: /192.168.158.4:9866 2025-07-09 21:08:16,918 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39812, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_165252204_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742168_1344, duration(ns): 23467605 2025-07-09 21:08:16,920 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742168_1344, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-09 21:08:20,073 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742168_1344 replica FinalizedReplica, blk_1073742168_1344, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742168 for deletion 2025-07-09 21:08:20,074 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742168_1344 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742168 2025-07-09 21:11:21,892 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742171_1347 src: /192.168.158.6:36424 dest: /192.168.158.4:9866 2025-07-09 21:11:21,921 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:36424, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1222304818_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742171_1347, duration(ns): 20703858 2025-07-09 21:11:21,921 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742171_1347, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-09 21:11:23,087 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742171_1347 replica FinalizedReplica, blk_1073742171_1347, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742171 for deletion 2025-07-09 21:11:23,088 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742171_1347 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742171 2025-07-09 21:12:21,893 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742172_1348 src: /192.168.158.7:54188 dest: /192.168.158.4:9866 2025-07-09 21:12:21,918 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:54188, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2107509066_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742172_1348, duration(ns): 17918690 2025-07-09 21:12:21,918 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742172_1348, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-09 21:12:26,086 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742172_1348 replica FinalizedReplica, blk_1073742172_1348, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742172 for deletion 2025-07-09 21:12:26,088 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742172_1348 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742172 2025-07-09 21:16:26,903 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742176_1352 src: /192.168.158.7:46710 dest: /192.168.158.4:9866 2025-07-09 21:16:26,929 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:46710, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1275551020_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742176_1352, duration(ns): 19518138 2025-07-09 21:16:26,929 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742176_1352, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-09 21:16:32,093 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742176_1352 replica FinalizedReplica, blk_1073742176_1352, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742176 for deletion 2025-07-09 21:16:32,094 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742176_1352 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742176 2025-07-09 21:17:26,914 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742177_1353 src: /192.168.158.6:49212 dest: /192.168.158.4:9866 2025-07-09 21:17:26,932 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:49212, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1263129084_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742177_1353, duration(ns): 15457711 2025-07-09 21:17:26,933 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742177_1353, type=LAST_IN_PIPELINE terminating 2025-07-09 21:17:29,100 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742177_1353 replica FinalizedReplica, blk_1073742177_1353, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742177 for deletion 2025-07-09 21:17:29,102 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742177_1353 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742177 2025-07-09 21:20:41,917 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742180_1356 src: /192.168.158.6:43826 dest: /192.168.158.4:9866 2025-07-09 21:20:41,938 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:43826, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1045726858_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742180_1356, duration(ns): 17759303 2025-07-09 21:20:41,938 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742180_1356, type=LAST_IN_PIPELINE terminating 2025-07-09 21:20:44,105 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742180_1356 replica FinalizedReplica, blk_1073742180_1356, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742180 for deletion 2025-07-09 21:20:44,106 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742180_1356 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742180 2025-07-09 21:21:46,895 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742181_1357 src: /192.168.158.9:40380 dest: /192.168.158.4:9866 2025-07-09 21:21:46,921 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:40380, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2125107310_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742181_1357, duration(ns): 17881555 2025-07-09 21:21:46,921 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742181_1357, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-09 21:21:47,106 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742181_1357 replica FinalizedReplica, blk_1073742181_1357, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742181 for deletion 2025-07-09 21:21:47,107 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742181_1357 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742181 2025-07-09 21:25:51,913 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742185_1361 src: /192.168.158.9:36684 dest: /192.168.158.4:9866 2025-07-09 21:25:51,930 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:36684, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_469659922_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742185_1361, duration(ns): 14561608 2025-07-09 21:25:51,930 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742185_1361, type=LAST_IN_PIPELINE terminating 2025-07-09 21:25:56,125 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742185_1361 replica FinalizedReplica, blk_1073742185_1361, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742185 for deletion 2025-07-09 21:25:56,127 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742185_1361 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742185 2025-07-09 21:31:01,919 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742190_1366 src: /192.168.158.8:46556 dest: /192.168.158.4:9866 2025-07-09 21:31:01,936 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:46556, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1950276255_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742190_1366, duration(ns): 13980742 2025-07-09 21:31:01,936 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742190_1366, type=LAST_IN_PIPELINE terminating 2025-07-09 21:31:05,144 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742190_1366 replica FinalizedReplica, blk_1073742190_1366, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742190 for deletion 2025-07-09 21:31:05,145 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742190_1366 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742190 2025-07-09 21:35:11,918 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742194_1370 src: /192.168.158.1:46888 dest: /192.168.158.4:9866 2025-07-09 21:35:11,950 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46888, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1778584644_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742194_1370, duration(ns): 21675476 2025-07-09 21:35:11,951 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742194_1370, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-09 21:35:14,159 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742194_1370 replica FinalizedReplica, blk_1073742194_1370, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742194 for deletion 2025-07-09 21:35:14,160 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742194_1370 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742194 2025-07-09 21:37:16,910 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742196_1372 src: /192.168.158.1:33722 dest: /192.168.158.4:9866 2025-07-09 21:37:16,946 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33722, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-684812852_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742196_1372, duration(ns): 23695117 2025-07-09 21:37:16,946 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742196_1372, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-09 21:37:17,169 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742196_1372 replica FinalizedReplica, blk_1073742196_1372, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742196 for deletion 2025-07-09 21:37:17,170 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742196_1372 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742196 2025-07-09 21:38:16,927 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742197_1373 src: /192.168.158.7:39576 dest: /192.168.158.4:9866 2025-07-09 21:38:16,946 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:39576, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1263385039_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742197_1373, duration(ns): 16358249 2025-07-09 21:38:16,946 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742197_1373, type=LAST_IN_PIPELINE terminating 2025-07-09 21:38:17,172 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742197_1373 replica FinalizedReplica, blk_1073742197_1373, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742197 for deletion 2025-07-09 21:38:17,174 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742197_1373 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742197 2025-07-09 21:39:16,920 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742198_1374 src: /192.168.158.7:60464 dest: /192.168.158.4:9866 2025-07-09 21:39:16,944 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:60464, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1051455653_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742198_1374, duration(ns): 17521493 2025-07-09 21:39:16,945 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742198_1374, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-09 21:39:20,173 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742198_1374 replica FinalizedReplica, blk_1073742198_1374, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742198 for deletion 2025-07-09 21:39:20,174 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742198_1374 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742198 2025-07-09 21:41:21,948 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742200_1376 src: /192.168.158.5:39472 dest: /192.168.158.4:9866 2025-07-09 21:41:21,978 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:39472, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-421190510_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742200_1376, duration(ns): 20629018 2025-07-09 21:41:21,978 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742200_1376, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-09 21:41:23,179 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742200_1376 replica FinalizedReplica, blk_1073742200_1376, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742200 for deletion 2025-07-09 21:41:23,180 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742200_1376 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742200 2025-07-09 21:42:26,937 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742201_1377 src: /192.168.158.5:37912 dest: /192.168.158.4:9866 2025-07-09 21:42:26,964 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:37912, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1652131642_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742201_1377, duration(ns): 19855118 2025-07-09 21:42:26,965 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742201_1377, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-09 21:42:32,183 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742201_1377 replica FinalizedReplica, blk_1073742201_1377, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742201 for deletion 2025-07-09 21:42:32,184 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742201_1377 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742201 2025-07-09 21:43:26,936 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742202_1378 src: /192.168.158.1:49178 dest: /192.168.158.4:9866 2025-07-09 21:43:26,971 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49178, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_979394217_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742202_1378, duration(ns): 23680320 2025-07-09 21:43:26,971 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742202_1378, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-09 21:43:29,185 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742202_1378 replica FinalizedReplica, blk_1073742202_1378, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742202 for deletion 2025-07-09 21:43:29,186 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742202_1378 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742202 2025-07-09 21:45:31,930 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742204_1380 src: /192.168.158.9:39792 dest: /192.168.158.4:9866 2025-07-09 21:45:31,957 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:39792, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1912471732_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742204_1380, duration(ns): 20383658 2025-07-09 21:45:31,958 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742204_1380, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-09 21:45:32,193 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742204_1380 replica FinalizedReplica, blk_1073742204_1380, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742204 for deletion 2025-07-09 21:45:32,195 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742204_1380 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742204 2025-07-09 21:48:36,939 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742207_1383 src: /192.168.158.1:45802 dest: /192.168.158.4:9866 2025-07-09 21:48:36,976 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45802, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1225760272_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742207_1383, duration(ns): 25709021 2025-07-09 21:48:36,976 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742207_1383, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-09 21:48:38,202 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742207_1383 replica FinalizedReplica, blk_1073742207_1383, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742207 for deletion 2025-07-09 21:48:38,204 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742207_1383 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742207 2025-07-09 21:50:36,956 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742209_1385 src: /192.168.158.5:52478 dest: /192.168.158.4:9866 2025-07-09 21:50:36,985 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:52478, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1063285255_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742209_1385, duration(ns): 19436171 2025-07-09 21:50:36,985 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742209_1385, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-09 21:50:41,206 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742209_1385 replica FinalizedReplica, blk_1073742209_1385, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742209 for deletion 2025-07-09 21:50:41,207 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742209_1385 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742209 2025-07-09 21:54:41,954 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742213_1389 src: /192.168.158.9:56798 dest: /192.168.158.4:9866 2025-07-09 21:54:41,972 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:56798, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-630549780_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742213_1389, duration(ns): 14744190 2025-07-09 21:54:41,973 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742213_1389, type=LAST_IN_PIPELINE terminating 2025-07-09 21:54:47,229 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742213_1389 replica FinalizedReplica, blk_1073742213_1389, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742213 for deletion 2025-07-09 21:54:47,230 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742213_1389 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742213 2025-07-09 21:57:41,964 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742216_1392 src: /192.168.158.6:36378 dest: /192.168.158.4:9866 2025-07-09 21:57:41,991 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:36378, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-970049433_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742216_1392, duration(ns): 19188117 2025-07-09 21:57:41,991 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742216_1392, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-09 21:57:44,242 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742216_1392 replica FinalizedReplica, blk_1073742216_1392, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742216 for deletion 2025-07-09 21:57:44,244 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742216_1392 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742216 2025-07-09 21:59:41,954 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742218_1394 src: /192.168.158.9:46538 dest: /192.168.158.4:9866 2025-07-09 21:59:41,973 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:46538, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1841951362_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742218_1394, duration(ns): 15299250 2025-07-09 21:59:41,973 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742218_1394, type=LAST_IN_PIPELINE terminating 2025-07-09 21:59:44,249 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742218_1394 replica FinalizedReplica, blk_1073742218_1394, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742218 for deletion 2025-07-09 21:59:44,250 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742218_1394 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742218 2025-07-09 22:02:46,992 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742221_1397 src: /192.168.158.8:34854 dest: /192.168.158.4:9866 2025-07-09 22:02:47,019 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:34854, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1646317095_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742221_1397, duration(ns): 19495804 2025-07-09 22:02:47,022 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742221_1397, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-09 22:02:50,260 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742221_1397 replica FinalizedReplica, blk_1073742221_1397, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742221 for deletion 2025-07-09 22:02:50,261 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742221_1397 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742221 2025-07-09 22:09:56,996 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742228_1404 src: /192.168.158.5:57438 dest: /192.168.158.4:9866 2025-07-09 22:09:57,015 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:57438, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1775432599_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742228_1404, duration(ns): 16155533 2025-07-09 22:09:57,015 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742228_1404, type=LAST_IN_PIPELINE terminating 2025-07-09 22:10:02,273 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742228_1404 replica FinalizedReplica, blk_1073742228_1404, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742228 for deletion 2025-07-09 22:10:02,274 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742228_1404 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742228 2025-07-09 22:14:01,996 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742232_1408 src: /192.168.158.7:47188 dest: /192.168.158.4:9866 2025-07-09 22:14:02,017 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:47188, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1524217986_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742232_1408, duration(ns): 18838499 2025-07-09 22:14:02,017 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742232_1408, type=LAST_IN_PIPELINE terminating 2025-07-09 22:14:02,283 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742232_1408 replica FinalizedReplica, blk_1073742232_1408, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742232 for deletion 2025-07-09 22:14:02,284 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742232_1408 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742232 2025-07-09 22:15:01,989 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742233_1409 src: /192.168.158.8:55806 dest: /192.168.158.4:9866 2025-07-09 22:15:02,013 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:55806, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_435469131_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742233_1409, duration(ns): 17249826 2025-07-09 22:15:02,013 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742233_1409, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-09 22:15:05,284 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742233_1409 replica FinalizedReplica, blk_1073742233_1409, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742233 for deletion 2025-07-09 22:15:05,286 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742233_1409 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742233 2025-07-09 22:16:01,995 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742234_1410 src: /192.168.158.1:51608 dest: /192.168.158.4:9866 2025-07-09 22:16:02,034 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51608, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1266143919_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742234_1410, duration(ns): 23336260 2025-07-09 22:16:02,034 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742234_1410, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-09 22:16:02,287 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742234_1410 replica FinalizedReplica, blk_1073742234_1410, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742234 for deletion 2025-07-09 22:16:02,289 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742234_1410 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742234 2025-07-09 22:17:06,999 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742235_1411 src: /192.168.158.1:58644 dest: /192.168.158.4:9866 2025-07-09 22:17:07,033 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58644, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2066923511_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742235_1411, duration(ns): 22164394 2025-07-09 22:17:07,034 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742235_1411, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-09 22:17:11,291 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742235_1411 replica FinalizedReplica, blk_1073742235_1411, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742235 for deletion 2025-07-09 22:17:11,293 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742235_1411 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742235 2025-07-09 22:18:06,991 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742236_1412 src: /192.168.158.8:52428 dest: /192.168.158.4:9866 2025-07-09 22:18:07,015 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:52428, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-238888325_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742236_1412, duration(ns): 17774161 2025-07-09 22:18:07,016 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742236_1412, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-09 22:18:08,294 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742236_1412 replica FinalizedReplica, blk_1073742236_1412, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742236 for deletion 2025-07-09 22:18:08,296 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742236_1412 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742236 2025-07-09 22:20:12,003 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742238_1414 src: /192.168.158.5:37604 dest: /192.168.158.4:9866 2025-07-09 22:20:12,020 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:37604, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1776553482_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742238_1414, duration(ns): 14329452 2025-07-09 22:20:12,020 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742238_1414, type=LAST_IN_PIPELINE terminating 2025-07-09 22:20:14,301 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742238_1414 replica FinalizedReplica, blk_1073742238_1414, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742238 for deletion 2025-07-09 22:20:14,303 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742238_1414 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742238 2025-07-09 22:23:17,002 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742241_1417 src: /192.168.158.1:58122 dest: /192.168.158.4:9866 2025-07-09 22:23:17,037 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58122, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-655897412_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742241_1417, duration(ns): 21683072 2025-07-09 22:23:17,037 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742241_1417, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-09 22:23:17,306 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742241_1417 replica FinalizedReplica, blk_1073742241_1417, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742241 for deletion 2025-07-09 22:23:17,307 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742241_1417 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742241 2025-07-09 22:24:17,018 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742242_1418 src: /192.168.158.7:35292 dest: /192.168.158.4:9866 2025-07-09 22:24:17,043 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:35292, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1725699228_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742242_1418, duration(ns): 18724791 2025-07-09 22:24:17,043 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742242_1418, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-09 22:24:17,311 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742242_1418 replica FinalizedReplica, blk_1073742242_1418, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742242 for deletion 2025-07-09 22:24:17,312 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742242_1418 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742242 2025-07-09 22:25:17,016 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742243_1419 src: /192.168.158.6:43266 dest: /192.168.158.4:9866 2025-07-09 22:25:17,042 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:43266, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1700361842_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742243_1419, duration(ns): 18473197 2025-07-09 22:25:17,042 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742243_1419, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-09 22:25:20,315 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742243_1419 replica FinalizedReplica, blk_1073742243_1419, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742243 for deletion 2025-07-09 22:25:20,316 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742243_1419 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742243 2025-07-09 22:27:17,042 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742245_1421 src: /192.168.158.9:56530 dest: /192.168.158.4:9866 2025-07-09 22:27:17,060 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:56530, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1768385329_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742245_1421, duration(ns): 15072895 2025-07-09 22:27:17,061 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742245_1421, type=LAST_IN_PIPELINE terminating 2025-07-09 22:27:20,316 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742245_1421 replica FinalizedReplica, blk_1073742245_1421, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742245 for deletion 2025-07-09 22:27:20,317 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742245_1421 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742245 2025-07-09 22:28:17,041 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742246_1422 src: /192.168.158.9:55726 dest: /192.168.158.4:9866 2025-07-09 22:28:17,068 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:55726, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1507667807_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742246_1422, duration(ns): 19540349 2025-07-09 22:28:17,068 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742246_1422, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-09 22:28:23,318 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742246_1422 replica FinalizedReplica, blk_1073742246_1422, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742246 for deletion 2025-07-09 22:28:23,319 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742246_1422 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742246 2025-07-09 22:30:22,025 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742248_1424 src: /192.168.158.7:40540 dest: /192.168.158.4:9866 2025-07-09 22:30:22,050 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:40540, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2027914652_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742248_1424, duration(ns): 18246897 2025-07-09 22:30:22,052 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742248_1424, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-09 22:30:26,322 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742248_1424 replica FinalizedReplica, blk_1073742248_1424, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742248 for deletion 2025-07-09 22:30:26,323 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742248_1424 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742248 2025-07-09 22:31:22,017 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742249_1425 src: /192.168.158.1:53802 dest: /192.168.158.4:9866 2025-07-09 22:31:22,052 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53802, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-651781717_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742249_1425, duration(ns): 23024805 2025-07-09 22:31:22,052 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742249_1425, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-09 22:31:29,324 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742249_1425 replica FinalizedReplica, blk_1073742249_1425, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742249 for deletion 2025-07-09 22:31:29,326 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742249_1425 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742249 2025-07-09 22:32:22,065 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742250_1426 src: /192.168.158.9:45574 dest: /192.168.158.4:9866 2025-07-09 22:32:22,084 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:45574, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1026809701_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742250_1426, duration(ns): 15677239 2025-07-09 22:32:22,084 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742250_1426, type=LAST_IN_PIPELINE terminating 2025-07-09 22:32:26,326 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742250_1426 replica FinalizedReplica, blk_1073742250_1426, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742250 for deletion 2025-07-09 22:32:26,327 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742250_1426 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742250 2025-07-09 22:34:27,042 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742252_1428 src: /192.168.158.8:45928 dest: /192.168.158.4:9866 2025-07-09 22:34:27,060 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:45928, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1017350064_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742252_1428, duration(ns): 15183124 2025-07-09 22:34:27,060 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742252_1428, type=LAST_IN_PIPELINE terminating 2025-07-09 22:34:35,331 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742252_1428 replica FinalizedReplica, blk_1073742252_1428, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742252 for deletion 2025-07-09 22:34:35,332 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742252_1428 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742252 2025-07-09 22:37:37,035 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742255_1431 src: /192.168.158.6:41256 dest: /192.168.158.4:9866 2025-07-09 22:37:37,060 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:41256, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1842657991_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742255_1431, duration(ns): 18635937 2025-07-09 22:37:37,060 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742255_1431, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-09 22:37:44,334 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742255_1431 replica FinalizedReplica, blk_1073742255_1431, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742255 for deletion 2025-07-09 22:37:44,335 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742255_1431 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742255 2025-07-09 22:38:37,039 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742256_1432 src: /192.168.158.6:58080 dest: /192.168.158.4:9866 2025-07-09 22:38:37,064 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:58080, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-182164208_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742256_1432, duration(ns): 17810625 2025-07-09 22:38:37,064 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742256_1432, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-09 22:38:44,337 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742256_1432 replica FinalizedReplica, blk_1073742256_1432, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742256 for deletion 2025-07-09 22:38:44,338 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742256_1432 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742256 2025-07-09 22:39:37,036 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742257_1433 src: /192.168.158.1:44136 dest: /192.168.158.4:9866 2025-07-09 22:39:37,070 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44136, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_102532759_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742257_1433, duration(ns): 21663077 2025-07-09 22:39:37,070 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742257_1433, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-09 22:39:41,337 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742257_1433 replica FinalizedReplica, blk_1073742257_1433, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742257 for deletion 2025-07-09 22:39:41,338 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742257_1433 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742257 2025-07-09 22:41:37,035 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742259_1435 src: /192.168.158.1:43708 dest: /192.168.158.4:9866 2025-07-09 22:41:37,069 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43708, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1029529341_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742259_1435, duration(ns): 22654681 2025-07-09 22:41:37,069 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742259_1435, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-09 22:41:41,344 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742259_1435 replica FinalizedReplica, blk_1073742259_1435, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742259 for deletion 2025-07-09 22:41:41,345 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742259_1435 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742259 2025-07-09 22:45:47,060 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742263_1439 src: /192.168.158.1:57272 dest: /192.168.158.4:9866 2025-07-09 22:45:47,090 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57272, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_243036258_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742263_1439, duration(ns): 19745471 2025-07-09 22:45:47,091 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742263_1439, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-09 22:45:50,355 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742263_1439 replica FinalizedReplica, blk_1073742263_1439, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742263 for deletion 2025-07-09 22:45:50,356 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742263_1439 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742263 2025-07-09 22:48:57,071 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742266_1442 src: /192.168.158.7:50582 dest: /192.168.158.4:9866 2025-07-09 22:48:57,090 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:50582, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2109690017_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742266_1442, duration(ns): 15937439 2025-07-09 22:48:57,090 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742266_1442, type=LAST_IN_PIPELINE terminating 2025-07-09 22:49:02,364 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742266_1442 replica FinalizedReplica, blk_1073742266_1442, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742266 for deletion 2025-07-09 22:49:02,365 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742266_1442 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742266 2025-07-09 22:50:57,063 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742268_1444 src: /192.168.158.1:38438 dest: /192.168.158.4:9866 2025-07-09 22:50:57,096 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38438, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2113926793_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742268_1444, duration(ns): 22498510 2025-07-09 22:50:57,097 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742268_1444, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-09 22:51:02,372 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742268_1444 replica FinalizedReplica, blk_1073742268_1444, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742268 for deletion 2025-07-09 22:51:02,373 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742268_1444 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742268 2025-07-09 22:51:57,063 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742269_1445 src: /192.168.158.8:40924 dest: /192.168.158.4:9866 2025-07-09 22:51:57,089 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:40924, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-117001434_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742269_1445, duration(ns): 19322018 2025-07-09 22:51:57,089 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742269_1445, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-09 22:52:02,377 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742269_1445 replica FinalizedReplica, blk_1073742269_1445, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742269 for deletion 2025-07-09 22:52:02,378 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742269_1445 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742269 2025-07-09 22:52:57,073 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742270_1446 src: /192.168.158.5:55622 dest: /192.168.158.4:9866 2025-07-09 22:52:57,093 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:55622, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1944148126_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742270_1446, duration(ns): 17049613 2025-07-09 22:52:57,093 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742270_1446, type=LAST_IN_PIPELINE terminating 2025-07-09 22:53:02,378 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742270_1446 replica FinalizedReplica, blk_1073742270_1446, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742270 for deletion 2025-07-09 22:53:02,380 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742270_1446 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742270 2025-07-09 22:54:57,069 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742272_1448 src: /192.168.158.9:51532 dest: /192.168.158.4:9866 2025-07-09 22:54:57,095 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:51532, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1341760987_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742272_1448, duration(ns): 18610772 2025-07-09 22:54:57,095 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742272_1448, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-09 22:55:02,386 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742272_1448 replica FinalizedReplica, blk_1073742272_1448, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742272 for deletion 2025-07-09 22:55:02,387 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742272_1448 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742272 2025-07-09 22:56:57,075 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742274_1450 src: /192.168.158.1:53376 dest: /192.168.158.4:9866 2025-07-09 22:56:57,108 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53376, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2080266837_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742274_1450, duration(ns): 21696874 2025-07-09 22:56:57,109 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742274_1450, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-09 22:57:02,396 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742274_1450 replica FinalizedReplica, blk_1073742274_1450, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742274 for deletion 2025-07-09 22:57:02,397 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742274_1450 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742274 2025-07-09 22:58:02,072 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742275_1451 src: /192.168.158.1:59082 dest: /192.168.158.4:9866 2025-07-09 22:58:02,107 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59082, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_443040155_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742275_1451, duration(ns): 23429440 2025-07-09 22:58:02,107 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742275_1451, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-09 22:58:05,400 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742275_1451 replica FinalizedReplica, blk_1073742275_1451, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742275 for deletion 2025-07-09 22:58:05,401 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742275_1451 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742275 2025-07-09 23:00:02,087 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742277_1453 src: /192.168.158.6:42308 dest: /192.168.158.4:9866 2025-07-09 23:00:02,116 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:42308, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-378525122_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742277_1453, duration(ns): 21971419 2025-07-09 23:00:02,116 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742277_1453, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-09 23:00:05,407 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742277_1453 replica FinalizedReplica, blk_1073742277_1453, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742277 for deletion 2025-07-09 23:00:05,408 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742277_1453 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742277 2025-07-09 23:02:07,082 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742279_1455 src: /192.168.158.9:50506 dest: /192.168.158.4:9866 2025-07-09 23:02:07,106 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:50506, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1027069615_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742279_1455, duration(ns): 17413204 2025-07-09 23:02:07,108 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742279_1455, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-09 23:02:11,411 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742279_1455 replica FinalizedReplica, blk_1073742279_1455, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742279 for deletion 2025-07-09 23:02:11,413 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742279_1455 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742279 2025-07-09 23:03:07,093 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742280_1456 src: /192.168.158.6:46978 dest: /192.168.158.4:9866 2025-07-09 23:03:07,111 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:46978, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1719134247_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742280_1456, duration(ns): 13869767 2025-07-09 23:03:07,111 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742280_1456, type=LAST_IN_PIPELINE terminating 2025-07-09 23:03:14,416 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742280_1456 replica FinalizedReplica, blk_1073742280_1456, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742280 for deletion 2025-07-09 23:03:14,417 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742280_1456 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742280 2025-07-09 23:05:07,092 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742282_1458 src: /192.168.158.1:37254 dest: /192.168.158.4:9866 2025-07-09 23:05:07,126 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37254, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1413979771_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742282_1458, duration(ns): 23668289 2025-07-09 23:05:07,126 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742282_1458, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-09 23:05:11,420 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742282_1458 replica FinalizedReplica, blk_1073742282_1458, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742282 for deletion 2025-07-09 23:05:11,421 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742282_1458 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742282 2025-07-09 23:06:12,093 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742283_1459 src: /192.168.158.8:50682 dest: /192.168.158.4:9866 2025-07-09 23:06:12,116 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:50682, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1895899628_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742283_1459, duration(ns): 16400533 2025-07-09 23:06:12,116 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742283_1459, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-09 23:06:17,422 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742283_1459 replica FinalizedReplica, blk_1073742283_1459, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742283 for deletion 2025-07-09 23:06:17,424 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742283_1459 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742283 2025-07-09 23:07:17,100 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742284_1460 src: /192.168.158.6:53488 dest: /192.168.158.4:9866 2025-07-09 23:07:17,117 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:53488, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1060119075_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742284_1460, duration(ns): 14585007 2025-07-09 23:07:17,118 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742284_1460, type=LAST_IN_PIPELINE terminating 2025-07-09 23:07:20,425 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742284_1460 replica FinalizedReplica, blk_1073742284_1460, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742284 for deletion 2025-07-09 23:07:20,427 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742284_1460 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742284 2025-07-09 23:08:22,093 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742285_1461 src: /192.168.158.5:58616 dest: /192.168.158.4:9866 2025-07-09 23:08:22,121 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:58616, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-929565778_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742285_1461, duration(ns): 20472959 2025-07-09 23:08:22,121 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742285_1461, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-09 23:08:29,429 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742285_1461 replica FinalizedReplica, blk_1073742285_1461, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742285 for deletion 2025-07-09 23:08:29,430 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742285_1461 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742285 2025-07-09 23:11:27,105 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742288_1464 src: /192.168.158.5:33000 dest: /192.168.158.4:9866 2025-07-09 23:11:27,129 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:33000, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1112178833_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742288_1464, duration(ns): 16857840 2025-07-09 23:11:27,129 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742288_1464, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-09 23:11:32,436 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742288_1464 replica FinalizedReplica, blk_1073742288_1464, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742288 for deletion 2025-07-09 23:11:32,438 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742288_1464 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742288 2025-07-09 23:13:27,110 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742290_1466 src: /192.168.158.6:41342 dest: /192.168.158.4:9866 2025-07-09 23:13:27,136 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:41342, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_970247930_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742290_1466, duration(ns): 19167906 2025-07-09 23:13:27,136 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742290_1466, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-09 23:13:32,440 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742290_1466 replica FinalizedReplica, blk_1073742290_1466, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742290 for deletion 2025-07-09 23:13:32,441 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742290_1466 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742290 2025-07-09 23:14:27,115 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742291_1467 src: /192.168.158.6:41548 dest: /192.168.158.4:9866 2025-07-09 23:14:27,138 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:41548, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-650612265_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742291_1467, duration(ns): 16098906 2025-07-09 23:14:27,138 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742291_1467, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-09 23:14:35,442 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742291_1467 replica FinalizedReplica, blk_1073742291_1467, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742291 for deletion 2025-07-09 23:14:35,443 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742291_1467 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742291 2025-07-09 23:15:27,094 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742292_1468 src: /192.168.158.8:48662 dest: /192.168.158.4:9866 2025-07-09 23:15:27,128 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:48662, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-159363579_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742292_1468, duration(ns): 27068490 2025-07-09 23:15:27,128 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742292_1468, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-09 23:15:32,441 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742292_1468 replica FinalizedReplica, blk_1073742292_1468, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742292 for deletion 2025-07-09 23:15:32,443 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742292_1468 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742292 2025-07-09 23:19:37,134 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742296_1472 src: /192.168.158.1:54348 dest: /192.168.158.4:9866 2025-07-09 23:19:37,167 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54348, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2126639083_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742296_1472, duration(ns): 21740904 2025-07-09 23:19:37,167 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742296_1472, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-09 23:19:41,455 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742296_1472 replica FinalizedReplica, blk_1073742296_1472, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742296 for deletion 2025-07-09 23:19:41,456 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742296_1472 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742296 2025-07-09 23:20:42,141 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742297_1473 src: /192.168.158.6:56212 dest: /192.168.158.4:9866 2025-07-09 23:20:42,159 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:56212, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1639244986_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742297_1473, duration(ns): 15603653 2025-07-09 23:20:42,159 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742297_1473, type=LAST_IN_PIPELINE terminating 2025-07-09 23:20:50,456 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742297_1473 replica FinalizedReplica, blk_1073742297_1473, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742297 for deletion 2025-07-09 23:20:50,458 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742297_1473 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742297 2025-07-09 23:22:42,134 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742299_1475 src: /192.168.158.9:60650 dest: /192.168.158.4:9866 2025-07-09 23:22:42,153 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:60650, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-371739806_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742299_1475, duration(ns): 16244082 2025-07-09 23:22:42,153 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742299_1475, type=LAST_IN_PIPELINE terminating 2025-07-09 23:22:47,461 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742299_1475 replica FinalizedReplica, blk_1073742299_1475, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742299 for deletion 2025-07-09 23:22:47,462 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742299_1475 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742299 2025-07-09 23:24:42,129 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742301_1477 src: /192.168.158.9:60192 dest: /192.168.158.4:9866 2025-07-09 23:24:42,156 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:60192, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-551697239_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742301_1477, duration(ns): 20247228 2025-07-09 23:24:42,156 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742301_1477, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-09 23:24:47,465 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742301_1477 replica FinalizedReplica, blk_1073742301_1477, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742301 for deletion 2025-07-09 23:24:47,467 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742301_1477 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742301 2025-07-09 23:26:42,139 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742303_1479 src: /192.168.158.9:56740 dest: /192.168.158.4:9866 2025-07-09 23:26:42,157 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:56740, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1665495095_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742303_1479, duration(ns): 15496611 2025-07-09 23:26:42,157 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742303_1479, type=LAST_IN_PIPELINE terminating 2025-07-09 23:26:47,469 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742303_1479 replica FinalizedReplica, blk_1073742303_1479, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742303 for deletion 2025-07-09 23:26:47,471 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742303_1479 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742303 2025-07-09 23:27:42,135 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742304_1480 src: /192.168.158.1:54700 dest: /192.168.158.4:9866 2025-07-09 23:27:42,170 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54700, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2141592576_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742304_1480, duration(ns): 24548379 2025-07-09 23:27:42,171 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742304_1480, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-09 23:27:47,473 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742304_1480 replica FinalizedReplica, blk_1073742304_1480, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742304 for deletion 2025-07-09 23:27:47,474 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742304_1480 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742304 2025-07-09 23:28:42,149 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742305_1481 src: /192.168.158.6:34206 dest: /192.168.158.4:9866 2025-07-09 23:28:42,177 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:34206, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_601547900_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742305_1481, duration(ns): 20578760 2025-07-09 23:28:42,177 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742305_1481, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-09 23:28:47,476 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742305_1481 replica FinalizedReplica, blk_1073742305_1481, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742305 for deletion 2025-07-09 23:28:47,477 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742305_1481 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742305 2025-07-09 23:29:47,135 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742306_1482 src: /192.168.158.9:45270 dest: /192.168.158.4:9866 2025-07-09 23:29:47,159 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:45270, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1901420872_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742306_1482, duration(ns): 17826000 2025-07-09 23:29:47,160 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742306_1482, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-09 23:29:53,479 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742306_1482 replica FinalizedReplica, blk_1073742306_1482, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742306 for deletion 2025-07-09 23:29:53,480 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742306_1482 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742306 2025-07-09 23:33:57,152 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742310_1486 src: /192.168.158.1:41410 dest: /192.168.158.4:9866 2025-07-09 23:33:57,187 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41410, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_129933355_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742310_1486, duration(ns): 22576459 2025-07-09 23:33:57,187 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742310_1486, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-09 23:34:02,491 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742310_1486 replica FinalizedReplica, blk_1073742310_1486, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742310 for deletion 2025-07-09 23:34:02,492 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742310_1486 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742310 2025-07-09 23:34:57,151 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742311_1487 src: /192.168.158.9:51446 dest: /192.168.158.4:9866 2025-07-09 23:34:57,170 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:51446, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1779846052_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742311_1487, duration(ns): 16529593 2025-07-09 23:34:57,171 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742311_1487, type=LAST_IN_PIPELINE terminating 2025-07-09 23:35:02,493 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742311_1487 replica FinalizedReplica, blk_1073742311_1487, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742311 for deletion 2025-07-09 23:35:02,495 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742311_1487 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742311 2025-07-09 23:35:57,149 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742312_1488 src: /192.168.158.1:50020 dest: /192.168.158.4:9866 2025-07-09 23:35:57,183 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50020, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1027447842_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742312_1488, duration(ns): 22704360 2025-07-09 23:35:57,183 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742312_1488, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-09 23:36:02,498 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742312_1488 replica FinalizedReplica, blk_1073742312_1488, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742312 for deletion 2025-07-09 23:36:02,499 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742312_1488 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742312 2025-07-09 23:36:13,274 INFO org.apache.hadoop.hdfs.server.datanode.DirectoryScanner: BlockPool BP-1059995147-192.168.158.1-1752101929360 Total blocks: 8, missing metadata files:0, missing block files:0, missing blocks in memory:0, mismatched blocks:0 2025-07-09 23:37:20,503 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Successfully sent block report 0x12cfd9a757d31f26, containing 4 storage report(s), of which we sent 4. The reports had 8 total blocks and used 1 RPC(s). This took 1 msec to generate and 4 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5. 2025-07-09 23:37:20,504 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Got finalize command for block pool BP-1059995147-192.168.158.1-1752101929360 2025-07-09 23:44:07,168 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742320_1496 src: /192.168.158.6:50148 dest: /192.168.158.4:9866 2025-07-09 23:44:07,197 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:50148, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2067109148_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742320_1496, duration(ns): 20768950 2025-07-09 23:44:07,198 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742320_1496, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-09 23:44:11,519 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742320_1496 replica FinalizedReplica, blk_1073742320_1496, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742320 for deletion 2025-07-09 23:44:11,521 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742320_1496 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742320 2025-07-09 23:46:12,172 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742322_1498 src: /192.168.158.6:40136 dest: /192.168.158.4:9866 2025-07-09 23:46:12,200 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:40136, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2005885740_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742322_1498, duration(ns): 20563184 2025-07-09 23:46:12,200 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742322_1498, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-09 23:46:20,522 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742322_1498 replica FinalizedReplica, blk_1073742322_1498, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742322 for deletion 2025-07-09 23:46:20,523 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742322_1498 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742322 2025-07-09 23:47:12,164 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742323_1499 src: /192.168.158.1:48040 dest: /192.168.158.4:9866 2025-07-09 23:47:12,196 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48040, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_920303882_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742323_1499, duration(ns): 20972632 2025-07-09 23:47:12,196 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742323_1499, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-09 23:47:17,526 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742323_1499 replica FinalizedReplica, blk_1073742323_1499, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742323 for deletion 2025-07-09 23:47:17,527 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742323_1499 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742323 2025-07-09 23:48:12,171 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742324_1500 src: /192.168.158.1:49338 dest: /192.168.158.4:9866 2025-07-09 23:48:12,203 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49338, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2121037852_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742324_1500, duration(ns): 20761387 2025-07-09 23:48:12,203 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742324_1500, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-09 23:48:17,529 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742324_1500 replica FinalizedReplica, blk_1073742324_1500, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742324 for deletion 2025-07-09 23:48:17,530 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742324_1500 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742324 2025-07-09 23:51:12,194 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742327_1503 src: /192.168.158.6:53198 dest: /192.168.158.4:9866 2025-07-09 23:51:12,224 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:53198, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_509367190_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742327_1503, duration(ns): 21083267 2025-07-09 23:51:12,225 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742327_1503, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-09 23:51:20,537 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742327_1503 replica FinalizedReplica, blk_1073742327_1503, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742327 for deletion 2025-07-09 23:51:20,538 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742327_1503 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742327 2025-07-09 23:53:17,201 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742329_1505 src: /192.168.158.1:34124 dest: /192.168.158.4:9866 2025-07-09 23:53:17,234 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34124, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_850905499_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742329_1505, duration(ns): 21845943 2025-07-09 23:53:17,236 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742329_1505, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-09 23:53:20,537 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742329_1505 replica FinalizedReplica, blk_1073742329_1505, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742329 for deletion 2025-07-09 23:53:20,538 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742329_1505 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742329 2025-07-09 23:59:22,211 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742335_1511 src: /192.168.158.1:36240 dest: /192.168.158.4:9866 2025-07-09 23:59:22,244 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36240, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_897334090_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742335_1511, duration(ns): 22541955 2025-07-09 23:59:22,244 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742335_1511, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-09 23:59:26,549 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742335_1511 replica FinalizedReplica, blk_1073742335_1511, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742335 for deletion 2025-07-09 23:59:26,550 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742335_1511 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073742335 2025-07-10 00:00:22,217 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742336_1512 src: /192.168.158.1:59608 dest: /192.168.158.4:9866 2025-07-10 00:00:22,252 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59608, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1246859698_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742336_1512, duration(ns): 24233012 2025-07-10 00:00:22,252 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742336_1512, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-10 00:00:29,552 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742336_1512 replica FinalizedReplica, blk_1073742336_1512, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742336 for deletion 2025-07-10 00:00:29,554 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742336_1512 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742336 2025-07-10 00:05:27,217 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742341_1517 src: /192.168.158.1:54826 dest: /192.168.158.4:9866 2025-07-10 00:05:27,250 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54826, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-39840168_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742341_1517, duration(ns): 22241752 2025-07-10 00:05:27,250 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742341_1517, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-10 00:05:35,565 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742341_1517 replica FinalizedReplica, blk_1073742341_1517, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742341 for deletion 2025-07-10 00:05:35,566 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742341_1517 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742341 2025-07-10 00:09:27,239 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742345_1521 src: /192.168.158.8:52462 dest: /192.168.158.4:9866 2025-07-10 00:09:27,264 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:52462, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_202230175_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742345_1521, duration(ns): 18309044 2025-07-10 00:09:27,264 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742345_1521, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-10 00:09:32,576 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742345_1521 replica FinalizedReplica, blk_1073742345_1521, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742345 for deletion 2025-07-10 00:09:32,578 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742345_1521 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742345 2025-07-10 00:10:27,239 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742346_1522 src: /192.168.158.7:54472 dest: /192.168.158.4:9866 2025-07-10 00:10:27,266 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:54472, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-408012122_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742346_1522, duration(ns): 18236200 2025-07-10 00:10:27,267 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742346_1522, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-10 00:10:32,579 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742346_1522 replica FinalizedReplica, blk_1073742346_1522, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742346 for deletion 2025-07-10 00:10:32,580 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742346_1522 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742346 2025-07-10 00:11:32,231 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742347_1523 src: /192.168.158.8:55604 dest: /192.168.158.4:9866 2025-07-10 00:11:32,258 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:55604, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1186581980_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742347_1523, duration(ns): 20238357 2025-07-10 00:11:32,259 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742347_1523, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-10 00:11:35,582 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742347_1523 replica FinalizedReplica, blk_1073742347_1523, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742347 for deletion 2025-07-10 00:11:35,583 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742347_1523 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742347 2025-07-10 00:13:37,245 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742349_1525 src: /192.168.158.7:34138 dest: /192.168.158.4:9866 2025-07-10 00:13:37,269 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:34138, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-368603978_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742349_1525, duration(ns): 17494044 2025-07-10 00:13:37,270 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742349_1525, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-10 00:13:41,584 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742349_1525 replica FinalizedReplica, blk_1073742349_1525, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742349 for deletion 2025-07-10 00:13:41,586 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742349_1525 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742349 2025-07-10 00:14:37,243 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742350_1526 src: /192.168.158.1:37374 dest: /192.168.158.4:9866 2025-07-10 00:14:37,276 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37374, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1089188704_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742350_1526, duration(ns): 21215756 2025-07-10 00:14:37,276 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742350_1526, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-10 00:14:44,586 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742350_1526 replica FinalizedReplica, blk_1073742350_1526, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742350 for deletion 2025-07-10 00:14:44,588 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742350_1526 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742350 2025-07-10 00:17:37,244 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742353_1529 src: /192.168.158.7:50650 dest: /192.168.158.4:9866 2025-07-10 00:17:37,269 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:50650, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-842038591_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742353_1529, duration(ns): 19118519 2025-07-10 00:17:37,269 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742353_1529, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-10 00:17:41,594 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742353_1529 replica FinalizedReplica, blk_1073742353_1529, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742353 for deletion 2025-07-10 00:17:41,596 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742353_1529 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742353 2025-07-10 00:21:42,252 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742357_1533 src: /192.168.158.9:37986 dest: /192.168.158.4:9866 2025-07-10 00:21:42,270 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:37986, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1860352827_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742357_1533, duration(ns): 15593715 2025-07-10 00:21:42,270 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742357_1533, type=LAST_IN_PIPELINE terminating 2025-07-10 00:21:47,602 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742357_1533 replica FinalizedReplica, blk_1073742357_1533, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742357 for deletion 2025-07-10 00:21:47,603 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742357_1533 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742357 2025-07-10 00:23:42,253 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742359_1535 src: /192.168.158.7:34738 dest: /192.168.158.4:9866 2025-07-10 00:23:42,281 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:34738, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_72543604_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742359_1535, duration(ns): 20727073 2025-07-10 00:23:42,281 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742359_1535, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-10 00:23:47,605 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742359_1535 replica FinalizedReplica, blk_1073742359_1535, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742359 for deletion 2025-07-10 00:23:47,606 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742359_1535 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742359 2025-07-10 00:25:52,257 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742361_1537 src: /192.168.158.1:53000 dest: /192.168.158.4:9866 2025-07-10 00:25:52,289 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53000, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-612942079_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742361_1537, duration(ns): 21455476 2025-07-10 00:25:52,292 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742361_1537, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-10 00:25:56,609 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742361_1537 replica FinalizedReplica, blk_1073742361_1537, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742361 for deletion 2025-07-10 00:25:56,610 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742361_1537 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742361 2025-07-10 00:26:52,254 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742362_1538 src: /192.168.158.1:34522 dest: /192.168.158.4:9866 2025-07-10 00:26:52,288 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34522, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2033299891_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742362_1538, duration(ns): 22869681 2025-07-10 00:26:52,289 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742362_1538, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-10 00:26:59,612 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742362_1538 replica FinalizedReplica, blk_1073742362_1538, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742362 for deletion 2025-07-10 00:26:59,613 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742362_1538 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742362 2025-07-10 00:28:52,262 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742364_1540 src: /192.168.158.8:33148 dest: /192.168.158.4:9866 2025-07-10 00:28:52,281 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:33148, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_775897247_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742364_1540, duration(ns): 16095559 2025-07-10 00:28:52,281 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742364_1540, type=LAST_IN_PIPELINE terminating 2025-07-10 00:28:56,613 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742364_1540 replica FinalizedReplica, blk_1073742364_1540, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742364 for deletion 2025-07-10 00:28:56,615 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742364_1540 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742364 2025-07-10 00:29:52,284 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742365_1541 src: /192.168.158.9:39548 dest: /192.168.158.4:9866 2025-07-10 00:29:52,302 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:39548, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-248835415_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742365_1541, duration(ns): 14462375 2025-07-10 00:29:52,303 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742365_1541, type=LAST_IN_PIPELINE terminating 2025-07-10 00:29:59,615 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742365_1541 replica FinalizedReplica, blk_1073742365_1541, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742365 for deletion 2025-07-10 00:29:59,617 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742365_1541 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742365 2025-07-10 00:30:57,286 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742366_1542 src: /192.168.158.8:60464 dest: /192.168.158.4:9866 2025-07-10 00:30:57,308 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:60464, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1218165297_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742366_1542, duration(ns): 18879424 2025-07-10 00:30:57,308 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742366_1542, type=LAST_IN_PIPELINE terminating 2025-07-10 00:31:02,615 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742366_1542 replica FinalizedReplica, blk_1073742366_1542, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742366 for deletion 2025-07-10 00:31:02,617 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742366_1542 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742366 2025-07-10 00:32:57,285 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742368_1544 src: /192.168.158.9:57378 dest: /192.168.158.4:9866 2025-07-10 00:32:57,302 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:57378, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-667249918_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742368_1544, duration(ns): 14935256 2025-07-10 00:32:57,302 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742368_1544, type=LAST_IN_PIPELINE terminating 2025-07-10 00:33:02,620 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742368_1544 replica FinalizedReplica, blk_1073742368_1544, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742368 for deletion 2025-07-10 00:33:02,622 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742368_1544 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742368 2025-07-10 00:34:02,280 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742369_1545 src: /192.168.158.5:38870 dest: /192.168.158.4:9866 2025-07-10 00:34:02,296 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:38870, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_852292538_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742369_1545, duration(ns): 13857354 2025-07-10 00:34:02,296 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742369_1545, type=LAST_IN_PIPELINE terminating 2025-07-10 00:34:08,622 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742369_1545 replica FinalizedReplica, blk_1073742369_1545, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742369 for deletion 2025-07-10 00:34:08,623 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742369_1545 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742369 2025-07-10 00:38:02,288 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742373_1549 src: /192.168.158.5:43472 dest: /192.168.158.4:9866 2025-07-10 00:38:02,307 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:43472, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1210683015_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742373_1549, duration(ns): 15821399 2025-07-10 00:38:02,307 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742373_1549, type=LAST_IN_PIPELINE terminating 2025-07-10 00:38:05,631 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742373_1549 replica FinalizedReplica, blk_1073742373_1549, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742373 for deletion 2025-07-10 00:38:05,633 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742373_1549 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742373 2025-07-10 00:39:02,295 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742374_1550 src: /192.168.158.5:49610 dest: /192.168.158.4:9866 2025-07-10 00:39:02,312 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:49610, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-155289439_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742374_1550, duration(ns): 14170312 2025-07-10 00:39:02,312 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742374_1550, type=LAST_IN_PIPELINE terminating 2025-07-10 00:39:05,633 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742374_1550 replica FinalizedReplica, blk_1073742374_1550, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742374 for deletion 2025-07-10 00:39:05,635 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742374_1550 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742374 2025-07-10 00:41:07,300 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742376_1552 src: /192.168.158.5:36820 dest: /192.168.158.4:9866 2025-07-10 00:41:07,327 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:36820, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1697126911_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742376_1552, duration(ns): 20361400 2025-07-10 00:41:07,327 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742376_1552, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-10 00:41:11,639 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742376_1552 replica FinalizedReplica, blk_1073742376_1552, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742376 for deletion 2025-07-10 00:41:11,640 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742376_1552 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742376 2025-07-10 00:42:12,293 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742377_1553 src: /192.168.158.8:54840 dest: /192.168.158.4:9866 2025-07-10 00:42:12,319 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:54840, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2026692645_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742377_1553, duration(ns): 19915281 2025-07-10 00:42:12,320 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742377_1553, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-10 00:42:20,643 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742377_1553 replica FinalizedReplica, blk_1073742377_1553, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742377 for deletion 2025-07-10 00:42:20,644 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742377_1553 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742377 2025-07-10 00:44:12,302 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742379_1555 src: /192.168.158.8:43218 dest: /192.168.158.4:9866 2025-07-10 00:44:12,321 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:43218, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1673985413_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742379_1555, duration(ns): 16398755 2025-07-10 00:44:12,321 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742379_1555, type=LAST_IN_PIPELINE terminating 2025-07-10 00:44:17,646 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742379_1555 replica FinalizedReplica, blk_1073742379_1555, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742379 for deletion 2025-07-10 00:44:17,647 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742379_1555 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742379 2025-07-10 00:47:17,327 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742382_1558 src: /192.168.158.7:47784 dest: /192.168.158.4:9866 2025-07-10 00:47:17,351 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:47784, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-850776601_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742382_1558, duration(ns): 18296024 2025-07-10 00:47:17,352 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742382_1558, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-10 00:47:20,656 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742382_1558 replica FinalizedReplica, blk_1073742382_1558, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742382 for deletion 2025-07-10 00:47:20,657 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742382_1558 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742382 2025-07-10 00:48:17,303 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742383_1559 src: /192.168.158.6:45742 dest: /192.168.158.4:9866 2025-07-10 00:48:17,320 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:45742, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1002146332_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742383_1559, duration(ns): 14266736 2025-07-10 00:48:17,320 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742383_1559, type=LAST_IN_PIPELINE terminating 2025-07-10 00:48:20,661 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742383_1559 replica FinalizedReplica, blk_1073742383_1559, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742383 for deletion 2025-07-10 00:48:20,662 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742383_1559 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742383 2025-07-10 00:50:22,302 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742385_1561 src: /192.168.158.1:60956 dest: /192.168.158.4:9866 2025-07-10 00:50:22,340 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60956, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1067632462_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742385_1561, duration(ns): 27620437 2025-07-10 00:50:22,340 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742385_1561, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-10 00:50:29,668 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742385_1561 replica FinalizedReplica, blk_1073742385_1561, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742385 for deletion 2025-07-10 00:50:29,670 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742385_1561 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742385 2025-07-10 00:52:22,317 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742387_1563 src: /192.168.158.8:43900 dest: /192.168.158.4:9866 2025-07-10 00:52:22,335 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:43900, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_810699173_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742387_1563, duration(ns): 14811501 2025-07-10 00:52:22,335 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742387_1563, type=LAST_IN_PIPELINE terminating 2025-07-10 00:52:26,671 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742387_1563 replica FinalizedReplica, blk_1073742387_1563, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742387 for deletion 2025-07-10 00:52:26,672 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742387_1563 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742387 2025-07-10 00:56:22,354 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742391_1567 src: /192.168.158.1:40440 dest: /192.168.158.4:9866 2025-07-10 00:56:22,388 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40440, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1911813708_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742391_1567, duration(ns): 22723621 2025-07-10 00:56:22,389 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742391_1567, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-10 00:56:29,683 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742391_1567 replica FinalizedReplica, blk_1073742391_1567, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742391 for deletion 2025-07-10 00:56:29,684 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742391_1567 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742391 2025-07-10 00:58:32,342 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742393_1569 src: /192.168.158.9:40402 dest: /192.168.158.4:9866 2025-07-10 00:58:32,367 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:40402, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_124094985_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742393_1569, duration(ns): 18259766 2025-07-10 00:58:32,368 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742393_1569, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-10 00:58:35,689 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742393_1569 replica FinalizedReplica, blk_1073742393_1569, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742393 for deletion 2025-07-10 00:58:35,690 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742393_1569 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742393 2025-07-10 00:59:32,333 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742394_1570 src: /192.168.158.1:60770 dest: /192.168.158.4:9866 2025-07-10 00:59:32,366 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60770, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-29196169_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742394_1570, duration(ns): 20500318 2025-07-10 00:59:32,366 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742394_1570, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-10 00:59:35,690 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742394_1570 replica FinalizedReplica, blk_1073742394_1570, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742394 for deletion 2025-07-10 00:59:35,692 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742394_1570 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742394 2025-07-10 01:00:32,340 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742395_1571 src: /192.168.158.7:41622 dest: /192.168.158.4:9866 2025-07-10 01:00:32,365 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:41622, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1231841581_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742395_1571, duration(ns): 17817054 2025-07-10 01:00:32,365 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742395_1571, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-10 01:00:35,691 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742395_1571 replica FinalizedReplica, blk_1073742395_1571, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742395 for deletion 2025-07-10 01:00:35,692 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742395_1571 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742395 2025-07-10 01:04:32,349 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742399_1575 src: /192.168.158.7:55980 dest: /192.168.158.4:9866 2025-07-10 01:04:32,373 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:55980, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1309521398_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742399_1575, duration(ns): 17644138 2025-07-10 01:04:32,373 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742399_1575, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-10 01:04:35,704 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742399_1575 replica FinalizedReplica, blk_1073742399_1575, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742399 for deletion 2025-07-10 01:04:35,705 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742399_1575 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742399 2025-07-10 01:09:47,354 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742404_1580 src: /192.168.158.9:48796 dest: /192.168.158.4:9866 2025-07-10 01:09:47,379 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:48796, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-262898683_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742404_1580, duration(ns): 19124134 2025-07-10 01:09:47,379 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742404_1580, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-10 01:09:53,718 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742404_1580 replica FinalizedReplica, blk_1073742404_1580, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742404 for deletion 2025-07-10 01:09:53,719 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742404_1580 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742404 2025-07-10 01:11:52,356 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742406_1582 src: /192.168.158.8:53640 dest: /192.168.158.4:9866 2025-07-10 01:11:52,380 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:53640, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_995242477_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742406_1582, duration(ns): 18000952 2025-07-10 01:11:52,381 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742406_1582, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-10 01:11:59,727 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742406_1582 replica FinalizedReplica, blk_1073742406_1582, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742406 for deletion 2025-07-10 01:11:59,729 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742406_1582 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742406 2025-07-10 01:12:52,362 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742407_1583 src: /192.168.158.6:54236 dest: /192.168.158.4:9866 2025-07-10 01:12:52,379 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:54236, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1767506140_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742407_1583, duration(ns): 14218932 2025-07-10 01:12:52,380 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742407_1583, type=LAST_IN_PIPELINE terminating 2025-07-10 01:12:59,730 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742407_1583 replica FinalizedReplica, blk_1073742407_1583, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742407 for deletion 2025-07-10 01:12:59,731 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742407_1583 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742407 2025-07-10 01:14:52,395 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742409_1585 src: /192.168.158.7:59318 dest: /192.168.158.4:9866 2025-07-10 01:14:52,420 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:59318, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1103083527_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742409_1585, duration(ns): 18773555 2025-07-10 01:14:52,421 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742409_1585, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-10 01:14:56,735 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742409_1585 replica FinalizedReplica, blk_1073742409_1585, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742409 for deletion 2025-07-10 01:14:56,736 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742409_1585 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742409 2025-07-10 01:15:52,384 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742410_1586 src: /192.168.158.1:53692 dest: /192.168.158.4:9866 2025-07-10 01:15:52,416 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53692, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_259437159_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742410_1586, duration(ns): 21125950 2025-07-10 01:15:52,416 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742410_1586, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-10 01:15:56,736 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742410_1586 replica FinalizedReplica, blk_1073742410_1586, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742410 for deletion 2025-07-10 01:15:56,738 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742410_1586 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742410 2025-07-10 01:16:52,395 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742411_1587 src: /192.168.158.9:53238 dest: /192.168.158.4:9866 2025-07-10 01:16:52,413 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:53238, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1471357856_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742411_1587, duration(ns): 15080657 2025-07-10 01:16:52,413 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742411_1587, type=LAST_IN_PIPELINE terminating 2025-07-10 01:16:56,739 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742411_1587 replica FinalizedReplica, blk_1073742411_1587, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742411 for deletion 2025-07-10 01:16:56,740 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742411_1587 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742411 2025-07-10 01:17:52,391 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742412_1588 src: /192.168.158.9:55572 dest: /192.168.158.4:9866 2025-07-10 01:17:52,417 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:55572, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2079586143_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742412_1588, duration(ns): 19949148 2025-07-10 01:17:52,417 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742412_1588, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-10 01:17:56,738 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742412_1588 replica FinalizedReplica, blk_1073742412_1588, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742412 for deletion 2025-07-10 01:17:56,740 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742412_1588 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742412 2025-07-10 01:21:57,402 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742416_1592 src: /192.168.158.5:46946 dest: /192.168.158.4:9866 2025-07-10 01:21:57,427 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:46946, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2057545872_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742416_1592, duration(ns): 18097720 2025-07-10 01:21:57,427 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742416_1592, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-10 01:22:05,748 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742416_1592 replica FinalizedReplica, blk_1073742416_1592, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742416 for deletion 2025-07-10 01:22:05,749 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742416_1592 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742416 2025-07-10 01:22:57,413 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742417_1593 src: /192.168.158.5:33556 dest: /192.168.158.4:9866 2025-07-10 01:22:57,438 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:33556, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2124603391_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742417_1593, duration(ns): 18652491 2025-07-10 01:22:57,439 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742417_1593, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-10 01:23:02,750 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742417_1593 replica FinalizedReplica, blk_1073742417_1593, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742417 for deletion 2025-07-10 01:23:02,751 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742417_1593 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742417 2025-07-10 01:25:02,404 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742419_1595 src: /192.168.158.5:44836 dest: /192.168.158.4:9866 2025-07-10 01:25:02,430 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:44836, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_586007206_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742419_1595, duration(ns): 19554066 2025-07-10 01:25:02,433 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742419_1595, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-10 01:25:05,754 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742419_1595 replica FinalizedReplica, blk_1073742419_1595, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742419 for deletion 2025-07-10 01:25:05,755 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742419_1595 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742419 2025-07-10 01:30:07,415 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742424_1600 src: /192.168.158.9:54352 dest: /192.168.158.4:9866 2025-07-10 01:30:07,442 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:54352, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1153244306_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742424_1600, duration(ns): 20111451 2025-07-10 01:30:07,442 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742424_1600, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-10 01:30:14,769 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742424_1600 replica FinalizedReplica, blk_1073742424_1600, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742424 for deletion 2025-07-10 01:30:14,771 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742424_1600 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742424 2025-07-10 01:38:22,423 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742432_1608 src: /192.168.158.9:34102 dest: /192.168.158.4:9866 2025-07-10 01:38:22,449 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:34102, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_42300103_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742432_1608, duration(ns): 18810001 2025-07-10 01:38:22,449 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742432_1608, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-10 01:38:29,798 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742432_1608 replica FinalizedReplica, blk_1073742432_1608, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742432 for deletion 2025-07-10 01:38:29,799 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742432_1608 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742432 2025-07-10 01:40:27,436 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742434_1610 src: /192.168.158.9:58550 dest: /192.168.158.4:9866 2025-07-10 01:40:27,455 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:58550, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_579787146_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742434_1610, duration(ns): 16153200 2025-07-10 01:40:27,455 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742434_1610, type=LAST_IN_PIPELINE terminating 2025-07-10 01:40:32,804 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742434_1610 replica FinalizedReplica, blk_1073742434_1610, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742434 for deletion 2025-07-10 01:40:32,805 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742434_1610 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742434 2025-07-10 01:42:32,427 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742436_1612 src: /192.168.158.1:39676 dest: /192.168.158.4:9866 2025-07-10 01:42:32,464 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39676, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1100397475_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742436_1612, duration(ns): 25935105 2025-07-10 01:42:32,464 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742436_1612, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-10 01:42:38,811 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742436_1612 replica FinalizedReplica, blk_1073742436_1612, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742436 for deletion 2025-07-10 01:42:38,812 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742436_1612 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742436 2025-07-10 01:43:32,436 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742437_1613 src: /192.168.158.9:48106 dest: /192.168.158.4:9866 2025-07-10 01:43:32,461 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:48106, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-767068913_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742437_1613, duration(ns): 19071192 2025-07-10 01:43:32,462 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742437_1613, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-10 01:43:35,812 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742437_1613 replica FinalizedReplica, blk_1073742437_1613, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742437 for deletion 2025-07-10 01:43:35,814 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742437_1613 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742437 2025-07-10 01:44:32,439 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742438_1614 src: /192.168.158.5:47812 dest: /192.168.158.4:9866 2025-07-10 01:44:32,466 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:47812, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1551952235_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742438_1614, duration(ns): 20432023 2025-07-10 01:44:32,466 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742438_1614, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-10 01:44:35,818 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742438_1614 replica FinalizedReplica, blk_1073742438_1614, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742438 for deletion 2025-07-10 01:44:35,820 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742438_1614 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742438 2025-07-10 01:45:32,437 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742439_1615 src: /192.168.158.6:57408 dest: /192.168.158.4:9866 2025-07-10 01:45:32,462 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:57408, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_964341133_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742439_1615, duration(ns): 18434041 2025-07-10 01:45:32,462 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742439_1615, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-10 01:45:35,819 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742439_1615 replica FinalizedReplica, blk_1073742439_1615, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742439 for deletion 2025-07-10 01:45:35,820 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742439_1615 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742439 2025-07-10 01:47:37,448 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742441_1617 src: /192.168.158.6:48534 dest: /192.168.158.4:9866 2025-07-10 01:47:37,474 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:48534, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1678879247_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742441_1617, duration(ns): 18974861 2025-07-10 01:47:37,476 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742441_1617, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-10 01:47:41,823 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742441_1617 replica FinalizedReplica, blk_1073742441_1617, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742441 for deletion 2025-07-10 01:47:41,825 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742441_1617 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742441 2025-07-10 01:48:37,481 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742442_1618 src: /192.168.158.1:37102 dest: /192.168.158.4:9866 2025-07-10 01:48:37,514 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37102, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-900490083_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742442_1618, duration(ns): 21678381 2025-07-10 01:48:37,514 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742442_1618, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-10 01:48:41,826 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742442_1618 replica FinalizedReplica, blk_1073742442_1618, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742442 for deletion 2025-07-10 01:48:41,827 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742442_1618 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742442 2025-07-10 01:49:37,484 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742443_1619 src: /192.168.158.7:45438 dest: /192.168.158.4:9866 2025-07-10 01:49:37,502 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:45438, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-49269935_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742443_1619, duration(ns): 14806433 2025-07-10 01:49:37,502 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742443_1619, type=LAST_IN_PIPELINE terminating 2025-07-10 01:49:44,828 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742443_1619 replica FinalizedReplica, blk_1073742443_1619, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742443 for deletion 2025-07-10 01:49:44,829 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742443_1619 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742443 2025-07-10 01:54:47,465 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742448_1624 src: /192.168.158.7:42378 dest: /192.168.158.4:9866 2025-07-10 01:54:47,484 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:42378, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-185676854_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742448_1624, duration(ns): 15815039 2025-07-10 01:54:47,484 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742448_1624, type=LAST_IN_PIPELINE terminating 2025-07-10 01:54:53,845 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742448_1624 replica FinalizedReplica, blk_1073742448_1624, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742448 for deletion 2025-07-10 01:54:53,846 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742448_1624 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742448 2025-07-10 01:55:47,464 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742449_1625 src: /192.168.158.8:60208 dest: /192.168.158.4:9866 2025-07-10 01:55:47,489 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:60208, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-22583769_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742449_1625, duration(ns): 18194499 2025-07-10 01:55:47,492 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742449_1625, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-10 01:55:53,849 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742449_1625 replica FinalizedReplica, blk_1073742449_1625, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742449 for deletion 2025-07-10 01:55:53,851 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742449_1625 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742449 2025-07-10 01:56:47,459 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742450_1626 src: /192.168.158.7:47114 dest: /192.168.158.4:9866 2025-07-10 01:56:47,478 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:47114, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1491920123_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742450_1626, duration(ns): 15291624 2025-07-10 01:56:47,478 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742450_1626, type=LAST_IN_PIPELINE terminating 2025-07-10 01:56:50,853 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742450_1626 replica FinalizedReplica, blk_1073742450_1626, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742450 for deletion 2025-07-10 01:56:50,854 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742450_1626 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742450 2025-07-10 01:57:47,462 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742451_1627 src: /192.168.158.1:34272 dest: /192.168.158.4:9866 2025-07-10 01:57:47,499 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34272, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2069361173_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742451_1627, duration(ns): 26038227 2025-07-10 01:57:47,499 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742451_1627, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-10 01:57:53,856 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742451_1627 replica FinalizedReplica, blk_1073742451_1627, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742451 for deletion 2025-07-10 01:57:53,857 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742451_1627 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742451 2025-07-10 01:58:47,477 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742452_1628 src: /192.168.158.5:34568 dest: /192.168.158.4:9866 2025-07-10 01:58:47,493 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:34568, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-800368657_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742452_1628, duration(ns): 13729897 2025-07-10 01:58:47,493 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742452_1628, type=LAST_IN_PIPELINE terminating 2025-07-10 01:58:50,858 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742452_1628 replica FinalizedReplica, blk_1073742452_1628, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742452 for deletion 2025-07-10 01:58:50,859 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742452_1628 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742452 2025-07-10 02:01:52,473 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742455_1631 src: /192.168.158.1:52032 dest: /192.168.158.4:9866 2025-07-10 02:01:52,510 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52032, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2049969061_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742455_1631, duration(ns): 25948809 2025-07-10 02:01:52,510 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742455_1631, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-10 02:01:59,892 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742455_1631 replica FinalizedReplica, blk_1073742455_1631, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742455 for deletion 2025-07-10 02:01:59,893 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742455_1631 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742455 2025-07-10 02:04:57,491 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742458_1634 src: /192.168.158.5:40522 dest: /192.168.158.4:9866 2025-07-10 02:04:57,517 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:40522, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-597497165_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742458_1634, duration(ns): 19321849 2025-07-10 02:04:57,517 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742458_1634, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-10 02:05:02,902 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742458_1634 replica FinalizedReplica, blk_1073742458_1634, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742458 for deletion 2025-07-10 02:05:02,903 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742458_1634 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742458 2025-07-10 02:05:57,487 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742459_1635 src: /192.168.158.6:55612 dest: /192.168.158.4:9866 2025-07-10 02:05:57,504 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:55612, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1351977165_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742459_1635, duration(ns): 14721001 2025-07-10 02:05:57,504 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742459_1635, type=LAST_IN_PIPELINE terminating 2025-07-10 02:06:05,906 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742459_1635 replica FinalizedReplica, blk_1073742459_1635, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742459 for deletion 2025-07-10 02:06:05,907 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742459_1635 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742459 2025-07-10 02:09:57,494 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742463_1639 src: /192.168.158.8:42976 dest: /192.168.158.4:9866 2025-07-10 02:09:57,519 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:42976, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_91843525_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742463_1639, duration(ns): 18240809 2025-07-10 02:09:57,519 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742463_1639, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-10 02:10:05,912 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742463_1639 replica FinalizedReplica, blk_1073742463_1639, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742463 for deletion 2025-07-10 02:10:05,913 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742463_1639 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742463 2025-07-10 02:14:02,505 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742467_1643 src: /192.168.158.1:32792 dest: /192.168.158.4:9866 2025-07-10 02:14:02,537 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:32792, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1701322646_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742467_1643, duration(ns): 22604798 2025-07-10 02:14:02,538 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742467_1643, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-10 02:14:05,923 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742467_1643 replica FinalizedReplica, blk_1073742467_1643, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742467 for deletion 2025-07-10 02:14:05,924 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742467_1643 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742467 2025-07-10 02:15:07,514 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742468_1644 src: /192.168.158.1:56070 dest: /192.168.158.4:9866 2025-07-10 02:15:07,545 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56070, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_329297907_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742468_1644, duration(ns): 19528976 2025-07-10 02:15:07,545 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742468_1644, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-10 02:15:14,926 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742468_1644 replica FinalizedReplica, blk_1073742468_1644, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742468 for deletion 2025-07-10 02:15:14,927 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742468_1644 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742468 2025-07-10 02:17:07,536 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742470_1646 src: /192.168.158.6:32890 dest: /192.168.158.4:9866 2025-07-10 02:17:07,551 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:32890, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1240621691_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742470_1646, duration(ns): 12186143 2025-07-10 02:17:07,551 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742470_1646, type=LAST_IN_PIPELINE terminating 2025-07-10 02:17:14,932 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742470_1646 replica FinalizedReplica, blk_1073742470_1646, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742470 for deletion 2025-07-10 02:17:14,933 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742470_1646 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742470 2025-07-10 02:20:17,520 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742473_1649 src: /192.168.158.1:41788 dest: /192.168.158.4:9866 2025-07-10 02:20:17,552 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41788, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1651838529_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742473_1649, duration(ns): 21895269 2025-07-10 02:20:17,553 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742473_1649, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-10 02:20:20,937 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742473_1649 replica FinalizedReplica, blk_1073742473_1649, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742473 for deletion 2025-07-10 02:20:20,938 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742473_1649 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742473 2025-07-10 02:23:22,514 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742476_1652 src: /192.168.158.7:49570 dest: /192.168.158.4:9866 2025-07-10 02:23:22,532 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:49570, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1248609738_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742476_1652, duration(ns): 15126837 2025-07-10 02:23:22,532 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742476_1652, type=LAST_IN_PIPELINE terminating 2025-07-10 02:23:26,947 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742476_1652 replica FinalizedReplica, blk_1073742476_1652, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742476 for deletion 2025-07-10 02:23:26,949 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742476_1652 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742476 2025-07-10 02:25:27,526 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742478_1654 src: /192.168.158.1:59992 dest: /192.168.158.4:9866 2025-07-10 02:25:27,557 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59992, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1377539615_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742478_1654, duration(ns): 20397048 2025-07-10 02:25:27,557 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742478_1654, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-10 02:25:35,949 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742478_1654 replica FinalizedReplica, blk_1073742478_1654, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742478 for deletion 2025-07-10 02:25:35,950 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742478_1654 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742478 2025-07-10 02:27:27,530 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742480_1656 src: /192.168.158.5:41702 dest: /192.168.158.4:9866 2025-07-10 02:27:27,546 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:41702, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1842971416_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742480_1656, duration(ns): 14054195 2025-07-10 02:27:27,547 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742480_1656, type=LAST_IN_PIPELINE terminating 2025-07-10 02:27:32,954 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742480_1656 replica FinalizedReplica, blk_1073742480_1656, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742480 for deletion 2025-07-10 02:27:32,955 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742480_1656 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742480 2025-07-10 02:29:27,555 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742482_1658 src: /192.168.158.1:43856 dest: /192.168.158.4:9866 2025-07-10 02:29:27,588 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43856, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-921955790_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742482_1658, duration(ns): 23036986 2025-07-10 02:29:27,589 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742482_1658, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-10 02:29:32,963 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742482_1658 replica FinalizedReplica, blk_1073742482_1658, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742482 for deletion 2025-07-10 02:29:32,964 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742482_1658 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742482 2025-07-10 02:30:27,543 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742483_1659 src: /192.168.158.1:50186 dest: /192.168.158.4:9866 2025-07-10 02:30:27,578 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50186, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1536473222_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742483_1659, duration(ns): 21121792 2025-07-10 02:30:27,578 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742483_1659, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-10 02:30:35,966 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742483_1659 replica FinalizedReplica, blk_1073742483_1659, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742483 for deletion 2025-07-10 02:30:35,967 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742483_1659 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742483 2025-07-10 02:31:27,559 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742484_1660 src: /192.168.158.6:40648 dest: /192.168.158.4:9866 2025-07-10 02:31:27,577 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:40648, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1799700864_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742484_1660, duration(ns): 13834494 2025-07-10 02:31:27,578 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742484_1660, type=LAST_IN_PIPELINE terminating 2025-07-10 02:31:32,971 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742484_1660 replica FinalizedReplica, blk_1073742484_1660, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742484 for deletion 2025-07-10 02:31:32,972 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742484_1660 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742484 2025-07-10 02:32:32,545 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742485_1661 src: /192.168.158.1:51722 dest: /192.168.158.4:9866 2025-07-10 02:32:32,576 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51722, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2000523947_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742485_1661, duration(ns): 20203023 2025-07-10 02:32:32,582 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742485_1661, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-10 02:32:35,972 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742485_1661 replica FinalizedReplica, blk_1073742485_1661, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742485 for deletion 2025-07-10 02:32:35,973 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742485_1661 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742485 2025-07-10 02:33:32,550 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742486_1662 src: /192.168.158.7:33122 dest: /192.168.158.4:9866 2025-07-10 02:33:32,567 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:33122, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-606430264_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742486_1662, duration(ns): 14153966 2025-07-10 02:33:32,567 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742486_1662, type=LAST_IN_PIPELINE terminating 2025-07-10 02:33:35,973 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742486_1662 replica FinalizedReplica, blk_1073742486_1662, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742486 for deletion 2025-07-10 02:33:35,974 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742486_1662 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742486 2025-07-10 02:36:37,543 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742489_1665 src: /192.168.158.1:44836 dest: /192.168.158.4:9866 2025-07-10 02:36:37,575 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44836, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_735245847_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742489_1665, duration(ns): 20809995 2025-07-10 02:36:37,575 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742489_1665, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-10 02:36:41,977 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742489_1665 replica FinalizedReplica, blk_1073742489_1665, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742489 for deletion 2025-07-10 02:36:41,978 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742489_1665 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742489 2025-07-10 02:38:37,565 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742491_1667 src: /192.168.158.1:47670 dest: /192.168.158.4:9866 2025-07-10 02:38:37,601 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47670, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_813500563_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742491_1667, duration(ns): 25256787 2025-07-10 02:38:37,601 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742491_1667, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-10 02:38:41,980 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742491_1667 replica FinalizedReplica, blk_1073742491_1667, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742491 for deletion 2025-07-10 02:38:41,981 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742491_1667 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742491 2025-07-10 02:39:37,557 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742492_1668 src: /192.168.158.1:37014 dest: /192.168.158.4:9866 2025-07-10 02:39:37,594 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37014, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_702219058_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742492_1668, duration(ns): 27090692 2025-07-10 02:39:37,595 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742492_1668, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-10 02:39:41,981 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742492_1668 replica FinalizedReplica, blk_1073742492_1668, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742492 for deletion 2025-07-10 02:39:41,987 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742492_1668 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742492 2025-07-10 02:43:52,570 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742496_1672 src: /192.168.158.9:52776 dest: /192.168.158.4:9866 2025-07-10 02:43:52,595 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:52776, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1376364158_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742496_1672, duration(ns): 18683127 2025-07-10 02:43:52,595 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742496_1672, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-10 02:43:56,993 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742496_1672 replica FinalizedReplica, blk_1073742496_1672, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742496 for deletion 2025-07-10 02:43:56,994 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742496_1672 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742496 2025-07-10 02:44:52,572 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742497_1673 src: /192.168.158.9:45746 dest: /192.168.158.4:9866 2025-07-10 02:44:52,596 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:45746, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1733971470_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742497_1673, duration(ns): 17716884 2025-07-10 02:44:52,596 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742497_1673, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-10 02:44:56,993 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742497_1673 replica FinalizedReplica, blk_1073742497_1673, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742497 for deletion 2025-07-10 02:44:56,994 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742497_1673 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742497 2025-07-10 02:45:52,569 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742498_1674 src: /192.168.158.8:57130 dest: /192.168.158.4:9866 2025-07-10 02:45:52,586 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:57130, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2108256703_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742498_1674, duration(ns): 14016062 2025-07-10 02:45:52,586 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742498_1674, type=LAST_IN_PIPELINE terminating 2025-07-10 02:45:56,993 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742498_1674 replica FinalizedReplica, blk_1073742498_1674, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742498 for deletion 2025-07-10 02:45:56,994 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742498_1674 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742498 2025-07-10 02:51:52,572 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742504_1680 src: /192.168.158.9:51148 dest: /192.168.158.4:9866 2025-07-10 02:51:52,598 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:51148, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1377330271_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742504_1680, duration(ns): 18879575 2025-07-10 02:51:52,598 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742504_1680, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-10 02:51:57,005 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742504_1680 replica FinalizedReplica, blk_1073742504_1680, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742504 for deletion 2025-07-10 02:51:57,007 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742504_1680 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742504 2025-07-10 02:55:57,585 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742508_1684 src: /192.168.158.6:37954 dest: /192.168.158.4:9866 2025-07-10 02:55:57,601 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:37954, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1212423075_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742508_1684, duration(ns): 13802384 2025-07-10 02:55:57,602 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742508_1684, type=LAST_IN_PIPELINE terminating 2025-07-10 02:56:03,017 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742508_1684 replica FinalizedReplica, blk_1073742508_1684, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742508 for deletion 2025-07-10 02:56:03,018 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742508_1684 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742508 2025-07-10 02:57:57,595 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742510_1686 src: /192.168.158.9:54146 dest: /192.168.158.4:9866 2025-07-10 02:57:57,618 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:54146, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_490804056_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742510_1686, duration(ns): 16705873 2025-07-10 02:57:57,618 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742510_1686, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-10 02:58:03,021 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742510_1686 replica FinalizedReplica, blk_1073742510_1686, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742510 for deletion 2025-07-10 02:58:03,022 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742510_1686 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742510 2025-07-10 02:59:57,600 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742512_1688 src: /192.168.158.9:33816 dest: /192.168.158.4:9866 2025-07-10 02:59:57,618 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:33816, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1279236590_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742512_1688, duration(ns): 15285678 2025-07-10 02:59:57,618 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742512_1688, type=LAST_IN_PIPELINE terminating 2025-07-10 03:00:03,027 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742512_1688 replica FinalizedReplica, blk_1073742512_1688, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742512 for deletion 2025-07-10 03:00:03,028 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742512_1688 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742512 2025-07-10 03:00:57,599 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742513_1689 src: /192.168.158.8:37888 dest: /192.168.158.4:9866 2025-07-10 03:00:57,627 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:37888, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_659509294_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742513_1689, duration(ns): 21184013 2025-07-10 03:00:57,627 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742513_1689, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-10 03:01:03,030 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742513_1689 replica FinalizedReplica, blk_1073742513_1689, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742513 for deletion 2025-07-10 03:01:03,031 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742513_1689 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742513 2025-07-10 03:03:02,597 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742515_1691 src: /192.168.158.5:41790 dest: /192.168.158.4:9866 2025-07-10 03:03:02,621 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:41790, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2088105392_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742515_1691, duration(ns): 17662141 2025-07-10 03:03:02,621 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742515_1691, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-10 03:03:09,039 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742515_1691 replica FinalizedReplica, blk_1073742515_1691, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742515 for deletion 2025-07-10 03:03:09,040 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742515_1691 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742515 2025-07-10 03:04:02,605 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742516_1692 src: /192.168.158.8:54398 dest: /192.168.158.4:9866 2025-07-10 03:04:02,621 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:54398, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_835271218_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742516_1692, duration(ns): 13353555 2025-07-10 03:04:02,621 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742516_1692, type=LAST_IN_PIPELINE terminating 2025-07-10 03:04:09,040 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742516_1692 replica FinalizedReplica, blk_1073742516_1692, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742516 for deletion 2025-07-10 03:04:09,042 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742516_1692 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742516 2025-07-10 03:06:02,602 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742518_1694 src: /192.168.158.1:46390 dest: /192.168.158.4:9866 2025-07-10 03:06:02,636 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46390, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-163734441_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742518_1694, duration(ns): 23500798 2025-07-10 03:06:02,636 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742518_1694, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-10 03:06:09,044 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742518_1694 replica FinalizedReplica, blk_1073742518_1694, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742518 for deletion 2025-07-10 03:06:09,045 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742518_1694 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742518 2025-07-10 03:07:02,616 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742519_1695 src: /192.168.158.9:36038 dest: /192.168.158.4:9866 2025-07-10 03:07:02,634 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:36038, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-254276832_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742519_1695, duration(ns): 16158896 2025-07-10 03:07:02,635 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742519_1695, type=LAST_IN_PIPELINE terminating 2025-07-10 03:07:06,045 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742519_1695 replica FinalizedReplica, blk_1073742519_1695, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742519 for deletion 2025-07-10 03:07:06,046 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742519_1695 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742519 2025-07-10 03:10:02,604 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742522_1698 src: /192.168.158.1:52894 dest: /192.168.158.4:9866 2025-07-10 03:10:02,634 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52894, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1556842049_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742522_1698, duration(ns): 19423320 2025-07-10 03:10:02,634 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742522_1698, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-10 03:10:06,053 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742522_1698 replica FinalizedReplica, blk_1073742522_1698, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742522 for deletion 2025-07-10 03:10:06,054 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742522_1698 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742522 2025-07-10 03:13:07,644 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742525_1701 src: /192.168.158.9:46354 dest: /192.168.158.4:9866 2025-07-10 03:13:07,661 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:46354, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1731747421_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742525_1701, duration(ns): 14661772 2025-07-10 03:13:07,662 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742525_1701, type=LAST_IN_PIPELINE terminating 2025-07-10 03:13:12,060 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742525_1701 replica FinalizedReplica, blk_1073742525_1701, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742525 for deletion 2025-07-10 03:13:12,061 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742525_1701 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742525 2025-07-10 03:15:12,621 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742527_1703 src: /192.168.158.1:52208 dest: /192.168.158.4:9866 2025-07-10 03:15:12,653 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52208, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1529610344_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742527_1703, duration(ns): 21323872 2025-07-10 03:15:12,653 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742527_1703, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-10 03:15:21,069 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742527_1703 replica FinalizedReplica, blk_1073742527_1703, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742527 for deletion 2025-07-10 03:15:21,070 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742527_1703 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742527 2025-07-10 03:16:17,649 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742528_1704 src: /192.168.158.1:48700 dest: /192.168.158.4:9866 2025-07-10 03:16:17,680 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48700, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-713890841_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742528_1704, duration(ns): 20539035 2025-07-10 03:16:17,680 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742528_1704, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-10 03:16:21,070 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742528_1704 replica FinalizedReplica, blk_1073742528_1704, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742528 for deletion 2025-07-10 03:16:21,072 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742528_1704 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742528 2025-07-10 03:17:22,666 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742529_1705 src: /192.168.158.5:37278 dest: /192.168.158.4:9866 2025-07-10 03:17:22,685 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:37278, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1398941965_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742529_1705, duration(ns): 15859673 2025-07-10 03:17:22,685 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742529_1705, type=LAST_IN_PIPELINE terminating 2025-07-10 03:17:27,071 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742529_1705 replica FinalizedReplica, blk_1073742529_1705, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742529 for deletion 2025-07-10 03:17:27,072 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742529_1705 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742529 2025-07-10 03:18:27,646 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742530_1706 src: /192.168.158.8:47644 dest: /192.168.158.4:9866 2025-07-10 03:18:27,672 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:47644, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_381265691_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742530_1706, duration(ns): 20470503 2025-07-10 03:18:27,673 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742530_1706, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-10 03:18:33,072 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742530_1706 replica FinalizedReplica, blk_1073742530_1706, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742530 for deletion 2025-07-10 03:18:33,074 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742530_1706 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742530 2025-07-10 03:19:27,621 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742531_1707 src: /192.168.158.1:42850 dest: /192.168.158.4:9866 2025-07-10 03:19:27,659 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42850, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1302505709_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742531_1707, duration(ns): 26744315 2025-07-10 03:19:27,659 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742531_1707, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-10 03:19:36,074 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742531_1707 replica FinalizedReplica, blk_1073742531_1707, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742531 for deletion 2025-07-10 03:19:36,075 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742531_1707 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742531 2025-07-10 03:20:27,651 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742532_1708 src: /192.168.158.9:44730 dest: /192.168.158.4:9866 2025-07-10 03:20:27,677 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:44730, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1081482310_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742532_1708, duration(ns): 19862509 2025-07-10 03:20:27,677 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742532_1708, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-10 03:20:36,077 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742532_1708 replica FinalizedReplica, blk_1073742532_1708, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742532 for deletion 2025-07-10 03:20:36,078 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742532_1708 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742532 2025-07-10 03:22:27,650 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742534_1710 src: /192.168.158.1:40636 dest: /192.168.158.4:9866 2025-07-10 03:22:27,682 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40636, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2120313581_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742534_1710, duration(ns): 20988113 2025-07-10 03:22:27,682 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742534_1710, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-10 03:22:33,080 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742534_1710 replica FinalizedReplica, blk_1073742534_1710, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742534 for deletion 2025-07-10 03:22:33,081 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742534_1710 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742534 2025-07-10 03:23:32,651 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742535_1711 src: /192.168.158.6:57340 dest: /192.168.158.4:9866 2025-07-10 03:23:32,676 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:57340, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1280500868_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742535_1711, duration(ns): 18717620 2025-07-10 03:23:32,676 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742535_1711, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-10 03:23:39,085 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742535_1711 replica FinalizedReplica, blk_1073742535_1711, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742535 for deletion 2025-07-10 03:23:39,086 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742535_1711 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742535 2025-07-10 03:25:32,644 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742537_1713 src: /192.168.158.6:46152 dest: /192.168.158.4:9866 2025-07-10 03:25:32,662 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:46152, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1027367905_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742537_1713, duration(ns): 15477101 2025-07-10 03:25:32,662 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742537_1713, type=LAST_IN_PIPELINE terminating 2025-07-10 03:25:39,089 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742537_1713 replica FinalizedReplica, blk_1073742537_1713, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742537 for deletion 2025-07-10 03:25:39,090 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742537_1713 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742537 2025-07-10 03:29:37,643 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742541_1717 src: /192.168.158.9:59714 dest: /192.168.158.4:9866 2025-07-10 03:29:37,660 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:59714, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1939793221_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742541_1717, duration(ns): 14164282 2025-07-10 03:29:37,660 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742541_1717, type=LAST_IN_PIPELINE terminating 2025-07-10 03:29:42,101 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742541_1717 replica FinalizedReplica, blk_1073742541_1717, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742541 for deletion 2025-07-10 03:29:42,102 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742541_1717 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742541 2025-07-10 03:36:52,674 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742548_1724 src: /192.168.158.1:45848 dest: /192.168.158.4:9866 2025-07-10 03:36:52,705 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45848, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_704041169_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742548_1724, duration(ns): 22054674 2025-07-10 03:36:52,706 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742548_1724, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-10 03:36:57,124 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742548_1724 replica FinalizedReplica, blk_1073742548_1724, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742548 for deletion 2025-07-10 03:36:57,125 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742548_1724 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742548 2025-07-10 03:41:57,670 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742553_1729 src: /192.168.158.1:39908 dest: /192.168.158.4:9866 2025-07-10 03:41:57,701 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39908, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-30034774_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742553_1729, duration(ns): 21035369 2025-07-10 03:41:57,701 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742553_1729, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-10 03:42:03,143 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742553_1729 replica FinalizedReplica, blk_1073742553_1729, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742553 for deletion 2025-07-10 03:42:03,144 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742553_1729 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742553 2025-07-10 03:43:02,664 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742554_1730 src: /192.168.158.1:55600 dest: /192.168.158.4:9866 2025-07-10 03:43:02,696 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55600, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1255735633_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742554_1730, duration(ns): 21616934 2025-07-10 03:43:02,696 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742554_1730, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-10 03:43:06,145 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742554_1730 replica FinalizedReplica, blk_1073742554_1730, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742554 for deletion 2025-07-10 03:43:06,146 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742554_1730 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742554 2025-07-10 03:49:07,666 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742560_1736 src: /192.168.158.8:37648 dest: /192.168.158.4:9866 2025-07-10 03:49:07,690 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:37648, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1385527201_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742560_1736, duration(ns): 17163175 2025-07-10 03:49:07,690 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742560_1736, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-10 03:49:12,158 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742560_1736 replica FinalizedReplica, blk_1073742560_1736, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742560 for deletion 2025-07-10 03:49:12,159 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742560_1736 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742560 2025-07-10 03:52:12,667 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742563_1739 src: /192.168.158.1:51182 dest: /192.168.158.4:9866 2025-07-10 03:52:12,698 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51182, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_794648371_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742563_1739, duration(ns): 21266020 2025-07-10 03:52:12,698 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742563_1739, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-10 03:52:18,170 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742563_1739 replica FinalizedReplica, blk_1073742563_1739, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742563 for deletion 2025-07-10 03:52:18,172 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742563_1739 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742563 2025-07-10 03:53:12,674 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742564_1740 src: /192.168.158.1:54160 dest: /192.168.158.4:9866 2025-07-10 03:53:12,705 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54160, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_904365398_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742564_1740, duration(ns): 20524490 2025-07-10 03:53:12,705 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742564_1740, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-10 03:53:21,172 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742564_1740 replica FinalizedReplica, blk_1073742564_1740, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742564 for deletion 2025-07-10 03:53:21,174 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742564_1740 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742564 2025-07-10 03:56:12,686 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742567_1743 src: /192.168.158.8:34890 dest: /192.168.158.4:9866 2025-07-10 03:56:12,710 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:34890, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-485475432_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742567_1743, duration(ns): 17645083 2025-07-10 03:56:12,710 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742567_1743, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-10 03:56:21,182 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742567_1743 replica FinalizedReplica, blk_1073742567_1743, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742567 for deletion 2025-07-10 03:56:21,184 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742567_1743 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742567 2025-07-10 03:57:12,689 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742568_1744 src: /192.168.158.7:51424 dest: /192.168.158.4:9866 2025-07-10 03:57:12,705 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:51424, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-753796117_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742568_1744, duration(ns): 13995572 2025-07-10 03:57:12,706 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742568_1744, type=LAST_IN_PIPELINE terminating 2025-07-10 03:57:21,187 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742568_1744 replica FinalizedReplica, blk_1073742568_1744, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742568 for deletion 2025-07-10 03:57:21,188 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742568_1744 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742568 2025-07-10 03:59:17,703 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742570_1746 src: /192.168.158.8:54986 dest: /192.168.158.4:9866 2025-07-10 03:59:17,727 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:54986, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_18903908_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742570_1746, duration(ns): 18419065 2025-07-10 03:59:17,727 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742570_1746, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-10 03:59:21,194 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742570_1746 replica FinalizedReplica, blk_1073742570_1746, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742570 for deletion 2025-07-10 03:59:21,196 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742570_1746 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742570 2025-07-10 04:00:17,702 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742571_1747 src: /192.168.158.1:58928 dest: /192.168.158.4:9866 2025-07-10 04:00:17,734 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58928, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1574577763_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742571_1747, duration(ns): 21706096 2025-07-10 04:00:17,734 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742571_1747, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-10 04:00:21,197 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742571_1747 replica FinalizedReplica, blk_1073742571_1747, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742571 for deletion 2025-07-10 04:00:21,198 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742571_1747 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742571 2025-07-10 04:02:22,708 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742573_1749 src: /192.168.158.6:55086 dest: /192.168.158.4:9866 2025-07-10 04:02:22,732 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:55086, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_208054583_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742573_1749, duration(ns): 18274923 2025-07-10 04:02:22,733 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742573_1749, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-10 04:02:27,203 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742573_1749 replica FinalizedReplica, blk_1073742573_1749, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742573 for deletion 2025-07-10 04:02:27,204 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742573_1749 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742573 2025-07-10 04:06:32,719 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742577_1753 src: /192.168.158.5:39458 dest: /192.168.158.4:9866 2025-07-10 04:06:32,741 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:39458, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1933081069_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742577_1753, duration(ns): 16447173 2025-07-10 04:06:32,742 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742577_1753, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-10 04:06:36,214 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742577_1753 replica FinalizedReplica, blk_1073742577_1753, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742577 for deletion 2025-07-10 04:06:36,215 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742577_1753 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742577 2025-07-10 04:07:32,726 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742578_1754 src: /192.168.158.9:38608 dest: /192.168.158.4:9866 2025-07-10 04:07:32,749 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:38608, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1159149241_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742578_1754, duration(ns): 17324633 2025-07-10 04:07:32,750 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742578_1754, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-10 04:07:36,219 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742578_1754 replica FinalizedReplica, blk_1073742578_1754, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742578 for deletion 2025-07-10 04:07:36,220 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742578_1754 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742578 2025-07-10 04:08:37,729 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742579_1755 src: /192.168.158.6:45136 dest: /192.168.158.4:9866 2025-07-10 04:08:37,753 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:45136, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1563728713_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742579_1755, duration(ns): 18321289 2025-07-10 04:08:37,754 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742579_1755, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-10 04:08:42,221 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742579_1755 replica FinalizedReplica, blk_1073742579_1755, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742579 for deletion 2025-07-10 04:08:42,222 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742579_1755 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742579 2025-07-10 04:10:37,729 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742581_1757 src: /192.168.158.7:59336 dest: /192.168.158.4:9866 2025-07-10 04:10:37,756 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:59336, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1000014223_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742581_1757, duration(ns): 20801261 2025-07-10 04:10:37,757 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742581_1757, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-10 04:10:45,226 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742581_1757 replica FinalizedReplica, blk_1073742581_1757, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742581 for deletion 2025-07-10 04:10:45,228 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742581_1757 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742581 2025-07-10 04:13:37,723 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742584_1760 src: /192.168.158.6:40714 dest: /192.168.158.4:9866 2025-07-10 04:13:37,743 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:40714, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1040988806_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742584_1760, duration(ns): 16946939 2025-07-10 04:13:37,743 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742584_1760, type=LAST_IN_PIPELINE terminating 2025-07-10 04:13:45,234 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742584_1760 replica FinalizedReplica, blk_1073742584_1760, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742584 for deletion 2025-07-10 04:13:45,235 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742584_1760 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742584 2025-07-10 04:15:37,729 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742586_1762 src: /192.168.158.1:59876 dest: /192.168.158.4:9866 2025-07-10 04:15:37,762 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59876, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1084808396_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742586_1762, duration(ns): 21898347 2025-07-10 04:15:37,763 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742586_1762, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-10 04:15:42,239 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742586_1762 replica FinalizedReplica, blk_1073742586_1762, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742586 for deletion 2025-07-10 04:15:42,240 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742586_1762 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742586 2025-07-10 04:19:42,730 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742590_1766 src: /192.168.158.6:43136 dest: /192.168.158.4:9866 2025-07-10 04:19:42,747 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:43136, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1601223485_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742590_1766, duration(ns): 13767134 2025-07-10 04:19:42,747 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742590_1766, type=LAST_IN_PIPELINE terminating 2025-07-10 04:19:51,252 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742590_1766 replica FinalizedReplica, blk_1073742590_1766, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742590 for deletion 2025-07-10 04:19:51,254 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742590_1766 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742590 2025-07-10 04:20:47,729 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742591_1767 src: /192.168.158.1:36868 dest: /192.168.158.4:9866 2025-07-10 04:20:47,761 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36868, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1019204742_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742591_1767, duration(ns): 21987282 2025-07-10 04:20:47,761 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742591_1767, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-10 04:20:54,254 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742591_1767 replica FinalizedReplica, blk_1073742591_1767, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742591 for deletion 2025-07-10 04:20:54,255 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742591_1767 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073742591 2025-07-10 04:22:52,744 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742593_1769 src: /192.168.158.1:56332 dest: /192.168.158.4:9866 2025-07-10 04:22:52,779 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56332, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1429477016_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742593_1769, duration(ns): 22928429 2025-07-10 04:22:52,779 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742593_1769, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-10 04:22:57,259 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742593_1769 replica FinalizedReplica, blk_1073742593_1769, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742593 for deletion 2025-07-10 04:22:57,261 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742593_1769 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742593 2025-07-10 04:23:57,745 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742594_1770 src: /192.168.158.8:46458 dest: /192.168.158.4:9866 2025-07-10 04:23:57,769 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:46458, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-186216501_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742594_1770, duration(ns): 17192256 2025-07-10 04:23:57,769 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742594_1770, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-10 04:24:03,263 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742594_1770 replica FinalizedReplica, blk_1073742594_1770, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742594 for deletion 2025-07-10 04:24:03,264 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742594_1770 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742594 2025-07-10 04:24:57,742 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742595_1771 src: /192.168.158.9:46302 dest: /192.168.158.4:9866 2025-07-10 04:24:57,760 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:46302, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2090150731_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742595_1771, duration(ns): 14900239 2025-07-10 04:24:57,760 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742595_1771, type=LAST_IN_PIPELINE terminating 2025-07-10 04:25:06,267 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742595_1771 replica FinalizedReplica, blk_1073742595_1771, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742595 for deletion 2025-07-10 04:25:06,268 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742595_1771 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742595 2025-07-10 04:28:12,763 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742598_1774 src: /192.168.158.1:49782 dest: /192.168.158.4:9866 2025-07-10 04:28:12,798 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49782, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1383435477_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742598_1774, duration(ns): 24934811 2025-07-10 04:28:12,798 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742598_1774, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-10 04:28:18,278 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742598_1774 replica FinalizedReplica, blk_1073742598_1774, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742598 for deletion 2025-07-10 04:28:18,279 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742598_1774 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742598 2025-07-10 04:29:12,796 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742599_1775 src: /192.168.158.9:54726 dest: /192.168.158.4:9866 2025-07-10 04:29:12,815 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:54726, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1131442414_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742599_1775, duration(ns): 16540002 2025-07-10 04:29:12,815 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742599_1775, type=LAST_IN_PIPELINE terminating 2025-07-10 04:29:18,281 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742599_1775 replica FinalizedReplica, blk_1073742599_1775, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742599 for deletion 2025-07-10 04:29:18,282 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742599_1775 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742599 2025-07-10 04:30:12,752 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742600_1776 src: /192.168.158.1:57220 dest: /192.168.158.4:9866 2025-07-10 04:30:12,786 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57220, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-23357444_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742600_1776, duration(ns): 22857211 2025-07-10 04:30:12,786 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742600_1776, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-10 04:30:18,284 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742600_1776 replica FinalizedReplica, blk_1073742600_1776, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742600 for deletion 2025-07-10 04:30:18,285 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742600_1776 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742600 2025-07-10 04:31:12,756 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742601_1777 src: /192.168.158.1:60734 dest: /192.168.158.4:9866 2025-07-10 04:31:12,789 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60734, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-470392553_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742601_1777, duration(ns): 22673593 2025-07-10 04:31:12,790 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742601_1777, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-10 04:31:18,287 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742601_1777 replica FinalizedReplica, blk_1073742601_1777, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742601 for deletion 2025-07-10 04:31:18,288 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742601_1777 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742601 2025-07-10 04:32:12,754 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742602_1778 src: /192.168.158.9:50842 dest: /192.168.158.4:9866 2025-07-10 04:32:12,778 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:50842, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1649248251_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742602_1778, duration(ns): 18825610 2025-07-10 04:32:12,779 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742602_1778, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-10 04:32:18,290 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742602_1778 replica FinalizedReplica, blk_1073742602_1778, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742602 for deletion 2025-07-10 04:32:18,291 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742602_1778 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742602 2025-07-10 04:33:12,749 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742603_1779 src: /192.168.158.5:46866 dest: /192.168.158.4:9866 2025-07-10 04:33:12,774 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:46866, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1516600183_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742603_1779, duration(ns): 17959605 2025-07-10 04:33:12,774 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742603_1779, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-10 04:33:18,294 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742603_1779 replica FinalizedReplica, blk_1073742603_1779, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742603 for deletion 2025-07-10 04:33:18,295 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742603_1779 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742603 2025-07-10 04:34:12,757 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742604_1780 src: /192.168.158.6:37420 dest: /192.168.158.4:9866 2025-07-10 04:34:12,782 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:37420, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_751847330_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742604_1780, duration(ns): 18632699 2025-07-10 04:34:12,782 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742604_1780, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-10 04:34:21,296 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742604_1780 replica FinalizedReplica, blk_1073742604_1780, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742604 for deletion 2025-07-10 04:34:21,297 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742604_1780 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742604 2025-07-10 04:38:17,779 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742608_1784 src: /192.168.158.1:45282 dest: /192.168.158.4:9866 2025-07-10 04:38:17,810 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45282, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_170379083_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742608_1784, duration(ns): 20595346 2025-07-10 04:38:17,810 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742608_1784, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-10 04:38:21,305 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742608_1784 replica FinalizedReplica, blk_1073742608_1784, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742608 for deletion 2025-07-10 04:38:21,307 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742608_1784 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742608 2025-07-10 04:44:27,819 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742614_1790 src: /192.168.158.5:37906 dest: /192.168.158.4:9866 2025-07-10 04:44:27,836 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:37906, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-585412012_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742614_1790, duration(ns): 14684926 2025-07-10 04:44:27,836 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742614_1790, type=LAST_IN_PIPELINE terminating 2025-07-10 04:44:33,331 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742614_1790 replica FinalizedReplica, blk_1073742614_1790, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742614 for deletion 2025-07-10 04:44:33,332 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742614_1790 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742614 2025-07-10 04:46:27,804 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742616_1792 src: /192.168.158.8:47208 dest: /192.168.158.4:9866 2025-07-10 04:46:27,828 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:47208, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_636530151_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742616_1792, duration(ns): 17868670 2025-07-10 04:46:27,829 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742616_1792, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-10 04:46:36,337 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742616_1792 replica FinalizedReplica, blk_1073742616_1792, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742616 for deletion 2025-07-10 04:46:36,338 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742616_1792 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742616 2025-07-10 04:47:32,809 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742617_1793 src: /192.168.158.8:40534 dest: /192.168.158.4:9866 2025-07-10 04:47:32,827 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:40534, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1139138312_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742617_1793, duration(ns): 16321361 2025-07-10 04:47:32,828 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742617_1793, type=LAST_IN_PIPELINE terminating 2025-07-10 04:47:36,338 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742617_1793 replica FinalizedReplica, blk_1073742617_1793, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742617 for deletion 2025-07-10 04:47:36,339 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742617_1793 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742617 2025-07-10 04:49:37,799 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742619_1795 src: /192.168.158.1:46554 dest: /192.168.158.4:9866 2025-07-10 04:49:37,830 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46554, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1757743349_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742619_1795, duration(ns): 21064277 2025-07-10 04:49:37,831 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742619_1795, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-10 04:49:45,344 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742619_1795 replica FinalizedReplica, blk_1073742619_1795, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742619 for deletion 2025-07-10 04:49:45,345 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742619_1795 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742619 2025-07-10 04:53:52,831 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742623_1799 src: /192.168.158.1:55126 dest: /192.168.158.4:9866 2025-07-10 04:53:52,866 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55126, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-285145670_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742623_1799, duration(ns): 23855057 2025-07-10 04:53:52,866 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742623_1799, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-10 04:53:57,354 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742623_1799 replica FinalizedReplica, blk_1073742623_1799, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742623 for deletion 2025-07-10 04:53:57,355 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742623_1799 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742623 2025-07-10 04:56:52,810 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742626_1802 src: /192.168.158.1:45732 dest: /192.168.158.4:9866 2025-07-10 04:56:52,844 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45732, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1481410511_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742626_1802, duration(ns): 23665744 2025-07-10 04:56:52,844 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742626_1802, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-10 04:56:57,361 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742626_1802 replica FinalizedReplica, blk_1073742626_1802, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742626 for deletion 2025-07-10 04:56:57,362 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742626_1802 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742626 2025-07-10 05:00:57,824 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742630_1806 src: /192.168.158.6:36102 dest: /192.168.158.4:9866 2025-07-10 05:00:57,843 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:36102, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1008019338_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742630_1806, duration(ns): 15717199 2025-07-10 05:00:57,843 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742630_1806, type=LAST_IN_PIPELINE terminating 2025-07-10 05:01:03,368 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742630_1806 replica FinalizedReplica, blk_1073742630_1806, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742630 for deletion 2025-07-10 05:01:03,369 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742630_1806 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742630 2025-07-10 05:04:07,821 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742633_1809 src: /192.168.158.5:60156 dest: /192.168.158.4:9866 2025-07-10 05:04:07,847 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:60156, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1449616038_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742633_1809, duration(ns): 19877020 2025-07-10 05:04:07,847 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742633_1809, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-10 05:04:12,378 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742633_1809 replica FinalizedReplica, blk_1073742633_1809, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742633 for deletion 2025-07-10 05:04:12,379 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742633_1809 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742633 2025-07-10 05:09:12,828 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742638_1814 src: /192.168.158.1:49042 dest: /192.168.158.4:9866 2025-07-10 05:09:12,858 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49042, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-763419912_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742638_1814, duration(ns): 19914099 2025-07-10 05:09:12,858 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742638_1814, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-10 05:09:21,389 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742638_1814 replica FinalizedReplica, blk_1073742638_1814, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742638 for deletion 2025-07-10 05:09:21,390 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742638_1814 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742638 2025-07-10 05:12:17,833 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742641_1817 src: /192.168.158.1:57278 dest: /192.168.158.4:9866 2025-07-10 05:12:17,865 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57278, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_778082601_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742641_1817, duration(ns): 21953729 2025-07-10 05:12:17,865 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742641_1817, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-10 05:12:21,399 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742641_1817 replica FinalizedReplica, blk_1073742641_1817, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742641 for deletion 2025-07-10 05:12:21,400 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742641_1817 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742641 2025-07-10 05:15:17,848 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742644_1820 src: /192.168.158.9:36000 dest: /192.168.158.4:9866 2025-07-10 05:15:17,872 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:36000, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1797629780_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742644_1820, duration(ns): 17588748 2025-07-10 05:15:17,873 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742644_1820, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-10 05:15:21,405 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742644_1820 replica FinalizedReplica, blk_1073742644_1820, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742644 for deletion 2025-07-10 05:15:21,406 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742644_1820 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742644 2025-07-10 05:16:17,846 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742645_1821 src: /192.168.158.1:57800 dest: /192.168.158.4:9866 2025-07-10 05:16:17,875 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57800, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-213725536_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742645_1821, duration(ns): 18753067 2025-07-10 05:16:17,875 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742645_1821, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-10 05:16:21,407 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742645_1821 replica FinalizedReplica, blk_1073742645_1821, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742645 for deletion 2025-07-10 05:16:21,408 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742645_1821 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742645 2025-07-10 05:18:22,841 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742647_1823 src: /192.168.158.1:34710 dest: /192.168.158.4:9866 2025-07-10 05:18:22,874 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34710, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1142127067_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742647_1823, duration(ns): 23064204 2025-07-10 05:18:22,874 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742647_1823, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-10 05:18:27,414 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742647_1823 replica FinalizedReplica, blk_1073742647_1823, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742647 for deletion 2025-07-10 05:18:27,415 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742647_1823 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742647 2025-07-10 05:22:22,845 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742651_1827 src: /192.168.158.1:44326 dest: /192.168.158.4:9866 2025-07-10 05:22:22,878 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44326, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1356560858_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742651_1827, duration(ns): 21490157 2025-07-10 05:22:22,878 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742651_1827, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-10 05:22:30,429 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742651_1827 replica FinalizedReplica, blk_1073742651_1827, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742651 for deletion 2025-07-10 05:22:30,430 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742651_1827 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742651 2025-07-10 05:24:32,862 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742653_1829 src: /192.168.158.1:42118 dest: /192.168.158.4:9866 2025-07-10 05:24:32,895 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42118, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-987378416_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742653_1829, duration(ns): 23693464 2025-07-10 05:24:32,896 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742653_1829, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-10 05:24:36,433 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742653_1829 replica FinalizedReplica, blk_1073742653_1829, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742653 for deletion 2025-07-10 05:24:36,434 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742653_1829 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742653 2025-07-10 05:29:32,857 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742658_1834 src: /192.168.158.9:34396 dest: /192.168.158.4:9866 2025-07-10 05:29:32,874 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:34396, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1993165339_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742658_1834, duration(ns): 14562195 2025-07-10 05:29:32,874 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742658_1834, type=LAST_IN_PIPELINE terminating 2025-07-10 05:29:36,447 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742658_1834 replica FinalizedReplica, blk_1073742658_1834, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742658 for deletion 2025-07-10 05:29:36,449 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742658_1834 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742658 2025-07-10 05:33:37,884 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742662_1838 src: /192.168.158.6:33106 dest: /192.168.158.4:9866 2025-07-10 05:33:37,910 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:33106, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_439557324_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742662_1838, duration(ns): 18664525 2025-07-10 05:33:37,910 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742662_1838, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-10 05:33:42,453 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742662_1838 replica FinalizedReplica, blk_1073742662_1838, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742662 for deletion 2025-07-10 05:33:42,454 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742662_1838 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742662 2025-07-10 05:35:42,882 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742664_1840 src: /192.168.158.6:60536 dest: /192.168.158.4:9866 2025-07-10 05:35:42,908 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:60536, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_966793579_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742664_1840, duration(ns): 20153214 2025-07-10 05:35:42,909 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742664_1840, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-10 05:35:48,456 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742664_1840 replica FinalizedReplica, blk_1073742664_1840, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742664 for deletion 2025-07-10 05:35:48,457 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742664_1840 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742664 2025-07-10 05:36:13,271 INFO org.apache.hadoop.hdfs.server.datanode.DirectoryScanner: BlockPool BP-1059995147-192.168.158.1-1752101929360 Total blocks: 8, missing metadata files:0, missing block files:0, missing blocks in memory:0, mismatched blocks:0 2025-07-10 05:36:42,892 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742665_1841 src: /192.168.158.9:38274 dest: /192.168.158.4:9866 2025-07-10 05:36:42,910 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:38274, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1017973197_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742665_1841, duration(ns): 15488181 2025-07-10 05:36:42,911 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742665_1841, type=LAST_IN_PIPELINE terminating 2025-07-10 05:36:51,460 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742665_1841 replica FinalizedReplica, blk_1073742665_1841, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742665 for deletion 2025-07-10 05:36:51,461 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742665_1841 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742665 2025-07-10 05:37:21,466 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Successfully sent block report 0x12cfd9a757d31f27, containing 4 storage report(s), of which we sent 4. The reports had 8 total blocks and used 1 RPC(s). This took 0 msec to generate and 4 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5. 2025-07-10 05:37:21,467 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Got finalize command for block pool BP-1059995147-192.168.158.1-1752101929360 2025-07-10 05:41:42,874 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742670_1846 src: /192.168.158.6:49656 dest: /192.168.158.4:9866 2025-07-10 05:41:42,890 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:49656, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-897636062_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742670_1846, duration(ns): 14143696 2025-07-10 05:41:42,890 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742670_1846, type=LAST_IN_PIPELINE terminating 2025-07-10 05:41:48,472 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742670_1846 replica FinalizedReplica, blk_1073742670_1846, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742670 for deletion 2025-07-10 05:41:48,474 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742670_1846 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742670 2025-07-10 05:43:42,880 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742672_1848 src: /192.168.158.8:50602 dest: /192.168.158.4:9866 2025-07-10 05:43:42,904 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:50602, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-149970413_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742672_1848, duration(ns): 17703432 2025-07-10 05:43:42,904 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742672_1848, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-10 05:43:48,473 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742672_1848 replica FinalizedReplica, blk_1073742672_1848, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742672 for deletion 2025-07-10 05:43:48,474 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742672_1848 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742672 2025-07-10 05:45:42,893 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742674_1850 src: /192.168.158.8:43710 dest: /192.168.158.4:9866 2025-07-10 05:45:42,909 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:43710, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1683134719_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742674_1850, duration(ns): 13696180 2025-07-10 05:45:42,909 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742674_1850, type=LAST_IN_PIPELINE terminating 2025-07-10 05:45:48,475 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742674_1850 replica FinalizedReplica, blk_1073742674_1850, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742674 for deletion 2025-07-10 05:45:48,476 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742674_1850 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742674 2025-07-10 05:47:42,889 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742676_1852 src: /192.168.158.7:39002 dest: /192.168.158.4:9866 2025-07-10 05:47:42,916 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:39002, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1440891721_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742676_1852, duration(ns): 20405870 2025-07-10 05:47:42,916 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742676_1852, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-10 05:47:51,481 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742676_1852 replica FinalizedReplica, blk_1073742676_1852, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742676 for deletion 2025-07-10 05:47:51,482 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742676_1852 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742676 2025-07-10 05:48:42,888 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742677_1853 src: /192.168.158.1:60134 dest: /192.168.158.4:9866 2025-07-10 05:48:42,919 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60134, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1502728697_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742677_1853, duration(ns): 21106599 2025-07-10 05:48:42,919 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742677_1853, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-10 05:48:48,482 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742677_1853 replica FinalizedReplica, blk_1073742677_1853, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742677 for deletion 2025-07-10 05:48:48,483 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742677_1853 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742677 2025-07-10 05:49:47,910 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742678_1854 src: /192.168.158.1:60172 dest: /192.168.158.4:9866 2025-07-10 05:49:47,945 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60172, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-465051703_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742678_1854, duration(ns): 23715728 2025-07-10 05:49:47,945 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742678_1854, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-10 05:49:54,485 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742678_1854 replica FinalizedReplica, blk_1073742678_1854, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742678 for deletion 2025-07-10 05:49:54,486 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742678_1854 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742678 2025-07-10 05:50:47,899 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742679_1855 src: /192.168.158.9:35660 dest: /192.168.158.4:9866 2025-07-10 05:50:47,950 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:35660, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1551266681_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742679_1855, duration(ns): 48264424 2025-07-10 05:50:47,950 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742679_1855, type=LAST_IN_PIPELINE terminating 2025-07-10 05:50:54,485 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742679_1855 replica FinalizedReplica, blk_1073742679_1855, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742679 for deletion 2025-07-10 05:50:54,486 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742679_1855 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742679 2025-07-10 05:51:47,903 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742680_1856 src: /192.168.158.1:42730 dest: /192.168.158.4:9866 2025-07-10 05:51:47,934 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42730, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1609431511_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742680_1856, duration(ns): 21339012 2025-07-10 05:51:47,935 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742680_1856, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-10 05:51:51,486 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742680_1856 replica FinalizedReplica, blk_1073742680_1856, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742680 for deletion 2025-07-10 05:51:51,487 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742680_1856 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742680 2025-07-10 05:52:47,931 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742681_1857 src: /192.168.158.8:46710 dest: /192.168.158.4:9866 2025-07-10 05:52:47,956 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:46710, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_749854140_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742681_1857, duration(ns): 18837517 2025-07-10 05:52:47,956 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742681_1857, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-10 05:52:54,490 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742681_1857 replica FinalizedReplica, blk_1073742681_1857, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742681 for deletion 2025-07-10 05:52:54,491 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742681_1857 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742681 2025-07-10 05:54:52,899 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742683_1859 src: /192.168.158.1:52884 dest: /192.168.158.4:9866 2025-07-10 05:54:52,931 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52884, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_263998059_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742683_1859, duration(ns): 23204948 2025-07-10 05:54:52,932 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742683_1859, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-10 05:55:00,495 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742683_1859 replica FinalizedReplica, blk_1073742683_1859, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742683 for deletion 2025-07-10 05:55:00,496 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742683_1859 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742683 2025-07-10 05:55:57,913 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742684_1860 src: /192.168.158.9:35536 dest: /192.168.158.4:9866 2025-07-10 05:55:57,929 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:35536, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1500751716_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742684_1860, duration(ns): 13923717 2025-07-10 05:55:57,930 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742684_1860, type=LAST_IN_PIPELINE terminating 2025-07-10 05:56:03,498 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742684_1860 replica FinalizedReplica, blk_1073742684_1860, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742684 for deletion 2025-07-10 05:56:03,500 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742684_1860 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742684 2025-07-10 05:57:02,908 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742685_1861 src: /192.168.158.5:44344 dest: /192.168.158.4:9866 2025-07-10 05:57:02,925 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:44344, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-941311529_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742685_1861, duration(ns): 14311144 2025-07-10 05:57:02,925 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742685_1861, type=LAST_IN_PIPELINE terminating 2025-07-10 05:57:06,500 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742685_1861 replica FinalizedReplica, blk_1073742685_1861, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742685 for deletion 2025-07-10 05:57:06,502 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742685_1861 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742685 2025-07-10 06:00:07,907 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742688_1864 src: /192.168.158.1:38216 dest: /192.168.158.4:9866 2025-07-10 06:00:07,938 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38216, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-514130435_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742688_1864, duration(ns): 20876064 2025-07-10 06:00:07,939 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742688_1864, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-10 06:00:12,507 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742688_1864 replica FinalizedReplica, blk_1073742688_1864, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742688 for deletion 2025-07-10 06:00:12,508 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742688_1864 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742688 2025-07-10 06:03:07,932 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742691_1867 src: /192.168.158.1:49822 dest: /192.168.158.4:9866 2025-07-10 06:03:07,964 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49822, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1374493860_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742691_1867, duration(ns): 21585452 2025-07-10 06:03:07,964 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742691_1867, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-10 06:03:12,516 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742691_1867 replica FinalizedReplica, blk_1073742691_1867, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742691 for deletion 2025-07-10 06:03:12,518 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742691_1867 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742691 2025-07-10 06:04:07,935 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742692_1868 src: /192.168.158.6:56688 dest: /192.168.158.4:9866 2025-07-10 06:04:07,960 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:56688, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_20302384_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742692_1868, duration(ns): 18753275 2025-07-10 06:04:07,961 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742692_1868, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-10 06:04:12,519 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742692_1868 replica FinalizedReplica, blk_1073742692_1868, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742692 for deletion 2025-07-10 06:04:12,520 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742692_1868 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742692 2025-07-10 06:08:12,935 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742696_1872 src: /192.168.158.6:33084 dest: /192.168.158.4:9866 2025-07-10 06:08:12,959 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:33084, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-369077118_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742696_1872, duration(ns): 17634322 2025-07-10 06:08:12,959 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742696_1872, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-10 06:08:18,536 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742696_1872 replica FinalizedReplica, blk_1073742696_1872, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742696 for deletion 2025-07-10 06:08:18,537 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742696_1872 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742696 2025-07-10 06:09:12,931 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742697_1873 src: /192.168.158.5:40064 dest: /192.168.158.4:9866 2025-07-10 06:09:12,956 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:40064, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_715878564_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742697_1873, duration(ns): 19068820 2025-07-10 06:09:12,956 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742697_1873, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-10 06:09:21,541 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742697_1873 replica FinalizedReplica, blk_1073742697_1873, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742697 for deletion 2025-07-10 06:09:21,542 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742697_1873 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742697 2025-07-10 06:10:12,928 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742698_1874 src: /192.168.158.9:46144 dest: /192.168.158.4:9866 2025-07-10 06:10:12,954 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:46144, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1949864797_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742698_1874, duration(ns): 19676223 2025-07-10 06:10:12,954 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742698_1874, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-10 06:10:21,545 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742698_1874 replica FinalizedReplica, blk_1073742698_1874, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742698 for deletion 2025-07-10 06:10:21,546 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742698_1874 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742698 2025-07-10 06:11:12,967 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742699_1875 src: /192.168.158.9:59710 dest: /192.168.158.4:9866 2025-07-10 06:11:12,993 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:59710, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1015844627_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742699_1875, duration(ns): 19043505 2025-07-10 06:11:12,993 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742699_1875, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-10 06:11:18,543 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742699_1875 replica FinalizedReplica, blk_1073742699_1875, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742699 for deletion 2025-07-10 06:11:18,544 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742699_1875 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742699 2025-07-10 06:12:12,940 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742700_1876 src: /192.168.158.6:44152 dest: /192.168.158.4:9866 2025-07-10 06:12:12,958 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:44152, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_600148389_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742700_1876, duration(ns): 15277783 2025-07-10 06:12:12,958 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742700_1876, type=LAST_IN_PIPELINE terminating 2025-07-10 06:12:21,546 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742700_1876 replica FinalizedReplica, blk_1073742700_1876, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742700 for deletion 2025-07-10 06:12:21,547 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742700_1876 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742700 2025-07-10 06:13:17,933 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742701_1877 src: /192.168.158.1:40780 dest: /192.168.158.4:9866 2025-07-10 06:13:17,964 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40780, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1999405310_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742701_1877, duration(ns): 20818866 2025-07-10 06:13:17,967 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742701_1877, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-10 06:13:21,548 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742701_1877 replica FinalizedReplica, blk_1073742701_1877, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742701 for deletion 2025-07-10 06:13:21,550 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742701_1877 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742701 2025-07-10 06:17:17,940 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742705_1881 src: /192.168.158.5:37806 dest: /192.168.158.4:9866 2025-07-10 06:17:17,964 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:37806, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_290922884_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742705_1881, duration(ns): 17893396 2025-07-10 06:17:17,967 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742705_1881, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-10 06:17:21,562 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742705_1881 replica FinalizedReplica, blk_1073742705_1881, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742705 for deletion 2025-07-10 06:17:21,563 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742705_1881 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742705 2025-07-10 06:18:17,918 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742706_1882 src: /192.168.158.5:47974 dest: /192.168.158.4:9866 2025-07-10 06:18:17,940 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:47974, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-634589811_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742706_1882, duration(ns): 15271155 2025-07-10 06:18:17,942 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742706_1882, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-10 06:18:24,562 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742706_1882 replica FinalizedReplica, blk_1073742706_1882, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742706 for deletion 2025-07-10 06:18:24,563 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742706_1882 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742706 2025-07-10 06:19:17,942 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742707_1883 src: /192.168.158.9:34984 dest: /192.168.158.4:9866 2025-07-10 06:19:17,968 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:34984, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1185495871_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742707_1883, duration(ns): 18860499 2025-07-10 06:19:17,969 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742707_1883, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-10 06:19:21,565 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742707_1883 replica FinalizedReplica, blk_1073742707_1883, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742707 for deletion 2025-07-10 06:19:21,566 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742707_1883 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742707 2025-07-10 06:20:22,943 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742708_1884 src: /192.168.158.1:33838 dest: /192.168.158.4:9866 2025-07-10 06:20:22,973 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33838, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1943342251_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742708_1884, duration(ns): 20581961 2025-07-10 06:20:22,974 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742708_1884, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-10 06:20:27,567 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742708_1884 replica FinalizedReplica, blk_1073742708_1884, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742708 for deletion 2025-07-10 06:20:27,568 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742708_1884 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742708 2025-07-10 06:21:22,961 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742709_1885 src: /192.168.158.6:37376 dest: /192.168.158.4:9866 2025-07-10 06:21:22,977 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:37376, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1146359950_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742709_1885, duration(ns): 12873945 2025-07-10 06:21:22,977 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742709_1885, type=LAST_IN_PIPELINE terminating 2025-07-10 06:21:30,569 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742709_1885 replica FinalizedReplica, blk_1073742709_1885, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742709 for deletion 2025-07-10 06:21:30,570 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742709_1885 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742709 2025-07-10 06:25:22,968 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742713_1889 src: /192.168.158.9:42884 dest: /192.168.158.4:9866 2025-07-10 06:25:22,994 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:42884, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_933913833_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742713_1889, duration(ns): 19931637 2025-07-10 06:25:22,995 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742713_1889, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-10 06:25:27,578 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742713_1889 replica FinalizedReplica, blk_1073742713_1889, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742713 for deletion 2025-07-10 06:25:27,579 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742713_1889 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742713 2025-07-10 06:27:23,000 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742715_1891 src: /192.168.158.7:47480 dest: /192.168.158.4:9866 2025-07-10 06:27:23,025 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:47480, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_486614466_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742715_1891, duration(ns): 18147145 2025-07-10 06:27:23,025 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742715_1891, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-10 06:27:30,583 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742715_1891 replica FinalizedReplica, blk_1073742715_1891, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742715 for deletion 2025-07-10 06:27:30,584 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742715_1891 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742715 2025-07-10 06:29:22,983 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742717_1893 src: /192.168.158.5:53964 dest: /192.168.158.4:9866 2025-07-10 06:29:22,998 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:53964, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1221515890_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742717_1893, duration(ns): 12887374 2025-07-10 06:29:22,999 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742717_1893, type=LAST_IN_PIPELINE terminating 2025-07-10 06:29:27,589 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742717_1893 replica FinalizedReplica, blk_1073742717_1893, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742717 for deletion 2025-07-10 06:29:27,590 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742717_1893 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742717 2025-07-10 06:30:22,975 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742718_1894 src: /192.168.158.6:43788 dest: /192.168.158.4:9866 2025-07-10 06:30:22,992 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:43788, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1529159771_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742718_1894, duration(ns): 14456686 2025-07-10 06:30:22,992 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742718_1894, type=LAST_IN_PIPELINE terminating 2025-07-10 06:30:27,594 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742718_1894 replica FinalizedReplica, blk_1073742718_1894, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742718 for deletion 2025-07-10 06:30:27,595 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742718_1894 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742718 2025-07-10 06:31:27,984 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742719_1895 src: /192.168.158.8:40292 dest: /192.168.158.4:9866 2025-07-10 06:31:28,006 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:40292, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-976776726_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742719_1895, duration(ns): 15732524 2025-07-10 06:31:28,007 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742719_1895, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-10 06:31:36,596 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742719_1895 replica FinalizedReplica, blk_1073742719_1895, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742719 for deletion 2025-07-10 06:31:36,598 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742719_1895 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742719 2025-07-10 06:34:32,969 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742722_1898 src: /192.168.158.1:50398 dest: /192.168.158.4:9866 2025-07-10 06:34:33,001 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50398, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1025599439_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742722_1898, duration(ns): 21753160 2025-07-10 06:34:33,001 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742722_1898, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-10 06:34:36,604 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742722_1898 replica FinalizedReplica, blk_1073742722_1898, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742722 for deletion 2025-07-10 06:34:36,605 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742722_1898 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742722 2025-07-10 06:39:38,031 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742727_1903 src: /192.168.158.8:56524 dest: /192.168.158.4:9866 2025-07-10 06:39:38,049 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:56524, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_107092770_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742727_1903, duration(ns): 15011531 2025-07-10 06:39:38,049 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742727_1903, type=LAST_IN_PIPELINE terminating 2025-07-10 06:39:42,620 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742727_1903 replica FinalizedReplica, blk_1073742727_1903, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742727 for deletion 2025-07-10 06:39:42,621 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742727_1903 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742727 2025-07-10 06:40:42,983 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742728_1904 src: /192.168.158.6:45710 dest: /192.168.158.4:9866 2025-07-10 06:40:43,008 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:45710, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-799531775_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742728_1904, duration(ns): 17704337 2025-07-10 06:40:43,008 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742728_1904, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-10 06:40:48,625 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742728_1904 replica FinalizedReplica, blk_1073742728_1904, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742728 for deletion 2025-07-10 06:40:48,627 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742728_1904 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742728 2025-07-10 06:43:43,001 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742731_1907 src: /192.168.158.5:35858 dest: /192.168.158.4:9866 2025-07-10 06:43:43,027 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:35858, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-788839053_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742731_1907, duration(ns): 19521660 2025-07-10 06:43:43,027 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742731_1907, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-10 06:43:51,635 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742731_1907 replica FinalizedReplica, blk_1073742731_1907, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742731 for deletion 2025-07-10 06:43:51,636 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742731_1907 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742731 2025-07-10 06:44:43,006 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742732_1908 src: /192.168.158.8:57148 dest: /192.168.158.4:9866 2025-07-10 06:44:43,025 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:57148, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-749576748_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742732_1908, duration(ns): 15709911 2025-07-10 06:44:43,025 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742732_1908, type=LAST_IN_PIPELINE terminating 2025-07-10 06:44:48,638 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742732_1908 replica FinalizedReplica, blk_1073742732_1908, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742732 for deletion 2025-07-10 06:44:48,639 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742732_1908 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742732 2025-07-10 06:45:48,012 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742733_1909 src: /192.168.158.6:35938 dest: /192.168.158.4:9866 2025-07-10 06:45:48,032 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:35938, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1453567035_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742733_1909, duration(ns): 18182930 2025-07-10 06:45:48,033 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742733_1909, type=LAST_IN_PIPELINE terminating 2025-07-10 06:45:51,643 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742733_1909 replica FinalizedReplica, blk_1073742733_1909, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742733 for deletion 2025-07-10 06:45:51,644 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742733_1909 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742733 2025-07-10 06:46:48,026 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742734_1910 src: /192.168.158.7:40914 dest: /192.168.158.4:9866 2025-07-10 06:46:48,049 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:40914, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-448882187_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742734_1910, duration(ns): 16091852 2025-07-10 06:46:48,049 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742734_1910, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-10 06:46:51,645 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742734_1910 replica FinalizedReplica, blk_1073742734_1910, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742734 for deletion 2025-07-10 06:46:51,646 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742734_1910 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742734 2025-07-10 06:47:48,014 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742735_1911 src: /192.168.158.6:55726 dest: /192.168.158.4:9866 2025-07-10 06:47:48,042 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:55726, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2070413229_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742735_1911, duration(ns): 21330533 2025-07-10 06:47:48,042 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742735_1911, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-10 06:47:51,650 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742735_1911 replica FinalizedReplica, blk_1073742735_1911, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742735 for deletion 2025-07-10 06:47:51,651 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742735_1911 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742735 2025-07-10 06:48:48,016 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742736_1912 src: /192.168.158.8:53708 dest: /192.168.158.4:9866 2025-07-10 06:48:48,035 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:53708, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2113875553_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742736_1912, duration(ns): 16322252 2025-07-10 06:48:48,035 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742736_1912, type=LAST_IN_PIPELINE terminating 2025-07-10 06:48:54,652 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742736_1912 replica FinalizedReplica, blk_1073742736_1912, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742736 for deletion 2025-07-10 06:48:54,653 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742736_1912 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742736 2025-07-10 06:50:48,025 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742738_1914 src: /192.168.158.5:54904 dest: /192.168.158.4:9866 2025-07-10 06:50:48,049 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:54904, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-929755282_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742738_1914, duration(ns): 17613895 2025-07-10 06:50:48,049 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742738_1914, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-10 06:50:51,659 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742738_1914 replica FinalizedReplica, blk_1073742738_1914, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742738 for deletion 2025-07-10 06:50:51,660 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742738_1914 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742738 2025-07-10 06:58:53,039 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742746_1922 src: /192.168.158.7:59010 dest: /192.168.158.4:9866 2025-07-10 06:58:53,064 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:59010, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1800162044_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742746_1922, duration(ns): 18440594 2025-07-10 06:58:53,064 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742746_1922, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-10 06:58:57,678 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742746_1922 replica FinalizedReplica, blk_1073742746_1922, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742746 for deletion 2025-07-10 06:58:57,679 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742746_1922 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742746 2025-07-10 06:59:53,069 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742747_1923 src: /192.168.158.1:46518 dest: /192.168.158.4:9866 2025-07-10 06:59:53,131 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46518, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1567740505_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742747_1923, duration(ns): 52552038 2025-07-10 06:59:53,132 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742747_1923, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-10 06:59:57,681 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742747_1923 replica FinalizedReplica, blk_1073742747_1923, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742747 for deletion 2025-07-10 06:59:57,682 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742747_1923 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742747 2025-07-10 07:02:53,044 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742750_1926 src: /192.168.158.8:41924 dest: /192.168.158.4:9866 2025-07-10 07:02:53,064 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:41924, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1908737983_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742750_1926, duration(ns): 17118558 2025-07-10 07:02:53,064 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742750_1926, type=LAST_IN_PIPELINE terminating 2025-07-10 07:02:57,687 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742750_1926 replica FinalizedReplica, blk_1073742750_1926, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742750 for deletion 2025-07-10 07:02:57,688 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742750_1926 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742750 2025-07-10 07:03:58,060 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742751_1927 src: /192.168.158.9:43788 dest: /192.168.158.4:9866 2025-07-10 07:03:58,085 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:43788, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1261602485_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742751_1927, duration(ns): 19402734 2025-07-10 07:03:58,086 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742751_1927, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-10 07:04:06,692 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742751_1927 replica FinalizedReplica, blk_1073742751_1927, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742751 for deletion 2025-07-10 07:04:06,693 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742751_1927 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742751 2025-07-10 07:04:58,049 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742752_1928 src: /192.168.158.1:57836 dest: /192.168.158.4:9866 2025-07-10 07:04:58,107 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57836, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2073098101_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742752_1928, duration(ns): 47274803 2025-07-10 07:04:58,108 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742752_1928, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-10 07:05:06,695 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742752_1928 replica FinalizedReplica, blk_1073742752_1928, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742752 for deletion 2025-07-10 07:05:06,696 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742752_1928 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742752 2025-07-10 07:09:03,028 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742756_1932 src: /192.168.158.1:57868 dest: /192.168.158.4:9866 2025-07-10 07:09:03,065 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57868, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1506142471_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742756_1932, duration(ns): 25151607 2025-07-10 07:09:03,066 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742756_1932, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-10 07:09:09,702 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742756_1932 replica FinalizedReplica, blk_1073742756_1932, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742756 for deletion 2025-07-10 07:09:09,703 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742756_1932 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742756 2025-07-10 07:13:13,055 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742760_1936 src: /192.168.158.1:36294 dest: /192.168.158.4:9866 2025-07-10 07:13:13,087 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36294, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1011383967_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742760_1936, duration(ns): 22049243 2025-07-10 07:13:13,087 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742760_1936, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-10 07:13:18,711 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742760_1936 replica FinalizedReplica, blk_1073742760_1936, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742760 for deletion 2025-07-10 07:13:18,712 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742760_1936 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742760 2025-07-10 07:14:13,063 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742761_1937 src: /192.168.158.9:48388 dest: /192.168.158.4:9866 2025-07-10 07:14:13,090 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:48388, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1592853541_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742761_1937, duration(ns): 17770917 2025-07-10 07:14:13,090 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742761_1937, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-10 07:14:18,714 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742761_1937 replica FinalizedReplica, blk_1073742761_1937, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742761 for deletion 2025-07-10 07:14:18,715 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742761_1937 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742761 2025-07-10 07:20:23,075 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742767_1943 src: /192.168.158.8:56540 dest: /192.168.158.4:9866 2025-07-10 07:20:23,095 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:56540, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-771440798_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742767_1943, duration(ns): 15791771 2025-07-10 07:20:23,095 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742767_1943, type=LAST_IN_PIPELINE terminating 2025-07-10 07:20:30,724 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742767_1943 replica FinalizedReplica, blk_1073742767_1943, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742767 for deletion 2025-07-10 07:20:30,725 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742767_1943 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742767 2025-07-10 07:22:23,070 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742769_1945 src: /192.168.158.1:53036 dest: /192.168.158.4:9866 2025-07-10 07:22:23,104 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53036, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1351140233_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742769_1945, duration(ns): 23527211 2025-07-10 07:22:23,104 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742769_1945, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-10 07:22:27,732 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742769_1945 replica FinalizedReplica, blk_1073742769_1945, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742769 for deletion 2025-07-10 07:22:27,733 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742769_1945 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742769 2025-07-10 07:24:28,086 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742771_1947 src: /192.168.158.9:37580 dest: /192.168.158.4:9866 2025-07-10 07:24:28,105 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:37580, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1668393256_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742771_1947, duration(ns): 16849891 2025-07-10 07:24:28,105 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742771_1947, type=LAST_IN_PIPELINE terminating 2025-07-10 07:24:36,735 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742771_1947 replica FinalizedReplica, blk_1073742771_1947, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742771 for deletion 2025-07-10 07:24:36,736 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742771_1947 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742771 2025-07-10 07:26:33,094 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742773_1949 src: /192.168.158.6:47738 dest: /192.168.158.4:9866 2025-07-10 07:26:33,113 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:47738, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1669171263_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742773_1949, duration(ns): 17216689 2025-07-10 07:26:33,114 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742773_1949, type=LAST_IN_PIPELINE terminating 2025-07-10 07:26:36,740 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742773_1949 replica FinalizedReplica, blk_1073742773_1949, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742773 for deletion 2025-07-10 07:26:36,741 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742773_1949 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742773 2025-07-10 07:29:33,093 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742776_1952 src: /192.168.158.5:46448 dest: /192.168.158.4:9866 2025-07-10 07:29:33,110 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:46448, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_981910281_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742776_1952, duration(ns): 14594657 2025-07-10 07:29:33,111 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742776_1952, type=LAST_IN_PIPELINE terminating 2025-07-10 07:29:36,751 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742776_1952 replica FinalizedReplica, blk_1073742776_1952, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742776 for deletion 2025-07-10 07:29:36,752 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742776_1952 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742776 2025-07-10 07:30:33,092 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742777_1953 src: /192.168.158.1:43882 dest: /192.168.158.4:9866 2025-07-10 07:30:33,122 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43882, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-85059218_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742777_1953, duration(ns): 20557814 2025-07-10 07:30:33,122 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742777_1953, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-10 07:30:36,756 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742777_1953 replica FinalizedReplica, blk_1073742777_1953, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742777 for deletion 2025-07-10 07:30:36,757 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742777_1953 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742777 2025-07-10 07:32:38,099 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742779_1955 src: /192.168.158.5:33204 dest: /192.168.158.4:9866 2025-07-10 07:32:38,123 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:33204, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1369358330_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742779_1955, duration(ns): 17890620 2025-07-10 07:32:38,125 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742779_1955, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-10 07:32:45,762 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742779_1955 replica FinalizedReplica, blk_1073742779_1955, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742779 for deletion 2025-07-10 07:32:45,764 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742779_1955 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742779 2025-07-10 07:33:38,102 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742780_1956 src: /192.168.158.6:38484 dest: /192.168.158.4:9866 2025-07-10 07:33:38,126 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:38484, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_337196471_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742780_1956, duration(ns): 18335894 2025-07-10 07:33:38,127 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742780_1956, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-10 07:33:42,764 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742780_1956 replica FinalizedReplica, blk_1073742780_1956, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742780 for deletion 2025-07-10 07:33:42,765 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742780_1956 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742780 2025-07-10 07:34:38,103 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742781_1957 src: /192.168.158.1:57342 dest: /192.168.158.4:9866 2025-07-10 07:34:38,133 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57342, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1610059291_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742781_1957, duration(ns): 18816780 2025-07-10 07:34:38,133 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742781_1957, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-10 07:34:45,767 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742781_1957 replica FinalizedReplica, blk_1073742781_1957, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742781 for deletion 2025-07-10 07:34:45,768 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742781_1957 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742781 2025-07-10 07:38:38,110 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742785_1961 src: /192.168.158.8:38442 dest: /192.168.158.4:9866 2025-07-10 07:38:38,126 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:38442, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-813010664_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742785_1961, duration(ns): 13444467 2025-07-10 07:38:38,126 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742785_1961, type=LAST_IN_PIPELINE terminating 2025-07-10 07:38:42,777 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742785_1961 replica FinalizedReplica, blk_1073742785_1961, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742785 for deletion 2025-07-10 07:38:42,778 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742785_1961 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742785 2025-07-10 07:40:38,126 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742787_1963 src: /192.168.158.5:58938 dest: /192.168.158.4:9866 2025-07-10 07:40:38,142 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:58938, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-288391330_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742787_1963, duration(ns): 14520038 2025-07-10 07:40:38,143 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742787_1963, type=LAST_IN_PIPELINE terminating 2025-07-10 07:40:45,780 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742787_1963 replica FinalizedReplica, blk_1073742787_1963, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742787 for deletion 2025-07-10 07:40:45,781 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742787_1963 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742787 2025-07-10 07:41:38,118 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742788_1964 src: /192.168.158.6:55742 dest: /192.168.158.4:9866 2025-07-10 07:41:38,142 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:55742, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_446662148_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742788_1964, duration(ns): 18088343 2025-07-10 07:41:38,142 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742788_1964, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-10 07:41:42,782 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742788_1964 replica FinalizedReplica, blk_1073742788_1964, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742788 for deletion 2025-07-10 07:41:42,783 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742788_1964 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742788 2025-07-10 07:42:38,127 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742789_1965 src: /192.168.158.5:53000 dest: /192.168.158.4:9866 2025-07-10 07:42:38,145 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:53000, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1698296601_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742789_1965, duration(ns): 15210041 2025-07-10 07:42:38,145 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742789_1965, type=LAST_IN_PIPELINE terminating 2025-07-10 07:42:45,787 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742789_1965 replica FinalizedReplica, blk_1073742789_1965, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742789 for deletion 2025-07-10 07:42:45,789 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742789_1965 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742789 2025-07-10 07:45:38,124 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742792_1968 src: /192.168.158.1:46464 dest: /192.168.158.4:9866 2025-07-10 07:45:38,155 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46464, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2084153214_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742792_1968, duration(ns): 20959006 2025-07-10 07:45:38,155 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742792_1968, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-10 07:45:45,800 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742792_1968 replica FinalizedReplica, blk_1073742792_1968, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742792 for deletion 2025-07-10 07:45:45,802 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742792_1968 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742792 2025-07-10 07:49:48,133 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742796_1972 src: /192.168.158.7:37136 dest: /192.168.158.4:9866 2025-07-10 07:49:48,151 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:37136, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_926109746_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742796_1972, duration(ns): 15470581 2025-07-10 07:49:48,151 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742796_1972, type=LAST_IN_PIPELINE terminating 2025-07-10 07:49:54,812 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742796_1972 replica FinalizedReplica, blk_1073742796_1972, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742796 for deletion 2025-07-10 07:49:54,813 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742796_1972 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742796 2025-07-10 07:51:48,140 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742798_1974 src: /192.168.158.6:37606 dest: /192.168.158.4:9866 2025-07-10 07:51:48,156 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:37606, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_715771505_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742798_1974, duration(ns): 14665581 2025-07-10 07:51:48,157 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742798_1974, type=LAST_IN_PIPELINE terminating 2025-07-10 07:51:51,817 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742798_1974 replica FinalizedReplica, blk_1073742798_1974, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742798 for deletion 2025-07-10 07:51:51,819 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742798_1974 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742798 2025-07-10 07:56:48,155 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742803_1979 src: /192.168.158.7:56414 dest: /192.168.158.4:9866 2025-07-10 07:56:48,174 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:56414, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1125488934_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742803_1979, duration(ns): 15041628 2025-07-10 07:56:48,174 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742803_1979, type=LAST_IN_PIPELINE terminating 2025-07-10 07:56:51,829 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742803_1979 replica FinalizedReplica, blk_1073742803_1979, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742803 for deletion 2025-07-10 07:56:51,830 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742803_1979 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742803 2025-07-10 07:59:48,156 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742806_1982 src: /192.168.158.6:48956 dest: /192.168.158.4:9866 2025-07-10 07:59:48,180 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:48956, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1992983900_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742806_1982, duration(ns): 17947308 2025-07-10 07:59:48,180 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742806_1982, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-10 07:59:54,834 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742806_1982 replica FinalizedReplica, blk_1073742806_1982, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742806 for deletion 2025-07-10 07:59:54,836 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742806_1982 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742806 2025-07-10 08:00:48,171 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742807_1983 src: /192.168.158.1:49654 dest: /192.168.158.4:9866 2025-07-10 08:00:48,205 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49654, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1954814436_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742807_1983, duration(ns): 24761054 2025-07-10 08:00:48,205 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742807_1983, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-10 08:00:51,835 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742807_1983 replica FinalizedReplica, blk_1073742807_1983, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742807 for deletion 2025-07-10 08:00:51,837 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742807_1983 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742807 2025-07-10 08:04:58,183 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742811_1987 src: /192.168.158.5:44080 dest: /192.168.158.4:9866 2025-07-10 08:04:58,202 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:44080, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1817256058_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742811_1987, duration(ns): 17056348 2025-07-10 08:04:58,202 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742811_1987, type=LAST_IN_PIPELINE terminating 2025-07-10 08:05:03,843 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742811_1987 replica FinalizedReplica, blk_1073742811_1987, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742811 for deletion 2025-07-10 08:05:03,844 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742811_1987 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742811 2025-07-10 08:06:58,210 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742813_1989 src: /192.168.158.5:44176 dest: /192.168.158.4:9866 2025-07-10 08:06:58,226 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:44176, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_228899491_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742813_1989, duration(ns): 13544398 2025-07-10 08:06:58,226 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742813_1989, type=LAST_IN_PIPELINE terminating 2025-07-10 08:07:06,847 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742813_1989 replica FinalizedReplica, blk_1073742813_1989, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742813 for deletion 2025-07-10 08:07:06,848 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742813_1989 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742813 2025-07-10 08:12:03,195 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742818_1994 src: /192.168.158.8:60554 dest: /192.168.158.4:9866 2025-07-10 08:12:03,218 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:60554, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_642831985_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742818_1994, duration(ns): 16608947 2025-07-10 08:12:03,218 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742818_1994, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-10 08:12:06,859 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742818_1994 replica FinalizedReplica, blk_1073742818_1994, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742818 for deletion 2025-07-10 08:12:06,860 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742818_1994 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742818 2025-07-10 08:15:08,218 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742821_1997 src: /192.168.158.9:58764 dest: /192.168.158.4:9866 2025-07-10 08:15:08,237 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:58764, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1378159507_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742821_1997, duration(ns): 16882137 2025-07-10 08:15:08,237 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742821_1997, type=LAST_IN_PIPELINE terminating 2025-07-10 08:15:12,863 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742821_1997 replica FinalizedReplica, blk_1073742821_1997, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742821 for deletion 2025-07-10 08:15:12,864 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742821_1997 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742821 2025-07-10 08:16:08,221 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742822_1998 src: /192.168.158.8:55698 dest: /192.168.158.4:9866 2025-07-10 08:16:08,237 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:55698, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2032215942_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742822_1998, duration(ns): 14206491 2025-07-10 08:16:08,237 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742822_1998, type=LAST_IN_PIPELINE terminating 2025-07-10 08:16:12,868 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742822_1998 replica FinalizedReplica, blk_1073742822_1998, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742822 for deletion 2025-07-10 08:16:12,869 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742822_1998 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742822 2025-07-10 08:19:13,229 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742825_2001 src: /192.168.158.1:36398 dest: /192.168.158.4:9866 2025-07-10 08:19:13,259 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36398, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1151456876_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742825_2001, duration(ns): 20920635 2025-07-10 08:19:13,260 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742825_2001, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-10 08:19:18,878 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742825_2001 replica FinalizedReplica, blk_1073742825_2001, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742825 for deletion 2025-07-10 08:19:18,880 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742825_2001 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742825 2025-07-10 08:20:13,221 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742826_2002 src: /192.168.158.9:42888 dest: /192.168.158.4:9866 2025-07-10 08:20:13,243 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:42888, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-735429389_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742826_2002, duration(ns): 16128667 2025-07-10 08:20:13,243 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742826_2002, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-10 08:20:21,881 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742826_2002 replica FinalizedReplica, blk_1073742826_2002, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742826 for deletion 2025-07-10 08:20:21,882 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742826_2002 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742826 2025-07-10 08:21:13,256 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742827_2003 src: /192.168.158.1:56782 dest: /192.168.158.4:9866 2025-07-10 08:21:13,287 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56782, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1579668216_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742827_2003, duration(ns): 21857734 2025-07-10 08:21:13,288 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742827_2003, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-10 08:21:21,884 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742827_2003 replica FinalizedReplica, blk_1073742827_2003, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742827 for deletion 2025-07-10 08:21:21,885 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742827_2003 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742827 2025-07-10 08:24:18,232 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742830_2006 src: /192.168.158.9:32932 dest: /192.168.158.4:9866 2025-07-10 08:24:18,259 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:32932, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_442362056_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742830_2006, duration(ns): 20301505 2025-07-10 08:24:18,259 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742830_2006, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-10 08:24:21,890 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742830_2006 replica FinalizedReplica, blk_1073742830_2006, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742830 for deletion 2025-07-10 08:24:21,891 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742830_2006 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742830 2025-07-10 08:26:23,260 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742832_2008 src: /192.168.158.8:60842 dest: /192.168.158.4:9866 2025-07-10 08:26:23,277 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:60842, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_394689310_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742832_2008, duration(ns): 14245782 2025-07-10 08:26:23,277 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742832_2008, type=LAST_IN_PIPELINE terminating 2025-07-10 08:26:30,895 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742832_2008 replica FinalizedReplica, blk_1073742832_2008, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742832 for deletion 2025-07-10 08:26:30,896 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742832_2008 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742832 2025-07-10 08:29:28,237 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742835_2011 src: /192.168.158.1:48512 dest: /192.168.158.4:9866 2025-07-10 08:29:28,283 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48512, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_626345331_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742835_2011, duration(ns): 36253011 2025-07-10 08:29:28,283 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742835_2011, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-10 08:29:33,898 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742835_2011 replica FinalizedReplica, blk_1073742835_2011, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742835 for deletion 2025-07-10 08:29:33,899 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742835_2011 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742835 2025-07-10 08:32:28,239 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742838_2014 src: /192.168.158.6:58186 dest: /192.168.158.4:9866 2025-07-10 08:32:28,264 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:58186, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-708846659_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742838_2014, duration(ns): 19286913 2025-07-10 08:32:28,264 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742838_2014, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-10 08:32:33,901 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742838_2014 replica FinalizedReplica, blk_1073742838_2014, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742838 for deletion 2025-07-10 08:32:33,902 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742838_2014 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742838 2025-07-10 08:33:33,268 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742839_2015 src: /192.168.158.5:55110 dest: /192.168.158.4:9866 2025-07-10 08:33:33,285 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:55110, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1536536745_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742839_2015, duration(ns): 14539414 2025-07-10 08:33:33,285 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742839_2015, type=LAST_IN_PIPELINE terminating 2025-07-10 08:33:39,901 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742839_2015 replica FinalizedReplica, blk_1073742839_2015, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742839 for deletion 2025-07-10 08:33:39,902 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742839_2015 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742839 2025-07-10 08:34:38,260 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742840_2016 src: /192.168.158.7:43036 dest: /192.168.158.4:9866 2025-07-10 08:34:38,285 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:43036, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2057073625_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742840_2016, duration(ns): 19609739 2025-07-10 08:34:38,285 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742840_2016, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-10 08:34:45,904 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742840_2016 replica FinalizedReplica, blk_1073742840_2016, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742840 for deletion 2025-07-10 08:34:45,906 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742840_2016 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742840 2025-07-10 08:35:38,251 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742841_2017 src: /192.168.158.6:41336 dest: /192.168.158.4:9866 2025-07-10 08:35:38,276 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:41336, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1354948991_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742841_2017, duration(ns): 19784405 2025-07-10 08:35:38,277 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742841_2017, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-10 08:35:42,908 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742841_2017 replica FinalizedReplica, blk_1073742841_2017, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742841 for deletion 2025-07-10 08:35:42,909 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742841_2017 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742841 2025-07-10 08:36:38,254 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742842_2018 src: /192.168.158.1:39906 dest: /192.168.158.4:9866 2025-07-10 08:36:38,290 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39906, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_371149143_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742842_2018, duration(ns): 24099655 2025-07-10 08:36:38,290 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742842_2018, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-10 08:36:45,912 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742842_2018 replica FinalizedReplica, blk_1073742842_2018, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742842 for deletion 2025-07-10 08:36:45,913 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742842_2018 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742842 2025-07-10 08:37:38,236 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742843_2019 src: /192.168.158.6:38090 dest: /192.168.158.4:9866 2025-07-10 08:37:38,259 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:38090, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-757552517_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742843_2019, duration(ns): 17223481 2025-07-10 08:37:38,259 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742843_2019, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-10 08:37:42,914 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742843_2019 replica FinalizedReplica, blk_1073742843_2019, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742843 for deletion 2025-07-10 08:37:42,915 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742843_2019 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742843 2025-07-10 08:40:48,256 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742846_2022 src: /192.168.158.8:59170 dest: /192.168.158.4:9866 2025-07-10 08:40:48,283 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:59170, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1955599557_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742846_2022, duration(ns): 21226231 2025-07-10 08:40:48,284 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742846_2022, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-10 08:40:51,923 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742846_2022 replica FinalizedReplica, blk_1073742846_2022, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742846 for deletion 2025-07-10 08:40:51,924 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742846_2022 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073742846 2025-07-10 08:45:48,268 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742851_2027 src: /192.168.158.1:46438 dest: /192.168.158.4:9866 2025-07-10 08:45:48,302 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46438, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1942849667_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742851_2027, duration(ns): 23966838 2025-07-10 08:45:48,302 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742851_2027, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-10 08:45:51,936 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742851_2027 replica FinalizedReplica, blk_1073742851_2027, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742851 for deletion 2025-07-10 08:45:51,937 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742851_2027 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742851 2025-07-10 08:46:48,256 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742852_2028 src: /192.168.158.9:32896 dest: /192.168.158.4:9866 2025-07-10 08:46:48,275 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:32896, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_326557147_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742852_2028, duration(ns): 16055557 2025-07-10 08:46:48,275 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742852_2028, type=LAST_IN_PIPELINE terminating 2025-07-10 08:46:51,940 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742852_2028 replica FinalizedReplica, blk_1073742852_2028, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742852 for deletion 2025-07-10 08:46:51,941 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742852_2028 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742852 2025-07-10 08:48:48,257 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742854_2030 src: /192.168.158.7:40188 dest: /192.168.158.4:9866 2025-07-10 08:48:48,282 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:40188, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2124946648_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742854_2030, duration(ns): 18907086 2025-07-10 08:48:48,282 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742854_2030, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-10 08:48:54,945 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742854_2030 replica FinalizedReplica, blk_1073742854_2030, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742854 for deletion 2025-07-10 08:48:54,946 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742854_2030 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742854 2025-07-10 08:49:53,265 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742855_2031 src: /192.168.158.9:35732 dest: /192.168.158.4:9866 2025-07-10 08:49:53,291 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:35732, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2110422429_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742855_2031, duration(ns): 19507768 2025-07-10 08:49:53,291 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742855_2031, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-10 08:50:00,948 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742855_2031 replica FinalizedReplica, blk_1073742855_2031, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742855 for deletion 2025-07-10 08:50:00,950 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742855_2031 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742855 2025-07-10 08:50:53,245 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742856_2032 src: /192.168.158.6:60150 dest: /192.168.158.4:9866 2025-07-10 08:50:53,266 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:60150, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1654165187_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742856_2032, duration(ns): 15788122 2025-07-10 08:50:53,267 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742856_2032, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-10 08:51:00,950 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742856_2032 replica FinalizedReplica, blk_1073742856_2032, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742856 for deletion 2025-07-10 08:51:00,951 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742856_2032 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742856 2025-07-10 08:51:53,270 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742857_2033 src: /192.168.158.8:44352 dest: /192.168.158.4:9866 2025-07-10 08:51:53,294 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:44352, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-556318856_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742857_2033, duration(ns): 18051579 2025-07-10 08:51:53,294 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742857_2033, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-10 08:51:57,952 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742857_2033 replica FinalizedReplica, blk_1073742857_2033, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742857 for deletion 2025-07-10 08:51:57,953 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742857_2033 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742857 2025-07-10 08:54:53,289 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742860_2036 src: /192.168.158.5:45178 dest: /192.168.158.4:9866 2025-07-10 08:54:53,305 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:45178, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1449247177_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742860_2036, duration(ns): 13850683 2025-07-10 08:54:53,305 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742860_2036, type=LAST_IN_PIPELINE terminating 2025-07-10 08:54:57,961 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742860_2036 replica FinalizedReplica, blk_1073742860_2036, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742860 for deletion 2025-07-10 08:54:57,962 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742860_2036 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742860 2025-07-10 08:55:53,277 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742861_2037 src: /192.168.158.1:52186 dest: /192.168.158.4:9866 2025-07-10 08:55:53,308 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52186, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-247010768_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742861_2037, duration(ns): 21843060 2025-07-10 08:55:53,309 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742861_2037, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-10 08:55:57,963 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742861_2037 replica FinalizedReplica, blk_1073742861_2037, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742861 for deletion 2025-07-10 08:55:57,964 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742861_2037 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742861 2025-07-10 08:56:58,291 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742862_2038 src: /192.168.158.8:57718 dest: /192.168.158.4:9866 2025-07-10 08:56:58,308 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:57718, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_851174219_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742862_2038, duration(ns): 14709713 2025-07-10 08:56:58,309 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742862_2038, type=LAST_IN_PIPELINE terminating 2025-07-10 08:57:00,966 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742862_2038 replica FinalizedReplica, blk_1073742862_2038, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742862 for deletion 2025-07-10 08:57:00,967 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742862_2038 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742862 2025-07-10 08:57:58,278 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742863_2039 src: /192.168.158.6:33170 dest: /192.168.158.4:9866 2025-07-10 08:57:58,302 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:33170, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-704017949_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742863_2039, duration(ns): 19405872 2025-07-10 08:57:58,303 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742863_2039, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-10 08:58:00,969 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742863_2039 replica FinalizedReplica, blk_1073742863_2039, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742863 for deletion 2025-07-10 08:58:00,971 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742863_2039 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742863 2025-07-10 08:58:58,291 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742864_2040 src: /192.168.158.5:48928 dest: /192.168.158.4:9866 2025-07-10 08:58:58,316 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:48928, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-240504399_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742864_2040, duration(ns): 18361238 2025-07-10 08:58:58,316 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742864_2040, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-10 08:59:00,973 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742864_2040 replica FinalizedReplica, blk_1073742864_2040, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742864 for deletion 2025-07-10 08:59:00,975 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742864_2040 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742864 2025-07-10 09:03:03,291 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742868_2044 src: /192.168.158.6:57014 dest: /192.168.158.4:9866 2025-07-10 09:03:03,314 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:57014, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-786137074_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742868_2044, duration(ns): 17488449 2025-07-10 09:03:03,314 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742868_2044, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-10 09:03:06,983 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742868_2044 replica FinalizedReplica, blk_1073742868_2044, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742868 for deletion 2025-07-10 09:03:06,984 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742868_2044 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742868 2025-07-10 09:04:03,295 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742869_2045 src: /192.168.158.5:36218 dest: /192.168.158.4:9866 2025-07-10 09:04:03,321 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:36218, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_362767312_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742869_2045, duration(ns): 20025607 2025-07-10 09:04:03,321 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742869_2045, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-10 09:04:09,984 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742869_2045 replica FinalizedReplica, blk_1073742869_2045, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742869 for deletion 2025-07-10 09:04:09,985 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742869_2045 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742869 2025-07-10 09:05:03,296 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742870_2046 src: /192.168.158.9:52026 dest: /192.168.158.4:9866 2025-07-10 09:05:03,321 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:52026, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-214740507_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742870_2046, duration(ns): 18472007 2025-07-10 09:05:03,321 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742870_2046, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-10 09:05:06,987 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742870_2046 replica FinalizedReplica, blk_1073742870_2046, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742870 for deletion 2025-07-10 09:05:06,988 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742870_2046 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742870 2025-07-10 09:06:03,300 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742871_2047 src: /192.168.158.1:41566 dest: /192.168.158.4:9866 2025-07-10 09:06:03,333 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41566, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1320056306_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742871_2047, duration(ns): 22294691 2025-07-10 09:06:03,333 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742871_2047, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-10 09:06:09,990 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742871_2047 replica FinalizedReplica, blk_1073742871_2047, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742871 for deletion 2025-07-10 09:06:09,991 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742871_2047 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742871 2025-07-10 09:11:13,321 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742876_2052 src: /192.168.158.8:46066 dest: /192.168.158.4:9866 2025-07-10 09:11:13,338 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:46066, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_974758399_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742876_2052, duration(ns): 15084507 2025-07-10 09:11:13,338 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742876_2052, type=LAST_IN_PIPELINE terminating 2025-07-10 09:11:16,005 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742876_2052 replica FinalizedReplica, blk_1073742876_2052, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742876 for deletion 2025-07-10 09:11:16,006 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742876_2052 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742876 2025-07-10 09:17:23,337 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742882_2058 src: /192.168.158.7:40260 dest: /192.168.158.4:9866 2025-07-10 09:17:23,361 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:40260, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1141389681_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742882_2058, duration(ns): 18401041 2025-07-10 09:17:23,362 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742882_2058, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-10 09:17:31,016 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742882_2058 replica FinalizedReplica, blk_1073742882_2058, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742882 for deletion 2025-07-10 09:17:31,018 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742882_2058 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742882 2025-07-10 09:19:23,329 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742884_2060 src: /192.168.158.1:34878 dest: /192.168.158.4:9866 2025-07-10 09:19:23,359 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34878, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-499317524_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742884_2060, duration(ns): 21661419 2025-07-10 09:19:23,360 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742884_2060, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-10 09:19:28,020 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742884_2060 replica FinalizedReplica, blk_1073742884_2060, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742884 for deletion 2025-07-10 09:19:28,021 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742884_2060 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742884 2025-07-10 09:20:28,333 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742885_2061 src: /192.168.158.5:48652 dest: /192.168.158.4:9866 2025-07-10 09:20:28,357 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:48652, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-171058956_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742885_2061, duration(ns): 17852567 2025-07-10 09:20:28,358 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742885_2061, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-10 09:20:31,022 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742885_2061 replica FinalizedReplica, blk_1073742885_2061, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742885 for deletion 2025-07-10 09:20:31,024 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742885_2061 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742885 2025-07-10 09:21:28,343 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742886_2062 src: /192.168.158.8:33500 dest: /192.168.158.4:9866 2025-07-10 09:21:28,370 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:33500, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_694310040_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742886_2062, duration(ns): 20109823 2025-07-10 09:21:28,370 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742886_2062, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-10 09:21:31,027 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742886_2062 replica FinalizedReplica, blk_1073742886_2062, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742886 for deletion 2025-07-10 09:21:31,028 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742886_2062 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742886 2025-07-10 09:22:28,349 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742887_2063 src: /192.168.158.6:41532 dest: /192.168.158.4:9866 2025-07-10 09:22:28,365 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:41532, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1310297242_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742887_2063, duration(ns): 13857990 2025-07-10 09:22:28,365 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742887_2063, type=LAST_IN_PIPELINE terminating 2025-07-10 09:22:34,030 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742887_2063 replica FinalizedReplica, blk_1073742887_2063, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742887 for deletion 2025-07-10 09:22:34,031 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742887_2063 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742887 2025-07-10 09:23:33,339 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742888_2064 src: /192.168.158.7:36556 dest: /192.168.158.4:9866 2025-07-10 09:23:33,363 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:36556, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1193970851_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742888_2064, duration(ns): 17721737 2025-07-10 09:23:33,364 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742888_2064, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-10 09:23:37,032 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742888_2064 replica FinalizedReplica, blk_1073742888_2064, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742888 for deletion 2025-07-10 09:23:37,033 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742888_2064 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742888 2025-07-10 09:26:33,359 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742891_2067 src: /192.168.158.5:51330 dest: /192.168.158.4:9866 2025-07-10 09:26:33,383 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:51330, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1127557654_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742891_2067, duration(ns): 19107979 2025-07-10 09:26:33,384 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742891_2067, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-10 09:26:37,038 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742891_2067 replica FinalizedReplica, blk_1073742891_2067, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742891 for deletion 2025-07-10 09:26:37,039 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742891_2067 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742891 2025-07-10 09:27:33,350 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742892_2068 src: /192.168.158.5:38886 dest: /192.168.158.4:9866 2025-07-10 09:27:33,375 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:38886, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1523351157_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742892_2068, duration(ns): 17615265 2025-07-10 09:27:33,378 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742892_2068, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-10 09:27:40,039 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742892_2068 replica FinalizedReplica, blk_1073742892_2068, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742892 for deletion 2025-07-10 09:27:40,040 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742892_2068 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742892 2025-07-10 09:28:33,361 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742893_2069 src: /192.168.158.9:34872 dest: /192.168.158.4:9866 2025-07-10 09:28:33,378 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:34872, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1380133857_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742893_2069, duration(ns): 15001711 2025-07-10 09:28:33,378 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742893_2069, type=LAST_IN_PIPELINE terminating 2025-07-10 09:28:37,042 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742893_2069 replica FinalizedReplica, blk_1073742893_2069, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742893 for deletion 2025-07-10 09:28:37,043 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742893_2069 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742893 2025-07-10 09:31:38,379 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742896_2072 src: /192.168.158.5:47116 dest: /192.168.158.4:9866 2025-07-10 09:31:38,405 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:47116, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-753711069_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742896_2072, duration(ns): 19310636 2025-07-10 09:31:38,405 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742896_2072, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-10 09:31:43,050 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742896_2072 replica FinalizedReplica, blk_1073742896_2072, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742896 for deletion 2025-07-10 09:31:43,051 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742896_2072 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742896 2025-07-10 09:32:38,379 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742897_2073 src: /192.168.158.7:50220 dest: /192.168.158.4:9866 2025-07-10 09:32:38,403 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:50220, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1977275055_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742897_2073, duration(ns): 18128266 2025-07-10 09:32:38,404 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742897_2073, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-10 09:32:43,052 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742897_2073 replica FinalizedReplica, blk_1073742897_2073, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742897 for deletion 2025-07-10 09:32:43,054 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742897_2073 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742897 2025-07-10 09:35:43,379 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742900_2076 src: /192.168.158.7:46240 dest: /192.168.158.4:9866 2025-07-10 09:35:43,406 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:46240, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1802466392_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742900_2076, duration(ns): 20551201 2025-07-10 09:35:43,406 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742900_2076, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-10 09:35:49,064 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742900_2076 replica FinalizedReplica, blk_1073742900_2076, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742900 for deletion 2025-07-10 09:35:49,065 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742900_2076 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742900 2025-07-10 09:36:43,388 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742901_2077 src: /192.168.158.7:54532 dest: /192.168.158.4:9866 2025-07-10 09:36:43,405 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:54532, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2079222033_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742901_2077, duration(ns): 14142630 2025-07-10 09:36:43,405 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742901_2077, type=LAST_IN_PIPELINE terminating 2025-07-10 09:36:46,065 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742901_2077 replica FinalizedReplica, blk_1073742901_2077, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742901 for deletion 2025-07-10 09:36:46,066 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742901_2077 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742901 2025-07-10 09:37:43,379 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742902_2078 src: /192.168.158.6:54940 dest: /192.168.158.4:9866 2025-07-10 09:37:43,394 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:54940, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-559009140_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742902_2078, duration(ns): 13209308 2025-07-10 09:37:43,395 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742902_2078, type=LAST_IN_PIPELINE terminating 2025-07-10 09:37:46,066 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742902_2078 replica FinalizedReplica, blk_1073742902_2078, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742902 for deletion 2025-07-10 09:37:46,068 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742902_2078 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742902 2025-07-10 09:41:48,394 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742906_2082 src: /192.168.158.1:47734 dest: /192.168.158.4:9866 2025-07-10 09:41:48,429 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47734, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1954015907_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742906_2082, duration(ns): 22958846 2025-07-10 09:41:48,430 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742906_2082, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-10 09:41:52,075 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742906_2082 replica FinalizedReplica, blk_1073742906_2082, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742906 for deletion 2025-07-10 09:41:52,077 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742906_2082 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742906 2025-07-10 09:42:48,396 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742907_2083 src: /192.168.158.7:49914 dest: /192.168.158.4:9866 2025-07-10 09:42:48,422 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:49914, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_700613797_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742907_2083, duration(ns): 20431516 2025-07-10 09:42:48,423 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742907_2083, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-10 09:42:55,079 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742907_2083 replica FinalizedReplica, blk_1073742907_2083, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742907 for deletion 2025-07-10 09:42:55,080 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742907_2083 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742907 2025-07-10 09:46:53,396 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742911_2087 src: /192.168.158.1:47634 dest: /192.168.158.4:9866 2025-07-10 09:46:53,430 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47634, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1484850853_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742911_2087, duration(ns): 24703925 2025-07-10 09:46:53,431 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742911_2087, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-10 09:46:58,092 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742911_2087 replica FinalizedReplica, blk_1073742911_2087, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742911 for deletion 2025-07-10 09:46:58,093 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742911_2087 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742911 2025-07-10 09:47:53,409 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742912_2088 src: /192.168.158.6:40564 dest: /192.168.158.4:9866 2025-07-10 09:47:53,426 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:40564, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-979459803_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742912_2088, duration(ns): 15516759 2025-07-10 09:47:53,427 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742912_2088, type=LAST_IN_PIPELINE terminating 2025-07-10 09:47:58,095 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742912_2088 replica FinalizedReplica, blk_1073742912_2088, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742912 for deletion 2025-07-10 09:47:58,096 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742912_2088 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742912 2025-07-10 09:52:03,412 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742916_2092 src: /192.168.158.5:59954 dest: /192.168.158.4:9866 2025-07-10 09:52:03,430 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:59954, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1655987881_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742916_2092, duration(ns): 15284746 2025-07-10 09:52:03,430 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742916_2092, type=LAST_IN_PIPELINE terminating 2025-07-10 09:52:07,102 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742916_2092 replica FinalizedReplica, blk_1073742916_2092, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742916 for deletion 2025-07-10 09:52:07,103 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742916_2092 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742916 2025-07-10 09:54:04,210 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742918_2094 src: /192.168.158.6:54702 dest: /192.168.158.4:9866 2025-07-10 09:54:04,226 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:54702, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2039532697_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742918_2094, duration(ns): 13702721 2025-07-10 09:54:04,226 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742918_2094, type=LAST_IN_PIPELINE terminating 2025-07-10 09:54:07,107 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742918_2094 replica FinalizedReplica, blk_1073742918_2094, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742918 for deletion 2025-07-10 09:54:07,108 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742918_2094 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742918 2025-07-10 09:56:08,424 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742920_2096 src: /192.168.158.6:56314 dest: /192.168.158.4:9866 2025-07-10 09:56:08,447 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:56314, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1050652208_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742920_2096, duration(ns): 17664195 2025-07-10 09:56:08,447 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742920_2096, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-10 09:56:13,111 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742920_2096 replica FinalizedReplica, blk_1073742920_2096, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742920 for deletion 2025-07-10 09:56:13,112 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742920_2096 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742920 2025-07-10 09:57:08,412 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742921_2097 src: /192.168.158.5:33600 dest: /192.168.158.4:9866 2025-07-10 09:57:08,429 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:33600, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1523185370_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742921_2097, duration(ns): 15147704 2025-07-10 09:57:08,430 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742921_2097, type=LAST_IN_PIPELINE terminating 2025-07-10 09:57:13,115 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742921_2097 replica FinalizedReplica, blk_1073742921_2097, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742921 for deletion 2025-07-10 09:57:13,116 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742921_2097 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742921 2025-07-10 09:59:13,418 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742923_2099 src: /192.168.158.8:38482 dest: /192.168.158.4:9866 2025-07-10 09:59:13,437 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:38482, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2129330907_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742923_2099, duration(ns): 17137569 2025-07-10 09:59:13,437 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742923_2099, type=LAST_IN_PIPELINE terminating 2025-07-10 09:59:19,119 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742923_2099 replica FinalizedReplica, blk_1073742923_2099, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742923 for deletion 2025-07-10 09:59:19,121 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742923_2099 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742923 2025-07-10 10:01:18,428 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742925_2101 src: /192.168.158.8:55936 dest: /192.168.158.4:9866 2025-07-10 10:01:18,446 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:55936, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-771269259_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742925_2101, duration(ns): 14796239 2025-07-10 10:01:18,446 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742925_2101, type=LAST_IN_PIPELINE terminating 2025-07-10 10:01:25,122 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742925_2101 replica FinalizedReplica, blk_1073742925_2101, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742925 for deletion 2025-07-10 10:01:25,123 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742925_2101 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742925 2025-07-10 10:03:18,433 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742927_2103 src: /192.168.158.7:51482 dest: /192.168.158.4:9866 2025-07-10 10:03:18,451 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:51482, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1255430111_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742927_2103, duration(ns): 16088246 2025-07-10 10:03:18,451 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742927_2103, type=LAST_IN_PIPELINE terminating 2025-07-10 10:03:22,129 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742927_2103 replica FinalizedReplica, blk_1073742927_2103, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742927 for deletion 2025-07-10 10:03:22,130 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742927_2103 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742927 2025-07-10 10:05:28,435 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742929_2105 src: /192.168.158.9:36296 dest: /192.168.158.4:9866 2025-07-10 10:05:28,459 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:36296, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_520386270_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742929_2105, duration(ns): 18270029 2025-07-10 10:05:28,459 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742929_2105, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-10 10:05:31,136 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742929_2105 replica FinalizedReplica, blk_1073742929_2105, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742929 for deletion 2025-07-10 10:05:31,137 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742929_2105 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742929 2025-07-10 10:06:33,439 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742930_2106 src: /192.168.158.8:42256 dest: /192.168.158.4:9866 2025-07-10 10:06:33,459 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:42256, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1207809388_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742930_2106, duration(ns): 17568683 2025-07-10 10:06:33,459 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742930_2106, type=LAST_IN_PIPELINE terminating 2025-07-10 10:06:40,141 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742930_2106 replica FinalizedReplica, blk_1073742930_2106, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742930 for deletion 2025-07-10 10:06:40,142 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742930_2106 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742930 2025-07-10 10:08:33,440 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742932_2108 src: /192.168.158.8:56794 dest: /192.168.158.4:9866 2025-07-10 10:08:33,458 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:56794, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1057117924_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742932_2108, duration(ns): 15443061 2025-07-10 10:08:33,458 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742932_2108, type=LAST_IN_PIPELINE terminating 2025-07-10 10:08:37,145 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742932_2108 replica FinalizedReplica, blk_1073742932_2108, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742932 for deletion 2025-07-10 10:08:37,146 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742932_2108 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742932 2025-07-10 10:10:33,430 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742934_2110 src: /192.168.158.1:40932 dest: /192.168.158.4:9866 2025-07-10 10:10:33,462 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40932, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1705482704_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742934_2110, duration(ns): 22159706 2025-07-10 10:10:33,462 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742934_2110, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-10 10:10:37,152 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742934_2110 replica FinalizedReplica, blk_1073742934_2110, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742934 for deletion 2025-07-10 10:10:37,153 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742934_2110 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742934 2025-07-10 10:11:33,487 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742935_2111 src: /192.168.158.1:60518 dest: /192.168.158.4:9866 2025-07-10 10:11:33,518 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60518, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1590680620_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742935_2111, duration(ns): 22473497 2025-07-10 10:11:33,519 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742935_2111, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-10 10:11:40,155 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742935_2111 replica FinalizedReplica, blk_1073742935_2111, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742935 for deletion 2025-07-10 10:11:40,156 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742935_2111 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742935 2025-07-10 10:13:33,446 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742937_2113 src: /192.168.158.5:36344 dest: /192.168.158.4:9866 2025-07-10 10:13:33,464 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:36344, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1867932134_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742937_2113, duration(ns): 15536872 2025-07-10 10:13:33,464 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742937_2113, type=LAST_IN_PIPELINE terminating 2025-07-10 10:13:37,158 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742937_2113 replica FinalizedReplica, blk_1073742937_2113, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742937 for deletion 2025-07-10 10:13:37,159 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742937_2113 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742937 2025-07-10 10:17:38,466 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742941_2117 src: /192.168.158.1:33052 dest: /192.168.158.4:9866 2025-07-10 10:17:38,498 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33052, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1545126818_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742941_2117, duration(ns): 22842454 2025-07-10 10:17:38,498 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742941_2117, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-10 10:17:46,168 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742941_2117 replica FinalizedReplica, blk_1073742941_2117, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742941 for deletion 2025-07-10 10:17:46,170 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742941_2117 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742941 2025-07-10 10:18:38,460 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742942_2118 src: /192.168.158.1:39092 dest: /192.168.158.4:9866 2025-07-10 10:18:38,490 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39092, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2126770735_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742942_2118, duration(ns): 20835032 2025-07-10 10:18:38,490 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742942_2118, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-10 10:18:46,171 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742942_2118 replica FinalizedReplica, blk_1073742942_2118, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742942 for deletion 2025-07-10 10:18:46,173 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742942_2118 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742942 2025-07-10 10:19:38,473 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742943_2119 src: /192.168.158.5:53556 dest: /192.168.158.4:9866 2025-07-10 10:19:38,498 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:53556, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1986295937_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742943_2119, duration(ns): 18076604 2025-07-10 10:19:38,498 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742943_2119, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-10 10:19:46,174 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742943_2119 replica FinalizedReplica, blk_1073742943_2119, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742943 for deletion 2025-07-10 10:19:46,176 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742943_2119 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742943 2025-07-10 10:21:43,482 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742945_2121 src: /192.168.158.9:48538 dest: /192.168.158.4:9866 2025-07-10 10:21:43,497 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:48538, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1976741996_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742945_2121, duration(ns): 12706754 2025-07-10 10:21:43,497 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742945_2121, type=LAST_IN_PIPELINE terminating 2025-07-10 10:21:46,180 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742945_2121 replica FinalizedReplica, blk_1073742945_2121, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742945 for deletion 2025-07-10 10:21:46,181 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742945_2121 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742945 2025-07-10 10:22:43,473 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742946_2122 src: /192.168.158.9:53544 dest: /192.168.158.4:9866 2025-07-10 10:22:43,497 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:53544, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_307514464_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742946_2122, duration(ns): 18438267 2025-07-10 10:22:43,497 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742946_2122, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-10 10:22:49,182 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742946_2122 replica FinalizedReplica, blk_1073742946_2122, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742946 for deletion 2025-07-10 10:22:49,183 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742946_2122 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742946 2025-07-10 10:25:53,480 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742949_2125 src: /192.168.158.1:39772 dest: /192.168.158.4:9866 2025-07-10 10:25:53,513 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39772, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-65318328_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742949_2125, duration(ns): 23440033 2025-07-10 10:25:53,515 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742949_2125, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-10 10:25:58,188 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742949_2125 replica FinalizedReplica, blk_1073742949_2125, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742949 for deletion 2025-07-10 10:25:58,189 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742949_2125 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742949 2025-07-10 10:28:58,518 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742952_2128 src: /192.168.158.5:37562 dest: /192.168.158.4:9866 2025-07-10 10:28:58,541 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:37562, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1865804882_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742952_2128, duration(ns): 17229730 2025-07-10 10:28:58,541 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742952_2128, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-10 10:29:01,198 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742952_2128 replica FinalizedReplica, blk_1073742952_2128, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742952 for deletion 2025-07-10 10:29:01,200 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742952_2128 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742952 2025-07-10 10:30:58,489 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742954_2130 src: /192.168.158.9:43154 dest: /192.168.158.4:9866 2025-07-10 10:30:58,509 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:43154, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1734398797_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742954_2130, duration(ns): 17595502 2025-07-10 10:30:58,509 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742954_2130, type=LAST_IN_PIPELINE terminating 2025-07-10 10:31:01,205 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742954_2130 replica FinalizedReplica, blk_1073742954_2130, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742954 for deletion 2025-07-10 10:31:01,206 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742954_2130 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742954 2025-07-10 10:33:08,480 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742956_2132 src: /192.168.158.8:60364 dest: /192.168.158.4:9866 2025-07-10 10:33:08,501 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:60364, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1613008004_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742956_2132, duration(ns): 15382520 2025-07-10 10:33:08,501 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742956_2132, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-10 10:33:16,210 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742956_2132 replica FinalizedReplica, blk_1073742956_2132, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742956 for deletion 2025-07-10 10:33:16,211 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742956_2132 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742956 2025-07-10 10:34:13,483 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742957_2133 src: /192.168.158.1:35384 dest: /192.168.158.4:9866 2025-07-10 10:34:13,514 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35384, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1347118364_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742957_2133, duration(ns): 22554134 2025-07-10 10:34:13,515 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742957_2133, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-10 10:34:16,213 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742957_2133 replica FinalizedReplica, blk_1073742957_2133, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742957 for deletion 2025-07-10 10:34:16,214 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742957_2133 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742957 2025-07-10 10:35:13,493 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742958_2134 src: /192.168.158.5:51072 dest: /192.168.158.4:9866 2025-07-10 10:35:13,510 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:51072, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2119373898_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742958_2134, duration(ns): 14884858 2025-07-10 10:35:13,510 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742958_2134, type=LAST_IN_PIPELINE terminating 2025-07-10 10:35:16,218 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742958_2134 replica FinalizedReplica, blk_1073742958_2134, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742958 for deletion 2025-07-10 10:35:16,219 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742958_2134 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742958 2025-07-10 10:36:18,491 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742959_2135 src: /192.168.158.9:57590 dest: /192.168.158.4:9866 2025-07-10 10:36:18,507 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:57590, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-294620980_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742959_2135, duration(ns): 13653800 2025-07-10 10:36:18,507 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742959_2135, type=LAST_IN_PIPELINE terminating 2025-07-10 10:36:25,217 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742959_2135 replica FinalizedReplica, blk_1073742959_2135, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742959 for deletion 2025-07-10 10:36:25,218 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742959_2135 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742959 2025-07-10 10:38:18,511 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742961_2137 src: /192.168.158.8:55810 dest: /192.168.158.4:9866 2025-07-10 10:38:18,529 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:55810, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_847082442_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742961_2137, duration(ns): 14809485 2025-07-10 10:38:18,529 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742961_2137, type=LAST_IN_PIPELINE terminating 2025-07-10 10:38:22,224 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742961_2137 replica FinalizedReplica, blk_1073742961_2137, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742961 for deletion 2025-07-10 10:38:22,226 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742961_2137 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742961 2025-07-10 10:40:23,505 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742963_2139 src: /192.168.158.7:45524 dest: /192.168.158.4:9866 2025-07-10 10:40:23,524 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:45524, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_418500406_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742963_2139, duration(ns): 16640620 2025-07-10 10:40:23,524 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742963_2139, type=LAST_IN_PIPELINE terminating 2025-07-10 10:40:31,229 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742963_2139 replica FinalizedReplica, blk_1073742963_2139, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742963 for deletion 2025-07-10 10:40:31,230 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742963_2139 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742963 2025-07-10 10:43:28,511 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742966_2142 src: /192.168.158.1:54126 dest: /192.168.158.4:9866 2025-07-10 10:43:28,541 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54126, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-409667731_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742966_2142, duration(ns): 19956522 2025-07-10 10:43:28,543 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742966_2142, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-10 10:43:31,236 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742966_2142 replica FinalizedReplica, blk_1073742966_2142, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742966 for deletion 2025-07-10 10:43:31,237 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742966_2142 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742966 2025-07-10 10:44:33,514 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742967_2143 src: /192.168.158.9:38040 dest: /192.168.158.4:9866 2025-07-10 10:44:33,537 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:38040, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-26051502_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742967_2143, duration(ns): 17029920 2025-07-10 10:44:33,539 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742967_2143, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-10 10:44:37,238 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742967_2143 replica FinalizedReplica, blk_1073742967_2143, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742967 for deletion 2025-07-10 10:44:37,239 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742967_2143 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742967 2025-07-10 10:47:33,523 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742970_2146 src: /192.168.158.1:60264 dest: /192.168.158.4:9866 2025-07-10 10:47:33,556 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60264, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1554037536_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742970_2146, duration(ns): 24308229 2025-07-10 10:47:33,557 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742970_2146, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-10 10:47:37,245 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742970_2146 replica FinalizedReplica, blk_1073742970_2146, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742970 for deletion 2025-07-10 10:47:37,246 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742970_2146 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742970 2025-07-10 10:52:38,539 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742975_2151 src: /192.168.158.5:52320 dest: /192.168.158.4:9866 2025-07-10 10:52:38,564 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:52320, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2112297777_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742975_2151, duration(ns): 19612375 2025-07-10 10:52:38,564 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742975_2151, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-10 10:52:43,253 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742975_2151 replica FinalizedReplica, blk_1073742975_2151, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742975 for deletion 2025-07-10 10:52:43,255 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742975_2151 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742975 2025-07-10 10:53:38,544 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742976_2152 src: /192.168.158.9:38444 dest: /192.168.158.4:9866 2025-07-10 10:53:38,562 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:38444, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1639839316_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742976_2152, duration(ns): 15071647 2025-07-10 10:53:38,562 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742976_2152, type=LAST_IN_PIPELINE terminating 2025-07-10 10:53:46,255 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742976_2152 replica FinalizedReplica, blk_1073742976_2152, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742976 for deletion 2025-07-10 10:53:46,257 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742976_2152 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742976 2025-07-10 10:56:38,528 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742979_2155 src: /192.168.158.1:50882 dest: /192.168.158.4:9866 2025-07-10 10:56:38,560 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50882, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-657562362_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742979_2155, duration(ns): 22304992 2025-07-10 10:56:38,560 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742979_2155, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-10 10:56:43,264 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742979_2155 replica FinalizedReplica, blk_1073742979_2155, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742979 for deletion 2025-07-10 10:56:43,265 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742979_2155 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742979 2025-07-10 11:00:43,539 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742983_2159 src: /192.168.158.7:52956 dest: /192.168.158.4:9866 2025-07-10 11:00:43,563 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:52956, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-192093298_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742983_2159, duration(ns): 17867898 2025-07-10 11:00:43,563 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742983_2159, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-10 11:00:46,274 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742983_2159 replica FinalizedReplica, blk_1073742983_2159, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742983 for deletion 2025-07-10 11:00:46,275 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742983_2159 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742983 2025-07-10 11:03:48,540 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742986_2162 src: /192.168.158.1:43198 dest: /192.168.158.4:9866 2025-07-10 11:03:48,572 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43198, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1326012484_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742986_2162, duration(ns): 22682535 2025-07-10 11:03:48,572 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742986_2162, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-10 11:03:55,275 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742986_2162 replica FinalizedReplica, blk_1073742986_2162, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742986 for deletion 2025-07-10 11:03:55,276 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742986_2162 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742986 2025-07-10 11:05:48,547 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742988_2164 src: /192.168.158.9:35852 dest: /192.168.158.4:9866 2025-07-10 11:05:48,570 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:35852, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-620908986_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742988_2164, duration(ns): 17457972 2025-07-10 11:05:48,570 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742988_2164, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-10 11:05:52,278 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742988_2164 replica FinalizedReplica, blk_1073742988_2164, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742988 for deletion 2025-07-10 11:05:52,279 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742988_2164 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742988 2025-07-10 11:10:48,580 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073742993_2169 src: /192.168.158.7:53490 dest: /192.168.158.4:9866 2025-07-10 11:10:48,606 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:53490, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1597516200_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073742993_2169, duration(ns): 19960892 2025-07-10 11:10:48,606 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073742993_2169, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-10 11:10:55,289 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073742993_2169 replica FinalizedReplica, blk_1073742993_2169, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742993 for deletion 2025-07-10 11:10:55,290 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073742993_2169 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073742993 2025-07-10 11:23:08,563 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743005_2181 src: /192.168.158.6:51786 dest: /192.168.158.4:9866 2025-07-10 11:23:08,588 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:51786, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_274042434_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743005_2181, duration(ns): 19433688 2025-07-10 11:23:08,588 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743005_2181, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-10 11:23:16,318 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743005_2181 replica FinalizedReplica, blk_1073743005_2181, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743005 for deletion 2025-07-10 11:23:16,319 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743005_2181 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743005 2025-07-10 11:24:08,573 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743006_2182 src: /192.168.158.9:57678 dest: /192.168.158.4:9866 2025-07-10 11:24:08,592 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:57678, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1798170211_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743006_2182, duration(ns): 16862262 2025-07-10 11:24:08,592 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743006_2182, type=LAST_IN_PIPELINE terminating 2025-07-10 11:24:13,320 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743006_2182 replica FinalizedReplica, blk_1073743006_2182, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743006 for deletion 2025-07-10 11:24:13,321 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743006_2182 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743006 2025-07-10 11:25:08,565 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743007_2183 src: /192.168.158.1:44208 dest: /192.168.158.4:9866 2025-07-10 11:25:08,596 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44208, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1988959009_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743007_2183, duration(ns): 21411376 2025-07-10 11:25:08,596 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743007_2183, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-10 11:25:13,323 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743007_2183 replica FinalizedReplica, blk_1073743007_2183, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743007 for deletion 2025-07-10 11:25:13,324 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743007_2183 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743007 2025-07-10 11:27:08,612 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743009_2185 src: /192.168.158.1:40078 dest: /192.168.158.4:9866 2025-07-10 11:27:08,644 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40078, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1153242681_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743009_2185, duration(ns): 21723644 2025-07-10 11:27:08,644 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743009_2185, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-10 11:27:13,328 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743009_2185 replica FinalizedReplica, blk_1073743009_2185, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743009 for deletion 2025-07-10 11:27:13,329 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743009_2185 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743009 2025-07-10 11:28:08,607 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743010_2186 src: /192.168.158.1:39874 dest: /192.168.158.4:9866 2025-07-10 11:28:08,640 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39874, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1986481506_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743010_2186, duration(ns): 23155413 2025-07-10 11:28:08,640 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743010_2186, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-10 11:28:13,331 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743010_2186 replica FinalizedReplica, blk_1073743010_2186, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743010 for deletion 2025-07-10 11:28:13,332 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743010_2186 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743010 2025-07-10 11:32:18,578 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743014_2190 src: /192.168.158.5:48902 dest: /192.168.158.4:9866 2025-07-10 11:32:18,604 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:48902, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_18539610_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743014_2190, duration(ns): 18557618 2025-07-10 11:32:18,604 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743014_2190, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-10 11:32:22,344 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743014_2190 replica FinalizedReplica, blk_1073743014_2190, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743014 for deletion 2025-07-10 11:32:22,345 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743014_2190 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743014 2025-07-10 11:34:18,600 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743016_2192 src: /192.168.158.1:49350 dest: /192.168.158.4:9866 2025-07-10 11:34:18,633 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49350, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1295808571_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743016_2192, duration(ns): 23287404 2025-07-10 11:34:18,633 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743016_2192, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-10 11:34:22,352 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743016_2192 replica FinalizedReplica, blk_1073743016_2192, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743016 for deletion 2025-07-10 11:34:22,353 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743016_2192 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743016 2025-07-10 11:35:23,584 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743017_2193 src: /192.168.158.1:52420 dest: /192.168.158.4:9866 2025-07-10 11:35:23,615 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52420, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1165896209_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743017_2193, duration(ns): 22524919 2025-07-10 11:35:23,616 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743017_2193, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-10 11:35:31,354 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743017_2193 replica FinalizedReplica, blk_1073743017_2193, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743017 for deletion 2025-07-10 11:35:31,355 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743017_2193 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743017 2025-07-10 11:36:13,269 INFO org.apache.hadoop.hdfs.server.datanode.DirectoryScanner: BlockPool BP-1059995147-192.168.158.1-1752101929360 Total blocks: 8, missing metadata files:0, missing block files:0, missing blocks in memory:0, mismatched blocks:0 2025-07-10 11:37:19,364 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Successfully sent block report 0x12cfd9a757d31f28, containing 4 storage report(s), of which we sent 4. The reports had 8 total blocks and used 1 RPC(s). This took 1 msec to generate and 4 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5. 2025-07-10 11:37:19,365 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Got finalize command for block pool BP-1059995147-192.168.158.1-1752101929360 2025-07-10 11:37:28,590 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743019_2195 src: /192.168.158.1:46346 dest: /192.168.158.4:9866 2025-07-10 11:37:28,624 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46346, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1896518304_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743019_2195, duration(ns): 22937238 2025-07-10 11:37:28,624 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743019_2195, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-10 11:37:31,361 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743019_2195 replica FinalizedReplica, blk_1073743019_2195, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743019 for deletion 2025-07-10 11:37:31,362 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743019_2195 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743019 2025-07-10 11:38:28,600 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743020_2196 src: /192.168.158.6:39288 dest: /192.168.158.4:9866 2025-07-10 11:38:28,619 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:39288, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1975641623_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743020_2196, duration(ns): 16219194 2025-07-10 11:38:28,619 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743020_2196, type=LAST_IN_PIPELINE terminating 2025-07-10 11:38:31,364 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743020_2196 replica FinalizedReplica, blk_1073743020_2196, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743020 for deletion 2025-07-10 11:38:31,365 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743020_2196 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743020 2025-07-10 11:39:28,598 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743021_2197 src: /192.168.158.7:42146 dest: /192.168.158.4:9866 2025-07-10 11:39:28,615 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:42146, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1487216174_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743021_2197, duration(ns): 15281773 2025-07-10 11:39:28,616 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743021_2197, type=LAST_IN_PIPELINE terminating 2025-07-10 11:39:31,367 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743021_2197 replica FinalizedReplica, blk_1073743021_2197, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743021 for deletion 2025-07-10 11:39:31,369 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743021_2197 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743021 2025-07-10 11:44:33,609 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743026_2202 src: /192.168.158.6:43392 dest: /192.168.158.4:9866 2025-07-10 11:44:33,627 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:43392, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1901089687_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743026_2202, duration(ns): 15621410 2025-07-10 11:44:33,627 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743026_2202, type=LAST_IN_PIPELINE terminating 2025-07-10 11:44:37,379 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743026_2202 replica FinalizedReplica, blk_1073743026_2202, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743026 for deletion 2025-07-10 11:44:37,380 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743026_2202 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743026 2025-07-10 11:46:43,602 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743028_2204 src: /192.168.158.5:33286 dest: /192.168.158.4:9866 2025-07-10 11:46:43,627 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:33286, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1500966415_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743028_2204, duration(ns): 18726133 2025-07-10 11:46:43,627 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743028_2204, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-10 11:46:49,383 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743028_2204 replica FinalizedReplica, blk_1073743028_2204, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743028 for deletion 2025-07-10 11:46:49,384 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743028_2204 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743028 2025-07-10 11:49:53,589 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743031_2207 src: /192.168.158.1:44122 dest: /192.168.158.4:9866 2025-07-10 11:49:53,621 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44122, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1767767287_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743031_2207, duration(ns): 22793280 2025-07-10 11:49:53,621 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743031_2207, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-10 11:50:01,396 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743031_2207 replica FinalizedReplica, blk_1073743031_2207, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743031 for deletion 2025-07-10 11:50:01,397 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743031_2207 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743031 2025-07-10 11:50:58,625 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743032_2208 src: /192.168.158.1:43354 dest: /192.168.158.4:9866 2025-07-10 11:50:58,657 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43354, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1981673605_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743032_2208, duration(ns): 21966066 2025-07-10 11:50:58,657 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743032_2208, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-10 11:51:01,399 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743032_2208 replica FinalizedReplica, blk_1073743032_2208, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743032 for deletion 2025-07-10 11:51:01,400 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743032_2208 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743032 2025-07-10 11:51:58,606 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743033_2209 src: /192.168.158.1:56772 dest: /192.168.158.4:9866 2025-07-10 11:51:58,638 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56772, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1884405508_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743033_2209, duration(ns): 22316010 2025-07-10 11:51:58,638 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743033_2209, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-10 11:52:01,402 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743033_2209 replica FinalizedReplica, blk_1073743033_2209, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743033 for deletion 2025-07-10 11:52:01,403 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743033_2209 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743033 2025-07-10 11:56:03,608 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743037_2213 src: /192.168.158.1:43660 dest: /192.168.158.4:9866 2025-07-10 11:56:03,639 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43660, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_430943_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743037_2213, duration(ns): 21532120 2025-07-10 11:56:03,639 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743037_2213, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-10 11:56:07,409 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743037_2213 replica FinalizedReplica, blk_1073743037_2213, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743037 for deletion 2025-07-10 11:56:07,410 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743037_2213 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743037 2025-07-10 11:58:08,607 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743039_2215 src: /192.168.158.9:40292 dest: /192.168.158.4:9866 2025-07-10 11:58:08,626 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:40292, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1911237823_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743039_2215, duration(ns): 17445732 2025-07-10 11:58:08,627 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743039_2215, type=LAST_IN_PIPELINE terminating 2025-07-10 11:58:13,412 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743039_2215 replica FinalizedReplica, blk_1073743039_2215, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743039 for deletion 2025-07-10 11:58:13,413 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743039_2215 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743039 2025-07-10 12:00:13,614 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743041_2217 src: /192.168.158.7:42282 dest: /192.168.158.4:9866 2025-07-10 12:00:13,631 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:42282, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1195786903_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743041_2217, duration(ns): 15107351 2025-07-10 12:00:13,632 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743041_2217, type=LAST_IN_PIPELINE terminating 2025-07-10 12:00:19,416 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743041_2217 replica FinalizedReplica, blk_1073743041_2217, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743041 for deletion 2025-07-10 12:00:19,418 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743041_2217 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743041 2025-07-10 12:02:13,607 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743043_2219 src: /192.168.158.1:60166 dest: /192.168.158.4:9866 2025-07-10 12:02:13,639 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60166, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1197144682_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743043_2219, duration(ns): 22779676 2025-07-10 12:02:13,639 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743043_2219, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-10 12:02:16,421 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743043_2219 replica FinalizedReplica, blk_1073743043_2219, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743043 for deletion 2025-07-10 12:02:16,422 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743043_2219 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743043 2025-07-10 12:04:18,652 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743045_2221 src: /192.168.158.8:54190 dest: /192.168.158.4:9866 2025-07-10 12:04:18,677 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:54190, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-156359272_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743045_2221, duration(ns): 19103349 2025-07-10 12:04:18,677 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743045_2221, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-10 12:04:25,427 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743045_2221 replica FinalizedReplica, blk_1073743045_2221, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743045 for deletion 2025-07-10 12:04:25,429 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743045_2221 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743045 2025-07-10 12:08:23,617 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743049_2225 src: /192.168.158.8:52228 dest: /192.168.158.4:9866 2025-07-10 12:08:23,643 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:52228, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_104051953_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743049_2225, duration(ns): 19768575 2025-07-10 12:08:23,643 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743049_2225, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-10 12:08:28,431 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743049_2225 replica FinalizedReplica, blk_1073743049_2225, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743049 for deletion 2025-07-10 12:08:28,432 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743049_2225 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743049 2025-07-10 12:10:28,631 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743051_2227 src: /192.168.158.5:58572 dest: /192.168.158.4:9866 2025-07-10 12:10:28,656 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:58572, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_729031284_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743051_2227, duration(ns): 19591540 2025-07-10 12:10:28,657 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743051_2227, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-10 12:10:31,434 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743051_2227 replica FinalizedReplica, blk_1073743051_2227, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743051 for deletion 2025-07-10 12:10:31,436 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743051_2227 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743051 2025-07-10 12:12:28,634 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743053_2229 src: /192.168.158.9:55002 dest: /192.168.158.4:9866 2025-07-10 12:12:28,650 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:55002, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_834803243_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743053_2229, duration(ns): 14097011 2025-07-10 12:12:28,650 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743053_2229, type=LAST_IN_PIPELINE terminating 2025-07-10 12:12:31,441 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743053_2229 replica FinalizedReplica, blk_1073743053_2229, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743053 for deletion 2025-07-10 12:12:31,442 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743053_2229 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743053 2025-07-10 12:13:28,630 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743054_2230 src: /192.168.158.1:38042 dest: /192.168.158.4:9866 2025-07-10 12:13:28,661 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38042, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1138166492_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743054_2230, duration(ns): 21434971 2025-07-10 12:13:28,661 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743054_2230, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-10 12:13:34,444 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743054_2230 replica FinalizedReplica, blk_1073743054_2230, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743054 for deletion 2025-07-10 12:13:34,445 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743054_2230 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743054 2025-07-10 12:14:33,634 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743055_2231 src: /192.168.158.5:54734 dest: /192.168.158.4:9866 2025-07-10 12:14:33,652 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:54734, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1862609186_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743055_2231, duration(ns): 15642890 2025-07-10 12:14:33,652 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743055_2231, type=LAST_IN_PIPELINE terminating 2025-07-10 12:14:37,445 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743055_2231 replica FinalizedReplica, blk_1073743055_2231, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743055 for deletion 2025-07-10 12:14:37,446 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743055_2231 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743055 2025-07-10 12:16:38,636 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743057_2233 src: /192.168.158.6:44448 dest: /192.168.158.4:9866 2025-07-10 12:16:38,654 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:44448, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-722077414_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743057_2233, duration(ns): 16477252 2025-07-10 12:16:38,655 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743057_2233, type=LAST_IN_PIPELINE terminating 2025-07-10 12:16:43,449 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743057_2233 replica FinalizedReplica, blk_1073743057_2233, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743057 for deletion 2025-07-10 12:16:43,451 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743057_2233 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743057 2025-07-10 12:20:43,630 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743061_2237 src: /192.168.158.1:46068 dest: /192.168.158.4:9866 2025-07-10 12:20:43,662 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46068, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-845230313_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743061_2237, duration(ns): 22334746 2025-07-10 12:20:43,662 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743061_2237, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-10 12:20:46,459 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743061_2237 replica FinalizedReplica, blk_1073743061_2237, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743061 for deletion 2025-07-10 12:20:46,460 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743061_2237 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743061 2025-07-10 12:21:43,640 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743062_2238 src: /192.168.158.1:50606 dest: /192.168.158.4:9866 2025-07-10 12:21:43,674 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50606, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1340292970_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743062_2238, duration(ns): 25002167 2025-07-10 12:21:43,674 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743062_2238, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-10 12:21:46,463 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743062_2238 replica FinalizedReplica, blk_1073743062_2238, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743062 for deletion 2025-07-10 12:21:46,464 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743062_2238 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743062 2025-07-10 12:25:48,634 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743066_2242 src: /192.168.158.1:59978 dest: /192.168.158.4:9866 2025-07-10 12:25:48,665 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59978, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_353359838_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743066_2242, duration(ns): 22271727 2025-07-10 12:25:48,666 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743066_2242, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-10 12:25:55,479 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743066_2242 replica FinalizedReplica, blk_1073743066_2242, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743066 for deletion 2025-07-10 12:25:55,480 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743066_2242 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743066 2025-07-10 12:27:48,645 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743068_2244 src: /192.168.158.1:40198 dest: /192.168.158.4:9866 2025-07-10 12:27:48,677 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40198, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-934422806_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743068_2244, duration(ns): 22359334 2025-07-10 12:27:48,677 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743068_2244, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-10 12:27:52,484 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743068_2244 replica FinalizedReplica, blk_1073743068_2244, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743068 for deletion 2025-07-10 12:27:52,485 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743068_2244 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743068 2025-07-10 12:28:48,649 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743069_2245 src: /192.168.158.5:37404 dest: /192.168.158.4:9866 2025-07-10 12:28:48,666 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:37404, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-866212203_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743069_2245, duration(ns): 14423583 2025-07-10 12:28:48,666 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743069_2245, type=LAST_IN_PIPELINE terminating 2025-07-10 12:28:55,486 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743069_2245 replica FinalizedReplica, blk_1073743069_2245, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743069 for deletion 2025-07-10 12:28:55,487 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743069_2245 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743069 2025-07-10 12:29:48,658 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743070_2246 src: /192.168.158.8:40238 dest: /192.168.158.4:9866 2025-07-10 12:29:48,676 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:40238, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-325730820_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743070_2246, duration(ns): 15864162 2025-07-10 12:29:48,676 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743070_2246, type=LAST_IN_PIPELINE terminating 2025-07-10 12:29:55,488 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743070_2246 replica FinalizedReplica, blk_1073743070_2246, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743070 for deletion 2025-07-10 12:29:55,490 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743070_2246 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743070 2025-07-10 12:30:48,659 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743071_2247 src: /192.168.158.5:40364 dest: /192.168.158.4:9866 2025-07-10 12:30:48,682 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:40364, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1624038962_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743071_2247, duration(ns): 18577176 2025-07-10 12:30:48,683 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743071_2247, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-10 12:30:52,491 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743071_2247 replica FinalizedReplica, blk_1073743071_2247, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743071 for deletion 2025-07-10 12:30:52,492 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743071_2247 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743071 2025-07-10 12:32:53,658 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743073_2249 src: /192.168.158.5:43482 dest: /192.168.158.4:9866 2025-07-10 12:32:53,676 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:43482, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_86563406_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743073_2249, duration(ns): 15496458 2025-07-10 12:32:53,676 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743073_2249, type=LAST_IN_PIPELINE terminating 2025-07-10 12:33:01,497 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743073_2249 replica FinalizedReplica, blk_1073743073_2249, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743073 for deletion 2025-07-10 12:33:01,498 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743073_2249 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743073 2025-07-10 12:33:53,657 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743074_2250 src: /192.168.158.9:33868 dest: /192.168.158.4:9866 2025-07-10 12:33:53,675 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:33868, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_967846534_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743074_2250, duration(ns): 15746184 2025-07-10 12:33:53,675 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743074_2250, type=LAST_IN_PIPELINE terminating 2025-07-10 12:33:58,497 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743074_2250 replica FinalizedReplica, blk_1073743074_2250, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743074 for deletion 2025-07-10 12:33:58,498 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743074_2250 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743074 2025-07-10 12:35:53,659 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743076_2252 src: /192.168.158.7:41434 dest: /192.168.158.4:9866 2025-07-10 12:35:53,684 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:41434, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-561778793_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743076_2252, duration(ns): 19912274 2025-07-10 12:35:53,684 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743076_2252, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-10 12:36:01,504 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743076_2252 replica FinalizedReplica, blk_1073743076_2252, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743076 for deletion 2025-07-10 12:36:01,505 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743076_2252 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743076 2025-07-10 12:36:53,664 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743077_2253 src: /192.168.158.5:54890 dest: /192.168.158.4:9866 2025-07-10 12:36:53,681 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:54890, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_582409682_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743077_2253, duration(ns): 14973893 2025-07-10 12:36:53,681 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743077_2253, type=LAST_IN_PIPELINE terminating 2025-07-10 12:37:01,505 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743077_2253 replica FinalizedReplica, blk_1073743077_2253, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743077 for deletion 2025-07-10 12:37:01,506 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743077_2253 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743077 2025-07-10 12:37:58,661 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743078_2254 src: /192.168.158.8:45492 dest: /192.168.158.4:9866 2025-07-10 12:37:58,687 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:45492, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-882718010_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743078_2254, duration(ns): 19294168 2025-07-10 12:37:58,687 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743078_2254, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-10 12:38:01,510 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743078_2254 replica FinalizedReplica, blk_1073743078_2254, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743078 for deletion 2025-07-10 12:38:01,511 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743078_2254 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743078 2025-07-10 12:39:03,693 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743079_2255 src: /192.168.158.5:44462 dest: /192.168.158.4:9866 2025-07-10 12:39:03,709 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:44462, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1735434255_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743079_2255, duration(ns): 13959065 2025-07-10 12:39:03,709 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743079_2255, type=LAST_IN_PIPELINE terminating 2025-07-10 12:39:07,513 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743079_2255 replica FinalizedReplica, blk_1073743079_2255, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743079 for deletion 2025-07-10 12:39:07,514 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743079_2255 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743079 2025-07-10 12:40:03,662 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743080_2256 src: /192.168.158.8:37098 dest: /192.168.158.4:9866 2025-07-10 12:40:03,686 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:37098, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-223341716_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743080_2256, duration(ns): 17606780 2025-07-10 12:40:03,686 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743080_2256, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-10 12:40:07,516 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743080_2256 replica FinalizedReplica, blk_1073743080_2256, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743080 for deletion 2025-07-10 12:40:07,517 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743080_2256 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743080 2025-07-10 12:42:08,660 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743082_2258 src: /192.168.158.1:46022 dest: /192.168.158.4:9866 2025-07-10 12:42:08,691 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46022, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-750212773_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743082_2258, duration(ns): 21909425 2025-07-10 12:42:08,691 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743082_2258, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-10 12:42:16,521 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743082_2258 replica FinalizedReplica, blk_1073743082_2258, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743082 for deletion 2025-07-10 12:42:16,522 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743082_2258 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743082 2025-07-10 12:44:08,677 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743084_2260 src: /192.168.158.1:50444 dest: /192.168.158.4:9866 2025-07-10 12:44:08,709 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50444, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-835711644_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743084_2260, duration(ns): 22927257 2025-07-10 12:44:08,709 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743084_2260, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-10 12:44:13,529 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743084_2260 replica FinalizedReplica, blk_1073743084_2260, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743084 for deletion 2025-07-10 12:44:13,530 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743084_2260 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743084 2025-07-10 12:45:08,682 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743085_2261 src: /192.168.158.1:39412 dest: /192.168.158.4:9866 2025-07-10 12:45:08,712 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39412, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1468299256_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743085_2261, duration(ns): 20149117 2025-07-10 12:45:08,712 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743085_2261, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-10 12:45:16,529 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743085_2261 replica FinalizedReplica, blk_1073743085_2261, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743085 for deletion 2025-07-10 12:45:16,530 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743085_2261 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743085 2025-07-10 12:46:13,674 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743086_2262 src: /192.168.158.9:56332 dest: /192.168.158.4:9866 2025-07-10 12:46:13,697 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:56332, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_277740492_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743086_2262, duration(ns): 17099717 2025-07-10 12:46:13,698 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743086_2262, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-10 12:46:16,532 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743086_2262 replica FinalizedReplica, blk_1073743086_2262, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743086 for deletion 2025-07-10 12:46:16,533 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743086_2262 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743086 2025-07-10 12:49:18,691 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743089_2265 src: /192.168.158.5:35668 dest: /192.168.158.4:9866 2025-07-10 12:49:18,716 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:35668, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1741833915_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743089_2265, duration(ns): 18864482 2025-07-10 12:49:18,716 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743089_2265, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-10 12:49:25,539 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743089_2265 replica FinalizedReplica, blk_1073743089_2265, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743089 for deletion 2025-07-10 12:49:25,540 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743089_2265 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743089 2025-07-10 12:51:18,713 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743091_2267 src: /192.168.158.9:35322 dest: /192.168.158.4:9866 2025-07-10 12:51:18,735 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:35322, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1083612968_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743091_2267, duration(ns): 19969830 2025-07-10 12:51:18,735 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743091_2267, type=LAST_IN_PIPELINE terminating 2025-07-10 12:51:22,546 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743091_2267 replica FinalizedReplica, blk_1073743091_2267, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743091 for deletion 2025-07-10 12:51:22,547 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743091_2267 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743091 2025-07-10 12:52:23,711 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743092_2268 src: /192.168.158.8:59020 dest: /192.168.158.4:9866 2025-07-10 12:52:23,729 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:59020, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2145635008_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743092_2268, duration(ns): 15417722 2025-07-10 12:52:23,729 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743092_2268, type=LAST_IN_PIPELINE terminating 2025-07-10 12:52:28,550 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743092_2268 replica FinalizedReplica, blk_1073743092_2268, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743092 for deletion 2025-07-10 12:52:28,551 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743092_2268 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743092 2025-07-10 12:55:28,684 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743095_2271 src: /192.168.158.1:49502 dest: /192.168.158.4:9866 2025-07-10 12:55:28,720 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49502, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1911962119_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743095_2271, duration(ns): 26076365 2025-07-10 12:55:28,721 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743095_2271, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-10 12:55:34,558 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743095_2271 replica FinalizedReplica, blk_1073743095_2271, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743095 for deletion 2025-07-10 12:55:34,560 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743095_2271 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743095 2025-07-10 12:56:33,709 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743096_2272 src: /192.168.158.1:35342 dest: /192.168.158.4:9866 2025-07-10 12:56:33,740 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35342, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2103722121_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743096_2272, duration(ns): 21902404 2025-07-10 12:56:33,741 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743096_2272, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-10 12:56:37,562 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743096_2272 replica FinalizedReplica, blk_1073743096_2272, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743096 for deletion 2025-07-10 12:56:37,563 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743096_2272 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743096 2025-07-10 12:58:33,714 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743098_2274 src: /192.168.158.9:47548 dest: /192.168.158.4:9866 2025-07-10 12:58:33,738 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:47548, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1050067203_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743098_2274, duration(ns): 18010976 2025-07-10 12:58:33,738 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743098_2274, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-10 12:58:37,569 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743098_2274 replica FinalizedReplica, blk_1073743098_2274, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743098 for deletion 2025-07-10 12:58:37,570 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743098_2274 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743098 2025-07-10 13:00:33,708 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743100_2276 src: /192.168.158.6:36760 dest: /192.168.158.4:9866 2025-07-10 13:00:33,732 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:36760, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1272536147_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743100_2276, duration(ns): 18661995 2025-07-10 13:00:33,732 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743100_2276, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-10 13:00:37,577 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743100_2276 replica FinalizedReplica, blk_1073743100_2276, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743100 for deletion 2025-07-10 13:00:37,578 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743100_2276 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743100 2025-07-10 13:02:33,736 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743102_2278 src: /192.168.158.9:41538 dest: /192.168.158.4:9866 2025-07-10 13:02:33,759 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:41538, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_656602245_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743102_2278, duration(ns): 17949278 2025-07-10 13:02:33,759 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743102_2278, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-10 13:02:37,584 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743102_2278 replica FinalizedReplica, blk_1073743102_2278, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743102 for deletion 2025-07-10 13:02:37,585 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743102_2278 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073743102 2025-07-10 13:06:38,761 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743106_2282 src: /192.168.158.1:56418 dest: /192.168.158.4:9866 2025-07-10 13:06:38,792 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56418, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-573800449_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743106_2282, duration(ns): 22384070 2025-07-10 13:06:38,793 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743106_2282, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-10 13:06:43,595 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743106_2282 replica FinalizedReplica, blk_1073743106_2282, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743106 for deletion 2025-07-10 13:06:43,596 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743106_2282 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743106 2025-07-10 13:08:38,746 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743108_2284 src: /192.168.158.6:32912 dest: /192.168.158.4:9866 2025-07-10 13:08:38,763 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:32912, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_710147737_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743108_2284, duration(ns): 15167115 2025-07-10 13:08:38,764 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743108_2284, type=LAST_IN_PIPELINE terminating 2025-07-10 13:08:43,599 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743108_2284 replica FinalizedReplica, blk_1073743108_2284, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743108 for deletion 2025-07-10 13:08:43,600 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743108_2284 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743108 2025-07-10 13:09:43,736 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743109_2285 src: /192.168.158.1:59902 dest: /192.168.158.4:9866 2025-07-10 13:09:43,766 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59902, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1484616710_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743109_2285, duration(ns): 20440042 2025-07-10 13:09:43,766 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743109_2285, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-10 13:09:49,601 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743109_2285 replica FinalizedReplica, blk_1073743109_2285, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743109 for deletion 2025-07-10 13:09:49,602 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743109_2285 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743109 2025-07-10 13:10:43,735 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743110_2286 src: /192.168.158.9:35504 dest: /192.168.158.4:9866 2025-07-10 13:10:43,757 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:35504, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_371155207_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743110_2286, duration(ns): 16638235 2025-07-10 13:10:43,757 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743110_2286, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-10 13:10:46,600 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743110_2286 replica FinalizedReplica, blk_1073743110_2286, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743110 for deletion 2025-07-10 13:10:46,601 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743110_2286 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743110 2025-07-10 13:17:53,753 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743117_2293 src: /192.168.158.8:48676 dest: /192.168.158.4:9866 2025-07-10 13:17:53,769 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:48676, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-780376986_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743117_2293, duration(ns): 13790022 2025-07-10 13:17:53,770 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743117_2293, type=LAST_IN_PIPELINE terminating 2025-07-10 13:17:58,621 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743117_2293 replica FinalizedReplica, blk_1073743117_2293, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743117 for deletion 2025-07-10 13:17:58,622 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743117_2293 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743117 2025-07-10 13:18:53,752 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743118_2294 src: /192.168.158.1:60184 dest: /192.168.158.4:9866 2025-07-10 13:18:53,783 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60184, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1409758203_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743118_2294, duration(ns): 21786608 2025-07-10 13:18:53,783 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743118_2294, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-10 13:19:01,625 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743118_2294 replica FinalizedReplica, blk_1073743118_2294, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743118 for deletion 2025-07-10 13:19:01,626 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743118_2294 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743118 2025-07-10 13:21:58,749 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743121_2297 src: /192.168.158.5:50474 dest: /192.168.158.4:9866 2025-07-10 13:21:58,774 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:50474, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_360156062_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743121_2297, duration(ns): 19779789 2025-07-10 13:21:58,775 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743121_2297, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-10 13:22:01,630 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743121_2297 replica FinalizedReplica, blk_1073743121_2297, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743121 for deletion 2025-07-10 13:22:01,631 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743121_2297 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743121 2025-07-10 13:22:58,740 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743122_2298 src: /192.168.158.8:46842 dest: /192.168.158.4:9866 2025-07-10 13:22:58,763 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:46842, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-37301249_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743122_2298, duration(ns): 17146344 2025-07-10 13:22:58,763 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743122_2298, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-10 13:23:01,631 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743122_2298 replica FinalizedReplica, blk_1073743122_2298, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743122 for deletion 2025-07-10 13:23:01,633 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743122_2298 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743122 2025-07-10 13:23:58,739 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743123_2299 src: /192.168.158.1:34054 dest: /192.168.158.4:9866 2025-07-10 13:23:58,769 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34054, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-947553421_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743123_2299, duration(ns): 20625501 2025-07-10 13:23:58,769 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743123_2299, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-10 13:24:01,635 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743123_2299 replica FinalizedReplica, blk_1073743123_2299, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743123 for deletion 2025-07-10 13:24:01,636 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743123_2299 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743123 2025-07-10 13:29:13,756 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743128_2304 src: /192.168.158.1:32998 dest: /192.168.158.4:9866 2025-07-10 13:29:13,786 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:32998, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-184197871_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743128_2304, duration(ns): 21770175 2025-07-10 13:29:13,787 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743128_2304, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-10 13:29:19,647 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743128_2304 replica FinalizedReplica, blk_1073743128_2304, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743128 for deletion 2025-07-10 13:29:19,649 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743128_2304 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743128 2025-07-10 13:31:18,788 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743130_2306 src: /192.168.158.8:53276 dest: /192.168.158.4:9866 2025-07-10 13:31:18,802 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:53276, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-85300020_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743130_2306, duration(ns): 12746463 2025-07-10 13:31:18,802 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743130_2306, type=LAST_IN_PIPELINE terminating 2025-07-10 13:31:22,653 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743130_2306 replica FinalizedReplica, blk_1073743130_2306, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743130 for deletion 2025-07-10 13:31:22,654 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743130_2306 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743130 2025-07-10 13:32:23,768 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743131_2307 src: /192.168.158.6:34558 dest: /192.168.158.4:9866 2025-07-10 13:32:23,786 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:34558, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1376657294_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743131_2307, duration(ns): 15826683 2025-07-10 13:32:23,787 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743131_2307, type=LAST_IN_PIPELINE terminating 2025-07-10 13:32:31,655 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743131_2307 replica FinalizedReplica, blk_1073743131_2307, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743131 for deletion 2025-07-10 13:32:31,656 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743131_2307 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743131 2025-07-10 13:35:28,768 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743134_2310 src: /192.168.158.9:51568 dest: /192.168.158.4:9866 2025-07-10 13:35:28,785 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:51568, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1648081513_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743134_2310, duration(ns): 15644507 2025-07-10 13:35:28,786 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743134_2310, type=LAST_IN_PIPELINE terminating 2025-07-10 13:35:31,663 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743134_2310 replica FinalizedReplica, blk_1073743134_2310, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743134 for deletion 2025-07-10 13:35:31,664 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743134_2310 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743134 2025-07-10 13:40:33,774 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743139_2315 src: /192.168.158.1:58460 dest: /192.168.158.4:9866 2025-07-10 13:40:33,804 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58460, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1419387093_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743139_2315, duration(ns): 20078343 2025-07-10 13:40:33,806 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743139_2315, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-10 13:40:37,676 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743139_2315 replica FinalizedReplica, blk_1073743139_2315, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743139 for deletion 2025-07-10 13:40:37,677 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743139_2315 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743139 2025-07-10 13:44:38,783 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743143_2319 src: /192.168.158.7:52920 dest: /192.168.158.4:9866 2025-07-10 13:44:38,805 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:52920, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1103511395_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743143_2319, duration(ns): 16348153 2025-07-10 13:44:38,805 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743143_2319, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-10 13:44:43,687 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743143_2319 replica FinalizedReplica, blk_1073743143_2319, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743143 for deletion 2025-07-10 13:44:43,688 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743143_2319 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743143 2025-07-10 13:46:38,788 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743145_2321 src: /192.168.158.9:46922 dest: /192.168.158.4:9866 2025-07-10 13:46:38,806 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:46922, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1586929873_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743145_2321, duration(ns): 14394418 2025-07-10 13:46:38,806 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743145_2321, type=LAST_IN_PIPELINE terminating 2025-07-10 13:46:43,694 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743145_2321 replica FinalizedReplica, blk_1073743145_2321, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743145 for deletion 2025-07-10 13:46:43,695 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743145_2321 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743145 2025-07-10 13:50:43,792 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743149_2325 src: /192.168.158.1:46528 dest: /192.168.158.4:9866 2025-07-10 13:50:43,821 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46528, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1714261644_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743149_2325, duration(ns): 20819507 2025-07-10 13:50:43,822 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743149_2325, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-10 13:50:49,702 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743149_2325 replica FinalizedReplica, blk_1073743149_2325, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743149 for deletion 2025-07-10 13:50:49,703 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743149_2325 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743149 2025-07-10 13:51:48,791 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743150_2326 src: /192.168.158.9:45998 dest: /192.168.158.4:9866 2025-07-10 13:51:48,815 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:45998, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2013400529_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743150_2326, duration(ns): 18467196 2025-07-10 13:51:48,815 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743150_2326, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-10 13:51:52,704 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743150_2326 replica FinalizedReplica, blk_1073743150_2326, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743150 for deletion 2025-07-10 13:51:52,706 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743150_2326 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743150 2025-07-10 13:56:58,815 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743155_2331 src: /192.168.158.7:48656 dest: /192.168.158.4:9866 2025-07-10 13:56:58,839 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:48656, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1687461377_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743155_2331, duration(ns): 18259403 2025-07-10 13:56:58,839 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743155_2331, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-10 13:57:01,716 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743155_2331 replica FinalizedReplica, blk_1073743155_2331, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743155 for deletion 2025-07-10 13:57:01,717 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743155_2331 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743155 2025-07-10 13:57:58,818 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743156_2332 src: /192.168.158.9:60406 dest: /192.168.158.4:9866 2025-07-10 13:57:58,840 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:60406, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-738730092_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743156_2332, duration(ns): 16601812 2025-07-10 13:57:58,840 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743156_2332, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-10 13:58:01,720 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743156_2332 replica FinalizedReplica, blk_1073743156_2332, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743156 for deletion 2025-07-10 13:58:01,722 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743156_2332 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743156 2025-07-10 14:00:03,818 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743158_2334 src: /192.168.158.6:42160 dest: /192.168.158.4:9866 2025-07-10 14:00:03,843 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:42160, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1060466398_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743158_2334, duration(ns): 18217725 2025-07-10 14:00:03,843 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743158_2334, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-10 14:00:07,729 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743158_2334 replica FinalizedReplica, blk_1073743158_2334, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743158 for deletion 2025-07-10 14:00:07,730 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743158_2334 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743158 2025-07-10 14:01:03,820 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743159_2335 src: /192.168.158.9:56748 dest: /192.168.158.4:9866 2025-07-10 14:01:03,842 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:56748, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1611969823_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743159_2335, duration(ns): 16826037 2025-07-10 14:01:03,843 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743159_2335, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-10 14:01:07,733 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743159_2335 replica FinalizedReplica, blk_1073743159_2335, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743159 for deletion 2025-07-10 14:01:07,734 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743159_2335 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743159 2025-07-10 14:04:13,815 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743162_2338 src: /192.168.158.1:58114 dest: /192.168.158.4:9866 2025-07-10 14:04:13,847 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58114, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1973381424_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743162_2338, duration(ns): 21766628 2025-07-10 14:04:13,848 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743162_2338, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-10 14:04:16,737 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743162_2338 replica FinalizedReplica, blk_1073743162_2338, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743162 for deletion 2025-07-10 14:04:16,738 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743162_2338 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743162 2025-07-10 14:06:18,822 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743164_2340 src: /192.168.158.7:34766 dest: /192.168.158.4:9866 2025-07-10 14:06:18,840 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:34766, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-859946131_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743164_2340, duration(ns): 15279410 2025-07-10 14:06:18,840 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743164_2340, type=LAST_IN_PIPELINE terminating 2025-07-10 14:06:22,744 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743164_2340 replica FinalizedReplica, blk_1073743164_2340, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743164 for deletion 2025-07-10 14:06:22,745 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743164_2340 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743164 2025-07-10 14:08:28,815 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743166_2342 src: /192.168.158.1:57166 dest: /192.168.158.4:9866 2025-07-10 14:08:28,848 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57166, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1619300588_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743166_2342, duration(ns): 23389552 2025-07-10 14:08:28,848 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743166_2342, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-10 14:08:31,748 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743166_2342 replica FinalizedReplica, blk_1073743166_2342, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743166 for deletion 2025-07-10 14:08:31,750 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743166_2342 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743166 2025-07-10 14:09:28,862 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743167_2343 src: /192.168.158.9:54132 dest: /192.168.158.4:9866 2025-07-10 14:09:28,879 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:54132, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_37718869_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743167_2343, duration(ns): 14699853 2025-07-10 14:09:28,879 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743167_2343, type=LAST_IN_PIPELINE terminating 2025-07-10 14:09:31,751 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743167_2343 replica FinalizedReplica, blk_1073743167_2343, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743167 for deletion 2025-07-10 14:09:31,752 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743167_2343 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743167 2025-07-10 14:10:28,853 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743168_2344 src: /192.168.158.6:32900 dest: /192.168.158.4:9866 2025-07-10 14:10:28,878 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:32900, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-296660585_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743168_2344, duration(ns): 19097605 2025-07-10 14:10:28,878 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743168_2344, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-10 14:10:31,752 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743168_2344 replica FinalizedReplica, blk_1073743168_2344, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743168 for deletion 2025-07-10 14:10:31,754 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743168_2344 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743168 2025-07-10 14:15:38,842 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743173_2349 src: /192.168.158.6:52256 dest: /192.168.158.4:9866 2025-07-10 14:15:38,859 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:52256, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-907770224_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743173_2349, duration(ns): 15331912 2025-07-10 14:15:38,860 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743173_2349, type=LAST_IN_PIPELINE terminating 2025-07-10 14:15:43,762 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743173_2349 replica FinalizedReplica, blk_1073743173_2349, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743173 for deletion 2025-07-10 14:15:43,763 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743173_2349 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743173 2025-07-10 14:19:43,854 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743177_2353 src: /192.168.158.7:47658 dest: /192.168.158.4:9866 2025-07-10 14:19:43,872 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:47658, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1422640877_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743177_2353, duration(ns): 15088517 2025-07-10 14:19:43,872 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743177_2353, type=LAST_IN_PIPELINE terminating 2025-07-10 14:19:46,770 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743177_2353 replica FinalizedReplica, blk_1073743177_2353, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743177 for deletion 2025-07-10 14:19:46,771 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743177_2353 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743177 2025-07-10 14:20:43,845 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743178_2354 src: /192.168.158.9:45480 dest: /192.168.158.4:9866 2025-07-10 14:20:43,870 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:45480, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1982158697_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743178_2354, duration(ns): 18848419 2025-07-10 14:20:43,870 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743178_2354, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-10 14:20:49,771 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743178_2354 replica FinalizedReplica, blk_1073743178_2354, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743178 for deletion 2025-07-10 14:20:49,772 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743178_2354 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743178 2025-07-10 14:23:48,880 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743181_2357 src: /192.168.158.1:46226 dest: /192.168.158.4:9866 2025-07-10 14:23:48,914 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46226, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-589776468_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743181_2357, duration(ns): 24720168 2025-07-10 14:23:48,914 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743181_2357, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-10 14:23:55,780 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743181_2357 replica FinalizedReplica, blk_1073743181_2357, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743181 for deletion 2025-07-10 14:23:55,781 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743181_2357 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743181 2025-07-10 14:25:48,857 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743183_2359 src: /192.168.158.1:48668 dest: /192.168.158.4:9866 2025-07-10 14:25:48,893 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48668, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-982822648_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743183_2359, duration(ns): 17004638 2025-07-10 14:25:48,893 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743183_2359, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-10 14:25:52,783 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743183_2359 replica FinalizedReplica, blk_1073743183_2359, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743183 for deletion 2025-07-10 14:25:52,785 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743183_2359 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743183 2025-07-10 14:29:58,862 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743187_2363 src: /192.168.158.8:58712 dest: /192.168.158.4:9866 2025-07-10 14:29:58,888 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:58712, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1753490916_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743187_2363, duration(ns): 20229025 2025-07-10 14:29:58,888 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743187_2363, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-10 14:30:04,792 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743187_2363 replica FinalizedReplica, blk_1073743187_2363, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743187 for deletion 2025-07-10 14:30:04,793 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743187_2363 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743187 2025-07-10 14:32:03,872 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743189_2365 src: /192.168.158.8:60846 dest: /192.168.158.4:9866 2025-07-10 14:32:03,896 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:60846, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_798356265_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743189_2365, duration(ns): 17960335 2025-07-10 14:32:03,896 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743189_2365, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-10 14:32:07,798 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743189_2365 replica FinalizedReplica, blk_1073743189_2365, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743189 for deletion 2025-07-10 14:32:07,799 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743189_2365 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743189 2025-07-10 14:37:13,881 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743194_2370 src: /192.168.158.9:57844 dest: /192.168.158.4:9866 2025-07-10 14:37:13,903 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:57844, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-470441578_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743194_2370, duration(ns): 16511232 2025-07-10 14:37:13,903 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743194_2370, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-10 14:37:19,813 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743194_2370 replica FinalizedReplica, blk_1073743194_2370, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743194 for deletion 2025-07-10 14:37:19,814 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743194_2370 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743194 2025-07-10 14:40:18,873 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743197_2373 src: /192.168.158.1:38584 dest: /192.168.158.4:9866 2025-07-10 14:40:18,931 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38584, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1083189215_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743197_2373, duration(ns): 47037781 2025-07-10 14:40:18,932 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743197_2373, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-10 14:40:22,820 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743197_2373 replica FinalizedReplica, blk_1073743197_2373, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743197 for deletion 2025-07-10 14:40:22,821 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743197_2373 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743197 2025-07-10 14:46:23,906 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743203_2379 src: /192.168.158.6:52488 dest: /192.168.158.4:9866 2025-07-10 14:46:23,932 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:52488, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1637277775_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743203_2379, duration(ns): 20696882 2025-07-10 14:46:23,933 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743203_2379, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-10 14:46:28,835 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743203_2379 replica FinalizedReplica, blk_1073743203_2379, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743203 for deletion 2025-07-10 14:46:28,837 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743203_2379 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743203 2025-07-10 14:47:23,927 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743204_2380 src: /192.168.158.8:47336 dest: /192.168.158.4:9866 2025-07-10 14:47:23,944 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:47336, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1758764410_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743204_2380, duration(ns): 15509446 2025-07-10 14:47:23,945 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743204_2380, type=LAST_IN_PIPELINE terminating 2025-07-10 14:47:28,838 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743204_2380 replica FinalizedReplica, blk_1073743204_2380, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743204 for deletion 2025-07-10 14:47:28,839 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743204_2380 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743204 2025-07-10 14:49:28,902 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743206_2382 src: /192.168.158.6:49788 dest: /192.168.158.4:9866 2025-07-10 14:49:28,921 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:49788, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2100704212_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743206_2382, duration(ns): 16686967 2025-07-10 14:49:28,921 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743206_2382, type=LAST_IN_PIPELINE terminating 2025-07-10 14:49:34,841 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743206_2382 replica FinalizedReplica, blk_1073743206_2382, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743206 for deletion 2025-07-10 14:49:34,842 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743206_2382 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743206 2025-07-10 14:51:38,897 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743208_2384 src: /192.168.158.1:56138 dest: /192.168.158.4:9866 2025-07-10 14:51:38,927 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56138, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_390338311_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743208_2384, duration(ns): 20866291 2025-07-10 14:51:38,927 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743208_2384, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-10 14:51:43,847 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743208_2384 replica FinalizedReplica, blk_1073743208_2384, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743208 for deletion 2025-07-10 14:51:43,848 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743208_2384 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743208 2025-07-10 14:52:38,892 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743209_2385 src: /192.168.158.1:36886 dest: /192.168.158.4:9866 2025-07-10 14:52:38,923 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36886, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-790869231_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743209_2385, duration(ns): 21592693 2025-07-10 14:52:38,923 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743209_2385, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-10 14:52:43,848 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743209_2385 replica FinalizedReplica, blk_1073743209_2385, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743209 for deletion 2025-07-10 14:52:43,850 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743209_2385 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743209 2025-07-10 14:55:43,918 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743212_2388 src: /192.168.158.8:58830 dest: /192.168.158.4:9866 2025-07-10 14:55:43,943 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:58830, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1561978724_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743212_2388, duration(ns): 19352528 2025-07-10 14:55:43,943 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743212_2388, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-10 14:55:46,855 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743212_2388 replica FinalizedReplica, blk_1073743212_2388, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743212 for deletion 2025-07-10 14:55:46,856 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743212_2388 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743212 2025-07-10 14:56:43,930 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743213_2389 src: /192.168.158.8:50334 dest: /192.168.158.4:9866 2025-07-10 14:56:43,956 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:50334, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-79460732_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743213_2389, duration(ns): 19737913 2025-07-10 14:56:43,956 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743213_2389, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-10 14:56:46,858 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743213_2389 replica FinalizedReplica, blk_1073743213_2389, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743213 for deletion 2025-07-10 14:56:46,859 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743213_2389 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743213 2025-07-10 14:58:43,926 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743215_2391 src: /192.168.158.5:56830 dest: /192.168.158.4:9866 2025-07-10 14:58:43,944 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:56830, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1221976499_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743215_2391, duration(ns): 15694546 2025-07-10 14:58:43,944 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743215_2391, type=LAST_IN_PIPELINE terminating 2025-07-10 14:58:46,861 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743215_2391 replica FinalizedReplica, blk_1073743215_2391, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743215 for deletion 2025-07-10 14:58:46,862 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743215_2391 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743215 2025-07-10 14:59:43,912 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743216_2392 src: /192.168.158.6:44712 dest: /192.168.158.4:9866 2025-07-10 14:59:43,938 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:44712, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2127281557_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743216_2392, duration(ns): 20094630 2025-07-10 14:59:43,938 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743216_2392, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-10 14:59:46,866 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743216_2392 replica FinalizedReplica, blk_1073743216_2392, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743216 for deletion 2025-07-10 14:59:46,867 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743216_2392 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743216 2025-07-10 15:00:43,912 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743217_2393 src: /192.168.158.5:59476 dest: /192.168.158.4:9866 2025-07-10 15:00:43,936 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:59476, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1821162526_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743217_2393, duration(ns): 18020064 2025-07-10 15:00:43,936 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743217_2393, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-10 15:00:49,866 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743217_2393 replica FinalizedReplica, blk_1073743217_2393, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743217 for deletion 2025-07-10 15:00:49,867 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743217_2393 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743217 2025-07-10 15:01:43,921 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743218_2394 src: /192.168.158.8:60132 dest: /192.168.158.4:9866 2025-07-10 15:01:43,946 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:60132, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-320527490_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743218_2394, duration(ns): 19613074 2025-07-10 15:01:43,946 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743218_2394, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-10 15:01:46,870 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743218_2394 replica FinalizedReplica, blk_1073743218_2394, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743218 for deletion 2025-07-10 15:01:46,871 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743218_2394 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743218 2025-07-10 15:03:43,918 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743220_2396 src: /192.168.158.7:50160 dest: /192.168.158.4:9866 2025-07-10 15:03:43,935 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:50160, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1060556621_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743220_2396, duration(ns): 14844138 2025-07-10 15:03:43,935 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743220_2396, type=LAST_IN_PIPELINE terminating 2025-07-10 15:03:46,879 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743220_2396 replica FinalizedReplica, blk_1073743220_2396, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743220 for deletion 2025-07-10 15:03:46,880 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743220_2396 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743220 2025-07-10 15:05:43,939 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743222_2398 src: /192.168.158.1:34850 dest: /192.168.158.4:9866 2025-07-10 15:05:43,970 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34850, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1222409408_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743222_2398, duration(ns): 21460822 2025-07-10 15:05:43,971 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743222_2398, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-10 15:05:49,884 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743222_2398 replica FinalizedReplica, blk_1073743222_2398, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743222 for deletion 2025-07-10 15:05:49,886 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743222_2398 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743222 2025-07-10 15:07:53,938 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743224_2400 src: /192.168.158.6:35666 dest: /192.168.158.4:9866 2025-07-10 15:07:53,956 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:35666, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_498553580_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743224_2400, duration(ns): 16143453 2025-07-10 15:07:53,956 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743224_2400, type=LAST_IN_PIPELINE terminating 2025-07-10 15:07:58,888 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743224_2400 replica FinalizedReplica, blk_1073743224_2400, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743224 for deletion 2025-07-10 15:07:58,889 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743224_2400 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743224 2025-07-10 15:11:53,926 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743228_2404 src: /192.168.158.5:33994 dest: /192.168.158.4:9866 2025-07-10 15:11:53,950 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:33994, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_199126928_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743228_2404, duration(ns): 18421069 2025-07-10 15:11:53,950 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743228_2404, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-10 15:11:58,895 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743228_2404 replica FinalizedReplica, blk_1073743228_2404, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743228 for deletion 2025-07-10 15:11:58,896 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743228_2404 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743228 2025-07-10 15:12:53,925 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743229_2405 src: /192.168.158.6:39212 dest: /192.168.158.4:9866 2025-07-10 15:12:53,943 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:39212, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1520321443_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743229_2405, duration(ns): 16135449 2025-07-10 15:12:53,944 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743229_2405, type=LAST_IN_PIPELINE terminating 2025-07-10 15:13:01,898 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743229_2405 replica FinalizedReplica, blk_1073743229_2405, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743229 for deletion 2025-07-10 15:13:01,899 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743229_2405 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743229 2025-07-10 15:13:53,934 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743230_2406 src: /192.168.158.5:38930 dest: /192.168.158.4:9866 2025-07-10 15:13:53,953 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:38930, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_782431722_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743230_2406, duration(ns): 16186080 2025-07-10 15:13:53,953 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743230_2406, type=LAST_IN_PIPELINE terminating 2025-07-10 15:14:01,901 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743230_2406 replica FinalizedReplica, blk_1073743230_2406, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743230 for deletion 2025-07-10 15:14:01,902 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743230_2406 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743230 2025-07-10 15:15:53,924 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743232_2408 src: /192.168.158.1:38926 dest: /192.168.158.4:9866 2025-07-10 15:15:53,957 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38926, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_390918929_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743232_2408, duration(ns): 23021508 2025-07-10 15:15:53,957 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743232_2408, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-10 15:15:58,905 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743232_2408 replica FinalizedReplica, blk_1073743232_2408, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743232 for deletion 2025-07-10 15:15:58,907 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743232_2408 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743232 2025-07-10 15:16:45,187 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743233_2409 src: /192.168.158.6:39544 dest: /192.168.158.4:9866 2025-07-10 15:16:45,254 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:39544, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1256593597_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743233_2409, duration(ns): 60522195 2025-07-10 15:16:45,254 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743233_2409, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-10 15:16:49,908 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743233_2409 replica FinalizedReplica, blk_1073743233_2409, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743233 for deletion 2025-07-10 15:16:49,910 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743233_2409 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743233 2025-07-10 15:17:45,774 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743234_2410 src: /192.168.158.1:33170 dest: /192.168.158.4:9866 2025-07-10 15:17:45,812 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33170, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2110482677_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743234_2410, duration(ns): 28458111 2025-07-10 15:17:45,812 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743234_2410, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-10 15:17:52,909 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743234_2410 replica FinalizedReplica, blk_1073743234_2410, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743234 for deletion 2025-07-10 15:17:52,910 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743234_2410 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743234 2025-07-10 15:18:50,765 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743235_2411 src: /192.168.158.9:51016 dest: /192.168.158.4:9866 2025-07-10 15:18:50,790 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:51016, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1537058515_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743235_2411, duration(ns): 19629834 2025-07-10 15:18:50,791 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743235_2411, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-10 15:18:58,913 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743235_2411 replica FinalizedReplica, blk_1073743235_2411, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743235 for deletion 2025-07-10 15:18:58,914 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743235_2411 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743235 2025-07-10 15:19:50,795 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743236_2412 src: /192.168.158.8:57018 dest: /192.168.158.4:9866 2025-07-10 15:19:50,819 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:57018, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1798345156_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743236_2412, duration(ns): 17828108 2025-07-10 15:19:50,819 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743236_2412, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-10 15:19:55,914 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743236_2412 replica FinalizedReplica, blk_1073743236_2412, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743236 for deletion 2025-07-10 15:19:55,915 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743236_2412 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743236 2025-07-10 15:23:00,756 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743239_2415 src: /192.168.158.9:48524 dest: /192.168.158.4:9866 2025-07-10 15:23:00,777 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:48524, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2048311421_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743239_2415, duration(ns): 18966997 2025-07-10 15:23:00,777 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743239_2415, type=LAST_IN_PIPELINE terminating 2025-07-10 15:23:07,916 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743239_2415 replica FinalizedReplica, blk_1073743239_2415, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743239 for deletion 2025-07-10 15:23:07,917 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743239_2415 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743239 2025-07-10 15:24:05,743 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743240_2416 src: /192.168.158.1:56688 dest: /192.168.158.4:9866 2025-07-10 15:24:05,775 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56688, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2057136452_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743240_2416, duration(ns): 22296952 2025-07-10 15:24:05,775 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743240_2416, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-10 15:24:10,918 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743240_2416 replica FinalizedReplica, blk_1073743240_2416, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743240 for deletion 2025-07-10 15:24:10,919 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743240_2416 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743240 2025-07-10 15:27:10,758 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743243_2419 src: /192.168.158.7:45836 dest: /192.168.158.4:9866 2025-07-10 15:27:10,784 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:45836, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2054509100_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743243_2419, duration(ns): 20938438 2025-07-10 15:27:10,785 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743243_2419, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-10 15:27:13,925 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743243_2419 replica FinalizedReplica, blk_1073743243_2419, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743243 for deletion 2025-07-10 15:27:13,926 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743243_2419 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743243 2025-07-10 15:31:15,754 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743247_2423 src: /192.168.158.6:41692 dest: /192.168.158.4:9866 2025-07-10 15:31:15,773 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:41692, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_425951037_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743247_2423, duration(ns): 16051721 2025-07-10 15:31:15,774 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743247_2423, type=LAST_IN_PIPELINE terminating 2025-07-10 15:31:19,933 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743247_2423 replica FinalizedReplica, blk_1073743247_2423, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743247 for deletion 2025-07-10 15:31:19,934 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743247_2423 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743247 2025-07-10 15:34:15,791 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743250_2426 src: /192.168.158.1:48770 dest: /192.168.158.4:9866 2025-07-10 15:34:15,822 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48770, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-33014224_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743250_2426, duration(ns): 21621248 2025-07-10 15:34:15,822 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743250_2426, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-10 15:34:22,936 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743250_2426 replica FinalizedReplica, blk_1073743250_2426, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743250 for deletion 2025-07-10 15:34:22,937 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743250_2426 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743250 2025-07-10 15:35:15,793 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743251_2427 src: /192.168.158.7:57328 dest: /192.168.158.4:9866 2025-07-10 15:35:15,811 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:57328, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_272775797_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743251_2427, duration(ns): 15561064 2025-07-10 15:35:15,811 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743251_2427, type=LAST_IN_PIPELINE terminating 2025-07-10 15:35:19,938 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743251_2427 replica FinalizedReplica, blk_1073743251_2427, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743251 for deletion 2025-07-10 15:35:19,939 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743251_2427 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743251 2025-07-10 15:40:20,778 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743256_2432 src: /192.168.158.7:41632 dest: /192.168.158.4:9866 2025-07-10 15:40:20,798 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:41632, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-465911299_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743256_2432, duration(ns): 18450954 2025-07-10 15:40:20,799 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743256_2432, type=LAST_IN_PIPELINE terminating 2025-07-10 15:40:28,951 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743256_2432 replica FinalizedReplica, blk_1073743256_2432, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743256 for deletion 2025-07-10 15:40:28,952 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743256_2432 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743256 2025-07-10 15:43:25,777 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743259_2435 src: /192.168.158.1:50972 dest: /192.168.158.4:9866 2025-07-10 15:43:25,812 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50972, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1253844259_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743259_2435, duration(ns): 26224314 2025-07-10 15:43:25,812 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743259_2435, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-10 15:43:28,958 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743259_2435 replica FinalizedReplica, blk_1073743259_2435, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743259 for deletion 2025-07-10 15:43:28,959 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743259_2435 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743259 2025-07-10 15:46:25,796 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743262_2438 src: /192.168.158.5:52488 dest: /192.168.158.4:9866 2025-07-10 15:46:25,814 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:52488, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-760453643_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743262_2438, duration(ns): 15619658 2025-07-10 15:46:25,814 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743262_2438, type=LAST_IN_PIPELINE terminating 2025-07-10 15:46:28,966 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743262_2438 replica FinalizedReplica, blk_1073743262_2438, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743262 for deletion 2025-07-10 15:46:28,967 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743262_2438 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743262 2025-07-10 15:48:30,785 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743264_2440 src: /192.168.158.8:45740 dest: /192.168.158.4:9866 2025-07-10 15:48:30,803 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:45740, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1496070379_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743264_2440, duration(ns): 15717370 2025-07-10 15:48:30,803 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743264_2440, type=LAST_IN_PIPELINE terminating 2025-07-10 15:48:37,972 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743264_2440 replica FinalizedReplica, blk_1073743264_2440, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743264 for deletion 2025-07-10 15:48:37,973 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743264_2440 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743264 2025-07-10 15:49:30,793 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743265_2441 src: /192.168.158.1:37036 dest: /192.168.158.4:9866 2025-07-10 15:49:30,824 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37036, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-377762_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743265_2441, duration(ns): 21228220 2025-07-10 15:49:30,824 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743265_2441, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-10 15:49:37,976 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743265_2441 replica FinalizedReplica, blk_1073743265_2441, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743265 for deletion 2025-07-10 15:49:37,977 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743265_2441 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743265 2025-07-10 15:53:35,791 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743269_2445 src: /192.168.158.1:40800 dest: /192.168.158.4:9866 2025-07-10 15:53:35,822 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40800, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1224803034_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743269_2445, duration(ns): 22325422 2025-07-10 15:53:35,822 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743269_2445, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-10 15:53:40,982 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743269_2445 replica FinalizedReplica, blk_1073743269_2445, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743269 for deletion 2025-07-10 15:53:40,983 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743269_2445 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743269 2025-07-10 15:56:40,811 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743272_2448 src: /192.168.158.8:50240 dest: /192.168.158.4:9866 2025-07-10 15:56:40,836 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:50240, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-25427623_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743272_2448, duration(ns): 19758568 2025-07-10 15:56:40,836 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743272_2448, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-10 15:56:46,988 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743272_2448 replica FinalizedReplica, blk_1073743272_2448, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743272 for deletion 2025-07-10 15:56:46,989 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743272_2448 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743272 2025-07-10 15:58:50,791 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743274_2450 src: /192.168.158.8:41146 dest: /192.168.158.4:9866 2025-07-10 15:58:50,816 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:41146, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_926673808_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743274_2450, duration(ns): 19226330 2025-07-10 15:58:50,816 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743274_2450, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-10 15:58:58,992 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743274_2450 replica FinalizedReplica, blk_1073743274_2450, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743274 for deletion 2025-07-10 15:58:58,993 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743274_2450 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743274 2025-07-10 15:59:50,799 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743275_2451 src: /192.168.158.1:44654 dest: /192.168.158.4:9866 2025-07-10 15:59:50,833 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44654, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-80800513_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743275_2451, duration(ns): 23720984 2025-07-10 15:59:50,833 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743275_2451, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-10 15:59:58,994 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743275_2451 replica FinalizedReplica, blk_1073743275_2451, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743275 for deletion 2025-07-10 15:59:58,996 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743275_2451 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743275 2025-07-10 16:00:55,809 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743276_2452 src: /192.168.158.5:34232 dest: /192.168.158.4:9866 2025-07-10 16:00:55,833 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:34232, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1352674197_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743276_2452, duration(ns): 18521301 2025-07-10 16:00:55,833 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743276_2452, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-10 16:01:01,999 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743276_2452 replica FinalizedReplica, blk_1073743276_2452, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743276 for deletion 2025-07-10 16:01:02,000 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743276_2452 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743276 2025-07-10 16:06:10,807 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743281_2457 src: /192.168.158.1:39712 dest: /192.168.158.4:9866 2025-07-10 16:06:10,838 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39712, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-871235249_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743281_2457, duration(ns): 21534626 2025-07-10 16:06:10,838 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743281_2457, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-10 16:06:17,012 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743281_2457 replica FinalizedReplica, blk_1073743281_2457, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743281 for deletion 2025-07-10 16:06:17,014 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743281_2457 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743281 2025-07-10 16:08:10,794 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743283_2459 src: /192.168.158.1:49312 dest: /192.168.158.4:9866 2025-07-10 16:08:10,828 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49312, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1139298703_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743283_2459, duration(ns): 24788228 2025-07-10 16:08:10,829 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743283_2459, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-10 16:08:14,017 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743283_2459 replica FinalizedReplica, blk_1073743283_2459, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743283 for deletion 2025-07-10 16:08:14,018 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743283_2459 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743283 2025-07-10 16:09:10,838 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743284_2460 src: /192.168.158.7:57698 dest: /192.168.158.4:9866 2025-07-10 16:09:10,858 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:57698, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1500803814_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743284_2460, duration(ns): 18343914 2025-07-10 16:09:10,859 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743284_2460, type=LAST_IN_PIPELINE terminating 2025-07-10 16:09:17,018 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743284_2460 replica FinalizedReplica, blk_1073743284_2460, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743284 for deletion 2025-07-10 16:09:17,019 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743284_2460 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743284 2025-07-10 16:10:10,803 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743285_2461 src: /192.168.158.1:49294 dest: /192.168.158.4:9866 2025-07-10 16:10:10,833 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49294, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_269174192_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743285_2461, duration(ns): 20814239 2025-07-10 16:10:10,834 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743285_2461, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-10 16:10:14,019 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743285_2461 replica FinalizedReplica, blk_1073743285_2461, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743285 for deletion 2025-07-10 16:10:14,020 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743285_2461 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743285 2025-07-10 16:11:10,809 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743286_2462 src: /192.168.158.6:38930 dest: /192.168.158.4:9866 2025-07-10 16:11:10,827 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:38930, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1207028002_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743286_2462, duration(ns): 15901718 2025-07-10 16:11:10,828 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743286_2462, type=LAST_IN_PIPELINE terminating 2025-07-10 16:11:14,024 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743286_2462 replica FinalizedReplica, blk_1073743286_2462, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743286 for deletion 2025-07-10 16:11:14,025 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743286_2462 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743286 2025-07-10 16:12:10,822 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743287_2463 src: /192.168.158.6:38776 dest: /192.168.158.4:9866 2025-07-10 16:12:10,840 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:38776, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_865596472_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743287_2463, duration(ns): 15983752 2025-07-10 16:12:10,841 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743287_2463, type=LAST_IN_PIPELINE terminating 2025-07-10 16:12:17,029 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743287_2463 replica FinalizedReplica, blk_1073743287_2463, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743287 for deletion 2025-07-10 16:12:17,031 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743287_2463 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743287 2025-07-10 16:13:10,812 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743288_2464 src: /192.168.158.8:58400 dest: /192.168.158.4:9866 2025-07-10 16:13:10,839 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:58400, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-932027721_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743288_2464, duration(ns): 21008012 2025-07-10 16:13:10,839 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743288_2464, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-10 16:13:14,031 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743288_2464 replica FinalizedReplica, blk_1073743288_2464, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743288 for deletion 2025-07-10 16:13:14,033 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743288_2464 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743288 2025-07-10 16:14:10,811 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743289_2465 src: /192.168.158.7:45550 dest: /192.168.158.4:9866 2025-07-10 16:14:10,829 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:45550, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-12233509_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743289_2465, duration(ns): 15938740 2025-07-10 16:14:10,830 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743289_2465, type=LAST_IN_PIPELINE terminating 2025-07-10 16:14:17,033 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743289_2465 replica FinalizedReplica, blk_1073743289_2465, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743289 for deletion 2025-07-10 16:14:17,035 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743289_2465 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743289 2025-07-10 16:15:10,812 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743290_2466 src: /192.168.158.9:32802 dest: /192.168.158.4:9866 2025-07-10 16:15:10,836 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:32802, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2097198465_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743290_2466, duration(ns): 18166927 2025-07-10 16:15:10,836 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743290_2466, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-10 16:15:14,033 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743290_2466 replica FinalizedReplica, blk_1073743290_2466, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743290 for deletion 2025-07-10 16:15:14,034 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743290_2466 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743290 2025-07-10 16:17:15,828 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743292_2468 src: /192.168.158.8:51478 dest: /192.168.158.4:9866 2025-07-10 16:17:15,847 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:51478, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1571617182_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743292_2468, duration(ns): 16393444 2025-07-10 16:17:15,847 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743292_2468, type=LAST_IN_PIPELINE terminating 2025-07-10 16:17:20,038 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743292_2468 replica FinalizedReplica, blk_1073743292_2468, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743292 for deletion 2025-07-10 16:17:20,039 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743292_2468 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743292 2025-07-10 16:18:20,820 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743293_2469 src: /192.168.158.8:51894 dest: /192.168.158.4:9866 2025-07-10 16:18:20,839 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:51894, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-880392174_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743293_2469, duration(ns): 16763425 2025-07-10 16:18:20,839 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743293_2469, type=LAST_IN_PIPELINE terminating 2025-07-10 16:18:26,042 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743293_2469 replica FinalizedReplica, blk_1073743293_2469, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743293 for deletion 2025-07-10 16:18:26,043 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743293_2469 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743293 2025-07-10 16:19:20,826 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743294_2470 src: /192.168.158.1:53484 dest: /192.168.158.4:9866 2025-07-10 16:19:20,857 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53484, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_372058673_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743294_2470, duration(ns): 22064526 2025-07-10 16:19:20,857 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743294_2470, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-10 16:19:26,046 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743294_2470 replica FinalizedReplica, blk_1073743294_2470, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743294 for deletion 2025-07-10 16:19:26,047 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743294_2470 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743294 2025-07-10 16:20:20,827 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743295_2471 src: /192.168.158.5:46608 dest: /192.168.158.4:9866 2025-07-10 16:20:20,843 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:46608, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_625543400_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743295_2471, duration(ns): 14767548 2025-07-10 16:20:20,844 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743295_2471, type=LAST_IN_PIPELINE terminating 2025-07-10 16:20:23,047 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743295_2471 replica FinalizedReplica, blk_1073743295_2471, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743295 for deletion 2025-07-10 16:20:23,048 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743295_2471 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743295 2025-07-10 16:21:25,849 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743296_2472 src: /192.168.158.7:58642 dest: /192.168.158.4:9866 2025-07-10 16:21:25,868 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:58642, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1919881845_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743296_2472, duration(ns): 16303138 2025-07-10 16:21:25,868 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743296_2472, type=LAST_IN_PIPELINE terminating 2025-07-10 16:21:29,050 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743296_2472 replica FinalizedReplica, blk_1073743296_2472, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743296 for deletion 2025-07-10 16:21:29,051 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743296_2472 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743296 2025-07-10 16:22:25,863 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743297_2473 src: /192.168.158.1:60732 dest: /192.168.158.4:9866 2025-07-10 16:22:25,894 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60732, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-964171319_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743297_2473, duration(ns): 21945679 2025-07-10 16:22:25,895 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743297_2473, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-10 16:22:29,051 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743297_2473 replica FinalizedReplica, blk_1073743297_2473, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743297 for deletion 2025-07-10 16:22:29,053 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743297_2473 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743297 2025-07-10 16:23:30,848 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743298_2474 src: /192.168.158.6:49974 dest: /192.168.158.4:9866 2025-07-10 16:23:30,867 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:49974, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1312220191_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743298_2474, duration(ns): 16276986 2025-07-10 16:23:30,867 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743298_2474, type=LAST_IN_PIPELINE terminating 2025-07-10 16:23:35,052 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743298_2474 replica FinalizedReplica, blk_1073743298_2474, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743298 for deletion 2025-07-10 16:23:35,053 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743298_2474 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743298 2025-07-10 16:24:30,844 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743299_2475 src: /192.168.158.8:38806 dest: /192.168.158.4:9866 2025-07-10 16:24:30,868 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:38806, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1136029863_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743299_2475, duration(ns): 18413085 2025-07-10 16:24:30,868 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743299_2475, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-10 16:24:38,055 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743299_2475 replica FinalizedReplica, blk_1073743299_2475, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743299 for deletion 2025-07-10 16:24:38,056 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743299_2475 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743299 2025-07-10 16:25:30,863 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743300_2476 src: /192.168.158.5:52684 dest: /192.168.158.4:9866 2025-07-10 16:25:30,886 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:52684, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1392084205_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743300_2476, duration(ns): 17259968 2025-07-10 16:25:30,887 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743300_2476, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-10 16:25:35,058 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743300_2476 replica FinalizedReplica, blk_1073743300_2476, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743300 for deletion 2025-07-10 16:25:35,060 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743300_2476 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743300 2025-07-10 16:26:30,850 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743301_2477 src: /192.168.158.8:55370 dest: /192.168.158.4:9866 2025-07-10 16:26:30,874 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:55370, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1274984955_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743301_2477, duration(ns): 18094441 2025-07-10 16:26:30,874 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743301_2477, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-10 16:26:35,059 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743301_2477 replica FinalizedReplica, blk_1073743301_2477, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743301 for deletion 2025-07-10 16:26:35,060 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743301_2477 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743301 2025-07-10 16:29:40,855 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743304_2480 src: /192.168.158.5:47662 dest: /192.168.158.4:9866 2025-07-10 16:29:40,879 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:47662, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_801265713_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743304_2480, duration(ns): 18378353 2025-07-10 16:29:40,880 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743304_2480, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-10 16:29:47,066 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743304_2480 replica FinalizedReplica, blk_1073743304_2480, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743304 for deletion 2025-07-10 16:29:47,067 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743304_2480 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743304 2025-07-10 16:31:40,886 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743306_2482 src: /192.168.158.7:60006 dest: /192.168.158.4:9866 2025-07-10 16:31:40,910 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:60006, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1914303689_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743306_2482, duration(ns): 18255042 2025-07-10 16:31:40,910 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743306_2482, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-10 16:31:44,072 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743306_2482 replica FinalizedReplica, blk_1073743306_2482, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743306 for deletion 2025-07-10 16:31:44,073 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743306_2482 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743306 2025-07-10 16:34:45,852 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743309_2485 src: /192.168.158.1:46492 dest: /192.168.158.4:9866 2025-07-10 16:34:45,884 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46492, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_241951865_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743309_2485, duration(ns): 22469945 2025-07-10 16:34:45,884 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743309_2485, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-10 16:34:50,080 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743309_2485 replica FinalizedReplica, blk_1073743309_2485, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743309 for deletion 2025-07-10 16:34:50,082 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743309_2485 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743309 2025-07-10 16:37:50,886 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743312_2488 src: /192.168.158.1:38610 dest: /192.168.158.4:9866 2025-07-10 16:37:50,920 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38610, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_742053328_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743312_2488, duration(ns): 24975860 2025-07-10 16:37:50,921 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743312_2488, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-10 16:37:56,088 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743312_2488 replica FinalizedReplica, blk_1073743312_2488, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743312 for deletion 2025-07-10 16:37:56,089 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743312_2488 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743312 2025-07-10 16:38:50,884 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743313_2489 src: /192.168.158.1:46412 dest: /192.168.158.4:9866 2025-07-10 16:38:50,915 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46412, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1947931897_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743313_2489, duration(ns): 21463101 2025-07-10 16:38:50,915 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743313_2489, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-10 16:38:59,090 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743313_2489 replica FinalizedReplica, blk_1073743313_2489, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743313 for deletion 2025-07-10 16:38:59,091 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743313_2489 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743313 2025-07-10 16:41:55,893 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743316_2492 src: /192.168.158.1:54780 dest: /192.168.158.4:9866 2025-07-10 16:41:55,926 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54780, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2141876515_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743316_2492, duration(ns): 23862633 2025-07-10 16:41:55,927 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743316_2492, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-10 16:41:59,097 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743316_2492 replica FinalizedReplica, blk_1073743316_2492, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743316 for deletion 2025-07-10 16:41:59,098 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743316_2492 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743316 2025-07-10 16:47:00,892 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743321_2497 src: /192.168.158.6:40826 dest: /192.168.158.4:9866 2025-07-10 16:47:00,907 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:40826, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_482754156_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743321_2497, duration(ns): 12542692 2025-07-10 16:47:00,907 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743321_2497, type=LAST_IN_PIPELINE terminating 2025-07-10 16:47:08,117 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743321_2497 replica FinalizedReplica, blk_1073743321_2497, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743321 for deletion 2025-07-10 16:47:08,118 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743321_2497 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743321 2025-07-10 16:48:00,899 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743322_2498 src: /192.168.158.1:47500 dest: /192.168.158.4:9866 2025-07-10 16:48:00,930 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47500, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1458979043_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743322_2498, duration(ns): 21495001 2025-07-10 16:48:00,930 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743322_2498, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-10 16:48:05,119 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743322_2498 replica FinalizedReplica, blk_1073743322_2498, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743322 for deletion 2025-07-10 16:48:05,120 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743322_2498 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743322 2025-07-10 16:49:05,895 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743323_2499 src: /192.168.158.1:52334 dest: /192.168.158.4:9866 2025-07-10 16:49:05,927 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52334, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1665218456_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743323_2499, duration(ns): 23544973 2025-07-10 16:49:05,928 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743323_2499, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-10 16:49:08,122 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743323_2499 replica FinalizedReplica, blk_1073743323_2499, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743323 for deletion 2025-07-10 16:49:08,123 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743323_2499 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743323 2025-07-10 16:52:05,903 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743326_2502 src: /192.168.158.5:52548 dest: /192.168.158.4:9866 2025-07-10 16:52:05,920 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:52548, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1265420319_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743326_2502, duration(ns): 15687422 2025-07-10 16:52:05,921 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743326_2502, type=LAST_IN_PIPELINE terminating 2025-07-10 16:52:08,130 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743326_2502 replica FinalizedReplica, blk_1073743326_2502, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743326 for deletion 2025-07-10 16:52:08,131 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743326_2502 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743326 2025-07-10 16:53:05,909 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743327_2503 src: /192.168.158.6:56294 dest: /192.168.158.4:9866 2025-07-10 16:53:05,933 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:56294, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1202639125_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743327_2503, duration(ns): 18058254 2025-07-10 16:53:05,933 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743327_2503, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-10 16:53:08,134 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743327_2503 replica FinalizedReplica, blk_1073743327_2503, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743327 for deletion 2025-07-10 16:53:08,135 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743327_2503 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743327 2025-07-10 16:58:15,944 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743332_2508 src: /192.168.158.7:45162 dest: /192.168.158.4:9866 2025-07-10 16:58:15,972 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:45162, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_806001028_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743332_2508, duration(ns): 22697870 2025-07-10 16:58:15,972 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743332_2508, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-10 16:58:20,144 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743332_2508 replica FinalizedReplica, blk_1073743332_2508, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743332 for deletion 2025-07-10 16:58:20,145 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743332_2508 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743332 2025-07-10 17:00:15,940 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743334_2510 src: /192.168.158.1:51916 dest: /192.168.158.4:9866 2025-07-10 17:00:15,972 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51916, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1330876197_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743334_2510, duration(ns): 22640942 2025-07-10 17:00:15,972 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743334_2510, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-10 17:00:20,147 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743334_2510 replica FinalizedReplica, blk_1073743334_2510, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743334 for deletion 2025-07-10 17:00:20,148 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743334_2510 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743334 2025-07-10 17:02:15,945 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743336_2512 src: /192.168.158.1:56408 dest: /192.168.158.4:9866 2025-07-10 17:02:15,979 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56408, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_42718431_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743336_2512, duration(ns): 24374851 2025-07-10 17:02:15,979 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743336_2512, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-10 17:02:23,157 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743336_2512 replica FinalizedReplica, blk_1073743336_2512, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743336 for deletion 2025-07-10 17:02:23,158 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743336_2512 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743336 2025-07-10 17:03:20,945 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743337_2513 src: /192.168.158.8:55432 dest: /192.168.158.4:9866 2025-07-10 17:03:20,960 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:55432, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-961776329_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743337_2513, duration(ns): 13031679 2025-07-10 17:03:20,960 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743337_2513, type=LAST_IN_PIPELINE terminating 2025-07-10 17:03:26,160 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743337_2513 replica FinalizedReplica, blk_1073743337_2513, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743337 for deletion 2025-07-10 17:03:26,161 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743337_2513 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743337 2025-07-10 17:05:20,947 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743339_2515 src: /192.168.158.8:55248 dest: /192.168.158.4:9866 2025-07-10 17:05:20,972 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:55248, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_528716247_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743339_2515, duration(ns): 19979675 2025-07-10 17:05:20,972 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743339_2515, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-10 17:05:23,171 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743339_2515 replica FinalizedReplica, blk_1073743339_2515, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743339 for deletion 2025-07-10 17:05:23,172 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743339_2515 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743339 2025-07-10 17:06:25,991 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743340_2516 src: /192.168.158.8:44878 dest: /192.168.158.4:9866 2025-07-10 17:06:26,007 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:44878, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1572614174_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743340_2516, duration(ns): 12405149 2025-07-10 17:06:26,008 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743340_2516, type=LAST_IN_PIPELINE terminating 2025-07-10 17:06:29,175 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743340_2516 replica FinalizedReplica, blk_1073743340_2516, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743340 for deletion 2025-07-10 17:06:29,176 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743340_2516 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743340 2025-07-10 17:07:30,967 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743341_2517 src: /192.168.158.9:46108 dest: /192.168.158.4:9866 2025-07-10 17:07:30,985 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:46108, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-829843544_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743341_2517, duration(ns): 15486131 2025-07-10 17:07:30,985 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743341_2517, type=LAST_IN_PIPELINE terminating 2025-07-10 17:07:38,178 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743341_2517 replica FinalizedReplica, blk_1073743341_2517, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743341 for deletion 2025-07-10 17:07:38,179 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743341_2517 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743341 2025-07-10 17:08:30,955 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743342_2518 src: /192.168.158.7:54116 dest: /192.168.158.4:9866 2025-07-10 17:08:30,981 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:54116, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1554688543_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743342_2518, duration(ns): 20666471 2025-07-10 17:08:30,981 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743342_2518, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-10 17:08:35,179 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743342_2518 replica FinalizedReplica, blk_1073743342_2518, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743342 for deletion 2025-07-10 17:08:35,180 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743342_2518 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743342 2025-07-10 17:11:35,972 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743345_2521 src: /192.168.158.1:59940 dest: /192.168.158.4:9866 2025-07-10 17:11:36,002 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59940, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_385893747_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743345_2521, duration(ns): 20444491 2025-07-10 17:11:36,002 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743345_2521, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-10 17:11:38,189 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743345_2521 replica FinalizedReplica, blk_1073743345_2521, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743345 for deletion 2025-07-10 17:11:38,190 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743345_2521 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743345 2025-07-10 17:12:40,953 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743346_2522 src: /192.168.158.1:40962 dest: /192.168.158.4:9866 2025-07-10 17:12:40,986 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40962, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1221245887_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743346_2522, duration(ns): 23781225 2025-07-10 17:12:40,986 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743346_2522, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-10 17:12:47,192 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743346_2522 replica FinalizedReplica, blk_1073743346_2522, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743346 for deletion 2025-07-10 17:12:47,193 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743346_2522 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743346 2025-07-10 17:16:46,824 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743350_2526 src: /192.168.158.5:58062 dest: /192.168.158.4:9866 2025-07-10 17:16:46,842 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:58062, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_946257012_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743350_2526, duration(ns): 15140704 2025-07-10 17:16:46,842 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743350_2526, type=LAST_IN_PIPELINE terminating 2025-07-10 17:16:53,204 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743350_2526 replica FinalizedReplica, blk_1073743350_2526, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743350 for deletion 2025-07-10 17:16:53,205 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743350_2526 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743350 2025-07-10 17:17:50,963 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743351_2527 src: /192.168.158.1:43734 dest: /192.168.158.4:9866 2025-07-10 17:17:50,999 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43734, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1816391961_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743351_2527, duration(ns): 25210288 2025-07-10 17:17:50,999 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743351_2527, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-10 17:17:53,208 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743351_2527 replica FinalizedReplica, blk_1073743351_2527, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743351 for deletion 2025-07-10 17:17:53,209 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743351_2527 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743351 2025-07-10 17:18:55,964 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743352_2528 src: /192.168.158.9:36944 dest: /192.168.158.4:9866 2025-07-10 17:18:55,982 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:36944, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-546309487_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743352_2528, duration(ns): 15676008 2025-07-10 17:18:55,983 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743352_2528, type=LAST_IN_PIPELINE terminating 2025-07-10 17:18:59,211 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743352_2528 replica FinalizedReplica, blk_1073743352_2528, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743352 for deletion 2025-07-10 17:18:59,212 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743352_2528 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743352 2025-07-10 17:22:00,980 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743355_2531 src: /192.168.158.9:38964 dest: /192.168.158.4:9866 2025-07-10 17:22:01,331 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:38964, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1715482576_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743355_2531, duration(ns): 18519209 2025-07-10 17:22:01,331 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743355_2531, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-10 17:22:05,212 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743355_2531 replica FinalizedReplica, blk_1073743355_2531, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743355 for deletion 2025-07-10 17:22:05,213 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743355_2531 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743355 2025-07-10 17:24:10,984 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743357_2533 src: /192.168.158.7:39364 dest: /192.168.158.4:9866 2025-07-10 17:24:11,001 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:39364, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1901826147_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743357_2533, duration(ns): 14523441 2025-07-10 17:24:11,001 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743357_2533, type=LAST_IN_PIPELINE terminating 2025-07-10 17:24:14,214 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743357_2533 replica FinalizedReplica, blk_1073743357_2533, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743357 for deletion 2025-07-10 17:24:14,215 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743357_2533 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073743357 2025-07-10 17:29:10,968 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743362_2538 src: /192.168.158.1:43656 dest: /192.168.158.4:9866 2025-07-10 17:29:10,998 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43656, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1469843808_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743362_2538, duration(ns): 20945889 2025-07-10 17:29:10,998 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743362_2538, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-10 17:29:17,218 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743362_2538 replica FinalizedReplica, blk_1073743362_2538, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743362 for deletion 2025-07-10 17:29:17,219 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743362_2538 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743362 2025-07-10 17:31:10,975 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743364_2540 src: /192.168.158.5:34284 dest: /192.168.158.4:9866 2025-07-10 17:31:10,992 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:34284, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1922067713_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743364_2540, duration(ns): 14177546 2025-07-10 17:31:10,992 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743364_2540, type=LAST_IN_PIPELINE terminating 2025-07-10 17:31:14,221 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743364_2540 replica FinalizedReplica, blk_1073743364_2540, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743364 for deletion 2025-07-10 17:31:14,222 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743364_2540 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743364 2025-07-10 17:33:15,977 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743366_2542 src: /192.168.158.6:40464 dest: /192.168.158.4:9866 2025-07-10 17:33:15,995 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:40464, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_264936366_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743366_2542, duration(ns): 15566456 2025-07-10 17:33:15,995 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743366_2542, type=LAST_IN_PIPELINE terminating 2025-07-10 17:33:23,226 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743366_2542 replica FinalizedReplica, blk_1073743366_2542, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743366 for deletion 2025-07-10 17:33:23,228 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743366_2542 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743366 2025-07-10 17:36:13,268 INFO org.apache.hadoop.hdfs.server.datanode.DirectoryScanner: BlockPool BP-1059995147-192.168.158.1-1752101929360 Total blocks: 8, missing metadata files:0, missing block files:0, missing blocks in memory:0, mismatched blocks:0 2025-07-10 17:36:16,004 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743369_2545 src: /192.168.158.6:33256 dest: /192.168.158.4:9866 2025-07-10 17:36:16,027 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:33256, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_548185371_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743369_2545, duration(ns): 17369185 2025-07-10 17:36:16,028 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743369_2545, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-10 17:36:20,228 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743369_2545 replica FinalizedReplica, blk_1073743369_2545, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743369 for deletion 2025-07-10 17:36:20,229 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743369_2545 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743369 2025-07-10 17:37:20,233 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Successfully sent block report 0x12cfd9a757d31f29, containing 4 storage report(s), of which we sent 4. The reports had 8 total blocks and used 1 RPC(s). This took 0 msec to generate and 5 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5. 2025-07-10 17:37:20,233 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Got finalize command for block pool BP-1059995147-192.168.158.1-1752101929360 2025-07-10 17:39:20,994 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743372_2548 src: /192.168.158.6:47700 dest: /192.168.158.4:9866 2025-07-10 17:39:21,011 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:47700, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1217925771_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743372_2548, duration(ns): 14714720 2025-07-10 17:39:21,012 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743372_2548, type=LAST_IN_PIPELINE terminating 2025-07-10 17:39:23,233 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743372_2548 replica FinalizedReplica, blk_1073743372_2548, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743372 for deletion 2025-07-10 17:39:23,234 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743372_2548 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743372 2025-07-10 17:42:20,994 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743375_2551 src: /192.168.158.6:43282 dest: /192.168.158.4:9866 2025-07-10 17:42:21,015 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:43282, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2132959744_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743375_2551, duration(ns): 15432695 2025-07-10 17:42:21,015 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743375_2551, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-10 17:42:23,238 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743375_2551 replica FinalizedReplica, blk_1073743375_2551, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743375 for deletion 2025-07-10 17:42:23,239 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743375_2551 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743375 2025-07-10 17:43:26,003 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743376_2552 src: /192.168.158.5:53754 dest: /192.168.158.4:9866 2025-07-10 17:43:26,020 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:53754, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1272776780_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743376_2552, duration(ns): 15111358 2025-07-10 17:43:26,020 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743376_2552, type=LAST_IN_PIPELINE terminating 2025-07-10 17:43:32,242 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743376_2552 replica FinalizedReplica, blk_1073743376_2552, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743376 for deletion 2025-07-10 17:43:32,243 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743376_2552 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743376 2025-07-10 17:44:25,997 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743377_2553 src: /192.168.158.9:57098 dest: /192.168.158.4:9866 2025-07-10 17:44:26,022 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:57098, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_250823024_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743377_2553, duration(ns): 19278854 2025-07-10 17:44:26,023 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743377_2553, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-10 17:44:29,245 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743377_2553 replica FinalizedReplica, blk_1073743377_2553, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743377 for deletion 2025-07-10 17:44:29,246 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743377_2553 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743377 2025-07-10 17:48:31,012 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743381_2557 src: /192.168.158.1:50536 dest: /192.168.158.4:9866 2025-07-10 17:48:31,043 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50536, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1655342286_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743381_2557, duration(ns): 22132530 2025-07-10 17:48:31,043 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743381_2557, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-10 17:48:38,250 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743381_2557 replica FinalizedReplica, blk_1073743381_2557, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743381 for deletion 2025-07-10 17:48:38,251 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743381_2557 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743381 2025-07-10 17:51:36,024 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743384_2560 src: /192.168.158.1:54038 dest: /192.168.158.4:9866 2025-07-10 17:51:36,054 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54038, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1951114935_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743384_2560, duration(ns): 20731003 2025-07-10 17:51:36,054 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743384_2560, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-10 17:51:38,252 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743384_2560 replica FinalizedReplica, blk_1073743384_2560, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743384 for deletion 2025-07-10 17:51:38,253 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743384_2560 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743384 2025-07-10 17:55:46,046 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743388_2564 src: /192.168.158.1:50060 dest: /192.168.158.4:9866 2025-07-10 17:55:46,076 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50060, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_56723374_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743388_2564, duration(ns): 21227624 2025-07-10 17:55:46,077 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743388_2564, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-10 17:55:53,255 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743388_2564 replica FinalizedReplica, blk_1073743388_2564, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743388 for deletion 2025-07-10 17:55:53,257 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743388_2564 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743388 2025-07-10 17:56:46,035 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743389_2565 src: /192.168.158.1:55202 dest: /192.168.158.4:9866 2025-07-10 17:56:46,066 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55202, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1240606424_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743389_2565, duration(ns): 21412837 2025-07-10 17:56:46,066 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743389_2565, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-10 17:56:50,256 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743389_2565 replica FinalizedReplica, blk_1073743389_2565, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743389 for deletion 2025-07-10 17:56:50,257 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743389_2565 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743389 2025-07-10 18:00:46,046 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743393_2569 src: /192.168.158.1:44888 dest: /192.168.158.4:9866 2025-07-10 18:00:46,079 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44888, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_822920948_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743393_2569, duration(ns): 23387202 2025-07-10 18:00:46,079 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743393_2569, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-10 18:00:53,269 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743393_2569 replica FinalizedReplica, blk_1073743393_2569, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743393 for deletion 2025-07-10 18:00:53,270 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743393_2569 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743393 2025-07-10 18:10:01,067 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743402_2578 src: /192.168.158.7:44618 dest: /192.168.158.4:9866 2025-07-10 18:10:01,084 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:44618, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-268792994_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743402_2578, duration(ns): 14773061 2025-07-10 18:10:01,084 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743402_2578, type=LAST_IN_PIPELINE terminating 2025-07-10 18:10:08,293 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743402_2578 replica FinalizedReplica, blk_1073743402_2578, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743402 for deletion 2025-07-10 18:10:08,294 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743402_2578 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743402 2025-07-10 18:15:06,075 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743407_2583 src: /192.168.158.8:50046 dest: /192.168.158.4:9866 2025-07-10 18:15:06,099 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:50046, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1263724413_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743407_2583, duration(ns): 18284484 2025-07-10 18:15:06,100 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743407_2583, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-10 18:15:08,304 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743407_2583 replica FinalizedReplica, blk_1073743407_2583, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743407 for deletion 2025-07-10 18:15:08,305 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743407_2583 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743407 2025-07-10 18:17:11,077 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743409_2585 src: /192.168.158.9:35166 dest: /192.168.158.4:9866 2025-07-10 18:17:11,093 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:35166, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1722928710_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743409_2585, duration(ns): 13875005 2025-07-10 18:17:11,094 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743409_2585, type=LAST_IN_PIPELINE terminating 2025-07-10 18:17:14,312 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743409_2585 replica FinalizedReplica, blk_1073743409_2585, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743409 for deletion 2025-07-10 18:17:14,314 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743409_2585 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743409 2025-07-10 18:18:11,076 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743410_2586 src: /192.168.158.1:60402 dest: /192.168.158.4:9866 2025-07-10 18:18:11,111 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60402, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2147081425_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743410_2586, duration(ns): 25514573 2025-07-10 18:18:11,112 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743410_2586, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-10 18:18:14,318 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743410_2586 replica FinalizedReplica, blk_1073743410_2586, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743410 for deletion 2025-07-10 18:18:14,319 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743410_2586 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743410 2025-07-10 18:20:11,084 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743412_2588 src: /192.168.158.6:54630 dest: /192.168.158.4:9866 2025-07-10 18:20:11,107 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:54630, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1045194340_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743412_2588, duration(ns): 17301519 2025-07-10 18:20:11,107 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743412_2588, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-10 18:20:17,325 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743412_2588 replica FinalizedReplica, blk_1073743412_2588, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743412 for deletion 2025-07-10 18:20:17,326 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743412_2588 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743412 2025-07-10 18:22:11,092 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743414_2590 src: /192.168.158.6:50666 dest: /192.168.158.4:9866 2025-07-10 18:22:11,108 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:50666, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1902921552_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743414_2590, duration(ns): 13953218 2025-07-10 18:22:11,108 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743414_2590, type=LAST_IN_PIPELINE terminating 2025-07-10 18:22:14,331 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743414_2590 replica FinalizedReplica, blk_1073743414_2590, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743414 for deletion 2025-07-10 18:22:14,332 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743414_2590 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743414 2025-07-10 18:24:16,091 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743416_2592 src: /192.168.158.7:60348 dest: /192.168.158.4:9866 2025-07-10 18:24:16,115 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:60348, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1167914954_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743416_2592, duration(ns): 18671298 2025-07-10 18:24:16,116 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743416_2592, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-10 18:24:20,336 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743416_2592 replica FinalizedReplica, blk_1073743416_2592, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743416 for deletion 2025-07-10 18:24:20,338 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743416_2592 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743416 2025-07-10 18:25:16,103 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743417_2593 src: /192.168.158.8:53830 dest: /192.168.158.4:9866 2025-07-10 18:25:16,120 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:53830, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-284924028_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743417_2593, duration(ns): 14581811 2025-07-10 18:25:16,120 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743417_2593, type=LAST_IN_PIPELINE terminating 2025-07-10 18:25:23,338 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743417_2593 replica FinalizedReplica, blk_1073743417_2593, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743417 for deletion 2025-07-10 18:25:23,339 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743417_2593 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743417 2025-07-10 18:26:16,094 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743418_2594 src: /192.168.158.1:51626 dest: /192.168.158.4:9866 2025-07-10 18:26:16,123 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51626, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_331687479_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743418_2594, duration(ns): 19996403 2025-07-10 18:26:16,123 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743418_2594, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-10 18:26:23,340 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743418_2594 replica FinalizedReplica, blk_1073743418_2594, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743418 for deletion 2025-07-10 18:26:23,342 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743418_2594 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743418 2025-07-10 18:27:16,101 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743419_2595 src: /192.168.158.7:44008 dest: /192.168.158.4:9866 2025-07-10 18:27:16,118 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:44008, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-650910946_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743419_2595, duration(ns): 14838978 2025-07-10 18:27:16,119 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743419_2595, type=LAST_IN_PIPELINE terminating 2025-07-10 18:27:20,342 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743419_2595 replica FinalizedReplica, blk_1073743419_2595, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743419 for deletion 2025-07-10 18:27:20,344 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743419_2595 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743419 2025-07-10 18:29:16,103 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743421_2597 src: /192.168.158.8:55652 dest: /192.168.158.4:9866 2025-07-10 18:29:16,126 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:55652, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1175494686_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743421_2597, duration(ns): 17257221 2025-07-10 18:29:16,126 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743421_2597, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-10 18:29:23,346 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743421_2597 replica FinalizedReplica, blk_1073743421_2597, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743421 for deletion 2025-07-10 18:29:23,347 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743421_2597 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743421 2025-07-10 18:30:16,105 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743422_2598 src: /192.168.158.1:43394 dest: /192.168.158.4:9866 2025-07-10 18:30:16,136 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43394, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_268075806_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743422_2598, duration(ns): 22010070 2025-07-10 18:30:16,136 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743422_2598, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-10 18:30:20,347 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743422_2598 replica FinalizedReplica, blk_1073743422_2598, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743422 for deletion 2025-07-10 18:30:20,348 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743422_2598 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743422 2025-07-10 18:31:21,107 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743423_2599 src: /192.168.158.8:60268 dest: /192.168.158.4:9866 2025-07-10 18:31:21,130 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:60268, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-348060841_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743423_2599, duration(ns): 17569926 2025-07-10 18:31:21,130 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743423_2599, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-10 18:31:26,348 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743423_2599 replica FinalizedReplica, blk_1073743423_2599, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743423 for deletion 2025-07-10 18:31:26,349 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743423_2599 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743423 2025-07-10 18:32:21,133 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743424_2600 src: /192.168.158.1:33940 dest: /192.168.158.4:9866 2025-07-10 18:32:21,168 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33940, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-32681887_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743424_2600, duration(ns): 25796372 2025-07-10 18:32:21,169 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743424_2600, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-10 18:32:26,351 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743424_2600 replica FinalizedReplica, blk_1073743424_2600, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743424 for deletion 2025-07-10 18:32:26,352 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743424_2600 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743424 2025-07-10 18:34:21,118 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743426_2602 src: /192.168.158.6:47192 dest: /192.168.158.4:9866 2025-07-10 18:34:21,143 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:47192, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1370869343_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743426_2602, duration(ns): 19600620 2025-07-10 18:34:21,144 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743426_2602, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-10 18:34:23,359 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743426_2602 replica FinalizedReplica, blk_1073743426_2602, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743426 for deletion 2025-07-10 18:34:23,360 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743426_2602 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743426 2025-07-10 18:37:21,122 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743429_2605 src: /192.168.158.1:51656 dest: /192.168.158.4:9866 2025-07-10 18:37:21,151 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51656, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_406117746_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743429_2605, duration(ns): 20743471 2025-07-10 18:37:21,151 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743429_2605, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-10 18:37:23,364 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743429_2605 replica FinalizedReplica, blk_1073743429_2605, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743429 for deletion 2025-07-10 18:37:23,366 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743429_2605 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743429 2025-07-10 18:38:21,130 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743430_2606 src: /192.168.158.7:49484 dest: /192.168.158.4:9866 2025-07-10 18:38:21,157 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:49484, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_880616157_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743430_2606, duration(ns): 20452198 2025-07-10 18:38:21,157 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743430_2606, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-10 18:38:23,365 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743430_2606 replica FinalizedReplica, blk_1073743430_2606, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743430 for deletion 2025-07-10 18:38:23,366 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743430_2606 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743430 2025-07-10 18:39:21,135 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743431_2607 src: /192.168.158.9:53392 dest: /192.168.158.4:9866 2025-07-10 18:39:21,153 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:53392, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1448761345_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743431_2607, duration(ns): 15934091 2025-07-10 18:39:21,154 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743431_2607, type=LAST_IN_PIPELINE terminating 2025-07-10 18:39:23,367 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743431_2607 replica FinalizedReplica, blk_1073743431_2607, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743431 for deletion 2025-07-10 18:39:23,368 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743431_2607 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743431 2025-07-10 18:43:21,141 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743435_2611 src: /192.168.158.6:45776 dest: /192.168.158.4:9866 2025-07-10 18:43:21,167 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:45776, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_505675647_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743435_2611, duration(ns): 20099637 2025-07-10 18:43:21,167 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743435_2611, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-10 18:43:23,376 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743435_2611 replica FinalizedReplica, blk_1073743435_2611, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743435 for deletion 2025-07-10 18:43:23,377 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743435_2611 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743435 2025-07-10 18:44:21,144 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743436_2612 src: /192.168.158.6:47992 dest: /192.168.158.4:9866 2025-07-10 18:44:21,168 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:47992, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1083311612_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743436_2612, duration(ns): 18288295 2025-07-10 18:44:21,168 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743436_2612, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-10 18:44:26,378 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743436_2612 replica FinalizedReplica, blk_1073743436_2612, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743436 for deletion 2025-07-10 18:44:26,380 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743436_2612 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743436 2025-07-10 18:45:21,147 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743437_2613 src: /192.168.158.6:46794 dest: /192.168.158.4:9866 2025-07-10 18:45:21,172 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:46794, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-941247707_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743437_2613, duration(ns): 18933342 2025-07-10 18:45:21,172 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743437_2613, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-10 18:45:23,380 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743437_2613 replica FinalizedReplica, blk_1073743437_2613, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743437 for deletion 2025-07-10 18:45:23,381 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743437_2613 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743437 2025-07-10 18:49:21,156 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743441_2617 src: /192.168.158.6:51456 dest: /192.168.158.4:9866 2025-07-10 18:49:21,180 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:51456, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_484425808_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743441_2617, duration(ns): 18865967 2025-07-10 18:49:21,180 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743441_2617, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-10 18:49:23,386 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743441_2617 replica FinalizedReplica, blk_1073743441_2617, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743441 for deletion 2025-07-10 18:49:23,388 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743441_2617 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743441 2025-07-10 18:50:26,156 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743442_2618 src: /192.168.158.8:46498 dest: /192.168.158.4:9866 2025-07-10 18:50:26,179 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:46498, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_997290484_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743442_2618, duration(ns): 18269109 2025-07-10 18:50:26,180 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743442_2618, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-10 18:50:29,389 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743442_2618 replica FinalizedReplica, blk_1073743442_2618, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743442 for deletion 2025-07-10 18:50:29,390 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743442_2618 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743442 2025-07-10 18:51:26,163 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743443_2619 src: /192.168.158.5:51346 dest: /192.168.158.4:9866 2025-07-10 18:51:26,180 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:51346, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1408447955_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743443_2619, duration(ns): 13426320 2025-07-10 18:51:26,180 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743443_2619, type=LAST_IN_PIPELINE terminating 2025-07-10 18:51:29,393 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743443_2619 replica FinalizedReplica, blk_1073743443_2619, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743443 for deletion 2025-07-10 18:51:29,394 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743443_2619 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743443 2025-07-10 18:53:26,174 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743445_2621 src: /192.168.158.9:58688 dest: /192.168.158.4:9866 2025-07-10 18:53:26,195 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:58688, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1925722428_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743445_2621, duration(ns): 18491837 2025-07-10 18:53:26,195 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743445_2621, type=LAST_IN_PIPELINE terminating 2025-07-10 18:53:32,397 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743445_2621 replica FinalizedReplica, blk_1073743445_2621, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743445 for deletion 2025-07-10 18:53:32,398 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743445_2621 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743445 2025-07-10 18:54:26,181 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743446_2622 src: /192.168.158.8:47194 dest: /192.168.158.4:9866 2025-07-10 18:54:26,198 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:47194, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1988261618_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743446_2622, duration(ns): 15323271 2025-07-10 18:54:26,199 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743446_2622, type=LAST_IN_PIPELINE terminating 2025-07-10 18:54:32,400 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743446_2622 replica FinalizedReplica, blk_1073743446_2622, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743446 for deletion 2025-07-10 18:54:32,401 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743446_2622 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743446 2025-07-10 18:56:31,189 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743448_2624 src: /192.168.158.6:55210 dest: /192.168.158.4:9866 2025-07-10 18:56:31,207 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:55210, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1944583968_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743448_2624, duration(ns): 15112680 2025-07-10 18:56:31,207 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743448_2624, type=LAST_IN_PIPELINE terminating 2025-07-10 18:56:35,404 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743448_2624 replica FinalizedReplica, blk_1073743448_2624, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743448 for deletion 2025-07-10 18:56:35,405 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743448_2624 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743448 2025-07-10 18:57:31,181 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743449_2625 src: /192.168.158.7:44708 dest: /192.168.158.4:9866 2025-07-10 18:57:31,198 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:44708, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1548771987_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743449_2625, duration(ns): 14127193 2025-07-10 18:57:31,198 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743449_2625, type=LAST_IN_PIPELINE terminating 2025-07-10 18:57:35,409 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743449_2625 replica FinalizedReplica, blk_1073743449_2625, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743449 for deletion 2025-07-10 18:57:35,410 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743449_2625 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743449 2025-07-10 19:01:31,189 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743453_2629 src: /192.168.158.8:59262 dest: /192.168.158.4:9866 2025-07-10 19:01:31,215 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:59262, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-761256183_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743453_2629, duration(ns): 20519078 2025-07-10 19:01:31,216 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743453_2629, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-10 19:01:35,420 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743453_2629 replica FinalizedReplica, blk_1073743453_2629, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743453 for deletion 2025-07-10 19:01:35,421 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743453_2629 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743453 2025-07-10 19:06:36,202 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743458_2634 src: /192.168.158.7:54376 dest: /192.168.158.4:9866 2025-07-10 19:06:36,227 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:54376, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1886384196_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743458_2634, duration(ns): 19295031 2025-07-10 19:06:36,227 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743458_2634, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-10 19:06:38,435 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743458_2634 replica FinalizedReplica, blk_1073743458_2634, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743458 for deletion 2025-07-10 19:06:38,436 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743458_2634 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743458 2025-07-10 19:10:41,204 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743462_2638 src: /192.168.158.5:41278 dest: /192.168.158.4:9866 2025-07-10 19:10:41,226 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:41278, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1013028119_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743462_2638, duration(ns): 16322949 2025-07-10 19:10:41,228 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743462_2638, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-10 19:10:47,446 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743462_2638 replica FinalizedReplica, blk_1073743462_2638, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743462 for deletion 2025-07-10 19:10:47,447 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743462_2638 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743462 2025-07-10 19:11:46,220 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743463_2639 src: /192.168.158.6:55736 dest: /192.168.158.4:9866 2025-07-10 19:11:46,271 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:55736, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1302352697_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743463_2639, duration(ns): 45208178 2025-07-10 19:11:46,271 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743463_2639, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-10 19:11:50,448 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743463_2639 replica FinalizedReplica, blk_1073743463_2639, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743463 for deletion 2025-07-10 19:11:50,449 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743463_2639 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743463 2025-07-10 19:15:51,209 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743467_2643 src: /192.168.158.8:47160 dest: /192.168.158.4:9866 2025-07-10 19:15:51,224 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:47160, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1596093474_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743467_2643, duration(ns): 13998208 2025-07-10 19:15:51,225 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743467_2643, type=LAST_IN_PIPELINE terminating 2025-07-10 19:15:53,456 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743467_2643 replica FinalizedReplica, blk_1073743467_2643, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743467 for deletion 2025-07-10 19:15:53,457 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743467_2643 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743467 2025-07-10 19:19:51,212 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743471_2647 src: /192.168.158.6:60734 dest: /192.168.158.4:9866 2025-07-10 19:19:51,233 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:60734, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1734734177_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743471_2647, duration(ns): 15713032 2025-07-10 19:19:51,233 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743471_2647, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-10 19:19:56,464 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743471_2647 replica FinalizedReplica, blk_1073743471_2647, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743471 for deletion 2025-07-10 19:19:56,465 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743471_2647 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743471 2025-07-10 19:21:56,217 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743473_2649 src: /192.168.158.6:52646 dest: /192.168.158.4:9866 2025-07-10 19:21:56,234 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:52646, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1185474599_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743473_2649, duration(ns): 14512903 2025-07-10 19:21:56,234 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743473_2649, type=LAST_IN_PIPELINE terminating 2025-07-10 19:22:02,470 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743473_2649 replica FinalizedReplica, blk_1073743473_2649, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743473 for deletion 2025-07-10 19:22:02,472 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743473_2649 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743473 2025-07-10 19:22:56,220 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743474_2650 src: /192.168.158.1:54588 dest: /192.168.158.4:9866 2025-07-10 19:22:56,249 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54588, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_31211966_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743474_2650, duration(ns): 20170808 2025-07-10 19:22:56,250 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743474_2650, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-10 19:23:02,476 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743474_2650 replica FinalizedReplica, blk_1073743474_2650, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743474 for deletion 2025-07-10 19:23:02,477 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743474_2650 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743474 2025-07-10 19:24:56,225 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743476_2652 src: /192.168.158.1:40376 dest: /192.168.158.4:9866 2025-07-10 19:24:56,254 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40376, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1112984729_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743476_2652, duration(ns): 20291268 2025-07-10 19:24:56,254 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743476_2652, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-10 19:24:59,485 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743476_2652 replica FinalizedReplica, blk_1073743476_2652, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743476 for deletion 2025-07-10 19:24:59,486 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743476_2652 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743476 2025-07-10 19:28:06,221 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743479_2655 src: /192.168.158.1:40926 dest: /192.168.158.4:9866 2025-07-10 19:28:06,250 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40926, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1977522074_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743479_2655, duration(ns): 20114315 2025-07-10 19:28:06,251 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743479_2655, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-10 19:28:11,492 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743479_2655 replica FinalizedReplica, blk_1073743479_2655, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743479 for deletion 2025-07-10 19:28:11,494 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743479_2655 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743479 2025-07-10 19:29:06,224 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743480_2656 src: /192.168.158.1:33532 dest: /192.168.158.4:9866 2025-07-10 19:29:06,254 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33532, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1690611959_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743480_2656, duration(ns): 20281349 2025-07-10 19:29:06,254 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743480_2656, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-10 19:29:08,492 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743480_2656 replica FinalizedReplica, blk_1073743480_2656, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743480 for deletion 2025-07-10 19:29:08,493 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743480_2656 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743480 2025-07-10 19:30:06,265 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743481_2657 src: /192.168.158.1:58168 dest: /192.168.158.4:9866 2025-07-10 19:30:06,297 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58168, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2125907732_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743481_2657, duration(ns): 22026942 2025-07-10 19:30:06,297 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743481_2657, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-10 19:30:11,492 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743481_2657 replica FinalizedReplica, blk_1073743481_2657, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743481 for deletion 2025-07-10 19:30:11,494 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743481_2657 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743481 2025-07-10 19:32:11,235 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743483_2659 src: /192.168.158.5:55122 dest: /192.168.158.4:9866 2025-07-10 19:32:11,258 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:55122, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-213953320_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743483_2659, duration(ns): 17990933 2025-07-10 19:32:11,259 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743483_2659, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-10 19:32:14,497 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743483_2659 replica FinalizedReplica, blk_1073743483_2659, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743483 for deletion 2025-07-10 19:32:14,498 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743483_2659 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743483 2025-07-10 19:34:11,238 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743485_2661 src: /192.168.158.7:35612 dest: /192.168.158.4:9866 2025-07-10 19:34:11,255 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:35612, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-142498616_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743485_2661, duration(ns): 15762905 2025-07-10 19:34:11,256 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743485_2661, type=LAST_IN_PIPELINE terminating 2025-07-10 19:34:14,504 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743485_2661 replica FinalizedReplica, blk_1073743485_2661, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743485 for deletion 2025-07-10 19:34:14,505 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743485_2661 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743485 2025-07-10 19:35:11,251 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743486_2662 src: /192.168.158.1:52984 dest: /192.168.158.4:9866 2025-07-10 19:35:11,283 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52984, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_520694308_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743486_2662, duration(ns): 23654692 2025-07-10 19:35:11,284 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743486_2662, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-10 19:35:14,505 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743486_2662 replica FinalizedReplica, blk_1073743486_2662, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743486 for deletion 2025-07-10 19:35:14,507 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743486_2662 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743486 2025-07-10 19:36:16,250 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743487_2663 src: /192.168.158.6:46742 dest: /192.168.158.4:9866 2025-07-10 19:36:16,274 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:46742, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_621934847_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743487_2663, duration(ns): 18069699 2025-07-10 19:36:16,276 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743487_2663, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-10 19:36:20,508 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743487_2663 replica FinalizedReplica, blk_1073743487_2663, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743487 for deletion 2025-07-10 19:36:20,509 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743487_2663 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743487 2025-07-10 19:38:16,271 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743489_2665 src: /192.168.158.9:35586 dest: /192.168.158.4:9866 2025-07-10 19:38:16,297 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:35586, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-72803776_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743489_2665, duration(ns): 19692949 2025-07-10 19:38:16,297 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743489_2665, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-10 19:38:20,514 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743489_2665 replica FinalizedReplica, blk_1073743489_2665, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743489 for deletion 2025-07-10 19:38:20,515 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743489_2665 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743489 2025-07-10 19:39:16,261 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743490_2666 src: /192.168.158.8:42562 dest: /192.168.158.4:9866 2025-07-10 19:39:16,280 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:42562, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1057031430_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743490_2666, duration(ns): 17839929 2025-07-10 19:39:16,281 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743490_2666, type=LAST_IN_PIPELINE terminating 2025-07-10 19:39:20,514 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743490_2666 replica FinalizedReplica, blk_1073743490_2666, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743490 for deletion 2025-07-10 19:39:20,516 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743490_2666 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743490 2025-07-10 19:41:26,272 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743492_2668 src: /192.168.158.9:55154 dest: /192.168.158.4:9866 2025-07-10 19:41:26,295 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:55154, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_171806765_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743492_2668, duration(ns): 17657915 2025-07-10 19:41:26,295 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743492_2668, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-10 19:41:32,523 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743492_2668 replica FinalizedReplica, blk_1073743492_2668, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743492 for deletion 2025-07-10 19:41:32,524 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743492_2668 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743492 2025-07-10 19:42:26,256 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743493_2669 src: /192.168.158.1:48580 dest: /192.168.158.4:9866 2025-07-10 19:42:26,287 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48580, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1037748332_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743493_2669, duration(ns): 20607669 2025-07-10 19:42:26,287 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743493_2669, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-10 19:42:29,525 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743493_2669 replica FinalizedReplica, blk_1073743493_2669, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743493 for deletion 2025-07-10 19:42:29,526 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743493_2669 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743493 2025-07-10 19:43:26,248 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743494_2670 src: /192.168.158.8:49074 dest: /192.168.158.4:9866 2025-07-10 19:43:26,265 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:49074, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1106966980_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743494_2670, duration(ns): 14586865 2025-07-10 19:43:26,265 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743494_2670, type=LAST_IN_PIPELINE terminating 2025-07-10 19:43:29,527 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743494_2670 replica FinalizedReplica, blk_1073743494_2670, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743494 for deletion 2025-07-10 19:43:29,529 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743494_2670 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743494 2025-07-10 19:46:31,278 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743497_2673 src: /192.168.158.9:53148 dest: /192.168.158.4:9866 2025-07-10 19:46:31,295 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:53148, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-176529925_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743497_2673, duration(ns): 15425833 2025-07-10 19:46:31,296 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743497_2673, type=LAST_IN_PIPELINE terminating 2025-07-10 19:46:38,530 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743497_2673 replica FinalizedReplica, blk_1073743497_2673, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743497 for deletion 2025-07-10 19:46:38,531 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743497_2673 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743497 2025-07-10 19:49:36,274 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743500_2676 src: /192.168.158.7:57710 dest: /192.168.158.4:9866 2025-07-10 19:49:36,290 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:57710, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_514533383_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743500_2676, duration(ns): 13960592 2025-07-10 19:49:36,290 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743500_2676, type=LAST_IN_PIPELINE terminating 2025-07-10 19:49:38,534 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743500_2676 replica FinalizedReplica, blk_1073743500_2676, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743500 for deletion 2025-07-10 19:49:38,535 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743500_2676 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743500 2025-07-10 19:50:36,285 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743501_2677 src: /192.168.158.9:44050 dest: /192.168.158.4:9866 2025-07-10 19:50:36,309 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:44050, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1917970089_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743501_2677, duration(ns): 17847207 2025-07-10 19:50:36,309 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743501_2677, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-10 19:50:38,535 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743501_2677 replica FinalizedReplica, blk_1073743501_2677, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743501 for deletion 2025-07-10 19:50:38,537 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743501_2677 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743501 2025-07-10 19:54:36,284 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743505_2681 src: /192.168.158.1:40578 dest: /192.168.158.4:9866 2025-07-10 19:54:36,315 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40578, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1601771338_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743505_2681, duration(ns): 21577308 2025-07-10 19:54:36,315 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743505_2681, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-10 19:54:38,546 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743505_2681 replica FinalizedReplica, blk_1073743505_2681, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743505 for deletion 2025-07-10 19:54:38,547 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743505_2681 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743505 2025-07-10 19:58:41,294 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743509_2685 src: /192.168.158.8:59086 dest: /192.168.158.4:9866 2025-07-10 19:58:41,317 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:59086, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1590465461_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743509_2685, duration(ns): 17783814 2025-07-10 19:58:41,317 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743509_2685, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-10 19:58:47,554 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743509_2685 replica FinalizedReplica, blk_1073743509_2685, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743509 for deletion 2025-07-10 19:58:47,555 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743509_2685 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743509 2025-07-10 20:00:46,299 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743511_2687 src: /192.168.158.1:36994 dest: /192.168.158.4:9866 2025-07-10 20:00:46,332 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36994, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-985916903_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743511_2687, duration(ns): 23394302 2025-07-10 20:00:46,332 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743511_2687, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-10 20:00:50,560 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743511_2687 replica FinalizedReplica, blk_1073743511_2687, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743511 for deletion 2025-07-10 20:00:50,561 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743511_2687 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743511 2025-07-10 20:01:46,330 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743512_2688 src: /192.168.158.1:33940 dest: /192.168.158.4:9866 2025-07-10 20:01:46,363 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33940, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-468454192_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743512_2688, duration(ns): 23615861 2025-07-10 20:01:46,363 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743512_2688, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-10 20:01:53,563 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743512_2688 replica FinalizedReplica, blk_1073743512_2688, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743512 for deletion 2025-07-10 20:01:53,564 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743512_2688 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743512 2025-07-10 20:02:46,292 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743513_2689 src: /192.168.158.1:49698 dest: /192.168.158.4:9866 2025-07-10 20:02:46,322 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49698, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_897217728_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743513_2689, duration(ns): 21397964 2025-07-10 20:02:46,322 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743513_2689, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-10 20:02:50,566 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743513_2689 replica FinalizedReplica, blk_1073743513_2689, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743513 for deletion 2025-07-10 20:02:50,567 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743513_2689 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743513 2025-07-10 20:05:56,301 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743516_2692 src: /192.168.158.1:33966 dest: /192.168.158.4:9866 2025-07-10 20:05:56,329 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33966, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1783720913_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743516_2692, duration(ns): 19289133 2025-07-10 20:05:56,329 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743516_2692, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-10 20:05:59,569 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743516_2692 replica FinalizedReplica, blk_1073743516_2692, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743516 for deletion 2025-07-10 20:05:59,570 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743516_2692 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743516 2025-07-10 20:10:01,310 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743520_2696 src: /192.168.158.8:43022 dest: /192.168.158.4:9866 2025-07-10 20:10:01,334 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:43022, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1981016599_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743520_2696, duration(ns): 17791124 2025-07-10 20:10:01,335 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743520_2696, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-10 20:10:08,580 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743520_2696 replica FinalizedReplica, blk_1073743520_2696, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743520 for deletion 2025-07-10 20:10:08,581 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743520_2696 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743520 2025-07-10 20:11:01,321 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743521_2697 src: /192.168.158.1:36228 dest: /192.168.158.4:9866 2025-07-10 20:11:01,351 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36228, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-906503506_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743521_2697, duration(ns): 21701495 2025-07-10 20:11:01,351 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743521_2697, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-10 20:11:05,585 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743521_2697 replica FinalizedReplica, blk_1073743521_2697, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743521 for deletion 2025-07-10 20:11:05,586 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743521_2697 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743521 2025-07-10 20:14:01,348 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743524_2700 src: /192.168.158.6:37694 dest: /192.168.158.4:9866 2025-07-10 20:14:01,370 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:37694, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1559087427_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743524_2700, duration(ns): 17211209 2025-07-10 20:14:01,370 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743524_2700, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-10 20:14:05,590 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743524_2700 replica FinalizedReplica, blk_1073743524_2700, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743524 for deletion 2025-07-10 20:14:05,591 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743524_2700 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743524 2025-07-10 20:15:01,352 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743525_2701 src: /192.168.158.8:46040 dest: /192.168.158.4:9866 2025-07-10 20:15:01,371 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:46040, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_743300147_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743525_2701, duration(ns): 17254436 2025-07-10 20:15:01,371 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743525_2701, type=LAST_IN_PIPELINE terminating 2025-07-10 20:15:05,593 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743525_2701 replica FinalizedReplica, blk_1073743525_2701, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743525 for deletion 2025-07-10 20:15:05,594 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743525_2701 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743525 2025-07-10 20:16:01,330 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743526_2702 src: /192.168.158.5:58666 dest: /192.168.158.4:9866 2025-07-10 20:16:01,355 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:58666, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1597179260_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743526_2702, duration(ns): 19582961 2025-07-10 20:16:01,355 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743526_2702, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-10 20:16:05,594 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743526_2702 replica FinalizedReplica, blk_1073743526_2702, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743526 for deletion 2025-07-10 20:16:05,596 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743526_2702 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743526 2025-07-10 20:17:01,327 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743527_2703 src: /192.168.158.1:52666 dest: /192.168.158.4:9866 2025-07-10 20:17:01,362 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52666, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_124402771_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743527_2703, duration(ns): 25482797 2025-07-10 20:17:01,362 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743527_2703, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-10 20:17:05,595 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743527_2703 replica FinalizedReplica, blk_1073743527_2703, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743527 for deletion 2025-07-10 20:17:05,596 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743527_2703 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743527 2025-07-10 20:18:01,334 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743528_2704 src: /192.168.158.9:45728 dest: /192.168.158.4:9866 2025-07-10 20:18:01,350 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:45728, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1853406913_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743528_2704, duration(ns): 13865681 2025-07-10 20:18:01,350 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743528_2704, type=LAST_IN_PIPELINE terminating 2025-07-10 20:18:05,596 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743528_2704 replica FinalizedReplica, blk_1073743528_2704, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743528 for deletion 2025-07-10 20:18:05,598 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743528_2704 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743528 2025-07-10 20:20:01,329 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743530_2706 src: /192.168.158.9:41834 dest: /192.168.158.4:9866 2025-07-10 20:20:01,354 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:41834, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1596315953_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743530_2706, duration(ns): 19891285 2025-07-10 20:20:01,355 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743530_2706, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-10 20:20:08,601 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743530_2706 replica FinalizedReplica, blk_1073743530_2706, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743530 for deletion 2025-07-10 20:20:08,602 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743530_2706 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743530 2025-07-10 20:28:11,341 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743538_2714 src: /192.168.158.1:37740 dest: /192.168.158.4:9866 2025-07-10 20:28:11,371 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37740, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1589096202_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743538_2714, duration(ns): 22053624 2025-07-10 20:28:11,374 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743538_2714, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-10 20:28:14,622 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743538_2714 replica FinalizedReplica, blk_1073743538_2714, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743538 for deletion 2025-07-10 20:28:14,624 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743538_2714 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743538 2025-07-10 20:30:11,393 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743540_2716 src: /192.168.158.1:50476 dest: /192.168.158.4:9866 2025-07-10 20:30:11,422 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50476, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-542259371_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743540_2716, duration(ns): 20789613 2025-07-10 20:30:11,424 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743540_2716, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-10 20:30:14,624 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743540_2716 replica FinalizedReplica, blk_1073743540_2716, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743540 for deletion 2025-07-10 20:30:14,625 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743540_2716 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743540 2025-07-10 20:32:11,348 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743542_2718 src: /192.168.158.1:35082 dest: /192.168.158.4:9866 2025-07-10 20:32:11,379 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35082, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1203934118_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743542_2718, duration(ns): 21908701 2025-07-10 20:32:11,379 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743542_2718, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-10 20:32:14,626 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743542_2718 replica FinalizedReplica, blk_1073743542_2718, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743542 for deletion 2025-07-10 20:32:14,627 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743542_2718 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743542 2025-07-10 20:34:11,358 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743544_2720 src: /192.168.158.9:36822 dest: /192.168.158.4:9866 2025-07-10 20:34:11,381 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:36822, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-894835485_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743544_2720, duration(ns): 16813896 2025-07-10 20:34:11,381 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743544_2720, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-10 20:34:17,629 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743544_2720 replica FinalizedReplica, blk_1073743544_2720, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743544 for deletion 2025-07-10 20:34:17,630 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743544_2720 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743544 2025-07-10 20:35:11,371 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743545_2721 src: /192.168.158.8:46616 dest: /192.168.158.4:9866 2025-07-10 20:35:11,388 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:46616, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1442366763_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743545_2721, duration(ns): 14982402 2025-07-10 20:35:11,388 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743545_2721, type=LAST_IN_PIPELINE terminating 2025-07-10 20:35:14,632 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743545_2721 replica FinalizedReplica, blk_1073743545_2721, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743545 for deletion 2025-07-10 20:35:14,633 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743545_2721 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743545 2025-07-10 20:37:11,370 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743547_2723 src: /192.168.158.7:42568 dest: /192.168.158.4:9866 2025-07-10 20:37:11,386 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:42568, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1312607657_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743547_2723, duration(ns): 13805026 2025-07-10 20:37:11,386 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743547_2723, type=LAST_IN_PIPELINE terminating 2025-07-10 20:37:17,640 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743547_2723 replica FinalizedReplica, blk_1073743547_2723, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743547 for deletion 2025-07-10 20:37:17,641 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743547_2723 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743547 2025-07-10 20:38:11,370 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743548_2724 src: /192.168.158.8:44428 dest: /192.168.158.4:9866 2025-07-10 20:38:11,395 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:44428, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_311314088_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743548_2724, duration(ns): 17707184 2025-07-10 20:38:11,397 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743548_2724, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-10 20:38:14,641 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743548_2724 replica FinalizedReplica, blk_1073743548_2724, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743548 for deletion 2025-07-10 20:38:14,642 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743548_2724 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743548 2025-07-10 20:39:11,381 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743549_2725 src: /192.168.158.5:59656 dest: /192.168.158.4:9866 2025-07-10 20:39:11,405 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:59656, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-156914706_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743549_2725, duration(ns): 17477690 2025-07-10 20:39:11,407 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743549_2725, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-10 20:39:14,642 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743549_2725 replica FinalizedReplica, blk_1073743549_2725, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743549 for deletion 2025-07-10 20:39:14,644 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743549_2725 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743549 2025-07-10 20:50:31,423 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743560_2736 src: /192.168.158.5:57176 dest: /192.168.158.4:9866 2025-07-10 20:50:31,443 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:57176, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1594927203_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743560_2736, duration(ns): 17184191 2025-07-10 20:50:31,443 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743560_2736, type=LAST_IN_PIPELINE terminating 2025-07-10 20:50:38,656 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743560_2736 replica FinalizedReplica, blk_1073743560_2736, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743560 for deletion 2025-07-10 20:50:38,657 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743560_2736 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743560 2025-07-10 20:55:51,423 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743565_2741 src: /192.168.158.8:54904 dest: /192.168.158.4:9866 2025-07-10 20:55:51,437 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:54904, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1132653667_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743565_2741, duration(ns): 12232813 2025-07-10 20:55:51,438 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743565_2741, type=LAST_IN_PIPELINE terminating 2025-07-10 20:55:53,667 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743565_2741 replica FinalizedReplica, blk_1073743565_2741, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743565 for deletion 2025-07-10 20:55:53,669 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743565_2741 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743565 2025-07-10 20:56:51,428 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743566_2742 src: /192.168.158.1:43686 dest: /192.168.158.4:9866 2025-07-10 20:56:51,463 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43686, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-233097791_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743566_2742, duration(ns): 24123665 2025-07-10 20:56:51,463 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743566_2742, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-10 20:56:53,671 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743566_2742 replica FinalizedReplica, blk_1073743566_2742, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743566 for deletion 2025-07-10 20:56:53,673 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743566_2742 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743566 2025-07-10 20:58:56,415 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743568_2744 src: /192.168.158.1:38238 dest: /192.168.158.4:9866 2025-07-10 20:58:56,445 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38238, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1787789319_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743568_2744, duration(ns): 21193838 2025-07-10 20:58:56,445 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743568_2744, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-10 20:59:02,677 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743568_2744 replica FinalizedReplica, blk_1073743568_2744, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743568 for deletion 2025-07-10 20:59:02,678 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743568_2744 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743568 2025-07-10 20:59:56,446 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743569_2745 src: /192.168.158.5:56472 dest: /192.168.158.4:9866 2025-07-10 20:59:56,464 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:56472, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-545982590_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743569_2745, duration(ns): 16338706 2025-07-10 20:59:56,464 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743569_2745, type=LAST_IN_PIPELINE terminating 2025-07-10 20:59:59,681 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743569_2745 replica FinalizedReplica, blk_1073743569_2745, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743569 for deletion 2025-07-10 20:59:59,682 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743569_2745 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743569 2025-07-10 21:01:56,439 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743571_2747 src: /192.168.158.7:39902 dest: /192.168.158.4:9866 2025-07-10 21:01:56,464 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:39902, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_973525795_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743571_2747, duration(ns): 18772060 2025-07-10 21:01:56,464 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743571_2747, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-10 21:02:02,686 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743571_2747 replica FinalizedReplica, blk_1073743571_2747, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743571 for deletion 2025-07-10 21:02:02,687 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743571_2747 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743571 2025-07-10 21:03:56,428 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743573_2749 src: /192.168.158.6:56146 dest: /192.168.158.4:9866 2025-07-10 21:03:56,451 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:56146, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1819989676_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743573_2749, duration(ns): 17744017 2025-07-10 21:03:56,452 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743573_2749, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-10 21:04:02,687 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743573_2749 replica FinalizedReplica, blk_1073743573_2749, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743573 for deletion 2025-07-10 21:04:02,689 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743573_2749 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743573 2025-07-10 21:10:06,463 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743579_2755 src: /192.168.158.1:47884 dest: /192.168.158.4:9866 2025-07-10 21:10:06,493 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47884, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1861237633_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743579_2755, duration(ns): 21668883 2025-07-10 21:10:06,493 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743579_2755, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-10 21:10:11,697 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743579_2755 replica FinalizedReplica, blk_1073743579_2755, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743579 for deletion 2025-07-10 21:10:11,698 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743579_2755 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743579 2025-07-10 21:11:11,449 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743580_2756 src: /192.168.158.5:54174 dest: /192.168.158.4:9866 2025-07-10 21:11:11,466 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:54174, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-665062861_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743580_2756, duration(ns): 15013641 2025-07-10 21:11:11,466 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743580_2756, type=LAST_IN_PIPELINE terminating 2025-07-10 21:11:14,699 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743580_2756 replica FinalizedReplica, blk_1073743580_2756, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743580 for deletion 2025-07-10 21:11:14,701 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743580_2756 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743580 2025-07-10 21:14:21,483 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743583_2759 src: /192.168.158.7:54944 dest: /192.168.158.4:9866 2025-07-10 21:14:21,500 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:54944, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1735861179_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743583_2759, duration(ns): 13365659 2025-07-10 21:14:21,500 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743583_2759, type=LAST_IN_PIPELINE terminating 2025-07-10 21:14:23,706 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743583_2759 replica FinalizedReplica, blk_1073743583_2759, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743583 for deletion 2025-07-10 21:14:23,707 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743583_2759 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743583 2025-07-10 21:15:26,459 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743584_2760 src: /192.168.158.7:37176 dest: /192.168.158.4:9866 2025-07-10 21:15:26,476 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:37176, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1012003636_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743584_2760, duration(ns): 14997258 2025-07-10 21:15:26,476 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743584_2760, type=LAST_IN_PIPELINE terminating 2025-07-10 21:15:32,707 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743584_2760 replica FinalizedReplica, blk_1073743584_2760, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743584 for deletion 2025-07-10 21:15:32,709 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743584_2760 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743584 2025-07-10 21:17:31,462 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743586_2762 src: /192.168.158.7:57560 dest: /192.168.158.4:9866 2025-07-10 21:17:31,479 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:57560, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1219065532_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743586_2762, duration(ns): 14861085 2025-07-10 21:17:31,479 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743586_2762, type=LAST_IN_PIPELINE terminating 2025-07-10 21:17:38,714 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743586_2762 replica FinalizedReplica, blk_1073743586_2762, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743586 for deletion 2025-07-10 21:17:38,715 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743586_2762 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743586 2025-07-10 21:19:36,470 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743588_2764 src: /192.168.158.1:35862 dest: /192.168.158.4:9866 2025-07-10 21:19:36,503 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35862, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1730030067_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743588_2764, duration(ns): 23879780 2025-07-10 21:19:36,504 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743588_2764, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-10 21:19:41,717 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743588_2764 replica FinalizedReplica, blk_1073743588_2764, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743588 for deletion 2025-07-10 21:19:41,718 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743588_2764 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743588 2025-07-10 21:20:36,478 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743589_2765 src: /192.168.158.5:33814 dest: /192.168.158.4:9866 2025-07-10 21:20:36,503 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:33814, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-909088556_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743589_2765, duration(ns): 18584255 2025-07-10 21:20:36,503 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743589_2765, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-10 21:20:41,719 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743589_2765 replica FinalizedReplica, blk_1073743589_2765, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743589 for deletion 2025-07-10 21:20:41,721 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743589_2765 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743589 2025-07-10 21:22:36,473 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743591_2767 src: /192.168.158.1:51506 dest: /192.168.158.4:9866 2025-07-10 21:22:36,504 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51506, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1294205251_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743591_2767, duration(ns): 21978388 2025-07-10 21:22:36,505 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743591_2767, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-10 21:22:38,722 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743591_2767 replica FinalizedReplica, blk_1073743591_2767, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743591 for deletion 2025-07-10 21:22:38,723 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743591_2767 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743591 2025-07-10 21:23:36,481 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743592_2768 src: /192.168.158.5:44184 dest: /192.168.158.4:9866 2025-07-10 21:23:36,499 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:44184, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_362474258_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743592_2768, duration(ns): 16386504 2025-07-10 21:23:36,500 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743592_2768, type=LAST_IN_PIPELINE terminating 2025-07-10 21:23:38,726 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743592_2768 replica FinalizedReplica, blk_1073743592_2768, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743592 for deletion 2025-07-10 21:23:38,727 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743592_2768 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743592 2025-07-10 21:24:36,480 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743593_2769 src: /192.168.158.8:50828 dest: /192.168.158.4:9866 2025-07-10 21:24:36,504 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:50828, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_619615495_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743593_2769, duration(ns): 18350678 2025-07-10 21:24:36,504 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743593_2769, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-10 21:24:38,727 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743593_2769 replica FinalizedReplica, blk_1073743593_2769, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743593 for deletion 2025-07-10 21:24:38,729 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743593_2769 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743593 2025-07-10 21:26:36,486 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743595_2771 src: /192.168.158.8:51598 dest: /192.168.158.4:9866 2025-07-10 21:26:36,503 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:51598, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-512777388_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743595_2771, duration(ns): 14379523 2025-07-10 21:26:36,503 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743595_2771, type=LAST_IN_PIPELINE terminating 2025-07-10 21:26:38,732 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743595_2771 replica FinalizedReplica, blk_1073743595_2771, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743595 for deletion 2025-07-10 21:26:38,733 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743595_2771 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743595 2025-07-10 21:27:36,479 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743596_2772 src: /192.168.158.6:53826 dest: /192.168.158.4:9866 2025-07-10 21:27:36,501 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:53826, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_573879906_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743596_2772, duration(ns): 16558802 2025-07-10 21:27:36,502 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743596_2772, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-10 21:27:38,734 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743596_2772 replica FinalizedReplica, blk_1073743596_2772, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743596 for deletion 2025-07-10 21:27:38,736 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743596_2772 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743596 2025-07-10 21:29:36,490 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743598_2774 src: /192.168.158.8:50188 dest: /192.168.158.4:9866 2025-07-10 21:29:36,519 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:50188, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1631004360_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743598_2774, duration(ns): 20520622 2025-07-10 21:29:36,520 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743598_2774, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-10 21:29:41,744 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743598_2774 replica FinalizedReplica, blk_1073743598_2774, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743598 for deletion 2025-07-10 21:29:41,746 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743598_2774 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743598 2025-07-10 21:30:41,499 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743599_2775 src: /192.168.158.9:44312 dest: /192.168.158.4:9866 2025-07-10 21:30:41,525 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:44312, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1621297132_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743599_2775, duration(ns): 19753246 2025-07-10 21:30:41,525 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743599_2775, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-10 21:30:44,746 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743599_2775 replica FinalizedReplica, blk_1073743599_2775, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743599 for deletion 2025-07-10 21:30:44,747 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743599_2775 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743599 2025-07-10 21:31:41,515 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743600_2776 src: /192.168.158.1:35490 dest: /192.168.158.4:9866 2025-07-10 21:31:41,547 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35490, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-760562389_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743600_2776, duration(ns): 22838397 2025-07-10 21:31:41,547 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743600_2776, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-10 21:31:44,749 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743600_2776 replica FinalizedReplica, blk_1073743600_2776, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743600 for deletion 2025-07-10 21:31:44,750 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743600_2776 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743600 2025-07-10 21:33:41,493 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743602_2778 src: /192.168.158.7:40224 dest: /192.168.158.4:9866 2025-07-10 21:33:41,518 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:40224, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1940739575_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743602_2778, duration(ns): 19295821 2025-07-10 21:33:41,518 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743602_2778, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-10 21:33:47,753 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743602_2778 replica FinalizedReplica, blk_1073743602_2778, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743602 for deletion 2025-07-10 21:33:47,754 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743602_2778 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743602 2025-07-10 21:34:41,512 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743603_2779 src: /192.168.158.6:35288 dest: /192.168.158.4:9866 2025-07-10 21:34:41,532 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:35288, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1973635627_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743603_2779, duration(ns): 18046329 2025-07-10 21:34:41,533 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743603_2779, type=LAST_IN_PIPELINE terminating 2025-07-10 21:34:44,753 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743603_2779 replica FinalizedReplica, blk_1073743603_2779, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743603 for deletion 2025-07-10 21:34:44,755 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743603_2779 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743603 2025-07-10 21:37:41,507 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743606_2782 src: /192.168.158.1:54584 dest: /192.168.158.4:9866 2025-07-10 21:37:41,538 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54584, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_696561373_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743606_2782, duration(ns): 21817931 2025-07-10 21:37:41,538 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743606_2782, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-10 21:37:47,761 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743606_2782 replica FinalizedReplica, blk_1073743606_2782, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743606 for deletion 2025-07-10 21:37:47,763 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743606_2782 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743606 2025-07-10 21:40:41,523 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743609_2785 src: /192.168.158.8:53296 dest: /192.168.158.4:9866 2025-07-10 21:40:41,548 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:53296, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1619610542_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743609_2785, duration(ns): 19251709 2025-07-10 21:40:41,548 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743609_2785, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-10 21:40:44,767 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743609_2785 replica FinalizedReplica, blk_1073743609_2785, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743609 for deletion 2025-07-10 21:40:44,768 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743609_2785 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743609 2025-07-10 21:42:41,525 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743611_2787 src: /192.168.158.8:53590 dest: /192.168.158.4:9866 2025-07-10 21:42:41,542 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:53590, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1751146350_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743611_2787, duration(ns): 14739454 2025-07-10 21:42:41,543 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743611_2787, type=LAST_IN_PIPELINE terminating 2025-07-10 21:42:44,774 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743611_2787 replica FinalizedReplica, blk_1073743611_2787, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743611 for deletion 2025-07-10 21:42:44,776 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743611_2787 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743611 2025-07-10 21:43:41,514 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743612_2788 src: /192.168.158.1:43468 dest: /192.168.158.4:9866 2025-07-10 21:43:41,546 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43468, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1493318163_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743612_2788, duration(ns): 22677724 2025-07-10 21:43:41,547 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743612_2788, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-10 21:43:44,775 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743612_2788 replica FinalizedReplica, blk_1073743612_2788, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743612 for deletion 2025-07-10 21:43:44,777 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743612_2788 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743612 2025-07-10 21:44:41,522 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743613_2789 src: /192.168.158.6:34356 dest: /192.168.158.4:9866 2025-07-10 21:44:41,545 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:34356, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-763670461_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743613_2789, duration(ns): 17356960 2025-07-10 21:44:41,545 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743613_2789, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-10 21:44:47,777 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743613_2789 replica FinalizedReplica, blk_1073743613_2789, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743613 for deletion 2025-07-10 21:44:47,779 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743613_2789 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073743613 2025-07-10 21:47:46,549 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743616_2792 src: /192.168.158.9:44534 dest: /192.168.158.4:9866 2025-07-10 21:47:46,576 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:44534, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1042166628_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743616_2792, duration(ns): 20296085 2025-07-10 21:47:46,576 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743616_2792, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-10 21:47:53,783 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743616_2792 replica FinalizedReplica, blk_1073743616_2792, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743616 for deletion 2025-07-10 21:47:53,784 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743616_2792 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743616 2025-07-10 21:48:51,532 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743617_2793 src: /192.168.158.5:43804 dest: /192.168.158.4:9866 2025-07-10 21:48:51,550 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:43804, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1378396803_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743617_2793, duration(ns): 15859830 2025-07-10 21:48:51,551 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743617_2793, type=LAST_IN_PIPELINE terminating 2025-07-10 21:48:56,785 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743617_2793 replica FinalizedReplica, blk_1073743617_2793, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743617 for deletion 2025-07-10 21:48:56,786 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743617_2793 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743617 2025-07-10 21:50:51,543 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743619_2795 src: /192.168.158.7:55682 dest: /192.168.158.4:9866 2025-07-10 21:50:51,561 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:55682, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-317678926_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743619_2795, duration(ns): 15563916 2025-07-10 21:50:51,561 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743619_2795, type=LAST_IN_PIPELINE terminating 2025-07-10 21:50:53,789 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743619_2795 replica FinalizedReplica, blk_1073743619_2795, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743619 for deletion 2025-07-10 21:50:53,790 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743619_2795 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743619 2025-07-10 21:53:51,531 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743622_2798 src: /192.168.158.6:45336 dest: /192.168.158.4:9866 2025-07-10 21:53:51,548 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:45336, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_84224899_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743622_2798, duration(ns): 13964360 2025-07-10 21:53:51,548 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743622_2798, type=LAST_IN_PIPELINE terminating 2025-07-10 21:53:56,791 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743622_2798 replica FinalizedReplica, blk_1073743622_2798, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743622 for deletion 2025-07-10 21:53:56,793 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743622_2798 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743622 2025-07-10 21:54:56,546 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743623_2799 src: /192.168.158.8:56278 dest: /192.168.158.4:9866 2025-07-10 21:54:56,561 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:56278, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-870072785_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743623_2799, duration(ns): 13344486 2025-07-10 21:54:56,561 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743623_2799, type=LAST_IN_PIPELINE terminating 2025-07-10 21:54:59,795 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743623_2799 replica FinalizedReplica, blk_1073743623_2799, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743623 for deletion 2025-07-10 21:54:59,796 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743623_2799 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743623 2025-07-10 21:55:56,572 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743624_2800 src: /192.168.158.8:46006 dest: /192.168.158.4:9866 2025-07-10 21:55:56,593 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:46006, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-365657657_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743624_2800, duration(ns): 15612603 2025-07-10 21:55:56,594 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743624_2800, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-10 21:55:59,796 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743624_2800 replica FinalizedReplica, blk_1073743624_2800, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743624 for deletion 2025-07-10 21:55:59,797 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743624_2800 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743624 2025-07-10 21:56:56,582 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743625_2801 src: /192.168.158.9:48504 dest: /192.168.158.4:9866 2025-07-10 21:56:56,598 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:48504, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1203094177_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743625_2801, duration(ns): 14109092 2025-07-10 21:56:56,599 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743625_2801, type=LAST_IN_PIPELINE terminating 2025-07-10 21:56:59,797 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743625_2801 replica FinalizedReplica, blk_1073743625_2801, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743625 for deletion 2025-07-10 21:56:59,799 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743625_2801 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743625 2025-07-10 21:57:56,562 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743626_2802 src: /192.168.158.8:44270 dest: /192.168.158.4:9866 2025-07-10 21:57:56,587 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:44270, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-600503222_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743626_2802, duration(ns): 19288391 2025-07-10 21:57:56,587 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743626_2802, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-10 21:58:02,801 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743626_2802 replica FinalizedReplica, blk_1073743626_2802, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743626 for deletion 2025-07-10 21:58:02,802 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743626_2802 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743626 2025-07-10 21:58:56,562 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743627_2803 src: /192.168.158.1:46712 dest: /192.168.158.4:9866 2025-07-10 21:58:56,594 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46712, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-947149025_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743627_2803, duration(ns): 22539830 2025-07-10 21:58:56,594 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743627_2803, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-10 21:58:59,802 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743627_2803 replica FinalizedReplica, blk_1073743627_2803, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743627 for deletion 2025-07-10 21:58:59,804 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743627_2803 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743627 2025-07-10 22:01:01,572 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743629_2805 src: /192.168.158.6:48464 dest: /192.168.158.4:9866 2025-07-10 22:01:01,594 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:48464, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2061723649_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743629_2805, duration(ns): 16786288 2025-07-10 22:01:01,594 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743629_2805, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-10 22:01:05,806 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743629_2805 replica FinalizedReplica, blk_1073743629_2805, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743629 for deletion 2025-07-10 22:01:05,807 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743629_2805 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743629 2025-07-10 22:03:01,551 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743631_2807 src: /192.168.158.7:51028 dest: /192.168.158.4:9866 2025-07-10 22:03:01,567 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:51028, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1815800215_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743631_2807, duration(ns): 14417091 2025-07-10 22:03:01,568 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743631_2807, type=LAST_IN_PIPELINE terminating 2025-07-10 22:03:05,809 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743631_2807 replica FinalizedReplica, blk_1073743631_2807, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743631 for deletion 2025-07-10 22:03:05,811 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743631_2807 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743631 2025-07-10 22:04:01,571 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743632_2808 src: /192.168.158.8:57006 dest: /192.168.158.4:9866 2025-07-10 22:04:01,591 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:57006, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1438476565_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743632_2808, duration(ns): 17434932 2025-07-10 22:04:01,591 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743632_2808, type=LAST_IN_PIPELINE terminating 2025-07-10 22:04:05,811 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743632_2808 replica FinalizedReplica, blk_1073743632_2808, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743632 for deletion 2025-07-10 22:04:05,813 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743632_2808 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743632 2025-07-10 22:06:06,558 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743634_2810 src: /192.168.158.5:44754 dest: /192.168.158.4:9866 2025-07-10 22:06:06,573 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:44754, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1070611673_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743634_2810, duration(ns): 12723745 2025-07-10 22:06:06,573 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743634_2810, type=LAST_IN_PIPELINE terminating 2025-07-10 22:06:11,813 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743634_2810 replica FinalizedReplica, blk_1073743634_2810, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743634 for deletion 2025-07-10 22:06:11,814 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743634_2810 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743634 2025-07-10 22:08:06,564 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743636_2812 src: /192.168.158.8:49376 dest: /192.168.158.4:9866 2025-07-10 22:08:06,583 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:49376, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2125134636_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743636_2812, duration(ns): 16064042 2025-07-10 22:08:06,583 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743636_2812, type=LAST_IN_PIPELINE terminating 2025-07-10 22:08:08,815 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743636_2812 replica FinalizedReplica, blk_1073743636_2812, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743636 for deletion 2025-07-10 22:08:08,816 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743636_2812 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743636 2025-07-10 22:09:11,552 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743637_2813 src: /192.168.158.1:36856 dest: /192.168.158.4:9866 2025-07-10 22:09:11,585 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36856, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1325555340_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743637_2813, duration(ns): 23605267 2025-07-10 22:09:11,585 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743637_2813, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-10 22:09:14,814 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743637_2813 replica FinalizedReplica, blk_1073743637_2813, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743637 for deletion 2025-07-10 22:09:14,816 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743637_2813 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743637 2025-07-10 22:11:21,591 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743639_2815 src: /192.168.158.8:42974 dest: /192.168.158.4:9866 2025-07-10 22:11:21,615 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:42974, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_600066595_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743639_2815, duration(ns): 18791584 2025-07-10 22:11:21,615 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743639_2815, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-10 22:11:26,818 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743639_2815 replica FinalizedReplica, blk_1073743639_2815, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743639 for deletion 2025-07-10 22:11:26,819 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743639_2815 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743639 2025-07-10 22:23:36,611 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743651_2827 src: /192.168.158.7:49552 dest: /192.168.158.4:9866 2025-07-10 22:23:36,628 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:49552, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_465908184_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743651_2827, duration(ns): 15148192 2025-07-10 22:23:36,629 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743651_2827, type=LAST_IN_PIPELINE terminating 2025-07-10 22:23:38,843 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743651_2827 replica FinalizedReplica, blk_1073743651_2827, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743651 for deletion 2025-07-10 22:23:38,844 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743651_2827 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743651 2025-07-10 22:24:36,626 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743652_2828 src: /192.168.158.1:51540 dest: /192.168.158.4:9866 2025-07-10 22:24:36,655 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51540, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1051141755_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743652_2828, duration(ns): 19602443 2025-07-10 22:24:36,655 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743652_2828, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-10 22:24:41,845 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743652_2828 replica FinalizedReplica, blk_1073743652_2828, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743652 for deletion 2025-07-10 22:24:41,846 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743652_2828 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743652 2025-07-10 22:25:41,608 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743653_2829 src: /192.168.158.6:33752 dest: /192.168.158.4:9866 2025-07-10 22:25:41,627 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:33752, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2031153637_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743653_2829, duration(ns): 17211757 2025-07-10 22:25:41,628 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743653_2829, type=LAST_IN_PIPELINE terminating 2025-07-10 22:25:44,844 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743653_2829 replica FinalizedReplica, blk_1073743653_2829, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743653 for deletion 2025-07-10 22:25:44,846 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743653_2829 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743653 2025-07-10 22:26:46,615 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743654_2830 src: /192.168.158.8:49288 dest: /192.168.158.4:9866 2025-07-10 22:26:46,639 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:49288, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1723108384_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743654_2830, duration(ns): 18710412 2025-07-10 22:26:46,640 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743654_2830, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-10 22:26:50,846 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743654_2830 replica FinalizedReplica, blk_1073743654_2830, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743654 for deletion 2025-07-10 22:26:50,847 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743654_2830 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743654 2025-07-10 22:27:51,605 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743655_2831 src: /192.168.158.1:51526 dest: /192.168.158.4:9866 2025-07-10 22:27:51,638 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51526, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1591074706_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743655_2831, duration(ns): 22665983 2025-07-10 22:27:51,639 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743655_2831, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-10 22:27:56,847 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743655_2831 replica FinalizedReplica, blk_1073743655_2831, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743655 for deletion 2025-07-10 22:27:56,848 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743655_2831 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743655 2025-07-10 22:28:56,600 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743656_2832 src: /192.168.158.1:33234 dest: /192.168.158.4:9866 2025-07-10 22:28:56,631 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33234, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-813212247_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743656_2832, duration(ns): 20964637 2025-07-10 22:28:56,631 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743656_2832, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-10 22:28:59,848 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743656_2832 replica FinalizedReplica, blk_1073743656_2832, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743656 for deletion 2025-07-10 22:28:59,850 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743656_2832 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743656 2025-07-10 22:32:01,666 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743659_2835 src: /192.168.158.5:60242 dest: /192.168.158.4:9866 2025-07-10 22:32:01,683 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:60242, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_326367124_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743659_2835, duration(ns): 14991289 2025-07-10 22:32:01,683 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743659_2835, type=LAST_IN_PIPELINE terminating 2025-07-10 22:32:05,849 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743659_2835 replica FinalizedReplica, blk_1073743659_2835, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743659 for deletion 2025-07-10 22:32:05,850 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743659_2835 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743659 2025-07-10 22:33:01,649 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743660_2836 src: /192.168.158.8:33104 dest: /192.168.158.4:9866 2025-07-10 22:33:01,672 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:33104, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1423226811_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743660_2836, duration(ns): 17945739 2025-07-10 22:33:01,672 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743660_2836, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-10 22:33:08,851 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743660_2836 replica FinalizedReplica, blk_1073743660_2836, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743660 for deletion 2025-07-10 22:33:08,852 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743660_2836 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743660 2025-07-10 22:34:01,657 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743661_2837 src: /192.168.158.8:47080 dest: /192.168.158.4:9866 2025-07-10 22:34:01,676 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:47080, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_187293895_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743661_2837, duration(ns): 17125840 2025-07-10 22:34:01,676 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743661_2837, type=LAST_IN_PIPELINE terminating 2025-07-10 22:34:08,851 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743661_2837 replica FinalizedReplica, blk_1073743661_2837, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743661 for deletion 2025-07-10 22:34:08,853 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743661_2837 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743661 2025-07-10 22:35:01,653 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743662_2838 src: /192.168.158.5:36326 dest: /192.168.158.4:9866 2025-07-10 22:35:01,671 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:36326, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1446571928_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743662_2838, duration(ns): 15751673 2025-07-10 22:35:01,671 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743662_2838, type=LAST_IN_PIPELINE terminating 2025-07-10 22:35:08,853 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743662_2838 replica FinalizedReplica, blk_1073743662_2838, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743662 for deletion 2025-07-10 22:35:08,856 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743662_2838 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743662 2025-07-10 22:36:01,654 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743663_2839 src: /192.168.158.5:42106 dest: /192.168.158.4:9866 2025-07-10 22:36:01,672 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:42106, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1460757674_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743663_2839, duration(ns): 15585763 2025-07-10 22:36:01,672 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743663_2839, type=LAST_IN_PIPELINE terminating 2025-07-10 22:36:05,856 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743663_2839 replica FinalizedReplica, blk_1073743663_2839, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743663 for deletion 2025-07-10 22:36:05,858 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743663_2839 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743663 2025-07-10 22:38:01,638 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743665_2841 src: /192.168.158.9:40080 dest: /192.168.158.4:9866 2025-07-10 22:38:01,662 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:40080, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1516450180_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743665_2841, duration(ns): 18276831 2025-07-10 22:38:01,662 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743665_2841, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-10 22:38:05,860 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743665_2841 replica FinalizedReplica, blk_1073743665_2841, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743665 for deletion 2025-07-10 22:38:05,862 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743665_2841 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743665 2025-07-10 22:39:01,624 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743666_2842 src: /192.168.158.6:56386 dest: /192.168.158.4:9866 2025-07-10 22:39:01,649 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:56386, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-231615970_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743666_2842, duration(ns): 18396685 2025-07-10 22:39:01,649 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743666_2842, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-10 22:39:05,865 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743666_2842 replica FinalizedReplica, blk_1073743666_2842, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743666 for deletion 2025-07-10 22:39:05,866 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743666_2842 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743666 2025-07-10 22:40:01,674 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743667_2843 src: /192.168.158.7:34012 dest: /192.168.158.4:9866 2025-07-10 22:40:01,693 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:34012, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2120051492_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743667_2843, duration(ns): 16337452 2025-07-10 22:40:01,693 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743667_2843, type=LAST_IN_PIPELINE terminating 2025-07-10 22:40:08,865 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743667_2843 replica FinalizedReplica, blk_1073743667_2843, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743667 for deletion 2025-07-10 22:40:08,866 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743667_2843 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743667 2025-07-10 22:41:01,640 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743668_2844 src: /192.168.158.7:55878 dest: /192.168.158.4:9866 2025-07-10 22:41:01,666 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:55878, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1765056506_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743668_2844, duration(ns): 20516986 2025-07-10 22:41:01,666 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743668_2844, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-10 22:41:05,867 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743668_2844 replica FinalizedReplica, blk_1073743668_2844, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743668 for deletion 2025-07-10 22:41:05,868 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743668_2844 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743668 2025-07-10 22:44:11,640 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743671_2847 src: /192.168.158.9:60956 dest: /192.168.158.4:9866 2025-07-10 22:44:11,666 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:60956, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-538611390_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743671_2847, duration(ns): 19955307 2025-07-10 22:44:11,666 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743671_2847, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-10 22:44:14,880 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743671_2847 replica FinalizedReplica, blk_1073743671_2847, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743671 for deletion 2025-07-10 22:44:14,882 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743671_2847 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743671 2025-07-10 22:46:21,647 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743673_2849 src: /192.168.158.9:48436 dest: /192.168.158.4:9866 2025-07-10 22:46:21,673 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:48436, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-843372923_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743673_2849, duration(ns): 20323252 2025-07-10 22:46:21,673 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743673_2849, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-10 22:46:23,885 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743673_2849 replica FinalizedReplica, blk_1073743673_2849, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743673 for deletion 2025-07-10 22:46:23,886 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743673_2849 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743673 2025-07-10 22:49:31,679 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743676_2852 src: /192.168.158.9:57842 dest: /192.168.158.4:9866 2025-07-10 22:49:31,705 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:57842, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_869128889_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743676_2852, duration(ns): 20600792 2025-07-10 22:49:31,706 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743676_2852, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-10 22:49:35,886 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743676_2852 replica FinalizedReplica, blk_1073743676_2852, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743676 for deletion 2025-07-10 22:49:35,888 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743676_2852 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743676 2025-07-10 22:51:31,746 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743678_2854 src: /192.168.158.1:54752 dest: /192.168.158.4:9866 2025-07-10 22:51:31,781 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54752, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_337945163_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743678_2854, duration(ns): 25927145 2025-07-10 22:51:31,781 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743678_2854, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-10 22:51:35,889 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743678_2854 replica FinalizedReplica, blk_1073743678_2854, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743678 for deletion 2025-07-10 22:51:35,890 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743678_2854 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743678 2025-07-10 22:54:36,655 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743681_2857 src: /192.168.158.7:39122 dest: /192.168.158.4:9866 2025-07-10 22:54:36,679 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:39122, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_888696134_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743681_2857, duration(ns): 18247316 2025-07-10 22:54:36,680 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743681_2857, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-10 22:54:38,893 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743681_2857 replica FinalizedReplica, blk_1073743681_2857, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743681 for deletion 2025-07-10 22:54:38,894 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743681_2857 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743681 2025-07-10 22:57:41,677 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743684_2860 src: /192.168.158.1:57570 dest: /192.168.158.4:9866 2025-07-10 22:57:41,707 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57570, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1043309968_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743684_2860, duration(ns): 21379259 2025-07-10 22:57:41,707 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743684_2860, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-10 22:57:47,902 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743684_2860 replica FinalizedReplica, blk_1073743684_2860, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743684 for deletion 2025-07-10 22:57:47,903 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743684_2860 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743684 2025-07-10 22:58:41,670 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743685_2861 src: /192.168.158.7:39954 dest: /192.168.158.4:9866 2025-07-10 22:58:41,687 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:39954, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_980958926_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743685_2861, duration(ns): 14763568 2025-07-10 22:58:41,688 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743685_2861, type=LAST_IN_PIPELINE terminating 2025-07-10 22:58:44,906 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743685_2861 replica FinalizedReplica, blk_1073743685_2861, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743685 for deletion 2025-07-10 22:58:44,907 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743685_2861 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743685 2025-07-10 22:59:41,706 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743686_2862 src: /192.168.158.9:39592 dest: /192.168.158.4:9866 2025-07-10 22:59:41,724 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:39592, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_421574128_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743686_2862, duration(ns): 15523671 2025-07-10 22:59:41,724 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743686_2862, type=LAST_IN_PIPELINE terminating 2025-07-10 22:59:44,907 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743686_2862 replica FinalizedReplica, blk_1073743686_2862, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743686 for deletion 2025-07-10 22:59:44,909 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743686_2862 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743686 2025-07-10 23:05:51,709 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743692_2868 src: /192.168.158.1:40518 dest: /192.168.158.4:9866 2025-07-10 23:05:51,742 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40518, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1027802423_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743692_2868, duration(ns): 23756269 2025-07-10 23:05:51,742 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743692_2868, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-10 23:05:56,923 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743692_2868 replica FinalizedReplica, blk_1073743692_2868, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743692 for deletion 2025-07-10 23:05:56,924 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743692_2868 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743692 2025-07-10 23:06:51,713 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743693_2869 src: /192.168.158.5:40794 dest: /192.168.158.4:9866 2025-07-10 23:06:51,731 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:40794, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_107701530_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743693_2869, duration(ns): 15347579 2025-07-10 23:06:51,732 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743693_2869, type=LAST_IN_PIPELINE terminating 2025-07-10 23:06:56,927 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743693_2869 replica FinalizedReplica, blk_1073743693_2869, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743693 for deletion 2025-07-10 23:06:56,928 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743693_2869 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743693 2025-07-10 23:07:51,719 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743694_2870 src: /192.168.158.8:46504 dest: /192.168.158.4:9866 2025-07-10 23:07:51,737 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:46504, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1697080178_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743694_2870, duration(ns): 15684751 2025-07-10 23:07:51,737 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743694_2870, type=LAST_IN_PIPELINE terminating 2025-07-10 23:07:53,930 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743694_2870 replica FinalizedReplica, blk_1073743694_2870, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743694 for deletion 2025-07-10 23:07:53,931 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743694_2870 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743694 2025-07-10 23:08:56,727 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743695_2871 src: /192.168.158.7:40680 dest: /192.168.158.4:9866 2025-07-10 23:08:56,751 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:40680, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1967778487_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743695_2871, duration(ns): 18773657 2025-07-10 23:08:56,752 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743695_2871, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-10 23:09:02,936 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743695_2871 replica FinalizedReplica, blk_1073743695_2871, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743695 for deletion 2025-07-10 23:09:02,937 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743695_2871 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743695 2025-07-10 23:10:56,715 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743697_2873 src: /192.168.158.6:36562 dest: /192.168.158.4:9866 2025-07-10 23:10:56,731 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:36562, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_913420260_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743697_2873, duration(ns): 14313487 2025-07-10 23:10:56,732 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743697_2873, type=LAST_IN_PIPELINE terminating 2025-07-10 23:11:02,940 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743697_2873 replica FinalizedReplica, blk_1073743697_2873, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743697 for deletion 2025-07-10 23:11:02,942 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743697_2873 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743697 2025-07-10 23:12:56,716 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743699_2875 src: /192.168.158.9:47574 dest: /192.168.158.4:9866 2025-07-10 23:12:56,733 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:47574, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1563096386_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743699_2875, duration(ns): 14711747 2025-07-10 23:12:56,734 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743699_2875, type=LAST_IN_PIPELINE terminating 2025-07-10 23:13:02,945 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743699_2875 replica FinalizedReplica, blk_1073743699_2875, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743699 for deletion 2025-07-10 23:13:02,946 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743699_2875 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743699 2025-07-10 23:14:01,730 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743700_2876 src: /192.168.158.6:50984 dest: /192.168.158.4:9866 2025-07-10 23:14:01,748 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:50984, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1082258119_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743700_2876, duration(ns): 15084110 2025-07-10 23:14:01,748 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743700_2876, type=LAST_IN_PIPELINE terminating 2025-07-10 23:14:05,948 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743700_2876 replica FinalizedReplica, blk_1073743700_2876, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743700 for deletion 2025-07-10 23:14:05,949 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743700_2876 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743700 2025-07-10 23:15:01,734 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743701_2877 src: /192.168.158.5:59886 dest: /192.168.158.4:9866 2025-07-10 23:15:01,757 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:59886, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1352648994_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743701_2877, duration(ns): 18337455 2025-07-10 23:15:01,758 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743701_2877, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-10 23:15:08,951 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743701_2877 replica FinalizedReplica, blk_1073743701_2877, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743701 for deletion 2025-07-10 23:15:08,952 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743701_2877 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743701 2025-07-10 23:16:06,749 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743702_2878 src: /192.168.158.6:37080 dest: /192.168.158.4:9866 2025-07-10 23:16:06,773 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:37080, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-558704661_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743702_2878, duration(ns): 18580812 2025-07-10 23:16:06,773 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743702_2878, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-10 23:16:11,955 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743702_2878 replica FinalizedReplica, blk_1073743702_2878, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743702 for deletion 2025-07-10 23:16:11,956 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743702_2878 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743702 2025-07-10 23:17:06,723 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743703_2879 src: /192.168.158.8:39316 dest: /192.168.158.4:9866 2025-07-10 23:17:06,741 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:39316, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1133556592_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743703_2879, duration(ns): 15126915 2025-07-10 23:17:06,741 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743703_2879, type=LAST_IN_PIPELINE terminating 2025-07-10 23:17:11,955 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743703_2879 replica FinalizedReplica, blk_1073743703_2879, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743703 for deletion 2025-07-10 23:17:11,956 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743703_2879 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743703 2025-07-10 23:18:06,731 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743704_2880 src: /192.168.158.8:33690 dest: /192.168.158.4:9866 2025-07-10 23:18:06,757 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:33690, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1242530745_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743704_2880, duration(ns): 20163664 2025-07-10 23:18:06,758 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743704_2880, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-10 23:18:11,957 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743704_2880 replica FinalizedReplica, blk_1073743704_2880, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743704 for deletion 2025-07-10 23:18:11,959 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743704_2880 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743704 2025-07-10 23:24:16,750 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743710_2886 src: /192.168.158.5:44208 dest: /192.168.158.4:9866 2025-07-10 23:24:16,776 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:44208, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_329990188_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743710_2886, duration(ns): 20681844 2025-07-10 23:24:16,776 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743710_2886, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-10 23:24:20,970 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743710_2886 replica FinalizedReplica, blk_1073743710_2886, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743710 for deletion 2025-07-10 23:24:20,971 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743710_2886 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743710 2025-07-10 23:26:16,792 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743712_2888 src: /192.168.158.9:46912 dest: /192.168.158.4:9866 2025-07-10 23:26:16,818 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:46912, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1146356269_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743712_2888, duration(ns): 21081335 2025-07-10 23:26:16,819 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743712_2888, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-10 23:26:20,978 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743712_2888 replica FinalizedReplica, blk_1073743712_2888, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743712 for deletion 2025-07-10 23:26:20,980 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743712_2888 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743712 2025-07-10 23:27:16,766 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743713_2889 src: /192.168.158.5:36894 dest: /192.168.158.4:9866 2025-07-10 23:27:16,782 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:36894, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1320363467_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743713_2889, duration(ns): 13930155 2025-07-10 23:27:16,783 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743713_2889, type=LAST_IN_PIPELINE terminating 2025-07-10 23:27:20,983 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743713_2889 replica FinalizedReplica, blk_1073743713_2889, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743713 for deletion 2025-07-10 23:27:20,984 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743713_2889 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743713 2025-07-10 23:29:16,767 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743715_2891 src: /192.168.158.1:49470 dest: /192.168.158.4:9866 2025-07-10 23:29:16,800 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49470, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-91161821_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743715_2891, duration(ns): 22745330 2025-07-10 23:29:16,800 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743715_2891, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-10 23:29:20,987 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743715_2891 replica FinalizedReplica, blk_1073743715_2891, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743715 for deletion 2025-07-10 23:29:20,988 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743715_2891 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743715 2025-07-10 23:30:21,774 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743716_2892 src: /192.168.158.1:47056 dest: /192.168.158.4:9866 2025-07-10 23:30:21,805 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47056, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1693779377_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743716_2892, duration(ns): 22290400 2025-07-10 23:30:21,805 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743716_2892, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-10 23:30:23,989 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743716_2892 replica FinalizedReplica, blk_1073743716_2892, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743716 for deletion 2025-07-10 23:30:23,991 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743716_2892 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743716 2025-07-10 23:34:31,763 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743720_2896 src: /192.168.158.1:45016 dest: /192.168.158.4:9866 2025-07-10 23:34:31,795 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45016, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_716353762_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743720_2896, duration(ns): 23018573 2025-07-10 23:34:31,796 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743720_2896, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-10 23:34:36,004 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743720_2896 replica FinalizedReplica, blk_1073743720_2896, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743720 for deletion 2025-07-10 23:34:36,005 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743720_2896 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743720 2025-07-10 23:35:31,767 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743721_2897 src: /192.168.158.5:54994 dest: /192.168.158.4:9866 2025-07-10 23:35:31,783 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:54994, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-696148418_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743721_2897, duration(ns): 14024936 2025-07-10 23:35:31,784 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743721_2897, type=LAST_IN_PIPELINE terminating 2025-07-10 23:35:36,008 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743721_2897 replica FinalizedReplica, blk_1073743721_2897, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743721 for deletion 2025-07-10 23:35:36,009 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743721_2897 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743721 2025-07-10 23:36:13,268 INFO org.apache.hadoop.hdfs.server.datanode.DirectoryScanner: BlockPool BP-1059995147-192.168.158.1-1752101929360 Total blocks: 8, missing metadata files:0, missing block files:0, missing blocks in memory:0, mismatched blocks:0 2025-07-10 23:37:21,015 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Successfully sent block report 0x12cfd9a757d31f2a, containing 4 storage report(s), of which we sent 4. The reports had 8 total blocks and used 1 RPC(s). This took 0 msec to generate and 5 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5. 2025-07-10 23:37:21,015 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Got finalize command for block pool BP-1059995147-192.168.158.1-1752101929360 2025-07-10 23:37:31,777 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743723_2899 src: /192.168.158.1:33676 dest: /192.168.158.4:9866 2025-07-10 23:37:31,809 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33676, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1088270449_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743723_2899, duration(ns): 23162252 2025-07-10 23:37:31,809 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743723_2899, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-10 23:37:36,011 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743723_2899 replica FinalizedReplica, blk_1073743723_2899, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743723 for deletion 2025-07-10 23:37:36,012 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743723_2899 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743723 2025-07-10 23:43:36,800 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743729_2905 src: /192.168.158.8:44248 dest: /192.168.158.4:9866 2025-07-10 23:43:36,825 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:44248, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1228465268_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743729_2905, duration(ns): 19781216 2025-07-10 23:43:36,825 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743729_2905, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-10 23:43:39,019 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743729_2905 replica FinalizedReplica, blk_1073743729_2905, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743729 for deletion 2025-07-10 23:43:39,020 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743729_2905 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743729 2025-07-10 23:45:36,798 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743731_2907 src: /192.168.158.5:60276 dest: /192.168.158.4:9866 2025-07-10 23:45:36,815 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:60276, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1257758351_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743731_2907, duration(ns): 14004939 2025-07-10 23:45:36,815 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743731_2907, type=LAST_IN_PIPELINE terminating 2025-07-10 23:45:39,022 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743731_2907 replica FinalizedReplica, blk_1073743731_2907, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743731 for deletion 2025-07-10 23:45:39,023 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743731_2907 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743731 2025-07-10 23:51:46,827 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743737_2913 src: /192.168.158.9:38412 dest: /192.168.158.4:9866 2025-07-10 23:51:46,847 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:38412, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_985766272_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743737_2913, duration(ns): 17495554 2025-07-10 23:51:46,847 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743737_2913, type=LAST_IN_PIPELINE terminating 2025-07-10 23:51:51,043 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743737_2913 replica FinalizedReplica, blk_1073743737_2913, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743737 for deletion 2025-07-10 23:51:51,044 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743737_2913 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743737 2025-07-10 23:53:46,809 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743739_2915 src: /192.168.158.1:43270 dest: /192.168.158.4:9866 2025-07-10 23:53:46,842 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43270, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-529266213_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743739_2915, duration(ns): 24016959 2025-07-10 23:53:46,842 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743739_2915, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-10 23:53:51,052 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743739_2915 replica FinalizedReplica, blk_1073743739_2915, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743739 for deletion 2025-07-10 23:53:51,053 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743739_2915 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743739 2025-07-10 23:56:51,810 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743742_2918 src: /192.168.158.1:59454 dest: /192.168.158.4:9866 2025-07-10 23:56:51,840 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59454, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_495662664_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743742_2918, duration(ns): 20960733 2025-07-10 23:56:51,841 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743742_2918, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-10 23:56:54,061 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743742_2918 replica FinalizedReplica, blk_1073743742_2918, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743742 for deletion 2025-07-10 23:56:54,062 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743742_2918 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743742 2025-07-10 23:57:51,819 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743743_2919 src: /192.168.158.1:46846 dest: /192.168.158.4:9866 2025-07-10 23:57:51,850 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46846, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1357614747_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743743_2919, duration(ns): 22177673 2025-07-10 23:57:51,850 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743743_2919, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-10 23:57:54,065 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743743_2919 replica FinalizedReplica, blk_1073743743_2919, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743743 for deletion 2025-07-10 23:57:54,066 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743743_2919 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743743 2025-07-11 00:01:01,817 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743746_2922 src: /192.168.158.1:60870 dest: /192.168.158.4:9866 2025-07-11 00:01:01,851 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60870, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-852190660_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743746_2922, duration(ns): 24204887 2025-07-11 00:01:01,852 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743746_2922, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-11 00:01:06,073 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743746_2922 replica FinalizedReplica, blk_1073743746_2922, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743746 for deletion 2025-07-11 00:01:06,074 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743746_2922 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743746 2025-07-11 00:03:01,831 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743748_2924 src: /192.168.158.1:46046 dest: /192.168.158.4:9866 2025-07-11 00:03:01,862 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46046, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-145979360_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743748_2924, duration(ns): 22077216 2025-07-11 00:03:01,863 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743748_2924, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-11 00:03:09,077 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743748_2924 replica FinalizedReplica, blk_1073743748_2924, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743748 for deletion 2025-07-11 00:03:09,078 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743748_2924 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743748 2025-07-11 00:04:01,844 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743749_2925 src: /192.168.158.5:54670 dest: /192.168.158.4:9866 2025-07-11 00:04:01,863 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:54670, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-617866403_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743749_2925, duration(ns): 16592889 2025-07-11 00:04:01,863 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743749_2925, type=LAST_IN_PIPELINE terminating 2025-07-11 00:04:09,078 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743749_2925 replica FinalizedReplica, blk_1073743749_2925, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743749 for deletion 2025-07-11 00:04:09,079 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743749_2925 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743749 2025-07-11 00:05:01,839 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743750_2926 src: /192.168.158.6:48916 dest: /192.168.158.4:9866 2025-07-11 00:05:01,860 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:48916, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_444538028_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743750_2926, duration(ns): 18238972 2025-07-11 00:05:01,860 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743750_2926, type=LAST_IN_PIPELINE terminating 2025-07-11 00:05:06,080 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743750_2926 replica FinalizedReplica, blk_1073743750_2926, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743750 for deletion 2025-07-11 00:05:06,081 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743750_2926 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743750 2025-07-11 00:07:06,850 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743752_2928 src: /192.168.158.7:59254 dest: /192.168.158.4:9866 2025-07-11 00:07:06,875 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:59254, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-323062265_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743752_2928, duration(ns): 20104226 2025-07-11 00:07:06,876 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743752_2928, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-11 00:07:09,083 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743752_2928 replica FinalizedReplica, blk_1073743752_2928, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743752 for deletion 2025-07-11 00:07:09,084 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743752_2928 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743752 2025-07-11 00:10:16,847 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743755_2931 src: /192.168.158.8:33254 dest: /192.168.158.4:9866 2025-07-11 00:10:16,867 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:33254, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_731249685_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743755_2931, duration(ns): 17058821 2025-07-11 00:10:16,867 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743755_2931, type=LAST_IN_PIPELINE terminating 2025-07-11 00:10:21,087 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743755_2931 replica FinalizedReplica, blk_1073743755_2931, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743755 for deletion 2025-07-11 00:10:21,088 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743755_2931 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743755 2025-07-11 00:11:16,827 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743756_2932 src: /192.168.158.1:60486 dest: /192.168.158.4:9866 2025-07-11 00:11:16,858 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60486, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_482907227_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743756_2932, duration(ns): 22075609 2025-07-11 00:11:16,858 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743756_2932, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-11 00:11:21,090 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743756_2932 replica FinalizedReplica, blk_1073743756_2932, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743756 for deletion 2025-07-11 00:11:21,091 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743756_2932 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743756 2025-07-11 00:13:21,836 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743758_2934 src: /192.168.158.9:45652 dest: /192.168.158.4:9866 2025-07-11 00:13:21,854 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:45652, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2128235039_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743758_2934, duration(ns): 15783058 2025-07-11 00:13:21,855 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743758_2934, type=LAST_IN_PIPELINE terminating 2025-07-11 00:13:27,093 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743758_2934 replica FinalizedReplica, blk_1073743758_2934, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743758 for deletion 2025-07-11 00:13:27,095 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743758_2934 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743758 2025-07-11 00:14:21,844 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743759_2935 src: /192.168.158.9:40812 dest: /192.168.158.4:9866 2025-07-11 00:14:21,863 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:40812, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-821711051_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743759_2935, duration(ns): 16898829 2025-07-11 00:14:21,863 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743759_2935, type=LAST_IN_PIPELINE terminating 2025-07-11 00:14:24,096 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743759_2935 replica FinalizedReplica, blk_1073743759_2935, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743759 for deletion 2025-07-11 00:14:24,097 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743759_2935 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743759 2025-07-11 00:16:31,837 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743761_2937 src: /192.168.158.5:49402 dest: /192.168.158.4:9866 2025-07-11 00:16:31,855 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:49402, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_994521296_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743761_2937, duration(ns): 15591056 2025-07-11 00:16:31,855 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743761_2937, type=LAST_IN_PIPELINE terminating 2025-07-11 00:16:39,098 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743761_2937 replica FinalizedReplica, blk_1073743761_2937, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743761 for deletion 2025-07-11 00:16:39,099 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743761_2937 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743761 2025-07-11 00:17:31,839 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743762_2938 src: /192.168.158.1:37846 dest: /192.168.158.4:9866 2025-07-11 00:17:31,869 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37846, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1307123146_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743762_2938, duration(ns): 20421138 2025-07-11 00:17:31,869 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743762_2938, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-11 00:17:36,100 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743762_2938 replica FinalizedReplica, blk_1073743762_2938, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743762 for deletion 2025-07-11 00:17:36,101 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743762_2938 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743762 2025-07-11 00:18:36,846 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743763_2939 src: /192.168.158.1:45472 dest: /192.168.158.4:9866 2025-07-11 00:18:36,874 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45472, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1576161712_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743763_2939, duration(ns): 19029629 2025-07-11 00:18:36,875 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743763_2939, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-11 00:18:39,103 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743763_2939 replica FinalizedReplica, blk_1073743763_2939, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743763 for deletion 2025-07-11 00:18:39,105 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743763_2939 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743763 2025-07-11 00:20:41,849 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743765_2941 src: /192.168.158.9:46444 dest: /192.168.158.4:9866 2025-07-11 00:20:41,866 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:46444, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1461821071_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743765_2941, duration(ns): 14998699 2025-07-11 00:20:41,867 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743765_2941, type=LAST_IN_PIPELINE terminating 2025-07-11 00:20:45,112 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743765_2941 replica FinalizedReplica, blk_1073743765_2941, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743765 for deletion 2025-07-11 00:20:45,113 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743765_2941 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743765 2025-07-11 00:27:51,865 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743772_2948 src: /192.168.158.9:40652 dest: /192.168.158.4:9866 2025-07-11 00:27:51,883 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:40652, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1210025393_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743772_2948, duration(ns): 16401356 2025-07-11 00:27:51,884 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743772_2948, type=LAST_IN_PIPELINE terminating 2025-07-11 00:27:54,132 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743772_2948 replica FinalizedReplica, blk_1073743772_2948, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743772 for deletion 2025-07-11 00:27:54,133 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743772_2948 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743772 2025-07-11 00:28:56,849 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743773_2949 src: /192.168.158.1:36320 dest: /192.168.158.4:9866 2025-07-11 00:28:56,883 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36320, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1901428291_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743773_2949, duration(ns): 23855580 2025-07-11 00:28:56,883 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743773_2949, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-11 00:29:00,135 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743773_2949 replica FinalizedReplica, blk_1073743773_2949, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743773 for deletion 2025-07-11 00:29:00,136 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743773_2949 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743773 2025-07-11 00:29:56,867 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743774_2950 src: /192.168.158.1:48012 dest: /192.168.158.4:9866 2025-07-11 00:29:56,898 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48012, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_951994744_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743774_2950, duration(ns): 21848162 2025-07-11 00:29:56,898 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743774_2950, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-11 00:30:00,136 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743774_2950 replica FinalizedReplica, blk_1073743774_2950, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743774 for deletion 2025-07-11 00:30:00,138 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743774_2950 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743774 2025-07-11 00:31:56,863 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743776_2952 src: /192.168.158.7:45762 dest: /192.168.158.4:9866 2025-07-11 00:31:56,882 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:45762, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1497190772_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743776_2952, duration(ns): 15686782 2025-07-11 00:31:56,883 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743776_2952, type=LAST_IN_PIPELINE terminating 2025-07-11 00:32:00,140 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743776_2952 replica FinalizedReplica, blk_1073743776_2952, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743776 for deletion 2025-07-11 00:32:00,141 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743776_2952 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743776 2025-07-11 00:34:01,872 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743778_2954 src: /192.168.158.7:57642 dest: /192.168.158.4:9866 2025-07-11 00:34:01,898 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:57642, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1761754529_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743778_2954, duration(ns): 19878483 2025-07-11 00:34:01,898 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743778_2954, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-11 00:34:09,146 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743778_2954 replica FinalizedReplica, blk_1073743778_2954, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743778 for deletion 2025-07-11 00:34:09,147 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743778_2954 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743778 2025-07-11 00:35:01,876 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743779_2955 src: /192.168.158.9:52004 dest: /192.168.158.4:9866 2025-07-11 00:35:01,900 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:52004, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_438262036_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743779_2955, duration(ns): 18667851 2025-07-11 00:35:01,900 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743779_2955, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-11 00:35:06,148 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743779_2955 replica FinalizedReplica, blk_1073743779_2955, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743779 for deletion 2025-07-11 00:35:06,149 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743779_2955 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743779 2025-07-11 00:38:16,878 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743782_2958 src: /192.168.158.8:56748 dest: /192.168.158.4:9866 2025-07-11 00:38:16,902 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:56748, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_376453142_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743782_2958, duration(ns): 19249248 2025-07-11 00:38:16,902 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743782_2958, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-11 00:38:24,151 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743782_2958 replica FinalizedReplica, blk_1073743782_2958, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743782 for deletion 2025-07-11 00:38:24,152 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743782_2958 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743782 2025-07-11 00:44:21,891 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743788_2964 src: /192.168.158.9:37236 dest: /192.168.158.4:9866 2025-07-11 00:44:21,918 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:37236, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1064085156_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743788_2964, duration(ns): 21065617 2025-07-11 00:44:21,918 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743788_2964, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-11 00:44:24,158 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743788_2964 replica FinalizedReplica, blk_1073743788_2964, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743788 for deletion 2025-07-11 00:44:24,159 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743788_2964 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743788 2025-07-11 00:45:26,917 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743789_2965 src: /192.168.158.7:42002 dest: /192.168.158.4:9866 2025-07-11 00:45:26,933 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:42002, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2032678362_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743789_2965, duration(ns): 14058857 2025-07-11 00:45:26,934 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743789_2965, type=LAST_IN_PIPELINE terminating 2025-07-11 00:45:30,161 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743789_2965 replica FinalizedReplica, blk_1073743789_2965, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743789 for deletion 2025-07-11 00:45:30,163 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743789_2965 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743789 2025-07-11 00:47:31,906 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743791_2967 src: /192.168.158.6:35336 dest: /192.168.158.4:9866 2025-07-11 00:47:31,930 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:35336, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1740911253_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743791_2967, duration(ns): 18593396 2025-07-11 00:47:31,930 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743791_2967, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-11 00:47:39,165 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743791_2967 replica FinalizedReplica, blk_1073743791_2967, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743791 for deletion 2025-07-11 00:47:39,167 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743791_2967 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743791 2025-07-11 00:48:31,912 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743792_2968 src: /192.168.158.6:40280 dest: /192.168.158.4:9866 2025-07-11 00:48:31,930 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:40280, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-359842886_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743792_2968, duration(ns): 16827521 2025-07-11 00:48:31,931 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743792_2968, type=LAST_IN_PIPELINE terminating 2025-07-11 00:48:36,167 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743792_2968 replica FinalizedReplica, blk_1073743792_2968, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743792 for deletion 2025-07-11 00:48:36,168 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743792_2968 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743792 2025-07-11 00:49:31,918 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743793_2969 src: /192.168.158.5:41190 dest: /192.168.158.4:9866 2025-07-11 00:49:31,937 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:41190, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_827460532_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743793_2969, duration(ns): 17162776 2025-07-11 00:49:31,937 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743793_2969, type=LAST_IN_PIPELINE terminating 2025-07-11 00:49:36,170 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743793_2969 replica FinalizedReplica, blk_1073743793_2969, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743793 for deletion 2025-07-11 00:49:36,171 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743793_2969 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743793 2025-07-11 00:50:31,911 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743794_2970 src: /192.168.158.7:53702 dest: /192.168.158.4:9866 2025-07-11 00:50:31,929 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:53702, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1218909351_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743794_2970, duration(ns): 16372141 2025-07-11 00:50:31,930 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743794_2970, type=LAST_IN_PIPELINE terminating 2025-07-11 00:50:36,173 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743794_2970 replica FinalizedReplica, blk_1073743794_2970, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743794 for deletion 2025-07-11 00:50:36,174 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743794_2970 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743794 2025-07-11 00:52:31,941 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743796_2972 src: /192.168.158.6:45998 dest: /192.168.158.4:9866 2025-07-11 00:52:31,963 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:45998, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_470243030_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743796_2972, duration(ns): 16297377 2025-07-11 00:52:31,964 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743796_2972, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-11 00:52:39,176 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743796_2972 replica FinalizedReplica, blk_1073743796_2972, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743796 for deletion 2025-07-11 00:52:39,177 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743796_2972 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743796 2025-07-11 00:56:41,945 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743800_2976 src: /192.168.158.1:54380 dest: /192.168.158.4:9866 2025-07-11 00:56:41,977 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54380, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-566923453_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743800_2976, duration(ns): 22259257 2025-07-11 00:56:41,977 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743800_2976, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-11 00:56:45,184 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743800_2976 replica FinalizedReplica, blk_1073743800_2976, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743800 for deletion 2025-07-11 00:56:45,185 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743800_2976 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743800 2025-07-11 00:58:46,915 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743802_2978 src: /192.168.158.1:54450 dest: /192.168.158.4:9866 2025-07-11 00:58:46,946 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54450, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-926956635_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743802_2978, duration(ns): 22252785 2025-07-11 00:58:46,947 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743802_2978, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-11 00:58:54,193 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743802_2978 replica FinalizedReplica, blk_1073743802_2978, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743802 for deletion 2025-07-11 00:58:54,194 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743802_2978 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743802 2025-07-11 00:59:46,926 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743803_2979 src: /192.168.158.8:51392 dest: /192.168.158.4:9866 2025-07-11 00:59:46,951 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:51392, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_85947415_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743803_2979, duration(ns): 19386612 2025-07-11 00:59:46,951 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743803_2979, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-11 00:59:54,197 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743803_2979 replica FinalizedReplica, blk_1073743803_2979, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743803 for deletion 2025-07-11 00:59:54,198 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743803_2979 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743803 2025-07-11 01:02:46,934 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743806_2982 src: /192.168.158.7:44082 dest: /192.168.158.4:9866 2025-07-11 01:02:46,951 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:44082, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_113595847_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743806_2982, duration(ns): 14707223 2025-07-11 01:02:46,951 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743806_2982, type=LAST_IN_PIPELINE terminating 2025-07-11 01:02:51,198 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743806_2982 replica FinalizedReplica, blk_1073743806_2982, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743806 for deletion 2025-07-11 01:02:51,200 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743806_2982 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743806 2025-07-11 01:04:46,942 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743808_2984 src: /192.168.158.7:59066 dest: /192.168.158.4:9866 2025-07-11 01:04:46,984 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:59066, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1701374131_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743808_2984, duration(ns): 39708002 2025-07-11 01:04:46,984 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743808_2984, type=LAST_IN_PIPELINE terminating 2025-07-11 01:04:54,202 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743808_2984 replica FinalizedReplica, blk_1073743808_2984, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743808 for deletion 2025-07-11 01:04:54,204 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743808_2984 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743808 2025-07-11 01:07:51,940 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743811_2987 src: /192.168.158.1:54480 dest: /192.168.158.4:9866 2025-07-11 01:07:51,971 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54480, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1460494159_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743811_2987, duration(ns): 21435971 2025-07-11 01:07:51,971 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743811_2987, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-11 01:07:54,208 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743811_2987 replica FinalizedReplica, blk_1073743811_2987, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743811 for deletion 2025-07-11 01:07:54,209 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743811_2987 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743811 2025-07-11 01:08:51,947 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743812_2988 src: /192.168.158.1:55504 dest: /192.168.158.4:9866 2025-07-11 01:08:51,976 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55504, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_655553695_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743812_2988, duration(ns): 20220102 2025-07-11 01:08:51,977 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743812_2988, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-11 01:08:54,211 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743812_2988 replica FinalizedReplica, blk_1073743812_2988, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743812 for deletion 2025-07-11 01:08:54,212 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743812_2988 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743812 2025-07-11 01:10:51,962 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743814_2990 src: /192.168.158.5:56952 dest: /192.168.158.4:9866 2025-07-11 01:10:51,987 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:56952, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-931355282_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743814_2990, duration(ns): 19929059 2025-07-11 01:10:51,987 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743814_2990, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-11 01:10:54,216 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743814_2990 replica FinalizedReplica, blk_1073743814_2990, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743814 for deletion 2025-07-11 01:10:54,217 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743814_2990 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743814 2025-07-11 01:12:51,952 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743816_2992 src: /192.168.158.1:53176 dest: /192.168.158.4:9866 2025-07-11 01:12:51,985 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53176, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1197373007_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743816_2992, duration(ns): 23316157 2025-07-11 01:12:51,985 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743816_2992, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-11 01:12:54,219 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743816_2992 replica FinalizedReplica, blk_1073743816_2992, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743816 for deletion 2025-07-11 01:12:54,220 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743816_2992 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743816 2025-07-11 01:14:56,969 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743818_2994 src: /192.168.158.6:47984 dest: /192.168.158.4:9866 2025-07-11 01:14:56,986 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:47984, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1030720135_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743818_2994, duration(ns): 14750041 2025-07-11 01:14:56,987 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743818_2994, type=LAST_IN_PIPELINE terminating 2025-07-11 01:15:00,223 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743818_2994 replica FinalizedReplica, blk_1073743818_2994, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743818 for deletion 2025-07-11 01:15:00,224 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743818_2994 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743818 2025-07-11 01:18:57,892 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743822_2998 src: /192.168.158.5:52804 dest: /192.168.158.4:9866 2025-07-11 01:18:57,921 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:52804, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1841402337_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743822_2998, duration(ns): 22835482 2025-07-11 01:18:57,921 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743822_2998, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-11 01:19:03,229 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743822_2998 replica FinalizedReplica, blk_1073743822_2998, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743822 for deletion 2025-07-11 01:19:03,230 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743822_2998 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743822 2025-07-11 01:20:01,992 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743823_2999 src: /192.168.158.1:38798 dest: /192.168.158.4:9866 2025-07-11 01:20:02,024 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38798, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-486093547_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743823_2999, duration(ns): 22248345 2025-07-11 01:20:02,024 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743823_2999, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-11 01:20:06,231 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743823_2999 replica FinalizedReplica, blk_1073743823_2999, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743823 for deletion 2025-07-11 01:20:06,232 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743823_2999 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743823 2025-07-11 01:22:11,951 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743825_3001 src: /192.168.158.6:45600 dest: /192.168.158.4:9866 2025-07-11 01:22:11,976 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:45600, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1175699934_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743825_3001, duration(ns): 19653692 2025-07-11 01:22:11,976 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743825_3001, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-11 01:22:15,234 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743825_3001 replica FinalizedReplica, blk_1073743825_3001, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743825 for deletion 2025-07-11 01:22:15,235 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743825_3001 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743825 2025-07-11 01:23:16,962 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743826_3002 src: /192.168.158.8:55094 dest: /192.168.158.4:9866 2025-07-11 01:23:16,981 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:55094, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_992015087_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743826_3002, duration(ns): 17045893 2025-07-11 01:23:16,981 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743826_3002, type=LAST_IN_PIPELINE terminating 2025-07-11 01:23:21,237 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743826_3002 replica FinalizedReplica, blk_1073743826_3002, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743826 for deletion 2025-07-11 01:23:21,238 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743826_3002 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743826 2025-07-11 01:26:16,968 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743829_3005 src: /192.168.158.8:47298 dest: /192.168.158.4:9866 2025-07-11 01:26:16,991 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:47298, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1377984125_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743829_3005, duration(ns): 17835505 2025-07-11 01:26:16,992 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743829_3005, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-11 01:26:24,241 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743829_3005 replica FinalizedReplica, blk_1073743829_3005, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743829 for deletion 2025-07-11 01:26:24,242 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743829_3005 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743829 2025-07-11 01:27:16,958 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743830_3006 src: /192.168.158.1:59674 dest: /192.168.158.4:9866 2025-07-11 01:27:16,988 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59674, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_18241787_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743830_3006, duration(ns): 21487005 2025-07-11 01:27:16,989 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743830_3006, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-11 01:27:21,247 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743830_3006 replica FinalizedReplica, blk_1073743830_3006, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743830 for deletion 2025-07-11 01:27:21,248 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743830_3006 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743830 2025-07-11 01:30:26,976 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743833_3009 src: /192.168.158.6:41164 dest: /192.168.158.4:9866 2025-07-11 01:30:26,992 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:41164, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_166172799_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743833_3009, duration(ns): 14043629 2025-07-11 01:30:26,993 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743833_3009, type=LAST_IN_PIPELINE terminating 2025-07-11 01:30:30,258 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743833_3009 replica FinalizedReplica, blk_1073743833_3009, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743833 for deletion 2025-07-11 01:30:30,259 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743833_3009 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743833 2025-07-11 01:35:31,988 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743838_3014 src: /192.168.158.1:57844 dest: /192.168.158.4:9866 2025-07-11 01:35:32,019 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57844, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-833423429_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743838_3014, duration(ns): 21503356 2025-07-11 01:35:32,021 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743838_3014, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-11 01:35:36,271 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743838_3014 replica FinalizedReplica, blk_1073743838_3014, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743838 for deletion 2025-07-11 01:35:36,273 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743838_3014 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743838 2025-07-11 01:37:36,981 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743840_3016 src: /192.168.158.1:38542 dest: /192.168.158.4:9866 2025-07-11 01:37:37,013 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38542, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1711093469_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743840_3016, duration(ns): 22991679 2025-07-11 01:37:37,013 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743840_3016, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-11 01:37:39,278 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743840_3016 replica FinalizedReplica, blk_1073743840_3016, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743840 for deletion 2025-07-11 01:37:39,279 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743840_3016 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743840 2025-07-11 01:39:37,005 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743842_3018 src: /192.168.158.8:41160 dest: /192.168.158.4:9866 2025-07-11 01:39:37,029 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:41160, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2143115143_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743842_3018, duration(ns): 18523077 2025-07-11 01:39:37,029 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743842_3018, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-11 01:39:39,282 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743842_3018 replica FinalizedReplica, blk_1073743842_3018, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743842 for deletion 2025-07-11 01:39:39,283 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743842_3018 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743842 2025-07-11 01:40:36,991 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743843_3019 src: /192.168.158.9:45618 dest: /192.168.158.4:9866 2025-07-11 01:40:37,016 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:45618, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-663668959_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743843_3019, duration(ns): 18804310 2025-07-11 01:40:37,016 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743843_3019, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-11 01:40:39,282 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743843_3019 replica FinalizedReplica, blk_1073743843_3019, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743843 for deletion 2025-07-11 01:40:39,283 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743843_3019 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743843 2025-07-11 01:47:52,003 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743850_3026 src: /192.168.158.9:34202 dest: /192.168.158.4:9866 2025-07-11 01:47:52,028 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:34202, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_522671766_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743850_3026, duration(ns): 19018295 2025-07-11 01:47:52,028 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743850_3026, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-11 01:47:54,297 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743850_3026 replica FinalizedReplica, blk_1073743850_3026, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743850 for deletion 2025-07-11 01:47:54,298 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743850_3026 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743850 2025-07-11 01:48:52,003 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743851_3027 src: /192.168.158.7:44402 dest: /192.168.158.4:9866 2025-07-11 01:48:52,026 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:44402, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-63090276_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743851_3027, duration(ns): 18178377 2025-07-11 01:48:52,027 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743851_3027, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-11 01:48:54,298 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743851_3027 replica FinalizedReplica, blk_1073743851_3027, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743851 for deletion 2025-07-11 01:48:54,300 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743851_3027 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743851 2025-07-11 01:49:52,007 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743852_3028 src: /192.168.158.6:48738 dest: /192.168.158.4:9866 2025-07-11 01:49:52,033 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:48738, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1646681411_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743852_3028, duration(ns): 20312892 2025-07-11 01:49:52,034 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743852_3028, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-11 01:49:54,300 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743852_3028 replica FinalizedReplica, blk_1073743852_3028, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743852 for deletion 2025-07-11 01:49:54,301 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743852_3028 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743852 2025-07-11 01:50:57,005 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743853_3029 src: /192.168.158.7:39682 dest: /192.168.158.4:9866 2025-07-11 01:50:57,029 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:39682, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2045174728_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743853_3029, duration(ns): 18470998 2025-07-11 01:50:57,029 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743853_3029, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-11 01:51:03,303 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743853_3029 replica FinalizedReplica, blk_1073743853_3029, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743853 for deletion 2025-07-11 01:51:03,304 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743853_3029 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743853 2025-07-11 01:53:57,005 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743856_3032 src: /192.168.158.7:35836 dest: /192.168.158.4:9866 2025-07-11 01:53:57,028 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:35836, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_869903764_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743856_3032, duration(ns): 17245918 2025-07-11 01:53:57,028 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743856_3032, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-11 01:54:00,310 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743856_3032 replica FinalizedReplica, blk_1073743856_3032, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743856 for deletion 2025-07-11 01:54:00,312 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743856_3032 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743856 2025-07-11 01:55:02,037 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743857_3033 src: /192.168.158.6:49850 dest: /192.168.158.4:9866 2025-07-11 01:55:02,060 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:49850, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1024783109_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743857_3033, duration(ns): 17885243 2025-07-11 01:55:02,060 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743857_3033, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-11 01:55:09,314 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743857_3033 replica FinalizedReplica, blk_1073743857_3033, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743857 for deletion 2025-07-11 01:55:09,315 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743857_3033 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743857 2025-07-11 01:58:07,041 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743860_3036 src: /192.168.158.1:44460 dest: /192.168.158.4:9866 2025-07-11 01:58:07,075 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44460, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1571758347_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743860_3036, duration(ns): 25082123 2025-07-11 01:58:07,075 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743860_3036, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-11 01:58:09,321 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743860_3036 replica FinalizedReplica, blk_1073743860_3036, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743860 for deletion 2025-07-11 01:58:09,322 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743860_3036 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743860 2025-07-11 01:59:07,035 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743861_3037 src: /192.168.158.1:33168 dest: /192.168.158.4:9866 2025-07-11 01:59:07,066 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33168, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1740648185_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743861_3037, duration(ns): 22918532 2025-07-11 01:59:07,066 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743861_3037, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-11 01:59:09,321 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743861_3037 replica FinalizedReplica, blk_1073743861_3037, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743861 for deletion 2025-07-11 01:59:09,322 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743861_3037 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743861 2025-07-11 02:02:07,024 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743864_3040 src: /192.168.158.7:41456 dest: /192.168.158.4:9866 2025-07-11 02:02:07,050 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:41456, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1200615688_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743864_3040, duration(ns): 19715163 2025-07-11 02:02:07,051 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743864_3040, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-11 02:02:12,327 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743864_3040 replica FinalizedReplica, blk_1073743864_3040, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743864 for deletion 2025-07-11 02:02:12,328 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743864_3040 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743864 2025-07-11 02:05:07,033 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743867_3043 src: /192.168.158.8:59452 dest: /192.168.158.4:9866 2025-07-11 02:05:07,052 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:59452, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2054921907_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743867_3043, duration(ns): 17257332 2025-07-11 02:05:07,053 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743867_3043, type=LAST_IN_PIPELINE terminating 2025-07-11 02:05:09,336 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743867_3043 replica FinalizedReplica, blk_1073743867_3043, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743867 for deletion 2025-07-11 02:05:09,337 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743867_3043 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743867 2025-07-11 02:06:07,043 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743868_3044 src: /192.168.158.7:54780 dest: /192.168.158.4:9866 2025-07-11 02:06:07,060 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:54780, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1543030184_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743868_3044, duration(ns): 15209068 2025-07-11 02:06:07,061 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743868_3044, type=LAST_IN_PIPELINE terminating 2025-07-11 02:06:12,336 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743868_3044 replica FinalizedReplica, blk_1073743868_3044, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743868 for deletion 2025-07-11 02:06:12,337 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743868_3044 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743868 2025-07-11 02:07:07,033 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743869_3045 src: /192.168.158.1:34598 dest: /192.168.158.4:9866 2025-07-11 02:07:07,065 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34598, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2008446197_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743869_3045, duration(ns): 23047405 2025-07-11 02:07:07,065 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743869_3045, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-11 02:07:09,339 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743869_3045 replica FinalizedReplica, blk_1073743869_3045, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743869 for deletion 2025-07-11 02:07:09,340 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743869_3045 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073743869 2025-07-11 02:10:12,064 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743872_3048 src: /192.168.158.6:44258 dest: /192.168.158.4:9866 2025-07-11 02:10:12,090 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:44258, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-789396205_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743872_3048, duration(ns): 20540159 2025-07-11 02:10:12,090 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743872_3048, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-11 02:10:18,346 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743872_3048 replica FinalizedReplica, blk_1073743872_3048, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743872 for deletion 2025-07-11 02:10:18,347 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743872_3048 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743872 2025-07-11 02:12:17,042 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743874_3050 src: /192.168.158.6:58160 dest: /192.168.158.4:9866 2025-07-11 02:12:17,065 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:58160, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1224297110_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743874_3050, duration(ns): 17591475 2025-07-11 02:12:17,065 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743874_3050, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-11 02:12:24,351 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743874_3050 replica FinalizedReplica, blk_1073743874_3050, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743874 for deletion 2025-07-11 02:12:24,352 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743874_3050 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743874 2025-07-11 02:16:17,051 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743878_3054 src: /192.168.158.1:42944 dest: /192.168.158.4:9866 2025-07-11 02:16:17,080 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42944, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1017215864_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743878_3054, duration(ns): 21178354 2025-07-11 02:16:17,084 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743878_3054, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-11 02:16:24,358 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743878_3054 replica FinalizedReplica, blk_1073743878_3054, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743878 for deletion 2025-07-11 02:16:24,359 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743878_3054 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743878 2025-07-11 02:18:17,069 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743880_3056 src: /192.168.158.9:44578 dest: /192.168.158.4:9866 2025-07-11 02:18:17,094 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:44578, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1464684731_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743880_3056, duration(ns): 19339817 2025-07-11 02:18:17,094 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743880_3056, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-11 02:18:21,363 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743880_3056 replica FinalizedReplica, blk_1073743880_3056, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743880 for deletion 2025-07-11 02:18:21,364 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743880_3056 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743880 2025-07-11 02:21:22,074 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743883_3059 src: /192.168.158.8:52462 dest: /192.168.158.4:9866 2025-07-11 02:21:22,090 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:52462, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1747033158_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743883_3059, duration(ns): 14193791 2025-07-11 02:21:22,090 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743883_3059, type=LAST_IN_PIPELINE terminating 2025-07-11 02:21:27,371 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743883_3059 replica FinalizedReplica, blk_1073743883_3059, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743883 for deletion 2025-07-11 02:21:27,372 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743883_3059 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743883 2025-07-11 02:22:22,074 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743884_3060 src: /192.168.158.7:42842 dest: /192.168.158.4:9866 2025-07-11 02:22:22,096 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:42842, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-59433475_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743884_3060, duration(ns): 16860206 2025-07-11 02:22:22,096 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743884_3060, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-11 02:22:24,374 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743884_3060 replica FinalizedReplica, blk_1073743884_3060, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743884 for deletion 2025-07-11 02:22:24,376 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743884_3060 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743884 2025-07-11 02:23:22,085 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743885_3061 src: /192.168.158.9:37338 dest: /192.168.158.4:9866 2025-07-11 02:23:22,109 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:37338, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_597047382_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743885_3061, duration(ns): 18266333 2025-07-11 02:23:22,109 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743885_3061, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-11 02:23:24,380 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743885_3061 replica FinalizedReplica, blk_1073743885_3061, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743885 for deletion 2025-07-11 02:23:24,381 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743885_3061 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743885 2025-07-11 02:26:22,133 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743888_3064 src: /192.168.158.6:43860 dest: /192.168.158.4:9866 2025-07-11 02:26:22,158 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:43860, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_449203645_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743888_3064, duration(ns): 19580641 2025-07-11 02:26:22,158 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743888_3064, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-11 02:26:27,384 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743888_3064 replica FinalizedReplica, blk_1073743888_3064, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743888 for deletion 2025-07-11 02:26:27,385 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743888_3064 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743888 2025-07-11 02:27:22,097 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743889_3065 src: /192.168.158.1:55954 dest: /192.168.158.4:9866 2025-07-11 02:27:22,128 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55954, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_52386463_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743889_3065, duration(ns): 23467159 2025-07-11 02:27:22,129 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743889_3065, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-11 02:27:24,387 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743889_3065 replica FinalizedReplica, blk_1073743889_3065, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743889 for deletion 2025-07-11 02:27:24,388 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743889_3065 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743889 2025-07-11 02:29:22,102 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743891_3067 src: /192.168.158.5:58422 dest: /192.168.158.4:9866 2025-07-11 02:29:22,122 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:58422, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-309417137_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743891_3067, duration(ns): 17437872 2025-07-11 02:29:22,122 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743891_3067, type=LAST_IN_PIPELINE terminating 2025-07-11 02:29:24,392 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743891_3067 replica FinalizedReplica, blk_1073743891_3067, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743891 for deletion 2025-07-11 02:29:24,393 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743891_3067 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743891 2025-07-11 02:31:22,099 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743893_3069 src: /192.168.158.6:59690 dest: /192.168.158.4:9866 2025-07-11 02:31:22,117 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:59690, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-279648762_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743893_3069, duration(ns): 16274190 2025-07-11 02:31:22,117 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743893_3069, type=LAST_IN_PIPELINE terminating 2025-07-11 02:31:24,395 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743893_3069 replica FinalizedReplica, blk_1073743893_3069, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743893 for deletion 2025-07-11 02:31:24,396 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743893_3069 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743893 2025-07-11 02:32:27,119 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743894_3070 src: /192.168.158.6:46240 dest: /192.168.158.4:9866 2025-07-11 02:32:27,145 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:46240, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1197670669_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743894_3070, duration(ns): 19975112 2025-07-11 02:32:27,145 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743894_3070, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-11 02:32:30,396 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743894_3070 replica FinalizedReplica, blk_1073743894_3070, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743894 for deletion 2025-07-11 02:32:30,397 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743894_3070 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743894 2025-07-11 02:34:27,099 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743896_3072 src: /192.168.158.1:54762 dest: /192.168.158.4:9866 2025-07-11 02:34:27,130 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54762, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_339259813_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743896_3072, duration(ns): 21969061 2025-07-11 02:34:27,130 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743896_3072, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-11 02:34:33,401 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743896_3072 replica FinalizedReplica, blk_1073743896_3072, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743896 for deletion 2025-07-11 02:34:33,402 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743896_3072 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743896 2025-07-11 02:36:37,104 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743898_3074 src: /192.168.158.1:58416 dest: /192.168.158.4:9866 2025-07-11 02:36:37,137 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58416, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1345960870_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743898_3074, duration(ns): 24233058 2025-07-11 02:36:37,137 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743898_3074, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-11 02:36:39,405 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743898_3074 replica FinalizedReplica, blk_1073743898_3074, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743898 for deletion 2025-07-11 02:36:39,406 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743898_3074 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743898 2025-07-11 02:37:37,117 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743899_3075 src: /192.168.158.7:58832 dest: /192.168.158.4:9866 2025-07-11 02:37:37,144 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:58832, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1485012382_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743899_3075, duration(ns): 21463226 2025-07-11 02:37:37,144 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743899_3075, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-11 02:37:39,411 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743899_3075 replica FinalizedReplica, blk_1073743899_3075, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743899 for deletion 2025-07-11 02:37:39,412 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743899_3075 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743899 2025-07-11 02:38:42,125 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743900_3076 src: /192.168.158.1:42590 dest: /192.168.158.4:9866 2025-07-11 02:38:42,157 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42590, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-350855875_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743900_3076, duration(ns): 22723804 2025-07-11 02:38:42,157 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743900_3076, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-11 02:38:48,413 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743900_3076 replica FinalizedReplica, blk_1073743900_3076, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743900 for deletion 2025-07-11 02:38:48,414 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743900_3076 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743900 2025-07-11 02:39:42,134 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743901_3077 src: /192.168.158.7:42676 dest: /192.168.158.4:9866 2025-07-11 02:39:42,155 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:42676, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_662896969_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743901_3077, duration(ns): 17259732 2025-07-11 02:39:42,155 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743901_3077, type=LAST_IN_PIPELINE terminating 2025-07-11 02:39:45,416 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743901_3077 replica FinalizedReplica, blk_1073743901_3077, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743901 for deletion 2025-07-11 02:39:45,417 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743901_3077 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743901 2025-07-11 02:41:42,125 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743903_3079 src: /192.168.158.8:45190 dest: /192.168.158.4:9866 2025-07-11 02:41:42,144 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:45190, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1707322550_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743903_3079, duration(ns): 16251708 2025-07-11 02:41:42,144 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743903_3079, type=LAST_IN_PIPELINE terminating 2025-07-11 02:41:45,421 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743903_3079 replica FinalizedReplica, blk_1073743903_3079, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743903 for deletion 2025-07-11 02:41:45,422 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743903_3079 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743903 2025-07-11 02:42:42,133 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743904_3080 src: /192.168.158.1:48898 dest: /192.168.158.4:9866 2025-07-11 02:42:42,162 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48898, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1858096646_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743904_3080, duration(ns): 20926841 2025-07-11 02:42:42,163 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743904_3080, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-11 02:42:48,424 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743904_3080 replica FinalizedReplica, blk_1073743904_3080, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743904 for deletion 2025-07-11 02:42:48,425 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743904_3080 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743904 2025-07-11 02:44:47,125 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743906_3082 src: /192.168.158.1:51946 dest: /192.168.158.4:9866 2025-07-11 02:44:47,155 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51946, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1174154807_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743906_3082, duration(ns): 20930789 2025-07-11 02:44:47,156 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743906_3082, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-11 02:44:51,430 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743906_3082 replica FinalizedReplica, blk_1073743906_3082, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743906 for deletion 2025-07-11 02:44:51,431 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743906_3082 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743906 2025-07-11 02:46:52,142 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743908_3084 src: /192.168.158.6:46696 dest: /192.168.158.4:9866 2025-07-11 02:46:52,159 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:46696, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1340552315_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743908_3084, duration(ns): 14735533 2025-07-11 02:46:52,159 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743908_3084, type=LAST_IN_PIPELINE terminating 2025-07-11 02:46:54,436 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743908_3084 replica FinalizedReplica, blk_1073743908_3084, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743908 for deletion 2025-07-11 02:46:54,437 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743908_3084 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743908 2025-07-11 02:47:52,138 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743909_3085 src: /192.168.158.1:47742 dest: /192.168.158.4:9866 2025-07-11 02:47:52,170 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47742, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-311363482_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743909_3085, duration(ns): 22422431 2025-07-11 02:47:52,170 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743909_3085, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-11 02:47:54,439 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743909_3085 replica FinalizedReplica, blk_1073743909_3085, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743909 for deletion 2025-07-11 02:47:54,440 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743909_3085 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743909 2025-07-11 02:48:52,143 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743910_3086 src: /192.168.158.5:59554 dest: /192.168.158.4:9866 2025-07-11 02:48:52,166 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:59554, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_412896203_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743910_3086, duration(ns): 19581644 2025-07-11 02:48:52,166 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743910_3086, type=LAST_IN_PIPELINE terminating 2025-07-11 02:48:54,441 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743910_3086 replica FinalizedReplica, blk_1073743910_3086, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743910 for deletion 2025-07-11 02:48:54,442 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743910_3086 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743910 2025-07-11 02:49:57,148 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743911_3087 src: /192.168.158.1:57326 dest: /192.168.158.4:9866 2025-07-11 02:49:57,180 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57326, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-755075356_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743911_3087, duration(ns): 22984000 2025-07-11 02:49:57,180 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743911_3087, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-11 02:50:03,444 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743911_3087 replica FinalizedReplica, blk_1073743911_3087, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743911 for deletion 2025-07-11 02:50:03,445 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743911_3087 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743911 2025-07-11 02:51:02,152 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743912_3088 src: /192.168.158.1:39718 dest: /192.168.158.4:9866 2025-07-11 02:51:02,185 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39718, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-449490147_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743912_3088, duration(ns): 23451887 2025-07-11 02:51:02,185 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743912_3088, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-11 02:51:06,447 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743912_3088 replica FinalizedReplica, blk_1073743912_3088, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743912 for deletion 2025-07-11 02:51:06,448 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743912_3088 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743912 2025-07-11 02:52:02,154 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743913_3089 src: /192.168.158.1:52550 dest: /192.168.158.4:9866 2025-07-11 02:52:02,183 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52550, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1820970780_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743913_3089, duration(ns): 19051329 2025-07-11 02:52:02,183 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743913_3089, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-11 02:52:06,451 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743913_3089 replica FinalizedReplica, blk_1073743913_3089, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743913 for deletion 2025-07-11 02:52:06,452 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743913_3089 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743913 2025-07-11 02:55:02,153 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743916_3092 src: /192.168.158.9:41276 dest: /192.168.158.4:9866 2025-07-11 02:55:02,176 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:41276, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-22189607_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743916_3092, duration(ns): 17371158 2025-07-11 02:55:02,176 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743916_3092, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-11 02:55:06,460 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743916_3092 replica FinalizedReplica, blk_1073743916_3092, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743916 for deletion 2025-07-11 02:55:06,461 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743916_3092 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743916 2025-07-11 02:57:02,212 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743918_3094 src: /192.168.158.5:54344 dest: /192.168.158.4:9866 2025-07-11 02:57:02,230 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:54344, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1774488767_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743918_3094, duration(ns): 16087944 2025-07-11 02:57:02,230 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743918_3094, type=LAST_IN_PIPELINE terminating 2025-07-11 02:57:06,468 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743918_3094 replica FinalizedReplica, blk_1073743918_3094, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743918 for deletion 2025-07-11 02:57:06,469 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743918_3094 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743918 2025-07-11 02:59:07,157 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743920_3096 src: /192.168.158.8:52268 dest: /192.168.158.4:9866 2025-07-11 02:59:07,181 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:52268, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1718929506_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743920_3096, duration(ns): 18629332 2025-07-11 02:59:07,181 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743920_3096, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-11 02:59:12,474 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743920_3096 replica FinalizedReplica, blk_1073743920_3096, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743920 for deletion 2025-07-11 02:59:12,475 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743920_3096 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743920 2025-07-11 03:02:12,138 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743923_3099 src: /192.168.158.6:56864 dest: /192.168.158.4:9866 2025-07-11 03:02:12,162 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:56864, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1126010207_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743923_3099, duration(ns): 18250018 2025-07-11 03:02:12,162 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743923_3099, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-11 03:02:18,479 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743923_3099 replica FinalizedReplica, blk_1073743923_3099, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743923 for deletion 2025-07-11 03:02:18,480 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743923_3099 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743923 2025-07-11 03:05:12,164 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743926_3102 src: /192.168.158.6:39996 dest: /192.168.158.4:9866 2025-07-11 03:05:12,180 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:39996, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_552959992_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743926_3102, duration(ns): 14746623 2025-07-11 03:05:12,181 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743926_3102, type=LAST_IN_PIPELINE terminating 2025-07-11 03:05:18,482 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743926_3102 replica FinalizedReplica, blk_1073743926_3102, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743926 for deletion 2025-07-11 03:05:18,484 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743926_3102 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743926 2025-07-11 03:09:22,155 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743930_3106 src: /192.168.158.6:36322 dest: /192.168.158.4:9866 2025-07-11 03:09:22,174 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:36322, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-107885727_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743930_3106, duration(ns): 16695663 2025-07-11 03:09:22,175 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743930_3106, type=LAST_IN_PIPELINE terminating 2025-07-11 03:09:24,491 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743930_3106 replica FinalizedReplica, blk_1073743930_3106, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743930 for deletion 2025-07-11 03:09:24,493 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743930_3106 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743930 2025-07-11 03:11:22,162 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743932_3108 src: /192.168.158.5:38346 dest: /192.168.158.4:9866 2025-07-11 03:11:22,179 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:38346, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1545964219_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743932_3108, duration(ns): 14192707 2025-07-11 03:11:22,179 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743932_3108, type=LAST_IN_PIPELINE terminating 2025-07-11 03:11:27,495 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743932_3108 replica FinalizedReplica, blk_1073743932_3108, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743932 for deletion 2025-07-11 03:11:27,496 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743932_3108 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743932 2025-07-11 03:12:22,159 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743933_3109 src: /192.168.158.9:37816 dest: /192.168.158.4:9866 2025-07-11 03:12:22,184 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:37816, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_20883615_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743933_3109, duration(ns): 19241301 2025-07-11 03:12:22,184 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743933_3109, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-11 03:12:24,498 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743933_3109 replica FinalizedReplica, blk_1073743933_3109, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743933 for deletion 2025-07-11 03:12:24,499 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743933_3109 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743933 2025-07-11 03:20:32,173 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743941_3117 src: /192.168.158.1:52236 dest: /192.168.158.4:9866 2025-07-11 03:20:32,205 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52236, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_718349986_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743941_3117, duration(ns): 23169870 2025-07-11 03:20:32,206 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743941_3117, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-11 03:20:36,508 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743941_3117 replica FinalizedReplica, blk_1073743941_3117, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743941 for deletion 2025-07-11 03:20:36,510 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743941_3117 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743941 2025-07-11 03:21:32,188 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743942_3118 src: /192.168.158.7:54382 dest: /192.168.158.4:9866 2025-07-11 03:21:32,207 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:54382, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1912647020_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743942_3118, duration(ns): 16485832 2025-07-11 03:21:32,207 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743942_3118, type=LAST_IN_PIPELINE terminating 2025-07-11 03:21:36,510 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743942_3118 replica FinalizedReplica, blk_1073743942_3118, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743942 for deletion 2025-07-11 03:21:36,512 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743942_3118 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743942 2025-07-11 03:22:32,182 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743943_3119 src: /192.168.158.8:46148 dest: /192.168.158.4:9866 2025-07-11 03:22:32,200 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:46148, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1366344362_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743943_3119, duration(ns): 15308481 2025-07-11 03:22:32,200 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743943_3119, type=LAST_IN_PIPELINE terminating 2025-07-11 03:22:36,511 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743943_3119 replica FinalizedReplica, blk_1073743943_3119, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743943 for deletion 2025-07-11 03:22:36,513 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743943_3119 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743943 2025-07-11 03:24:32,186 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743945_3121 src: /192.168.158.9:42296 dest: /192.168.158.4:9866 2025-07-11 03:24:32,211 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:42296, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-641746230_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743945_3121, duration(ns): 19602531 2025-07-11 03:24:32,212 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743945_3121, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-11 03:24:36,519 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743945_3121 replica FinalizedReplica, blk_1073743945_3121, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743945 for deletion 2025-07-11 03:24:36,520 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743945_3121 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743945 2025-07-11 03:26:32,225 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743947_3123 src: /192.168.158.5:60748 dest: /192.168.158.4:9866 2025-07-11 03:26:32,245 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:60748, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-686719396_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743947_3123, duration(ns): 17769066 2025-07-11 03:26:32,245 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743947_3123, type=LAST_IN_PIPELINE terminating 2025-07-11 03:26:36,520 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743947_3123 replica FinalizedReplica, blk_1073743947_3123, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743947 for deletion 2025-07-11 03:26:36,521 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743947_3123 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743947 2025-07-11 03:27:32,192 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743948_3124 src: /192.168.158.7:37078 dest: /192.168.158.4:9866 2025-07-11 03:27:32,209 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:37078, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_329500816_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743948_3124, duration(ns): 14247712 2025-07-11 03:27:32,209 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743948_3124, type=LAST_IN_PIPELINE terminating 2025-07-11 03:27:36,522 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743948_3124 replica FinalizedReplica, blk_1073743948_3124, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743948 for deletion 2025-07-11 03:27:36,524 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743948_3124 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743948 2025-07-11 03:28:32,190 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743949_3125 src: /192.168.158.5:47534 dest: /192.168.158.4:9866 2025-07-11 03:28:32,209 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:47534, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1790327059_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743949_3125, duration(ns): 17190053 2025-07-11 03:28:32,209 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743949_3125, type=LAST_IN_PIPELINE terminating 2025-07-11 03:28:36,524 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743949_3125 replica FinalizedReplica, blk_1073743949_3125, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743949 for deletion 2025-07-11 03:28:36,526 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743949_3125 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743949 2025-07-11 03:29:37,196 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743950_3126 src: /192.168.158.1:46146 dest: /192.168.158.4:9866 2025-07-11 03:29:37,230 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46146, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_402791660_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743950_3126, duration(ns): 25298931 2025-07-11 03:29:37,230 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743950_3126, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-11 03:29:39,527 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743950_3126 replica FinalizedReplica, blk_1073743950_3126, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743950 for deletion 2025-07-11 03:29:39,528 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743950_3126 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743950 2025-07-11 03:32:37,207 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743953_3129 src: /192.168.158.5:38748 dest: /192.168.158.4:9866 2025-07-11 03:32:37,225 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:38748, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_622388534_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743953_3129, duration(ns): 15824414 2025-07-11 03:32:37,226 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743953_3129, type=LAST_IN_PIPELINE terminating 2025-07-11 03:32:42,529 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743953_3129 replica FinalizedReplica, blk_1073743953_3129, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743953 for deletion 2025-07-11 03:32:42,530 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743953_3129 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743953 2025-07-11 03:33:37,217 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743954_3130 src: /192.168.158.9:44502 dest: /192.168.158.4:9866 2025-07-11 03:33:37,234 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:44502, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-269948692_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743954_3130, duration(ns): 14748258 2025-07-11 03:33:37,234 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743954_3130, type=LAST_IN_PIPELINE terminating 2025-07-11 03:33:39,530 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743954_3130 replica FinalizedReplica, blk_1073743954_3130, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743954 for deletion 2025-07-11 03:33:39,531 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743954_3130 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743954 2025-07-11 03:34:37,218 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743955_3131 src: /192.168.158.1:49224 dest: /192.168.158.4:9866 2025-07-11 03:34:37,252 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49224, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1415419840_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743955_3131, duration(ns): 24747142 2025-07-11 03:34:37,252 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743955_3131, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-11 03:34:39,533 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743955_3131 replica FinalizedReplica, blk_1073743955_3131, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743955 for deletion 2025-07-11 03:34:39,534 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743955_3131 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743955 2025-07-11 03:35:37,224 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743956_3132 src: /192.168.158.9:52674 dest: /192.168.158.4:9866 2025-07-11 03:35:37,241 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:52674, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-447348307_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743956_3132, duration(ns): 15033852 2025-07-11 03:35:37,242 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743956_3132, type=LAST_IN_PIPELINE terminating 2025-07-11 03:35:39,533 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743956_3132 replica FinalizedReplica, blk_1073743956_3132, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743956 for deletion 2025-07-11 03:35:39,534 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743956_3132 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743956 2025-07-11 03:37:37,221 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743958_3134 src: /192.168.158.5:39280 dest: /192.168.158.4:9866 2025-07-11 03:37:37,237 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:39280, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1297504824_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743958_3134, duration(ns): 13805307 2025-07-11 03:37:37,237 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743958_3134, type=LAST_IN_PIPELINE terminating 2025-07-11 03:37:39,534 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743958_3134 replica FinalizedReplica, blk_1073743958_3134, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743958 for deletion 2025-07-11 03:37:39,535 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743958_3134 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743958 2025-07-11 03:38:37,221 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743959_3135 src: /192.168.158.1:41208 dest: /192.168.158.4:9866 2025-07-11 03:38:37,252 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41208, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1832697553_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743959_3135, duration(ns): 22155154 2025-07-11 03:38:37,253 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743959_3135, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-11 03:38:39,539 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743959_3135 replica FinalizedReplica, blk_1073743959_3135, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743959 for deletion 2025-07-11 03:38:39,540 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743959_3135 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743959 2025-07-11 03:39:37,232 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743960_3136 src: /192.168.158.5:40498 dest: /192.168.158.4:9866 2025-07-11 03:39:37,256 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:40498, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1838446191_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743960_3136, duration(ns): 18528835 2025-07-11 03:39:37,256 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743960_3136, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-11 03:39:39,543 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743960_3136 replica FinalizedReplica, blk_1073743960_3136, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743960 for deletion 2025-07-11 03:39:39,544 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743960_3136 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743960 2025-07-11 03:40:37,240 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743961_3137 src: /192.168.158.8:49086 dest: /192.168.158.4:9866 2025-07-11 03:40:37,263 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:49086, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2147400963_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743961_3137, duration(ns): 18147099 2025-07-11 03:40:37,264 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743961_3137, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-11 03:40:39,546 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743961_3137 replica FinalizedReplica, blk_1073743961_3137, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743961 for deletion 2025-07-11 03:40:39,547 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743961_3137 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743961 2025-07-11 03:42:42,258 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743963_3139 src: /192.168.158.8:43800 dest: /192.168.158.4:9866 2025-07-11 03:42:42,283 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:43800, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-930465276_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743963_3139, duration(ns): 19879946 2025-07-11 03:42:42,284 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743963_3139, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-11 03:42:45,548 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743963_3139 replica FinalizedReplica, blk_1073743963_3139, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743963 for deletion 2025-07-11 03:42:45,550 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743963_3139 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743963 2025-07-11 03:47:52,275 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743968_3144 src: /192.168.158.1:45966 dest: /192.168.158.4:9866 2025-07-11 03:47:52,306 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45966, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_296899952_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743968_3144, duration(ns): 22413370 2025-07-11 03:47:52,306 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743968_3144, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-11 03:47:57,567 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743968_3144 replica FinalizedReplica, blk_1073743968_3144, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743968 for deletion 2025-07-11 03:47:57,569 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743968_3144 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743968 2025-07-11 03:50:52,276 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743971_3147 src: /192.168.158.1:41872 dest: /192.168.158.4:9866 2025-07-11 03:50:52,310 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41872, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-21431809_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743971_3147, duration(ns): 24188631 2025-07-11 03:50:52,310 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743971_3147, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-11 03:50:57,573 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743971_3147 replica FinalizedReplica, blk_1073743971_3147, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743971 for deletion 2025-07-11 03:50:57,574 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743971_3147 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743971 2025-07-11 03:52:57,277 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743973_3149 src: /192.168.158.7:59786 dest: /192.168.158.4:9866 2025-07-11 03:52:57,295 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:59786, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-717839890_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743973_3149, duration(ns): 15254194 2025-07-11 03:52:57,295 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743973_3149, type=LAST_IN_PIPELINE terminating 2025-07-11 03:53:03,573 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743973_3149 replica FinalizedReplica, blk_1073743973_3149, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743973 for deletion 2025-07-11 03:53:03,574 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743973_3149 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743973 2025-07-11 03:54:57,276 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743975_3151 src: /192.168.158.8:46488 dest: /192.168.158.4:9866 2025-07-11 03:54:57,302 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:46488, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1859616667_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743975_3151, duration(ns): 19889579 2025-07-11 03:54:57,302 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743975_3151, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-11 03:55:00,576 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743975_3151 replica FinalizedReplica, blk_1073743975_3151, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743975 for deletion 2025-07-11 03:55:00,577 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743975_3151 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743975 2025-07-11 04:00:57,282 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743981_3157 src: /192.168.158.1:47946 dest: /192.168.158.4:9866 2025-07-11 04:00:57,312 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47946, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_352458248_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743981_3157, duration(ns): 21156798 2025-07-11 04:00:57,312 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743981_3157, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-11 04:01:00,583 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743981_3157 replica FinalizedReplica, blk_1073743981_3157, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743981 for deletion 2025-07-11 04:01:00,584 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743981_3157 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743981 2025-07-11 04:02:02,299 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743982_3158 src: /192.168.158.1:43972 dest: /192.168.158.4:9866 2025-07-11 04:02:02,332 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43972, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_247226279_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743982_3158, duration(ns): 23978503 2025-07-11 04:02:02,333 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743982_3158, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-11 04:02:09,585 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743982_3158 replica FinalizedReplica, blk_1073743982_3158, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743982 for deletion 2025-07-11 04:02:09,586 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743982_3158 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743982 2025-07-11 04:03:02,303 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743983_3159 src: /192.168.158.1:45828 dest: /192.168.158.4:9866 2025-07-11 04:03:02,335 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45828, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1269687378_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743983_3159, duration(ns): 23240311 2025-07-11 04:03:02,336 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743983_3159, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-11 04:03:06,585 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743983_3159 replica FinalizedReplica, blk_1073743983_3159, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743983 for deletion 2025-07-11 04:03:06,587 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743983_3159 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743983 2025-07-11 04:06:02,339 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743986_3162 src: /192.168.158.7:59528 dest: /192.168.158.4:9866 2025-07-11 04:06:02,359 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:59528, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-10860196_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743986_3162, duration(ns): 17406659 2025-07-11 04:06:02,359 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743986_3162, type=LAST_IN_PIPELINE terminating 2025-07-11 04:06:09,595 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743986_3162 replica FinalizedReplica, blk_1073743986_3162, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743986 for deletion 2025-07-11 04:06:09,596 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743986_3162 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743986 2025-07-11 04:07:07,309 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743987_3163 src: /192.168.158.5:57500 dest: /192.168.158.4:9866 2025-07-11 04:07:07,335 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:57500, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1388365350_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743987_3163, duration(ns): 21285649 2025-07-11 04:07:07,336 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743987_3163, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-11 04:07:09,600 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743987_3163 replica FinalizedReplica, blk_1073743987_3163, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743987 for deletion 2025-07-11 04:07:09,601 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743987_3163 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743987 2025-07-11 04:10:07,327 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743990_3166 src: /192.168.158.7:49572 dest: /192.168.158.4:9866 2025-07-11 04:10:07,354 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:49572, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1956900289_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743990_3166, duration(ns): 21390339 2025-07-11 04:10:07,354 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743990_3166, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-11 04:10:09,602 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743990_3166 replica FinalizedReplica, blk_1073743990_3166, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743990 for deletion 2025-07-11 04:10:09,603 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743990_3166 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743990 2025-07-11 04:12:07,377 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743992_3168 src: /192.168.158.1:59246 dest: /192.168.158.4:9866 2025-07-11 04:12:07,409 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59246, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_372113599_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743992_3168, duration(ns): 23414548 2025-07-11 04:12:07,409 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743992_3168, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-11 04:12:12,639 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743992_3168 replica FinalizedReplica, blk_1073743992_3168, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743992 for deletion 2025-07-11 04:12:12,640 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743992_3168 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743992 2025-07-11 04:15:07,348 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743995_3171 src: /192.168.158.9:44200 dest: /192.168.158.4:9866 2025-07-11 04:15:07,374 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:44200, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_744762356_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743995_3171, duration(ns): 20428700 2025-07-11 04:15:07,374 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743995_3171, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-11 04:15:12,645 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743995_3171 replica FinalizedReplica, blk_1073743995_3171, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743995 for deletion 2025-07-11 04:15:12,646 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743995_3171 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743995 2025-07-11 04:16:07,328 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743996_3172 src: /192.168.158.1:49782 dest: /192.168.158.4:9866 2025-07-11 04:16:07,359 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49782, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_162513158_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743996_3172, duration(ns): 22144713 2025-07-11 04:16:07,359 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743996_3172, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-11 04:16:09,647 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743996_3172 replica FinalizedReplica, blk_1073743996_3172, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743996 for deletion 2025-07-11 04:16:09,648 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743996_3172 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743996 2025-07-11 04:17:07,326 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073743997_3173 src: /192.168.158.9:43702 dest: /192.168.158.4:9866 2025-07-11 04:17:07,343 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:43702, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1720531587_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073743997_3173, duration(ns): 15057415 2025-07-11 04:17:07,344 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073743997_3173, type=LAST_IN_PIPELINE terminating 2025-07-11 04:17:12,649 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073743997_3173 replica FinalizedReplica, blk_1073743997_3173, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743997 for deletion 2025-07-11 04:17:12,650 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073743997_3173 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073743997 2025-07-11 04:20:12,348 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744000_3176 src: /192.168.158.8:52896 dest: /192.168.158.4:9866 2025-07-11 04:20:12,373 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:52896, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2052215082_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744000_3176, duration(ns): 19032637 2025-07-11 04:20:12,373 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744000_3176, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-11 04:20:15,655 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744000_3176 replica FinalizedReplica, blk_1073744000_3176, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744000 for deletion 2025-07-11 04:20:15,656 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744000_3176 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744000 2025-07-11 04:22:12,336 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744002_3178 src: /192.168.158.9:55032 dest: /192.168.158.4:9866 2025-07-11 04:22:12,360 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:55032, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-524228436_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744002_3178, duration(ns): 17829771 2025-07-11 04:22:12,360 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744002_3178, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-11 04:22:15,662 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744002_3178 replica FinalizedReplica, blk_1073744002_3178, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744002 for deletion 2025-07-11 04:22:15,663 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744002_3178 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744002 2025-07-11 04:26:17,357 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744006_3182 src: /192.168.158.8:42120 dest: /192.168.158.4:9866 2025-07-11 04:26:17,382 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:42120, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1380860239_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744006_3182, duration(ns): 19903013 2025-07-11 04:26:17,383 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744006_3182, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-11 04:26:21,674 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744006_3182 replica FinalizedReplica, blk_1073744006_3182, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744006 for deletion 2025-07-11 04:26:21,675 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744006_3182 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744006 2025-07-11 04:31:22,351 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744011_3187 src: /192.168.158.1:35942 dest: /192.168.158.4:9866 2025-07-11 04:31:22,381 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35942, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_154754860_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744011_3187, duration(ns): 21000628 2025-07-11 04:31:22,382 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744011_3187, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-11 04:31:24,687 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744011_3187 replica FinalizedReplica, blk_1073744011_3187, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744011 for deletion 2025-07-11 04:31:24,688 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744011_3187 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744011 2025-07-11 04:34:22,357 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744014_3190 src: /192.168.158.1:45128 dest: /192.168.158.4:9866 2025-07-11 04:34:22,386 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45128, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-72264978_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744014_3190, duration(ns): 20203210 2025-07-11 04:34:22,387 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744014_3190, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-11 04:34:27,692 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744014_3190 replica FinalizedReplica, blk_1073744014_3190, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744014 for deletion 2025-07-11 04:34:27,693 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744014_3190 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744014 2025-07-11 04:35:27,358 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744015_3191 src: /192.168.158.9:54910 dest: /192.168.158.4:9866 2025-07-11 04:35:27,374 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:54910, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_210103863_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744015_3191, duration(ns): 14352038 2025-07-11 04:35:27,375 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744015_3191, type=LAST_IN_PIPELINE terminating 2025-07-11 04:35:33,692 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744015_3191 replica FinalizedReplica, blk_1073744015_3191, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744015 for deletion 2025-07-11 04:35:33,693 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744015_3191 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744015 2025-07-11 04:38:27,372 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744018_3194 src: /192.168.158.6:47302 dest: /192.168.158.4:9866 2025-07-11 04:38:27,398 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:47302, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1043578885_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744018_3194, duration(ns): 20406296 2025-07-11 04:38:27,398 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744018_3194, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-11 04:38:33,703 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744018_3194 replica FinalizedReplica, blk_1073744018_3194, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744018 for deletion 2025-07-11 04:38:33,704 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744018_3194 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744018 2025-07-11 04:39:27,369 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744019_3195 src: /192.168.158.1:40376 dest: /192.168.158.4:9866 2025-07-11 04:39:27,401 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40376, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1812449003_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744019_3195, duration(ns): 22448338 2025-07-11 04:39:27,401 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744019_3195, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-11 04:39:30,705 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744019_3195 replica FinalizedReplica, blk_1073744019_3195, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744019 for deletion 2025-07-11 04:39:30,706 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744019_3195 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744019 2025-07-11 04:40:27,372 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744020_3196 src: /192.168.158.9:54850 dest: /192.168.158.4:9866 2025-07-11 04:40:27,396 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:54850, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1800043741_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744020_3196, duration(ns): 18103033 2025-07-11 04:40:27,396 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744020_3196, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-11 04:40:30,707 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744020_3196 replica FinalizedReplica, blk_1073744020_3196, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744020 for deletion 2025-07-11 04:40:30,708 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744020_3196 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744020 2025-07-11 04:42:32,360 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744022_3198 src: /192.168.158.9:45674 dest: /192.168.158.4:9866 2025-07-11 04:42:32,384 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:45674, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-512602902_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744022_3198, duration(ns): 18773811 2025-07-11 04:42:32,384 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744022_3198, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-11 04:42:36,713 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744022_3198 replica FinalizedReplica, blk_1073744022_3198, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744022 for deletion 2025-07-11 04:42:36,714 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744022_3198 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744022 2025-07-11 04:43:32,370 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744023_3199 src: /192.168.158.1:60600 dest: /192.168.158.4:9866 2025-07-11 04:43:32,400 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60600, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2141736212_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744023_3199, duration(ns): 20368362 2025-07-11 04:43:32,400 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744023_3199, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-11 04:43:33,716 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744023_3199 replica FinalizedReplica, blk_1073744023_3199, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744023 for deletion 2025-07-11 04:43:33,717 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744023_3199 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744023 2025-07-11 04:45:32,395 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744025_3201 src: /192.168.158.7:35706 dest: /192.168.158.4:9866 2025-07-11 04:45:32,426 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:35706, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1297271963_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744025_3201, duration(ns): 23414554 2025-07-11 04:45:32,426 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744025_3201, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-11 04:45:39,719 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744025_3201 replica FinalizedReplica, blk_1073744025_3201, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744025 for deletion 2025-07-11 04:45:39,720 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744025_3201 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744025 2025-07-11 04:46:32,401 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744026_3202 src: /192.168.158.5:41370 dest: /192.168.158.4:9866 2025-07-11 04:46:32,421 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:41370, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1400129740_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744026_3202, duration(ns): 17843899 2025-07-11 04:46:32,421 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744026_3202, type=LAST_IN_PIPELINE terminating 2025-07-11 04:46:36,721 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744026_3202 replica FinalizedReplica, blk_1073744026_3202, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744026 for deletion 2025-07-11 04:46:36,722 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744026_3202 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744026 2025-07-11 04:47:32,378 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744027_3203 src: /192.168.158.7:44144 dest: /192.168.158.4:9866 2025-07-11 04:47:32,402 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:44144, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1936998176_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744027_3203, duration(ns): 18913767 2025-07-11 04:47:32,402 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744027_3203, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-11 04:47:33,725 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744027_3203 replica FinalizedReplica, blk_1073744027_3203, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744027 for deletion 2025-07-11 04:47:33,727 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744027_3203 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744027 2025-07-11 04:48:32,384 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744028_3204 src: /192.168.158.8:34158 dest: /192.168.158.4:9866 2025-07-11 04:48:32,401 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:34158, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2098846142_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744028_3204, duration(ns): 14295091 2025-07-11 04:48:32,401 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744028_3204, type=LAST_IN_PIPELINE terminating 2025-07-11 04:48:33,727 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744028_3204 replica FinalizedReplica, blk_1073744028_3204, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744028 for deletion 2025-07-11 04:48:33,729 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744028_3204 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744028 2025-07-11 04:50:32,387 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744030_3206 src: /192.168.158.9:52714 dest: /192.168.158.4:9866 2025-07-11 04:50:32,405 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:52714, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1870112988_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744030_3206, duration(ns): 15485619 2025-07-11 04:50:32,405 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744030_3206, type=LAST_IN_PIPELINE terminating 2025-07-11 04:50:33,728 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744030_3206 replica FinalizedReplica, blk_1073744030_3206, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744030 for deletion 2025-07-11 04:50:33,729 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744030_3206 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744030 2025-07-11 04:51:32,382 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744031_3207 src: /192.168.158.1:57476 dest: /192.168.158.4:9866 2025-07-11 04:51:32,413 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57476, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_410513346_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744031_3207, duration(ns): 21530128 2025-07-11 04:51:32,413 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744031_3207, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-11 04:51:36,734 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744031_3207 replica FinalizedReplica, blk_1073744031_3207, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744031 for deletion 2025-07-11 04:51:36,735 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744031_3207 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744031 2025-07-11 04:54:47,425 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744034_3210 src: /192.168.158.8:60872 dest: /192.168.158.4:9866 2025-07-11 04:54:47,441 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:60872, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1518081072_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744034_3210, duration(ns): 14193740 2025-07-11 04:54:47,441 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744034_3210, type=LAST_IN_PIPELINE terminating 2025-07-11 04:54:48,745 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744034_3210 replica FinalizedReplica, blk_1073744034_3210, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744034 for deletion 2025-07-11 04:54:48,746 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744034_3210 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744034 2025-07-11 04:55:47,432 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744035_3211 src: /192.168.158.6:57606 dest: /192.168.158.4:9866 2025-07-11 04:55:47,448 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:57606, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1621306227_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744035_3211, duration(ns): 14325006 2025-07-11 04:55:47,448 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744035_3211, type=LAST_IN_PIPELINE terminating 2025-07-11 04:55:54,745 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744035_3211 replica FinalizedReplica, blk_1073744035_3211, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744035 for deletion 2025-07-11 04:55:54,746 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744035_3211 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744035 2025-07-11 04:58:47,412 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744038_3214 src: /192.168.158.9:34304 dest: /192.168.158.4:9866 2025-07-11 04:58:47,430 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:34304, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1746151133_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744038_3214, duration(ns): 15932795 2025-07-11 04:58:47,430 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744038_3214, type=LAST_IN_PIPELINE terminating 2025-07-11 04:58:48,750 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744038_3214 replica FinalizedReplica, blk_1073744038_3214, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744038 for deletion 2025-07-11 04:58:48,751 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744038_3214 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744038 2025-07-11 05:00:47,406 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744040_3216 src: /192.168.158.9:38810 dest: /192.168.158.4:9866 2025-07-11 05:00:47,424 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:38810, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1706322745_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744040_3216, duration(ns): 15609603 2025-07-11 05:00:47,425 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744040_3216, type=LAST_IN_PIPELINE terminating 2025-07-11 05:00:48,756 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744040_3216 replica FinalizedReplica, blk_1073744040_3216, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744040 for deletion 2025-07-11 05:00:48,757 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744040_3216 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744040 2025-07-11 05:03:52,419 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744043_3219 src: /192.168.158.7:39516 dest: /192.168.158.4:9866 2025-07-11 05:03:52,442 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:39516, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1127630291_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744043_3219, duration(ns): 17107419 2025-07-11 05:03:52,442 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744043_3219, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-11 05:03:54,763 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744043_3219 replica FinalizedReplica, blk_1073744043_3219, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744043 for deletion 2025-07-11 05:03:54,764 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744043_3219 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744043 2025-07-11 05:04:57,413 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744044_3220 src: /192.168.158.1:33486 dest: /192.168.158.4:9866 2025-07-11 05:04:57,444 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33486, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-602033229_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744044_3220, duration(ns): 21743535 2025-07-11 05:04:57,445 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744044_3220, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-11 05:05:00,764 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744044_3220 replica FinalizedReplica, blk_1073744044_3220, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744044 for deletion 2025-07-11 05:05:00,765 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744044_3220 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744044 2025-07-11 05:09:02,418 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744048_3224 src: /192.168.158.1:33872 dest: /192.168.158.4:9866 2025-07-11 05:09:02,450 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33872, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-239671961_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744048_3224, duration(ns): 22796659 2025-07-11 05:09:02,451 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744048_3224, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-11 05:09:06,768 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744048_3224 replica FinalizedReplica, blk_1073744048_3224, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744048 for deletion 2025-07-11 05:09:06,769 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744048_3224 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744048 2025-07-11 05:12:07,441 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744051_3227 src: /192.168.158.7:53532 dest: /192.168.158.4:9866 2025-07-11 05:12:07,458 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:53532, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1730516601_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744051_3227, duration(ns): 14726163 2025-07-11 05:12:07,458 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744051_3227, type=LAST_IN_PIPELINE terminating 2025-07-11 05:12:09,773 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744051_3227 replica FinalizedReplica, blk_1073744051_3227, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744051 for deletion 2025-07-11 05:12:09,774 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744051_3227 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744051 2025-07-11 05:13:12,439 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744052_3228 src: /192.168.158.1:45180 dest: /192.168.158.4:9866 2025-07-11 05:13:12,469 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45180, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1887501856_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744052_3228, duration(ns): 21331025 2025-07-11 05:13:12,469 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744052_3228, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-11 05:13:18,775 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744052_3228 replica FinalizedReplica, blk_1073744052_3228, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744052 for deletion 2025-07-11 05:13:18,776 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744052_3228 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744052 2025-07-11 05:14:12,465 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744053_3229 src: /192.168.158.5:36526 dest: /192.168.158.4:9866 2025-07-11 05:14:12,487 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:36526, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1132821914_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744053_3229, duration(ns): 16743419 2025-07-11 05:14:12,487 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744053_3229, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-11 05:14:15,778 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744053_3229 replica FinalizedReplica, blk_1073744053_3229, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744053 for deletion 2025-07-11 05:14:15,779 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744053_3229 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744053 2025-07-11 05:15:12,466 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744054_3230 src: /192.168.158.6:42510 dest: /192.168.158.4:9866 2025-07-11 05:15:12,483 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:42510, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_996822838_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744054_3230, duration(ns): 14902118 2025-07-11 05:15:12,483 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744054_3230, type=LAST_IN_PIPELINE terminating 2025-07-11 05:15:15,779 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744054_3230 replica FinalizedReplica, blk_1073744054_3230, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744054 for deletion 2025-07-11 05:15:15,781 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744054_3230 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744054 2025-07-11 05:17:22,468 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744056_3232 src: /192.168.158.5:56416 dest: /192.168.158.4:9866 2025-07-11 05:17:22,495 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:56416, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1849983813_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744056_3232, duration(ns): 20794636 2025-07-11 05:17:22,495 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744056_3232, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-11 05:17:24,785 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744056_3232 replica FinalizedReplica, blk_1073744056_3232, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744056 for deletion 2025-07-11 05:17:24,786 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744056_3232 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744056 2025-07-11 05:21:37,484 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744060_3236 src: /192.168.158.5:57096 dest: /192.168.158.4:9866 2025-07-11 05:21:37,503 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:57096, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2097726515_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744060_3236, duration(ns): 16496252 2025-07-11 05:21:37,503 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744060_3236, type=LAST_IN_PIPELINE terminating 2025-07-11 05:21:39,796 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744060_3236 replica FinalizedReplica, blk_1073744060_3236, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744060 for deletion 2025-07-11 05:21:39,797 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744060_3236 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744060 2025-07-11 05:22:42,495 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744061_3237 src: /192.168.158.7:59928 dest: /192.168.158.4:9866 2025-07-11 05:22:42,516 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:59928, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1093467757_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744061_3237, duration(ns): 19471811 2025-07-11 05:22:42,517 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744061_3237, type=LAST_IN_PIPELINE terminating 2025-07-11 05:22:45,798 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744061_3237 replica FinalizedReplica, blk_1073744061_3237, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744061 for deletion 2025-07-11 05:22:45,799 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744061_3237 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744061 2025-07-11 05:23:42,480 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744062_3238 src: /192.168.158.1:53752 dest: /192.168.158.4:9866 2025-07-11 05:23:42,514 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53752, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1829888420_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744062_3238, duration(ns): 23110418 2025-07-11 05:23:42,514 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744062_3238, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-11 05:23:45,802 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744062_3238 replica FinalizedReplica, blk_1073744062_3238, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744062 for deletion 2025-07-11 05:23:45,804 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744062_3238 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744062 2025-07-11 05:26:52,521 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744065_3241 src: /192.168.158.5:40220 dest: /192.168.158.4:9866 2025-07-11 05:26:52,541 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:40220, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_303327522_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744065_3241, duration(ns): 18567013 2025-07-11 05:26:52,542 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744065_3241, type=LAST_IN_PIPELINE terminating 2025-07-11 05:26:54,808 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744065_3241 replica FinalizedReplica, blk_1073744065_3241, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744065 for deletion 2025-07-11 05:26:54,809 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744065_3241 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744065 2025-07-11 05:27:52,489 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744066_3242 src: /192.168.158.1:34620 dest: /192.168.158.4:9866 2025-07-11 05:27:52,523 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34620, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1040789189_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744066_3242, duration(ns): 26047999 2025-07-11 05:27:52,524 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744066_3242, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-11 05:27:54,812 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744066_3242 replica FinalizedReplica, blk_1073744066_3242, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744066 for deletion 2025-07-11 05:27:54,813 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744066_3242 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744066 2025-07-11 05:29:57,493 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744068_3244 src: /192.168.158.6:53138 dest: /192.168.158.4:9866 2025-07-11 05:29:57,512 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:53138, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1233063519_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744068_3244, duration(ns): 16313945 2025-07-11 05:29:57,512 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744068_3244, type=LAST_IN_PIPELINE terminating 2025-07-11 05:30:00,816 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744068_3244 replica FinalizedReplica, blk_1073744068_3244, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744068 for deletion 2025-07-11 05:30:00,817 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744068_3244 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744068 2025-07-11 05:30:57,498 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744069_3245 src: /192.168.158.6:45478 dest: /192.168.158.4:9866 2025-07-11 05:30:57,516 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:45478, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_482555776_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744069_3245, duration(ns): 15492857 2025-07-11 05:30:57,516 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744069_3245, type=LAST_IN_PIPELINE terminating 2025-07-11 05:31:00,819 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744069_3245 replica FinalizedReplica, blk_1073744069_3245, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744069 for deletion 2025-07-11 05:31:00,820 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744069_3245 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744069 2025-07-11 05:32:02,499 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744070_3246 src: /192.168.158.6:47588 dest: /192.168.158.4:9866 2025-07-11 05:32:02,524 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:47588, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2097343911_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744070_3246, duration(ns): 20253222 2025-07-11 05:32:02,525 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744070_3246, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-11 05:32:03,822 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744070_3246 replica FinalizedReplica, blk_1073744070_3246, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744070 for deletion 2025-07-11 05:32:03,823 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744070_3246 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744070 2025-07-11 05:33:07,490 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744071_3247 src: /192.168.158.6:33766 dest: /192.168.158.4:9866 2025-07-11 05:33:07,508 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:33766, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-533937178_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744071_3247, duration(ns): 15748325 2025-07-11 05:33:07,508 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744071_3247, type=LAST_IN_PIPELINE terminating 2025-07-11 05:33:12,824 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744071_3247 replica FinalizedReplica, blk_1073744071_3247, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744071 for deletion 2025-07-11 05:33:12,825 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744071_3247 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744071 2025-07-11 05:36:07,498 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744074_3250 src: /192.168.158.1:34992 dest: /192.168.158.4:9866 2025-07-11 05:36:07,530 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34992, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_64308697_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744074_3250, duration(ns): 23049310 2025-07-11 05:36:07,530 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744074_3250, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-11 05:36:09,830 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744074_3250 replica FinalizedReplica, blk_1073744074_3250, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744074 for deletion 2025-07-11 05:36:09,831 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744074_3250 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744074 2025-07-11 05:36:13,268 INFO org.apache.hadoop.hdfs.server.datanode.DirectoryScanner: BlockPool BP-1059995147-192.168.158.1-1752101929360 Total blocks: 8, missing metadata files:0, missing block files:0, missing blocks in memory:0, mismatched blocks:0 2025-07-11 05:37:18,836 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Successfully sent block report 0x12cfd9a757d31f2b, containing 4 storage report(s), of which we sent 4. The reports had 8 total blocks and used 1 RPC(s). This took 1 msec to generate and 3 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5. 2025-07-11 05:37:18,837 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Got finalize command for block pool BP-1059995147-192.168.158.1-1752101929360 2025-07-11 05:38:07,504 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744076_3252 src: /192.168.158.5:35926 dest: /192.168.158.4:9866 2025-07-11 05:38:07,530 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:35926, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1198989234_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744076_3252, duration(ns): 19836434 2025-07-11 05:38:07,530 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744076_3252, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-11 05:38:09,835 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744076_3252 replica FinalizedReplica, blk_1073744076_3252, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744076 for deletion 2025-07-11 05:38:09,836 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744076_3252 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744076 2025-07-11 05:40:07,532 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744078_3254 src: /192.168.158.1:53498 dest: /192.168.158.4:9866 2025-07-11 05:40:07,564 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53498, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1595382245_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744078_3254, duration(ns): 22544109 2025-07-11 05:40:07,564 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744078_3254, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-11 05:40:09,838 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744078_3254 replica FinalizedReplica, blk_1073744078_3254, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744078 for deletion 2025-07-11 05:40:09,839 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744078_3254 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744078 2025-07-11 05:43:12,568 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744081_3257 src: /192.168.158.7:47158 dest: /192.168.158.4:9866 2025-07-11 05:43:12,587 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:47158, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-677636293_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744081_3257, duration(ns): 16871237 2025-07-11 05:43:12,588 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744081_3257, type=LAST_IN_PIPELINE terminating 2025-07-11 05:43:15,845 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744081_3257 replica FinalizedReplica, blk_1073744081_3257, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744081 for deletion 2025-07-11 05:43:15,846 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744081_3257 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744081 2025-07-11 05:44:12,590 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744082_3258 src: /192.168.158.9:52242 dest: /192.168.158.4:9866 2025-07-11 05:44:12,608 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:52242, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-60318266_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744082_3258, duration(ns): 16302737 2025-07-11 05:44:12,609 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744082_3258, type=LAST_IN_PIPELINE terminating 2025-07-11 05:44:15,848 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744082_3258 replica FinalizedReplica, blk_1073744082_3258, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744082 for deletion 2025-07-11 05:44:15,849 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744082_3258 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744082 2025-07-11 05:49:22,558 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744087_3263 src: /192.168.158.9:55056 dest: /192.168.158.4:9866 2025-07-11 05:49:22,575 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:55056, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-650746359_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744087_3263, duration(ns): 14401068 2025-07-11 05:49:22,575 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744087_3263, type=LAST_IN_PIPELINE terminating 2025-07-11 05:49:24,858 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744087_3263 replica FinalizedReplica, blk_1073744087_3263, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744087 for deletion 2025-07-11 05:49:24,859 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744087_3263 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744087 2025-07-11 05:50:27,549 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744088_3264 src: /192.168.158.1:37526 dest: /192.168.158.4:9866 2025-07-11 05:50:27,585 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37526, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-503650827_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744088_3264, duration(ns): 26901021 2025-07-11 05:50:27,585 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744088_3264, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-11 05:50:33,862 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744088_3264 replica FinalizedReplica, blk_1073744088_3264, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744088 for deletion 2025-07-11 05:50:33,864 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744088_3264 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744088 2025-07-11 05:53:27,559 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744091_3267 src: /192.168.158.1:37586 dest: /192.168.158.4:9866 2025-07-11 05:53:27,590 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37586, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1894793102_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744091_3267, duration(ns): 21321272 2025-07-11 05:53:27,590 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744091_3267, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-11 05:53:33,869 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744091_3267 replica FinalizedReplica, blk_1073744091_3267, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744091 for deletion 2025-07-11 05:53:33,870 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744091_3267 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744091 2025-07-11 05:54:32,549 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744092_3268 src: /192.168.158.1:46964 dest: /192.168.158.4:9866 2025-07-11 05:54:32,578 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46964, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1956183378_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744092_3268, duration(ns): 21180994 2025-07-11 05:54:32,579 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744092_3268, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-11 05:54:33,872 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744092_3268 replica FinalizedReplica, blk_1073744092_3268, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744092 for deletion 2025-07-11 05:54:33,873 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744092_3268 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744092 2025-07-11 05:55:32,561 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744093_3269 src: /192.168.158.1:33576 dest: /192.168.158.4:9866 2025-07-11 05:55:32,592 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33576, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-207051381_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744093_3269, duration(ns): 22359483 2025-07-11 05:55:32,593 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744093_3269, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-11 05:55:33,875 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744093_3269 replica FinalizedReplica, blk_1073744093_3269, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744093 for deletion 2025-07-11 05:55:33,876 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744093_3269 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744093 2025-07-11 05:56:32,556 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744094_3270 src: /192.168.158.6:36838 dest: /192.168.158.4:9866 2025-07-11 05:56:32,572 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:36838, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1186655922_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744094_3270, duration(ns): 14855282 2025-07-11 05:56:32,573 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744094_3270, type=LAST_IN_PIPELINE terminating 2025-07-11 05:56:36,877 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744094_3270 replica FinalizedReplica, blk_1073744094_3270, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744094 for deletion 2025-07-11 05:56:36,878 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744094_3270 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744094 2025-07-11 05:58:37,559 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744096_3272 src: /192.168.158.8:48526 dest: /192.168.158.4:9866 2025-07-11 05:58:37,618 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:48526, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1321981198_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744096_3272, duration(ns): 52705043 2025-07-11 05:58:37,618 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744096_3272, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-11 05:58:42,881 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744096_3272 replica FinalizedReplica, blk_1073744096_3272, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744096 for deletion 2025-07-11 05:58:42,883 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744096_3272 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744096 2025-07-11 05:59:37,565 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744097_3273 src: /192.168.158.5:55744 dest: /192.168.158.4:9866 2025-07-11 05:59:37,591 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:55744, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1905391684_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744097_3273, duration(ns): 20317034 2025-07-11 05:59:37,591 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744097_3273, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-11 05:59:42,883 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744097_3273 replica FinalizedReplica, blk_1073744097_3273, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744097 for deletion 2025-07-11 05:59:42,885 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744097_3273 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744097 2025-07-11 06:01:42,559 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744099_3275 src: /192.168.158.8:46954 dest: /192.168.158.4:9866 2025-07-11 06:01:42,586 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:46954, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1306029956_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744099_3275, duration(ns): 20143177 2025-07-11 06:01:42,586 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744099_3275, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-11 06:01:45,888 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744099_3275 replica FinalizedReplica, blk_1073744099_3275, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744099 for deletion 2025-07-11 06:01:45,889 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744099_3275 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744099 2025-07-11 06:02:42,549 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744100_3276 src: /192.168.158.1:38180 dest: /192.168.158.4:9866 2025-07-11 06:02:42,581 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38180, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_927807366_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744100_3276, duration(ns): 22710266 2025-07-11 06:02:42,581 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744100_3276, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-11 06:02:45,891 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744100_3276 replica FinalizedReplica, blk_1073744100_3276, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744100 for deletion 2025-07-11 06:02:45,892 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744100_3276 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744100 2025-07-11 06:04:47,556 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744102_3278 src: /192.168.158.6:52068 dest: /192.168.158.4:9866 2025-07-11 06:04:47,580 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:52068, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_865832555_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744102_3278, duration(ns): 18452345 2025-07-11 06:04:47,580 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744102_3278, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-11 06:04:48,895 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744102_3278 replica FinalizedReplica, blk_1073744102_3278, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744102 for deletion 2025-07-11 06:04:48,896 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744102_3278 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744102 2025-07-11 06:05:47,563 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744103_3279 src: /192.168.158.5:53926 dest: /192.168.158.4:9866 2025-07-11 06:05:47,581 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:53926, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1551503174_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744103_3279, duration(ns): 15617387 2025-07-11 06:05:47,581 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744103_3279, type=LAST_IN_PIPELINE terminating 2025-07-11 06:05:51,898 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744103_3279 replica FinalizedReplica, blk_1073744103_3279, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744103 for deletion 2025-07-11 06:05:51,899 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744103_3279 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744103 2025-07-11 06:12:52,568 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744110_3286 src: /192.168.158.8:38430 dest: /192.168.158.4:9866 2025-07-11 06:12:52,593 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:38430, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1460175633_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744110_3286, duration(ns): 19354507 2025-07-11 06:12:52,593 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744110_3286, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-11 06:12:54,909 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744110_3286 replica FinalizedReplica, blk_1073744110_3286, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744110 for deletion 2025-07-11 06:12:54,910 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744110_3286 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744110 2025-07-11 06:15:52,575 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744113_3289 src: /192.168.158.5:47074 dest: /192.168.158.4:9866 2025-07-11 06:15:52,606 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:47074, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2125775657_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744113_3289, duration(ns): 23328882 2025-07-11 06:15:52,606 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744113_3289, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-11 06:15:54,916 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744113_3289 replica FinalizedReplica, blk_1073744113_3289, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744113 for deletion 2025-07-11 06:15:54,917 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744113_3289 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744113 2025-07-11 06:16:57,612 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744114_3290 src: /192.168.158.1:58724 dest: /192.168.158.4:9866 2025-07-11 06:16:57,642 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58724, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1206587590_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744114_3290, duration(ns): 21663932 2025-07-11 06:16:57,643 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744114_3290, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-11 06:17:03,918 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744114_3290 replica FinalizedReplica, blk_1073744114_3290, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744114 for deletion 2025-07-11 06:17:03,920 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744114_3290 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744114 2025-07-11 06:18:57,595 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744116_3292 src: /192.168.158.7:60738 dest: /192.168.158.4:9866 2025-07-11 06:18:57,612 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:60738, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-965502829_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744116_3292, duration(ns): 14871049 2025-07-11 06:18:57,612 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744116_3292, type=LAST_IN_PIPELINE terminating 2025-07-11 06:19:00,925 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744116_3292 replica FinalizedReplica, blk_1073744116_3292, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744116 for deletion 2025-07-11 06:19:00,926 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744116_3292 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744116 2025-07-11 06:20:58,376 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744118_3294 src: /192.168.158.7:60974 dest: /192.168.158.4:9866 2025-07-11 06:20:58,393 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:60974, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1769939430_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744118_3294, duration(ns): 14296295 2025-07-11 06:20:58,393 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744118_3294, type=LAST_IN_PIPELINE terminating 2025-07-11 06:21:00,927 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744118_3294 replica FinalizedReplica, blk_1073744118_3294, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744118 for deletion 2025-07-11 06:21:00,928 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744118_3294 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744118 2025-07-11 06:21:57,605 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744119_3295 src: /192.168.158.5:51344 dest: /192.168.158.4:9866 2025-07-11 06:21:57,625 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:51344, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-146181511_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744119_3295, duration(ns): 17846982 2025-07-11 06:21:57,625 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744119_3295, type=LAST_IN_PIPELINE terminating 2025-07-11 06:22:03,927 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744119_3295 replica FinalizedReplica, blk_1073744119_3295, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744119 for deletion 2025-07-11 06:22:03,928 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744119_3295 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744119 2025-07-11 06:23:02,585 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744120_3296 src: /192.168.158.7:39168 dest: /192.168.158.4:9866 2025-07-11 06:23:02,602 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:39168, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-510899811_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744120_3296, duration(ns): 14934515 2025-07-11 06:23:02,602 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744120_3296, type=LAST_IN_PIPELINE terminating 2025-07-11 06:23:06,928 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744120_3296 replica FinalizedReplica, blk_1073744120_3296, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744120 for deletion 2025-07-11 06:23:06,929 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744120_3296 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744120 2025-07-11 06:27:07,589 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744124_3300 src: /192.168.158.1:53264 dest: /192.168.158.4:9866 2025-07-11 06:27:07,624 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53264, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-49769556_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744124_3300, duration(ns): 26233247 2025-07-11 06:27:07,625 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744124_3300, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-11 06:27:09,935 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744124_3300 replica FinalizedReplica, blk_1073744124_3300, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744124 for deletion 2025-07-11 06:27:09,936 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744124_3300 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744124 2025-07-11 06:29:07,591 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744126_3302 src: /192.168.158.9:46798 dest: /192.168.158.4:9866 2025-07-11 06:29:07,608 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:46798, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951035796_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744126_3302, duration(ns): 15023988 2025-07-11 06:29:07,609 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744126_3302, type=LAST_IN_PIPELINE terminating 2025-07-11 06:29:09,941 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744126_3302 replica FinalizedReplica, blk_1073744126_3302, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744126 for deletion 2025-07-11 06:29:09,942 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744126_3302 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073744126 2025-07-11 06:33:22,594 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744130_3306 src: /192.168.158.1:43206 dest: /192.168.158.4:9866 2025-07-11 06:33:22,625 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43206, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_232514950_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744130_3306, duration(ns): 22264509 2025-07-11 06:33:22,626 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744130_3306, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-11 06:33:27,951 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744130_3306 replica FinalizedReplica, blk_1073744130_3306, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744130 for deletion 2025-07-11 06:33:27,952 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744130_3306 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744130 2025-07-11 06:34:22,617 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744131_3307 src: /192.168.158.9:49562 dest: /192.168.158.4:9866 2025-07-11 06:34:22,632 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:49562, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1817328292_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744131_3307, duration(ns): 12316086 2025-07-11 06:34:22,632 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744131_3307, type=LAST_IN_PIPELINE terminating 2025-07-11 06:34:24,952 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744131_3307 replica FinalizedReplica, blk_1073744131_3307, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744131 for deletion 2025-07-11 06:34:24,953 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744131_3307 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744131 2025-07-11 06:35:22,614 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744132_3308 src: /192.168.158.6:35330 dest: /192.168.158.4:9866 2025-07-11 06:35:22,637 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:35330, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_728341527_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744132_3308, duration(ns): 18000449 2025-07-11 06:35:22,638 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744132_3308, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-11 06:35:24,953 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744132_3308 replica FinalizedReplica, blk_1073744132_3308, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744132 for deletion 2025-07-11 06:35:24,954 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744132_3308 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744132 2025-07-11 06:36:22,605 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744133_3309 src: /192.168.158.1:55990 dest: /192.168.158.4:9866 2025-07-11 06:36:22,637 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55990, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_850173231_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744133_3309, duration(ns): 22143306 2025-07-11 06:36:22,637 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744133_3309, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-11 06:36:24,954 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744133_3309 replica FinalizedReplica, blk_1073744133_3309, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744133 for deletion 2025-07-11 06:36:24,955 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744133_3309 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744133 2025-07-11 06:39:32,641 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744136_3312 src: /192.168.158.5:41846 dest: /192.168.158.4:9866 2025-07-11 06:39:32,658 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:41846, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_251045143_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744136_3312, duration(ns): 15721209 2025-07-11 06:39:32,659 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744136_3312, type=LAST_IN_PIPELINE terminating 2025-07-11 06:39:36,962 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744136_3312 replica FinalizedReplica, blk_1073744136_3312, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744136 for deletion 2025-07-11 06:39:36,963 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744136_3312 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744136 2025-07-11 06:40:32,651 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744137_3313 src: /192.168.158.6:34276 dest: /192.168.158.4:9866 2025-07-11 06:40:32,670 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:34276, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_620120800_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744137_3313, duration(ns): 16782428 2025-07-11 06:40:32,670 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744137_3313, type=LAST_IN_PIPELINE terminating 2025-07-11 06:40:33,963 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744137_3313 replica FinalizedReplica, blk_1073744137_3313, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744137 for deletion 2025-07-11 06:40:33,964 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744137_3313 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744137 2025-07-11 06:42:32,644 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744139_3315 src: /192.168.158.6:48754 dest: /192.168.158.4:9866 2025-07-11 06:42:32,667 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:48754, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2084775692_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744139_3315, duration(ns): 17216511 2025-07-11 06:42:32,667 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744139_3315, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-11 06:42:36,965 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744139_3315 replica FinalizedReplica, blk_1073744139_3315, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744139 for deletion 2025-07-11 06:42:36,966 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744139_3315 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744139 2025-07-11 06:44:32,661 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744141_3317 src: /192.168.158.9:59256 dest: /192.168.158.4:9866 2025-07-11 06:44:32,686 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:59256, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-382305057_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744141_3317, duration(ns): 18793208 2025-07-11 06:44:32,686 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744141_3317, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-11 06:44:33,981 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744141_3317 replica FinalizedReplica, blk_1073744141_3317, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744141 for deletion 2025-07-11 06:44:33,981 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744141_3317 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744141 2025-07-11 06:45:32,643 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744142_3318 src: /192.168.158.7:47558 dest: /192.168.158.4:9866 2025-07-11 06:45:32,668 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:47558, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-413081619_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744142_3318, duration(ns): 19013273 2025-07-11 06:45:32,668 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744142_3318, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-11 06:45:36,982 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744142_3318 replica FinalizedReplica, blk_1073744142_3318, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744142 for deletion 2025-07-11 06:45:36,983 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744142_3318 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744142 2025-07-11 06:47:32,644 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744144_3320 src: /192.168.158.9:35692 dest: /192.168.158.4:9866 2025-07-11 06:47:32,668 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:35692, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1641954621_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744144_3320, duration(ns): 18080393 2025-07-11 06:47:32,668 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744144_3320, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-11 06:47:33,986 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744144_3320 replica FinalizedReplica, blk_1073744144_3320, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744144 for deletion 2025-07-11 06:47:33,987 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744144_3320 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744144 2025-07-11 06:48:32,657 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744145_3321 src: /192.168.158.1:39288 dest: /192.168.158.4:9866 2025-07-11 06:48:32,688 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39288, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1399752444_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744145_3321, duration(ns): 20931697 2025-07-11 06:48:32,689 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744145_3321, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-11 06:48:36,988 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744145_3321 replica FinalizedReplica, blk_1073744145_3321, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744145 for deletion 2025-07-11 06:48:36,989 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744145_3321 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744145 2025-07-11 06:51:37,681 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744148_3324 src: /192.168.158.8:49652 dest: /192.168.158.4:9866 2025-07-11 06:51:37,700 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:49652, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_157234256_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744148_3324, duration(ns): 16951529 2025-07-11 06:51:37,701 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744148_3324, type=LAST_IN_PIPELINE terminating 2025-07-11 06:51:39,997 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744148_3324 replica FinalizedReplica, blk_1073744148_3324, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744148 for deletion 2025-07-11 06:51:39,998 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744148_3324 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744148 2025-07-11 06:53:37,692 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744150_3326 src: /192.168.158.5:60926 dest: /192.168.158.4:9866 2025-07-11 06:53:37,714 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:60926, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_188195579_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744150_3326, duration(ns): 19086989 2025-07-11 06:53:37,714 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744150_3326, type=LAST_IN_PIPELINE terminating 2025-07-11 06:53:40,003 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744150_3326 replica FinalizedReplica, blk_1073744150_3326, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744150 for deletion 2025-07-11 06:53:40,004 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744150_3326 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744150 2025-07-11 06:54:37,702 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744151_3327 src: /192.168.158.1:52136 dest: /192.168.158.4:9866 2025-07-11 06:54:37,731 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52136, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2118139739_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744151_3327, duration(ns): 21191998 2025-07-11 06:54:37,732 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744151_3327, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-11 06:54:40,003 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744151_3327 replica FinalizedReplica, blk_1073744151_3327, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744151 for deletion 2025-07-11 06:54:40,004 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744151_3327 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744151 2025-07-11 06:55:37,694 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744152_3328 src: /192.168.158.1:43500 dest: /192.168.158.4:9866 2025-07-11 06:55:37,725 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43500, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-139262428_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744152_3328, duration(ns): 22151720 2025-07-11 06:55:37,725 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744152_3328, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-11 06:55:43,004 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744152_3328 replica FinalizedReplica, blk_1073744152_3328, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744152 for deletion 2025-07-11 06:55:43,005 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744152_3328 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744152 2025-07-11 06:58:42,682 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744155_3331 src: /192.168.158.9:51276 dest: /192.168.158.4:9866 2025-07-11 06:58:42,706 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:51276, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1203837761_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744155_3331, duration(ns): 18678964 2025-07-11 06:58:42,707 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744155_3331, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-11 06:58:49,010 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744155_3331 replica FinalizedReplica, blk_1073744155_3331, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744155 for deletion 2025-07-11 06:58:49,011 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744155_3331 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744155 2025-07-11 07:01:47,699 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744158_3334 src: /192.168.158.1:39376 dest: /192.168.158.4:9866 2025-07-11 07:01:47,731 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39376, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_442325041_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744158_3334, duration(ns): 23516855 2025-07-11 07:01:47,732 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744158_3334, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-11 07:01:52,017 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744158_3334 replica FinalizedReplica, blk_1073744158_3334, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744158 for deletion 2025-07-11 07:01:52,019 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744158_3334 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744158 2025-07-11 07:07:52,719 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744164_3340 src: /192.168.158.6:42872 dest: /192.168.158.4:9866 2025-07-11 07:07:52,737 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:42872, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_934945321_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744164_3340, duration(ns): 16299555 2025-07-11 07:07:52,738 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744164_3340, type=LAST_IN_PIPELINE terminating 2025-07-11 07:07:58,034 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744164_3340 replica FinalizedReplica, blk_1073744164_3340, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744164 for deletion 2025-07-11 07:07:58,035 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744164_3340 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744164 2025-07-11 07:09:57,724 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744166_3342 src: /192.168.158.8:46026 dest: /192.168.158.4:9866 2025-07-11 07:09:57,745 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:46026, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-920381590_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744166_3342, duration(ns): 17990682 2025-07-11 07:09:57,745 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744166_3342, type=LAST_IN_PIPELINE terminating 2025-07-11 07:10:04,036 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744166_3342 replica FinalizedReplica, blk_1073744166_3342, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744166 for deletion 2025-07-11 07:10:04,037 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744166_3342 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744166 2025-07-11 07:12:02,713 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744168_3344 src: /192.168.158.8:32868 dest: /192.168.158.4:9866 2025-07-11 07:12:02,737 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:32868, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_773031999_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744168_3344, duration(ns): 18391756 2025-07-11 07:12:02,737 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744168_3344, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-11 07:12:04,042 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744168_3344 replica FinalizedReplica, blk_1073744168_3344, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744168 for deletion 2025-07-11 07:12:04,043 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744168_3344 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744168 2025-07-11 07:14:02,715 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744170_3346 src: /192.168.158.1:47254 dest: /192.168.158.4:9866 2025-07-11 07:14:02,747 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47254, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1860307265_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744170_3346, duration(ns): 23869376 2025-07-11 07:14:02,748 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744170_3346, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-11 07:14:07,047 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744170_3346 replica FinalizedReplica, blk_1073744170_3346, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744170 for deletion 2025-07-11 07:14:07,048 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744170_3346 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744170 2025-07-11 07:16:02,722 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744172_3348 src: /192.168.158.1:46104 dest: /192.168.158.4:9866 2025-07-11 07:16:02,752 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46104, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1165315158_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744172_3348, duration(ns): 21343948 2025-07-11 07:16:02,753 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744172_3348, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-11 07:16:04,053 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744172_3348 replica FinalizedReplica, blk_1073744172_3348, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744172 for deletion 2025-07-11 07:16:04,054 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744172_3348 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744172 2025-07-11 07:19:02,731 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744175_3351 src: /192.168.158.1:49820 dest: /192.168.158.4:9866 2025-07-11 07:19:02,757 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49820, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_36830086_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744175_3351, duration(ns): 18609479 2025-07-11 07:19:02,758 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744175_3351, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-11 07:19:04,060 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744175_3351 replica FinalizedReplica, blk_1073744175_3351, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744175 for deletion 2025-07-11 07:19:04,061 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744175_3351 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744175 2025-07-11 07:20:02,743 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744176_3352 src: /192.168.158.9:57550 dest: /192.168.158.4:9866 2025-07-11 07:20:02,760 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:57550, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-431812541_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744176_3352, duration(ns): 14947749 2025-07-11 07:20:02,761 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744176_3352, type=LAST_IN_PIPELINE terminating 2025-07-11 07:20:04,060 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744176_3352 replica FinalizedReplica, blk_1073744176_3352, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744176 for deletion 2025-07-11 07:20:04,061 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744176_3352 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744176 2025-07-11 07:22:02,754 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744178_3354 src: /192.168.158.8:36580 dest: /192.168.158.4:9866 2025-07-11 07:22:02,777 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:36580, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1327410878_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744178_3354, duration(ns): 17209126 2025-07-11 07:22:02,777 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744178_3354, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-11 07:22:04,066 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744178_3354 replica FinalizedReplica, blk_1073744178_3354, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744178 for deletion 2025-07-11 07:22:04,067 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744178_3354 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744178 2025-07-11 07:23:02,756 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744179_3355 src: /192.168.158.1:57528 dest: /192.168.158.4:9866 2025-07-11 07:23:02,788 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57528, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_133465961_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744179_3355, duration(ns): 23850216 2025-07-11 07:23:02,789 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744179_3355, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-11 07:23:04,067 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744179_3355 replica FinalizedReplica, blk_1073744179_3355, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744179 for deletion 2025-07-11 07:23:04,069 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744179_3355 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744179 2025-07-11 07:25:07,749 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744181_3357 src: /192.168.158.5:42172 dest: /192.168.158.4:9866 2025-07-11 07:25:07,775 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:42172, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-590291924_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744181_3357, duration(ns): 20035088 2025-07-11 07:25:07,775 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744181_3357, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-11 07:25:10,073 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744181_3357 replica FinalizedReplica, blk_1073744181_3357, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744181 for deletion 2025-07-11 07:25:10,074 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744181_3357 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744181 2025-07-11 07:26:07,759 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744182_3358 src: /192.168.158.6:60556 dest: /192.168.158.4:9866 2025-07-11 07:26:07,784 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:60556, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-354934510_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744182_3358, duration(ns): 19564984 2025-07-11 07:26:07,785 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744182_3358, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-11 07:26:13,075 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744182_3358 replica FinalizedReplica, blk_1073744182_3358, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744182 for deletion 2025-07-11 07:26:13,077 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744182_3358 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744182 2025-07-11 07:29:07,767 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744185_3361 src: /192.168.158.1:42750 dest: /192.168.158.4:9866 2025-07-11 07:29:07,797 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42750, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-274293573_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744185_3361, duration(ns): 20162226 2025-07-11 07:29:07,797 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744185_3361, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-11 07:29:10,080 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744185_3361 replica FinalizedReplica, blk_1073744185_3361, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744185 for deletion 2025-07-11 07:29:10,081 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744185_3361 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744185 2025-07-11 07:30:12,734 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744186_3362 src: /192.168.158.8:51712 dest: /192.168.158.4:9866 2025-07-11 07:30:12,760 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:51712, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1216373070_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744186_3362, duration(ns): 19901124 2025-07-11 07:30:12,760 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744186_3362, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-11 07:30:16,083 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744186_3362 replica FinalizedReplica, blk_1073744186_3362, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744186 for deletion 2025-07-11 07:30:16,084 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744186_3362 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744186 2025-07-11 07:31:12,763 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744187_3363 src: /192.168.158.1:35972 dest: /192.168.158.4:9866 2025-07-11 07:31:12,795 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35972, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_376787345_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744187_3363, duration(ns): 21017361 2025-07-11 07:31:12,795 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744187_3363, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-11 07:31:19,088 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744187_3363 replica FinalizedReplica, blk_1073744187_3363, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744187 for deletion 2025-07-11 07:31:19,089 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744187_3363 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744187 2025-07-11 07:34:17,778 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744190_3366 src: /192.168.158.8:60044 dest: /192.168.158.4:9866 2025-07-11 07:34:17,799 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:60044, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1930111606_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744190_3366, duration(ns): 18085494 2025-07-11 07:34:17,799 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744190_3366, type=LAST_IN_PIPELINE terminating 2025-07-11 07:34:22,098 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744190_3366 replica FinalizedReplica, blk_1073744190_3366, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744190 for deletion 2025-07-11 07:34:22,099 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744190_3366 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744190 2025-07-11 07:35:17,759 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744191_3367 src: /192.168.158.9:60498 dest: /192.168.158.4:9866 2025-07-11 07:35:17,783 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:60498, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1629175116_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744191_3367, duration(ns): 18880142 2025-07-11 07:35:17,784 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744191_3367, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-11 07:35:19,102 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744191_3367 replica FinalizedReplica, blk_1073744191_3367, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744191 for deletion 2025-07-11 07:35:19,103 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744191_3367 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744191 2025-07-11 07:36:17,758 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744192_3368 src: /192.168.158.8:33410 dest: /192.168.158.4:9866 2025-07-11 07:36:17,783 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:33410, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2063405765_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744192_3368, duration(ns): 19632778 2025-07-11 07:36:17,784 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744192_3368, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-11 07:36:22,104 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744192_3368 replica FinalizedReplica, blk_1073744192_3368, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744192 for deletion 2025-07-11 07:36:22,105 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744192_3368 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744192 2025-07-11 07:37:22,770 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744193_3369 src: /192.168.158.7:43554 dest: /192.168.158.4:9866 2025-07-11 07:37:22,790 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:43554, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-147251379_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744193_3369, duration(ns): 18333009 2025-07-11 07:37:22,791 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744193_3369, type=LAST_IN_PIPELINE terminating 2025-07-11 07:37:25,105 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744193_3369 replica FinalizedReplica, blk_1073744193_3369, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744193 for deletion 2025-07-11 07:37:25,106 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744193_3369 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744193 2025-07-11 07:42:22,754 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744198_3374 src: /192.168.158.1:60144 dest: /192.168.158.4:9866 2025-07-11 07:42:22,784 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60144, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_646065006_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744198_3374, duration(ns): 21029283 2025-07-11 07:42:22,785 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744198_3374, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-11 07:42:25,113 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744198_3374 replica FinalizedReplica, blk_1073744198_3374, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744198 for deletion 2025-07-11 07:42:25,115 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744198_3374 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744198 2025-07-11 07:45:27,762 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744201_3377 src: /192.168.158.1:58750 dest: /192.168.158.4:9866 2025-07-11 07:45:27,792 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58750, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_767633111_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744201_3377, duration(ns): 22102077 2025-07-11 07:45:27,795 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744201_3377, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-11 07:45:31,121 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744201_3377 replica FinalizedReplica, blk_1073744201_3377, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744201 for deletion 2025-07-11 07:45:31,123 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744201_3377 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744201 2025-07-11 07:47:27,771 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744203_3379 src: /192.168.158.9:38016 dest: /192.168.158.4:9866 2025-07-11 07:47:27,796 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:38016, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1939971129_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744203_3379, duration(ns): 19147925 2025-07-11 07:47:27,796 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744203_3379, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-11 07:47:34,125 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744203_3379 replica FinalizedReplica, blk_1073744203_3379, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744203 for deletion 2025-07-11 07:47:34,126 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744203_3379 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744203 2025-07-11 07:49:32,782 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744205_3381 src: /192.168.158.8:56208 dest: /192.168.158.4:9866 2025-07-11 07:49:32,801 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:56208, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1941518169_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744205_3381, duration(ns): 16212996 2025-07-11 07:49:32,801 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744205_3381, type=LAST_IN_PIPELINE terminating 2025-07-11 07:49:34,131 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744205_3381 replica FinalizedReplica, blk_1073744205_3381, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744205 for deletion 2025-07-11 07:49:34,132 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744205_3381 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744205 2025-07-11 07:50:32,769 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744206_3382 src: /192.168.158.8:52040 dest: /192.168.158.4:9866 2025-07-11 07:50:32,788 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:52040, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1508062328_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744206_3382, duration(ns): 16740528 2025-07-11 07:50:32,788 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744206_3382, type=LAST_IN_PIPELINE terminating 2025-07-11 07:50:34,132 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744206_3382 replica FinalizedReplica, blk_1073744206_3382, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744206 for deletion 2025-07-11 07:50:34,134 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744206_3382 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744206 2025-07-11 07:51:32,769 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744207_3383 src: /192.168.158.1:55908 dest: /192.168.158.4:9866 2025-07-11 07:51:32,797 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55908, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1578050276_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744207_3383, duration(ns): 19746903 2025-07-11 07:51:32,797 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744207_3383, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-11 07:51:34,134 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744207_3383 replica FinalizedReplica, blk_1073744207_3383, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744207 for deletion 2025-07-11 07:51:34,136 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744207_3383 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744207 2025-07-11 07:53:32,774 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744209_3385 src: /192.168.158.6:43348 dest: /192.168.158.4:9866 2025-07-11 07:53:32,791 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:43348, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1384250368_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744209_3385, duration(ns): 15512209 2025-07-11 07:53:32,792 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744209_3385, type=LAST_IN_PIPELINE terminating 2025-07-11 07:53:34,138 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744209_3385 replica FinalizedReplica, blk_1073744209_3385, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744209 for deletion 2025-07-11 07:53:34,140 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744209_3385 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744209 2025-07-11 07:54:32,799 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744210_3386 src: /192.168.158.7:52788 dest: /192.168.158.4:9866 2025-07-11 07:54:32,825 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:52788, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1372281246_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744210_3386, duration(ns): 19643950 2025-07-11 07:54:32,825 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744210_3386, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-11 07:54:34,139 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744210_3386 replica FinalizedReplica, blk_1073744210_3386, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744210 for deletion 2025-07-11 07:54:34,140 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744210_3386 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744210 2025-07-11 07:58:42,779 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744214_3390 src: /192.168.158.1:58358 dest: /192.168.158.4:9866 2025-07-11 07:58:42,811 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58358, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-906303822_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744214_3390, duration(ns): 23097920 2025-07-11 07:58:42,811 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744214_3390, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-11 07:58:46,143 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744214_3390 replica FinalizedReplica, blk_1073744214_3390, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744214 for deletion 2025-07-11 07:58:46,144 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744214_3390 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744214 2025-07-11 07:59:42,798 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744215_3391 src: /192.168.158.8:36698 dest: /192.168.158.4:9866 2025-07-11 07:59:42,821 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:36698, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_660296893_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744215_3391, duration(ns): 16762433 2025-07-11 07:59:42,822 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744215_3391, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-11 07:59:49,145 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744215_3391 replica FinalizedReplica, blk_1073744215_3391, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744215 for deletion 2025-07-11 07:59:49,147 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744215_3391 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744215 2025-07-11 08:00:47,798 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744216_3392 src: /192.168.158.7:36906 dest: /192.168.158.4:9866 2025-07-11 08:00:47,823 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:36906, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-807613095_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744216_3392, duration(ns): 19513587 2025-07-11 08:00:47,823 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744216_3392, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-11 08:00:49,148 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744216_3392 replica FinalizedReplica, blk_1073744216_3392, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744216 for deletion 2025-07-11 08:00:49,149 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744216_3392 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744216 2025-07-11 08:01:52,799 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744217_3393 src: /192.168.158.1:38874 dest: /192.168.158.4:9866 2025-07-11 08:01:52,830 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38874, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_695806152_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744217_3393, duration(ns): 21877412 2025-07-11 08:01:52,830 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744217_3393, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-11 08:01:58,149 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744217_3393 replica FinalizedReplica, blk_1073744217_3393, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744217 for deletion 2025-07-11 08:01:58,150 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744217_3393 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744217 2025-07-11 08:02:52,792 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744218_3394 src: /192.168.158.6:55358 dest: /192.168.158.4:9866 2025-07-11 08:02:52,819 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:55358, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1500197668_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744218_3394, duration(ns): 20800993 2025-07-11 08:02:52,820 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744218_3394, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-11 08:02:58,149 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744218_3394 replica FinalizedReplica, blk_1073744218_3394, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744218 for deletion 2025-07-11 08:02:58,150 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744218_3394 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744218 2025-07-11 08:05:52,797 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744221_3397 src: /192.168.158.1:41132 dest: /192.168.158.4:9866 2025-07-11 08:05:52,828 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41132, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-876186815_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744221_3397, duration(ns): 21899334 2025-07-11 08:05:52,828 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744221_3397, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-11 08:05:58,158 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744221_3397 replica FinalizedReplica, blk_1073744221_3397, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744221 for deletion 2025-07-11 08:05:58,159 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744221_3397 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744221 2025-07-11 08:07:57,801 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744223_3399 src: /192.168.158.1:53358 dest: /192.168.158.4:9866 2025-07-11 08:07:57,833 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53358, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1231692937_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744223_3399, duration(ns): 22174746 2025-07-11 08:07:57,833 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744223_3399, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-11 08:08:01,166 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744223_3399 replica FinalizedReplica, blk_1073744223_3399, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744223 for deletion 2025-07-11 08:08:01,167 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744223_3399 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744223 2025-07-11 08:11:57,820 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744227_3403 src: /192.168.158.5:53960 dest: /192.168.158.4:9866 2025-07-11 08:11:57,838 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:53960, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1144706304_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744227_3403, duration(ns): 15745813 2025-07-11 08:11:57,838 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744227_3403, type=LAST_IN_PIPELINE terminating 2025-07-11 08:12:04,179 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744227_3403 replica FinalizedReplica, blk_1073744227_3403, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744227 for deletion 2025-07-11 08:12:04,180 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744227_3403 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744227 2025-07-11 08:12:57,843 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744228_3404 src: /192.168.158.1:49658 dest: /192.168.158.4:9866 2025-07-11 08:12:57,873 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49658, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-920263265_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744228_3404, duration(ns): 22128649 2025-07-11 08:12:57,874 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744228_3404, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-11 08:13:01,183 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744228_3404 replica FinalizedReplica, blk_1073744228_3404, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744228 for deletion 2025-07-11 08:13:01,185 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744228_3404 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744228 2025-07-11 08:13:57,858 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744229_3405 src: /192.168.158.5:47974 dest: /192.168.158.4:9866 2025-07-11 08:13:57,875 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:47974, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-336585009_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744229_3405, duration(ns): 15291014 2025-07-11 08:13:57,876 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744229_3405, type=LAST_IN_PIPELINE terminating 2025-07-11 08:14:01,186 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744229_3405 replica FinalizedReplica, blk_1073744229_3405, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744229 for deletion 2025-07-11 08:14:01,188 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744229_3405 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744229 2025-07-11 08:16:57,813 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744232_3408 src: /192.168.158.1:54914 dest: /192.168.158.4:9866 2025-07-11 08:16:57,844 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54914, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1503927509_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744232_3408, duration(ns): 21679030 2025-07-11 08:16:57,844 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744232_3408, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-11 08:17:04,198 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744232_3408 replica FinalizedReplica, blk_1073744232_3408, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744232 for deletion 2025-07-11 08:17:04,199 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744232_3408 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744232 2025-07-11 08:18:57,822 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744234_3410 src: /192.168.158.6:49740 dest: /192.168.158.4:9866 2025-07-11 08:18:57,842 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:49740, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-731359366_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744234_3410, duration(ns): 18086737 2025-07-11 08:18:57,842 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744234_3410, type=LAST_IN_PIPELINE terminating 2025-07-11 08:19:04,203 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744234_3410 replica FinalizedReplica, blk_1073744234_3410, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744234 for deletion 2025-07-11 08:19:04,204 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744234_3410 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744234 2025-07-11 08:20:02,824 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744235_3411 src: /192.168.158.5:52308 dest: /192.168.158.4:9866 2025-07-11 08:20:02,843 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:52308, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1602105286_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744235_3411, duration(ns): 15999647 2025-07-11 08:20:02,843 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744235_3411, type=LAST_IN_PIPELINE terminating 2025-07-11 08:20:04,204 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744235_3411 replica FinalizedReplica, blk_1073744235_3411, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744235 for deletion 2025-07-11 08:20:04,207 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744235_3411 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744235 2025-07-11 08:22:02,828 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744237_3413 src: /192.168.158.7:37650 dest: /192.168.158.4:9866 2025-07-11 08:22:02,854 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:37650, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-41489057_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744237_3413, duration(ns): 19643655 2025-07-11 08:22:02,854 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744237_3413, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-11 08:22:04,206 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744237_3413 replica FinalizedReplica, blk_1073744237_3413, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744237 for deletion 2025-07-11 08:22:04,207 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744237_3413 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744237 2025-07-11 08:23:07,843 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744238_3414 src: /192.168.158.9:49328 dest: /192.168.158.4:9866 2025-07-11 08:23:07,863 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:49328, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1088062557_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744238_3414, duration(ns): 17429678 2025-07-11 08:23:07,863 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744238_3414, type=LAST_IN_PIPELINE terminating 2025-07-11 08:23:13,209 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744238_3414 replica FinalizedReplica, blk_1073744238_3414, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744238 for deletion 2025-07-11 08:23:13,210 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744238_3414 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744238 2025-07-11 08:27:12,835 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744242_3418 src: /192.168.158.9:53506 dest: /192.168.158.4:9866 2025-07-11 08:27:12,860 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:53506, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1836026749_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744242_3418, duration(ns): 19309548 2025-07-11 08:27:12,860 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744242_3418, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-11 08:27:16,217 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744242_3418 replica FinalizedReplica, blk_1073744242_3418, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744242 for deletion 2025-07-11 08:27:16,219 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744242_3418 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744242 2025-07-11 08:30:12,842 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744245_3421 src: /192.168.158.6:55388 dest: /192.168.158.4:9866 2025-07-11 08:30:12,860 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:55388, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1857632241_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744245_3421, duration(ns): 16213848 2025-07-11 08:30:12,860 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744245_3421, type=LAST_IN_PIPELINE terminating 2025-07-11 08:30:16,225 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744245_3421 replica FinalizedReplica, blk_1073744245_3421, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744245 for deletion 2025-07-11 08:30:16,226 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744245_3421 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744245 2025-07-11 08:31:12,839 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744246_3422 src: /192.168.158.1:32820 dest: /192.168.158.4:9866 2025-07-11 08:31:12,869 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:32820, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1907374489_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744246_3422, duration(ns): 21433620 2025-07-11 08:31:12,869 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744246_3422, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-11 08:31:16,228 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744246_3422 replica FinalizedReplica, blk_1073744246_3422, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744246 for deletion 2025-07-11 08:31:16,229 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744246_3422 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744246 2025-07-11 08:32:12,843 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744247_3423 src: /192.168.158.9:40542 dest: /192.168.158.4:9866 2025-07-11 08:32:12,866 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:40542, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1177553962_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744247_3423, duration(ns): 18045374 2025-07-11 08:32:12,866 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744247_3423, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-11 08:32:16,232 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744247_3423 replica FinalizedReplica, blk_1073744247_3423, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744247 for deletion 2025-07-11 08:32:16,233 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744247_3423 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744247 2025-07-11 08:33:12,847 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744248_3424 src: /192.168.158.9:35692 dest: /192.168.158.4:9866 2025-07-11 08:33:12,866 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:35692, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1918239840_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744248_3424, duration(ns): 16273474 2025-07-11 08:33:12,866 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744248_3424, type=LAST_IN_PIPELINE terminating 2025-07-11 08:33:19,234 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744248_3424 replica FinalizedReplica, blk_1073744248_3424, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744248 for deletion 2025-07-11 08:33:19,236 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744248_3424 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744248 2025-07-11 08:34:12,853 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744249_3425 src: /192.168.158.5:56620 dest: /192.168.158.4:9866 2025-07-11 08:34:12,877 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:56620, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-119779470_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744249_3425, duration(ns): 18110906 2025-07-11 08:34:12,877 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744249_3425, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-11 08:34:16,236 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744249_3425 replica FinalizedReplica, blk_1073744249_3425, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744249 for deletion 2025-07-11 08:34:16,238 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744249_3425 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744249 2025-07-11 08:35:12,852 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744250_3426 src: /192.168.158.7:51754 dest: /192.168.158.4:9866 2025-07-11 08:35:12,879 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:51754, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_798762351_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744250_3426, duration(ns): 20626639 2025-07-11 08:35:12,879 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744250_3426, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-11 08:35:19,241 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744250_3426 replica FinalizedReplica, blk_1073744250_3426, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744250 for deletion 2025-07-11 08:35:19,242 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744250_3426 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744250 2025-07-11 08:36:12,857 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744251_3427 src: /192.168.158.1:51442 dest: /192.168.158.4:9866 2025-07-11 08:36:12,888 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51442, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2021442769_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744251_3427, duration(ns): 21780962 2025-07-11 08:36:12,888 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744251_3427, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-11 08:36:16,242 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744251_3427 replica FinalizedReplica, blk_1073744251_3427, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744251 for deletion 2025-07-11 08:36:16,243 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744251_3427 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744251 2025-07-11 08:37:12,855 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744252_3428 src: /192.168.158.1:57740 dest: /192.168.158.4:9866 2025-07-11 08:37:12,886 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57740, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-75055488_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744252_3428, duration(ns): 21769341 2025-07-11 08:37:12,886 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744252_3428, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-11 08:37:16,245 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744252_3428 replica FinalizedReplica, blk_1073744252_3428, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744252 for deletion 2025-07-11 08:37:16,246 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744252_3428 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744252 2025-07-11 08:38:12,856 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744253_3429 src: /192.168.158.7:36990 dest: /192.168.158.4:9866 2025-07-11 08:38:12,886 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:36990, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1202001856_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744253_3429, duration(ns): 22761552 2025-07-11 08:38:12,886 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744253_3429, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-11 08:38:19,247 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744253_3429 replica FinalizedReplica, blk_1073744253_3429, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744253 for deletion 2025-07-11 08:38:19,248 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744253_3429 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744253 2025-07-11 08:39:17,865 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744254_3430 src: /192.168.158.6:35974 dest: /192.168.158.4:9866 2025-07-11 08:39:17,891 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:35974, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_843202517_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744254_3430, duration(ns): 21403317 2025-07-11 08:39:17,892 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744254_3430, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-11 08:39:19,252 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744254_3430 replica FinalizedReplica, blk_1073744254_3430, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744254 for deletion 2025-07-11 08:39:19,253 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744254_3430 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744254 2025-07-11 08:40:17,866 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744255_3431 src: /192.168.158.1:43606 dest: /192.168.158.4:9866 2025-07-11 08:40:17,897 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43606, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_873554023_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744255_3431, duration(ns): 21737184 2025-07-11 08:40:17,897 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744255_3431, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-11 08:40:19,254 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744255_3431 replica FinalizedReplica, blk_1073744255_3431, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744255 for deletion 2025-07-11 08:40:19,255 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744255_3431 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744255 2025-07-11 08:45:22,872 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744260_3436 src: /192.168.158.1:50484 dest: /192.168.158.4:9866 2025-07-11 08:45:22,902 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50484, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-56026483_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744260_3436, duration(ns): 21249956 2025-07-11 08:45:22,902 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744260_3436, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-11 08:45:25,262 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744260_3436 replica FinalizedReplica, blk_1073744260_3436, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744260 for deletion 2025-07-11 08:45:25,264 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744260_3436 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744260 2025-07-11 08:49:27,882 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744264_3440 src: /192.168.158.5:54594 dest: /192.168.158.4:9866 2025-07-11 08:49:27,898 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:54594, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-247832147_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744264_3440, duration(ns): 14110833 2025-07-11 08:49:27,899 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744264_3440, type=LAST_IN_PIPELINE terminating 2025-07-11 08:49:31,274 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744264_3440 replica FinalizedReplica, blk_1073744264_3440, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744264 for deletion 2025-07-11 08:49:31,275 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744264_3440 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744264 2025-07-11 08:54:37,891 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744269_3445 src: /192.168.158.6:46648 dest: /192.168.158.4:9866 2025-07-11 08:54:37,910 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:46648, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_622043822_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744269_3445, duration(ns): 15998460 2025-07-11 08:54:37,910 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744269_3445, type=LAST_IN_PIPELINE terminating 2025-07-11 08:54:43,285 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744269_3445 replica FinalizedReplica, blk_1073744269_3445, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744269 for deletion 2025-07-11 08:54:43,286 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744269_3445 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744269 2025-07-11 08:57:42,901 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744272_3448 src: /192.168.158.5:49338 dest: /192.168.158.4:9866 2025-07-11 08:57:42,921 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:49338, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1890734768_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744272_3448, duration(ns): 18253334 2025-07-11 08:57:42,921 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744272_3448, type=LAST_IN_PIPELINE terminating 2025-07-11 08:57:46,294 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744272_3448 replica FinalizedReplica, blk_1073744272_3448, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744272 for deletion 2025-07-11 08:57:46,295 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744272_3448 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744272 2025-07-11 08:58:42,887 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744273_3449 src: /192.168.158.1:50646 dest: /192.168.158.4:9866 2025-07-11 08:58:42,920 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50646, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-892788922_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744273_3449, duration(ns): 24113816 2025-07-11 08:58:42,921 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744273_3449, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-11 08:58:49,296 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744273_3449 replica FinalizedReplica, blk_1073744273_3449, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744273 for deletion 2025-07-11 08:58:49,297 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744273_3449 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744273 2025-07-11 08:59:47,896 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744274_3450 src: /192.168.158.7:48094 dest: /192.168.158.4:9866 2025-07-11 08:59:47,914 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:48094, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1768777491_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744274_3450, duration(ns): 15193113 2025-07-11 08:59:47,914 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744274_3450, type=LAST_IN_PIPELINE terminating 2025-07-11 08:59:49,303 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744274_3450 replica FinalizedReplica, blk_1073744274_3450, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744274 for deletion 2025-07-11 08:59:49,305 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744274_3450 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744274 2025-07-11 09:00:47,899 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744275_3451 src: /192.168.158.1:35216 dest: /192.168.158.4:9866 2025-07-11 09:00:47,928 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35216, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1089088189_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744275_3451, duration(ns): 20582241 2025-07-11 09:00:47,929 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744275_3451, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-11 09:00:52,305 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744275_3451 replica FinalizedReplica, blk_1073744275_3451, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744275 for deletion 2025-07-11 09:00:52,306 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744275_3451 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744275 2025-07-11 09:02:57,900 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744277_3453 src: /192.168.158.7:58148 dest: /192.168.158.4:9866 2025-07-11 09:02:57,925 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:58148, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1405738651_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744277_3453, duration(ns): 20009122 2025-07-11 09:02:57,925 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744277_3453, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-11 09:03:01,312 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744277_3453 replica FinalizedReplica, blk_1073744277_3453, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744277 for deletion 2025-07-11 09:03:01,313 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744277_3453 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744277 2025-07-11 09:05:07,902 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744279_3455 src: /192.168.158.6:41488 dest: /192.168.158.4:9866 2025-07-11 09:05:07,921 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:41488, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_251950881_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744279_3455, duration(ns): 17666257 2025-07-11 09:05:07,922 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744279_3455, type=LAST_IN_PIPELINE terminating 2025-07-11 09:05:13,317 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744279_3455 replica FinalizedReplica, blk_1073744279_3455, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744279 for deletion 2025-07-11 09:05:13,319 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744279_3455 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744279 2025-07-11 09:06:07,903 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744280_3456 src: /192.168.158.1:37508 dest: /192.168.158.4:9866 2025-07-11 09:06:07,934 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37508, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_583550663_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744280_3456, duration(ns): 22624900 2025-07-11 09:06:07,934 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744280_3456, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-11 09:06:10,319 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744280_3456 replica FinalizedReplica, blk_1073744280_3456, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744280 for deletion 2025-07-11 09:06:10,320 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744280_3456 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744280 2025-07-11 09:07:07,933 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744281_3457 src: /192.168.158.6:41146 dest: /192.168.158.4:9866 2025-07-11 09:07:07,948 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:41146, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1511173670_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744281_3457, duration(ns): 12980923 2025-07-11 09:07:07,949 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744281_3457, type=LAST_IN_PIPELINE terminating 2025-07-11 09:07:13,323 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744281_3457 replica FinalizedReplica, blk_1073744281_3457, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744281 for deletion 2025-07-11 09:07:13,324 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744281_3457 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744281 2025-07-11 09:08:07,905 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744282_3458 src: /192.168.158.6:54910 dest: /192.168.158.4:9866 2025-07-11 09:08:07,927 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:54910, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1178925628_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744282_3458, duration(ns): 17214834 2025-07-11 09:08:07,928 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744282_3458, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-11 09:08:13,323 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744282_3458 replica FinalizedReplica, blk_1073744282_3458, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744282 for deletion 2025-07-11 09:08:13,324 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744282_3458 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744282 2025-07-11 09:11:12,957 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744285_3461 src: /192.168.158.1:56282 dest: /192.168.158.4:9866 2025-07-11 09:11:12,988 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56282, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2069188656_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744285_3461, duration(ns): 21757985 2025-07-11 09:11:12,988 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744285_3461, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-11 09:11:19,332 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744285_3461 replica FinalizedReplica, blk_1073744285_3461, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744285 for deletion 2025-07-11 09:11:19,333 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744285_3461 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744285 2025-07-11 09:12:17,969 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744286_3462 src: /192.168.158.5:40940 dest: /192.168.158.4:9866 2025-07-11 09:12:17,990 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:40940, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_629493237_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744286_3462, duration(ns): 18442578 2025-07-11 09:12:17,990 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744286_3462, type=LAST_IN_PIPELINE terminating 2025-07-11 09:12:22,336 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744286_3462 replica FinalizedReplica, blk_1073744286_3462, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744286 for deletion 2025-07-11 09:12:22,337 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744286_3462 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744286 2025-07-11 09:13:17,943 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744287_3463 src: /192.168.158.1:46696 dest: /192.168.158.4:9866 2025-07-11 09:13:17,973 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46696, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_775298312_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744287_3463, duration(ns): 21482543 2025-07-11 09:13:17,973 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744287_3463, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-11 09:13:19,338 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744287_3463 replica FinalizedReplica, blk_1073744287_3463, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744287 for deletion 2025-07-11 09:13:19,339 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744287_3463 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744287 2025-07-11 09:15:17,941 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744289_3465 src: /192.168.158.9:42248 dest: /192.168.158.4:9866 2025-07-11 09:15:17,966 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:42248, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-613968765_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744289_3465, duration(ns): 18901442 2025-07-11 09:15:17,966 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744289_3465, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-11 09:15:19,343 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744289_3465 replica FinalizedReplica, blk_1073744289_3465, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744289 for deletion 2025-07-11 09:15:19,344 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744289_3465 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744289 2025-07-11 09:16:22,931 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744290_3466 src: /192.168.158.8:36672 dest: /192.168.158.4:9866 2025-07-11 09:16:22,956 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:36672, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_632403086_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744290_3466, duration(ns): 18849160 2025-07-11 09:16:22,956 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744290_3466, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-11 09:16:25,346 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744290_3466 replica FinalizedReplica, blk_1073744290_3466, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744290 for deletion 2025-07-11 09:16:25,347 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744290_3466 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744290 2025-07-11 09:17:22,927 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744291_3467 src: /192.168.158.9:42696 dest: /192.168.158.4:9866 2025-07-11 09:17:22,944 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:42696, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1280388726_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744291_3467, duration(ns): 14089111 2025-07-11 09:17:22,944 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744291_3467, type=LAST_IN_PIPELINE terminating 2025-07-11 09:17:25,349 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744291_3467 replica FinalizedReplica, blk_1073744291_3467, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744291 for deletion 2025-07-11 09:17:25,350 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744291_3467 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744291 2025-07-11 09:18:22,930 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744292_3468 src: /192.168.158.1:49216 dest: /192.168.158.4:9866 2025-07-11 09:18:22,962 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49216, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-902967081_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744292_3468, duration(ns): 23163268 2025-07-11 09:18:22,962 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744292_3468, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-11 09:18:28,353 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744292_3468 replica FinalizedReplica, blk_1073744292_3468, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744292 for deletion 2025-07-11 09:18:28,354 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744292_3468 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744292 2025-07-11 09:23:22,948 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744297_3473 src: /192.168.158.5:46448 dest: /192.168.158.4:9866 2025-07-11 09:23:22,971 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:46448, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1825717242_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744297_3473, duration(ns): 17451958 2025-07-11 09:23:22,971 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744297_3473, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-11 09:23:28,365 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744297_3473 replica FinalizedReplica, blk_1073744297_3473, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744297 for deletion 2025-07-11 09:23:28,367 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744297_3473 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744297 2025-07-11 09:27:22,961 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744301_3477 src: /192.168.158.7:59700 dest: /192.168.158.4:9866 2025-07-11 09:27:22,978 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:59700, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_135983471_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744301_3477, duration(ns): 15111666 2025-07-11 09:27:22,978 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744301_3477, type=LAST_IN_PIPELINE terminating 2025-07-11 09:27:25,377 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744301_3477 replica FinalizedReplica, blk_1073744301_3477, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744301 for deletion 2025-07-11 09:27:25,378 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744301_3477 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744301 2025-07-11 09:28:22,950 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744302_3478 src: /192.168.158.1:52412 dest: /192.168.158.4:9866 2025-07-11 09:28:22,978 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52412, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-85631479_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744302_3478, duration(ns): 19306998 2025-07-11 09:28:22,978 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744302_3478, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-11 09:28:25,380 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744302_3478 replica FinalizedReplica, blk_1073744302_3478, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744302 for deletion 2025-07-11 09:28:25,381 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744302_3478 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744302 2025-07-11 09:31:22,958 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744305_3481 src: /192.168.158.1:42506 dest: /192.168.158.4:9866 2025-07-11 09:31:22,987 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42506, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1073629403_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744305_3481, duration(ns): 20939855 2025-07-11 09:31:22,988 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744305_3481, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-11 09:31:25,388 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744305_3481 replica FinalizedReplica, blk_1073744305_3481, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744305 for deletion 2025-07-11 09:31:25,389 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744305_3481 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744305 2025-07-11 09:33:27,965 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744307_3483 src: /192.168.158.7:59738 dest: /192.168.158.4:9866 2025-07-11 09:33:27,983 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:59738, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1981480681_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744307_3483, duration(ns): 15786608 2025-07-11 09:33:27,984 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744307_3483, type=LAST_IN_PIPELINE terminating 2025-07-11 09:33:31,389 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744307_3483 replica FinalizedReplica, blk_1073744307_3483, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744307 for deletion 2025-07-11 09:33:31,390 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744307_3483 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744307 2025-07-11 09:34:32,982 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744308_3484 src: /192.168.158.7:47104 dest: /192.168.158.4:9866 2025-07-11 09:34:32,999 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:47104, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_309502273_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744308_3484, duration(ns): 14801968 2025-07-11 09:34:32,999 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744308_3484, type=LAST_IN_PIPELINE terminating 2025-07-11 09:34:37,394 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744308_3484 replica FinalizedReplica, blk_1073744308_3484, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744308 for deletion 2025-07-11 09:34:37,395 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744308_3484 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744308 2025-07-11 09:36:37,999 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744310_3486 src: /192.168.158.9:37380 dest: /192.168.158.4:9866 2025-07-11 09:36:38,025 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:37380, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-454658040_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744310_3486, duration(ns): 20762192 2025-07-11 09:36:38,025 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744310_3486, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-11 09:36:40,398 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744310_3486 replica FinalizedReplica, blk_1073744310_3486, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744310 for deletion 2025-07-11 09:36:40,399 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744310_3486 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744310 2025-07-11 09:38:42,988 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744312_3488 src: /192.168.158.1:50206 dest: /192.168.158.4:9866 2025-07-11 09:38:43,018 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50206, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_516078362_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744312_3488, duration(ns): 21695174 2025-07-11 09:38:43,019 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744312_3488, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-11 09:38:46,400 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744312_3488 replica FinalizedReplica, blk_1073744312_3488, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744312 for deletion 2025-07-11 09:38:46,401 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744312_3488 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744312 2025-07-11 09:39:42,987 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744313_3489 src: /192.168.158.1:40870 dest: /192.168.158.4:9866 2025-07-11 09:39:43,019 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40870, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1428227386_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744313_3489, duration(ns): 23477079 2025-07-11 09:39:43,019 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744313_3489, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-11 09:39:46,403 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744313_3489 replica FinalizedReplica, blk_1073744313_3489, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744313 for deletion 2025-07-11 09:39:46,404 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744313_3489 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744313 2025-07-11 09:40:43,006 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744314_3490 src: /192.168.158.7:54736 dest: /192.168.158.4:9866 2025-07-11 09:40:43,026 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:54736, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1416699318_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744314_3490, duration(ns): 17724304 2025-07-11 09:40:43,026 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744314_3490, type=LAST_IN_PIPELINE terminating 2025-07-11 09:40:46,405 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744314_3490 replica FinalizedReplica, blk_1073744314_3490, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744314 for deletion 2025-07-11 09:40:46,406 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744314_3490 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744314 2025-07-11 09:43:47,993 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744317_3493 src: /192.168.158.1:47288 dest: /192.168.158.4:9866 2025-07-11 09:43:48,024 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47288, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1771175779_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744317_3493, duration(ns): 22470010 2025-07-11 09:43:48,024 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744317_3493, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-11 09:43:52,414 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744317_3493 replica FinalizedReplica, blk_1073744317_3493, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744317 for deletion 2025-07-11 09:43:52,415 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744317_3493 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744317 2025-07-11 09:46:53,015 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744320_3496 src: /192.168.158.7:53812 dest: /192.168.158.4:9866 2025-07-11 09:46:53,040 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:53812, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-16934602_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744320_3496, duration(ns): 19441409 2025-07-11 09:46:53,040 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744320_3496, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-11 09:46:55,420 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744320_3496 replica FinalizedReplica, blk_1073744320_3496, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744320 for deletion 2025-07-11 09:46:55,421 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744320_3496 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744320 2025-07-11 09:48:53,006 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744322_3498 src: /192.168.158.1:57234 dest: /192.168.158.4:9866 2025-07-11 09:48:53,037 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57234, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_810701808_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744322_3498, duration(ns): 21787078 2025-07-11 09:48:53,037 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744322_3498, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-11 09:48:55,423 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744322_3498 replica FinalizedReplica, blk_1073744322_3498, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744322 for deletion 2025-07-11 09:48:55,424 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744322_3498 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744322 2025-07-11 09:49:58,017 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744323_3499 src: /192.168.158.1:59512 dest: /192.168.158.4:9866 2025-07-11 09:49:58,051 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59512, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-605958634_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744323_3499, duration(ns): 24917561 2025-07-11 09:49:58,052 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744323_3499, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-11 09:50:01,425 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744323_3499 replica FinalizedReplica, blk_1073744323_3499, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744323 for deletion 2025-07-11 09:50:01,426 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744323_3499 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744323 2025-07-11 09:54:02,992 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744327_3503 src: /192.168.158.1:45470 dest: /192.168.158.4:9866 2025-07-11 09:54:03,023 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45470, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_464430331_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744327_3503, duration(ns): 22070848 2025-07-11 09:54:03,023 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744327_3503, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-11 09:54:07,432 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744327_3503 replica FinalizedReplica, blk_1073744327_3503, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744327 for deletion 2025-07-11 09:54:07,433 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744327_3503 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744327 2025-07-11 09:57:08,009 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744330_3506 src: /192.168.158.9:59880 dest: /192.168.158.4:9866 2025-07-11 09:57:08,036 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:59880, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1967220347_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744330_3506, duration(ns): 21072760 2025-07-11 09:57:08,036 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744330_3506, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-11 09:57:10,438 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744330_3506 replica FinalizedReplica, blk_1073744330_3506, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744330 for deletion 2025-07-11 09:57:10,439 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744330_3506 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744330 2025-07-11 10:02:13,008 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744335_3511 src: /192.168.158.1:45026 dest: /192.168.158.4:9866 2025-07-11 10:02:13,039 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45026, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1748405664_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744335_3511, duration(ns): 21079214 2025-07-11 10:02:13,039 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744335_3511, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-11 10:02:16,448 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744335_3511 replica FinalizedReplica, blk_1073744335_3511, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744335 for deletion 2025-07-11 10:02:16,449 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744335_3511 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744335 2025-07-11 10:03:13,026 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744336_3512 src: /192.168.158.7:55668 dest: /192.168.158.4:9866 2025-07-11 10:03:13,045 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:55668, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-606758586_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744336_3512, duration(ns): 16514657 2025-07-11 10:03:13,045 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744336_3512, type=LAST_IN_PIPELINE terminating 2025-07-11 10:03:16,451 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744336_3512 replica FinalizedReplica, blk_1073744336_3512, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744336 for deletion 2025-07-11 10:03:16,452 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744336_3512 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744336 2025-07-11 10:05:18,044 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744338_3514 src: /192.168.158.7:50406 dest: /192.168.158.4:9866 2025-07-11 10:05:18,070 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:50406, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1120647901_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744338_3514, duration(ns): 19333158 2025-07-11 10:05:18,070 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744338_3514, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-11 10:05:19,455 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744338_3514 replica FinalizedReplica, blk_1073744338_3514, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744338 for deletion 2025-07-11 10:05:19,456 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744338_3514 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744338 2025-07-11 10:07:18,025 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744340_3516 src: /192.168.158.9:47768 dest: /192.168.158.4:9866 2025-07-11 10:07:18,044 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:47768, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1948002854_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744340_3516, duration(ns): 17456575 2025-07-11 10:07:18,045 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744340_3516, type=LAST_IN_PIPELINE terminating 2025-07-11 10:07:19,459 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744340_3516 replica FinalizedReplica, blk_1073744340_3516, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744340 for deletion 2025-07-11 10:07:19,460 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744340_3516 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744340 2025-07-11 10:09:18,028 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744342_3518 src: /192.168.158.6:42436 dest: /192.168.158.4:9866 2025-07-11 10:09:18,046 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:42436, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1204858293_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744342_3518, duration(ns): 15690679 2025-07-11 10:09:18,046 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744342_3518, type=LAST_IN_PIPELINE terminating 2025-07-11 10:09:22,464 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744342_3518 replica FinalizedReplica, blk_1073744342_3518, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744342 for deletion 2025-07-11 10:09:22,465 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744342_3518 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744342 2025-07-11 10:10:23,027 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744343_3519 src: /192.168.158.8:60126 dest: /192.168.158.4:9866 2025-07-11 10:10:23,051 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:60126, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_198629090_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744343_3519, duration(ns): 18445437 2025-07-11 10:10:23,052 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744343_3519, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-11 10:10:25,465 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744343_3519 replica FinalizedReplica, blk_1073744343_3519, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744343 for deletion 2025-07-11 10:10:25,466 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744343_3519 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744343 2025-07-11 10:12:28,028 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744345_3521 src: /192.168.158.9:56128 dest: /192.168.158.4:9866 2025-07-11 10:12:28,046 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:56128, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1261805452_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744345_3521, duration(ns): 15611046 2025-07-11 10:12:28,046 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744345_3521, type=LAST_IN_PIPELINE terminating 2025-07-11 10:12:31,471 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744345_3521 replica FinalizedReplica, blk_1073744345_3521, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744345 for deletion 2025-07-11 10:12:31,472 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744345_3521 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744345 2025-07-11 10:13:28,030 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744346_3522 src: /192.168.158.8:47392 dest: /192.168.158.4:9866 2025-07-11 10:13:28,047 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:47392, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-821991229_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744346_3522, duration(ns): 14410788 2025-07-11 10:13:28,047 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744346_3522, type=LAST_IN_PIPELINE terminating 2025-07-11 10:13:34,472 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744346_3522 replica FinalizedReplica, blk_1073744346_3522, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744346 for deletion 2025-07-11 10:13:34,473 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744346_3522 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744346 2025-07-11 10:14:28,027 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744347_3523 src: /192.168.158.1:52964 dest: /192.168.158.4:9866 2025-07-11 10:14:28,062 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52964, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-858308584_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744347_3523, duration(ns): 23925744 2025-07-11 10:14:28,062 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744347_3523, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-11 10:14:31,475 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744347_3523 replica FinalizedReplica, blk_1073744347_3523, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744347 for deletion 2025-07-11 10:14:31,476 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744347_3523 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744347 2025-07-11 10:15:28,041 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744348_3524 src: /192.168.158.1:58560 dest: /192.168.158.4:9866 2025-07-11 10:15:28,071 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58560, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_506836328_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744348_3524, duration(ns): 20903717 2025-07-11 10:15:28,071 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744348_3524, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-11 10:15:34,476 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744348_3524 replica FinalizedReplica, blk_1073744348_3524, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744348 for deletion 2025-07-11 10:15:34,478 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744348_3524 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744348 2025-07-11 10:16:28,044 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744349_3525 src: /192.168.158.1:43802 dest: /192.168.158.4:9866 2025-07-11 10:16:28,077 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43802, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1973775616_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744349_3525, duration(ns): 23524225 2025-07-11 10:16:28,077 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744349_3525, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-11 10:16:31,477 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744349_3525 replica FinalizedReplica, blk_1073744349_3525, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744349 for deletion 2025-07-11 10:16:31,478 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744349_3525 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744349 2025-07-11 10:18:28,050 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744351_3527 src: /192.168.158.7:48002 dest: /192.168.158.4:9866 2025-07-11 10:18:28,068 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:48002, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_8516787_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744351_3527, duration(ns): 15484887 2025-07-11 10:18:28,068 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744351_3527, type=LAST_IN_PIPELINE terminating 2025-07-11 10:18:31,480 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744351_3527 replica FinalizedReplica, blk_1073744351_3527, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744351 for deletion 2025-07-11 10:18:31,481 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744351_3527 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744351 2025-07-11 10:20:33,050 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744353_3529 src: /192.168.158.6:36082 dest: /192.168.158.4:9866 2025-07-11 10:20:33,069 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:36082, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_301012253_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744353_3529, duration(ns): 17119949 2025-07-11 10:20:33,069 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744353_3529, type=LAST_IN_PIPELINE terminating 2025-07-11 10:20:34,482 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744353_3529 replica FinalizedReplica, blk_1073744353_3529, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744353 for deletion 2025-07-11 10:20:34,483 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744353_3529 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744353 2025-07-11 10:24:38,049 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744357_3533 src: /192.168.158.1:54466 dest: /192.168.158.4:9866 2025-07-11 10:24:38,081 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54466, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2110768379_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744357_3533, duration(ns): 22718265 2025-07-11 10:24:38,081 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744357_3533, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-11 10:24:43,491 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744357_3533 replica FinalizedReplica, blk_1073744357_3533, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744357 for deletion 2025-07-11 10:24:43,492 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744357_3533 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744357 2025-07-11 10:25:38,065 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744358_3534 src: /192.168.158.1:51260 dest: /192.168.158.4:9866 2025-07-11 10:25:38,094 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51260, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1692521347_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744358_3534, duration(ns): 20650175 2025-07-11 10:25:38,094 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744358_3534, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-11 10:25:43,495 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744358_3534 replica FinalizedReplica, blk_1073744358_3534, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744358 for deletion 2025-07-11 10:25:43,496 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744358_3534 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744358 2025-07-11 10:27:38,045 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744360_3536 src: /192.168.158.7:42240 dest: /192.168.158.4:9866 2025-07-11 10:27:38,068 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:42240, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1146853080_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744360_3536, duration(ns): 17419470 2025-07-11 10:27:38,068 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744360_3536, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-11 10:27:40,499 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744360_3536 replica FinalizedReplica, blk_1073744360_3536, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744360 for deletion 2025-07-11 10:27:40,501 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744360_3536 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744360 2025-07-11 10:28:38,049 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744361_3537 src: /192.168.158.1:58558 dest: /192.168.158.4:9866 2025-07-11 10:28:38,080 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58558, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1860632725_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744361_3537, duration(ns): 22004453 2025-07-11 10:28:38,080 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744361_3537, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-11 10:28:40,504 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744361_3537 replica FinalizedReplica, blk_1073744361_3537, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744361 for deletion 2025-07-11 10:28:40,505 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744361_3537 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744361 2025-07-11 10:29:43,058 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744362_3538 src: /192.168.158.8:43628 dest: /192.168.158.4:9866 2025-07-11 10:29:43,075 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:43628, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1848525522_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744362_3538, duration(ns): 14648558 2025-07-11 10:29:43,075 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744362_3538, type=LAST_IN_PIPELINE terminating 2025-07-11 10:29:46,504 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744362_3538 replica FinalizedReplica, blk_1073744362_3538, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744362 for deletion 2025-07-11 10:29:46,505 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744362_3538 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744362 2025-07-11 10:30:48,058 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744363_3539 src: /192.168.158.1:49528 dest: /192.168.158.4:9866 2025-07-11 10:30:48,088 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49528, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_814560254_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744363_3539, duration(ns): 20579177 2025-07-11 10:30:48,088 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744363_3539, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-11 10:30:49,507 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744363_3539 replica FinalizedReplica, blk_1073744363_3539, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744363 for deletion 2025-07-11 10:30:49,508 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744363_3539 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744363 2025-07-11 10:31:53,076 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744364_3540 src: /192.168.158.1:60998 dest: /192.168.158.4:9866 2025-07-11 10:31:53,105 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60998, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1013532141_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744364_3540, duration(ns): 19490333 2025-07-11 10:31:53,105 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744364_3540, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-11 10:31:55,510 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744364_3540 replica FinalizedReplica, blk_1073744364_3540, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744364 for deletion 2025-07-11 10:31:55,511 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744364_3540 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744364 2025-07-11 10:32:58,067 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744365_3541 src: /192.168.158.5:37744 dest: /192.168.158.4:9866 2025-07-11 10:32:58,088 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:37744, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-548706906_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744365_3541, duration(ns): 18424881 2025-07-11 10:32:58,088 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744365_3541, type=LAST_IN_PIPELINE terminating 2025-07-11 10:33:01,513 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744365_3541 replica FinalizedReplica, blk_1073744365_3541, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744365 for deletion 2025-07-11 10:33:01,514 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744365_3541 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744365 2025-07-11 10:36:03,067 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744368_3544 src: /192.168.158.6:33200 dest: /192.168.158.4:9866 2025-07-11 10:36:03,088 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:33200, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1063543253_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744368_3544, duration(ns): 16011315 2025-07-11 10:36:03,088 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744368_3544, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-11 10:36:04,523 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744368_3544 replica FinalizedReplica, blk_1073744368_3544, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744368 for deletion 2025-07-11 10:36:04,524 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744368_3544 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744368 2025-07-11 10:37:03,065 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744369_3545 src: /192.168.158.1:44000 dest: /192.168.158.4:9866 2025-07-11 10:37:03,098 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44000, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1996136320_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744369_3545, duration(ns): 23967152 2025-07-11 10:37:03,099 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744369_3545, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-11 10:37:04,529 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744369_3545 replica FinalizedReplica, blk_1073744369_3545, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744369 for deletion 2025-07-11 10:37:04,530 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744369_3545 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744369 2025-07-11 10:38:08,068 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744370_3546 src: /192.168.158.6:43508 dest: /192.168.158.4:9866 2025-07-11 10:38:08,088 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:43508, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_801895546_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744370_3546, duration(ns): 17526867 2025-07-11 10:38:08,088 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744370_3546, type=LAST_IN_PIPELINE terminating 2025-07-11 10:38:13,532 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744370_3546 replica FinalizedReplica, blk_1073744370_3546, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744370 for deletion 2025-07-11 10:38:13,533 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744370_3546 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744370 2025-07-11 10:40:18,084 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744372_3548 src: /192.168.158.8:36394 dest: /192.168.158.4:9866 2025-07-11 10:40:18,103 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:36394, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-308274244_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744372_3548, duration(ns): 17738241 2025-07-11 10:40:18,104 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744372_3548, type=LAST_IN_PIPELINE terminating 2025-07-11 10:40:19,536 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744372_3548 replica FinalizedReplica, blk_1073744372_3548, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744372 for deletion 2025-07-11 10:40:19,538 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744372_3548 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744372 2025-07-11 10:41:18,071 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744373_3549 src: /192.168.158.1:49410 dest: /192.168.158.4:9866 2025-07-11 10:41:18,102 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49410, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_179359089_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744373_3549, duration(ns): 23005857 2025-07-11 10:41:18,103 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744373_3549, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-11 10:41:19,539 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744373_3549 replica FinalizedReplica, blk_1073744373_3549, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744373 for deletion 2025-07-11 10:41:19,540 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744373_3549 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744373 2025-07-11 10:43:18,093 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744375_3551 src: /192.168.158.1:36082 dest: /192.168.158.4:9866 2025-07-11 10:43:18,124 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36082, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1760892034_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744375_3551, duration(ns): 22459086 2025-07-11 10:43:18,125 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744375_3551, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-11 10:43:19,545 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744375_3551 replica FinalizedReplica, blk_1073744375_3551, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744375 for deletion 2025-07-11 10:43:19,546 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744375_3551 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744375 2025-07-11 10:46:23,106 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744378_3554 src: /192.168.158.7:48130 dest: /192.168.158.4:9866 2025-07-11 10:46:23,124 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:48130, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-795240226_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744378_3554, duration(ns): 15124335 2025-07-11 10:46:23,124 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744378_3554, type=LAST_IN_PIPELINE terminating 2025-07-11 10:46:25,557 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744378_3554 replica FinalizedReplica, blk_1073744378_3554, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744378 for deletion 2025-07-11 10:46:25,558 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744378_3554 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744378 2025-07-11 10:47:23,096 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744379_3555 src: /192.168.158.1:43408 dest: /192.168.158.4:9866 2025-07-11 10:47:23,127 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43408, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_456468896_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744379_3555, duration(ns): 22924928 2025-07-11 10:47:23,128 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744379_3555, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-11 10:47:25,559 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744379_3555 replica FinalizedReplica, blk_1073744379_3555, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744379 for deletion 2025-07-11 10:47:25,560 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744379_3555 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744379 2025-07-11 10:50:33,113 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744382_3558 src: /192.168.158.9:35244 dest: /192.168.158.4:9866 2025-07-11 10:50:33,130 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:35244, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1478809893_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744382_3558, duration(ns): 14989761 2025-07-11 10:50:33,131 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744382_3558, type=LAST_IN_PIPELINE terminating 2025-07-11 10:50:37,574 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744382_3558 replica FinalizedReplica, blk_1073744382_3558, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744382 for deletion 2025-07-11 10:50:37,576 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744382_3558 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744382 2025-07-11 10:51:33,112 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744383_3559 src: /192.168.158.5:59160 dest: /192.168.158.4:9866 2025-07-11 10:51:33,131 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:59160, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2116970579_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744383_3559, duration(ns): 16196988 2025-07-11 10:51:33,131 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744383_3559, type=LAST_IN_PIPELINE terminating 2025-07-11 10:51:34,577 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744383_3559 replica FinalizedReplica, blk_1073744383_3559, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744383 for deletion 2025-07-11 10:51:34,578 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744383_3559 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073744383 2025-07-11 10:54:33,105 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744386_3562 src: /192.168.158.1:33660 dest: /192.168.158.4:9866 2025-07-11 10:54:33,137 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33660, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1674658751_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744386_3562, duration(ns): 22373978 2025-07-11 10:54:33,137 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744386_3562, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-11 10:54:34,582 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744386_3562 replica FinalizedReplica, blk_1073744386_3562, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744386 for deletion 2025-07-11 10:54:34,583 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744386_3562 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744386 2025-07-11 10:55:33,133 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744387_3563 src: /192.168.158.6:35572 dest: /192.168.158.4:9866 2025-07-11 10:55:33,160 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:35572, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-736403625_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744387_3563, duration(ns): 19905804 2025-07-11 10:55:33,160 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744387_3563, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-11 10:55:34,583 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744387_3563 replica FinalizedReplica, blk_1073744387_3563, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744387 for deletion 2025-07-11 10:55:34,585 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744387_3563 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744387 2025-07-11 10:56:33,145 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744388_3564 src: /192.168.158.7:37710 dest: /192.168.158.4:9866 2025-07-11 10:56:33,172 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:37710, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_28487211_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744388_3564, duration(ns): 21734460 2025-07-11 10:56:33,173 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744388_3564, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-11 10:56:34,584 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744388_3564 replica FinalizedReplica, blk_1073744388_3564, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744388 for deletion 2025-07-11 10:56:34,585 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744388_3564 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744388 2025-07-11 10:57:33,123 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744389_3565 src: /192.168.158.5:53046 dest: /192.168.158.4:9866 2025-07-11 10:57:33,147 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:53046, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_685054844_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744389_3565, duration(ns): 18601036 2025-07-11 10:57:33,147 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744389_3565, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-11 10:57:34,588 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744389_3565 replica FinalizedReplica, blk_1073744389_3565, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744389 for deletion 2025-07-11 10:57:34,589 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744389_3565 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744389 2025-07-11 10:58:33,129 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744390_3566 src: /192.168.158.7:57140 dest: /192.168.158.4:9866 2025-07-11 10:58:33,155 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:57140, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-392023859_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744390_3566, duration(ns): 20534327 2025-07-11 10:58:33,155 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744390_3566, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-11 10:58:34,588 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744390_3566 replica FinalizedReplica, blk_1073744390_3566, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744390 for deletion 2025-07-11 10:58:34,589 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744390_3566 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744390 2025-07-11 11:00:33,132 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744392_3568 src: /192.168.158.1:41974 dest: /192.168.158.4:9866 2025-07-11 11:00:33,164 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41974, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-32514998_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744392_3568, duration(ns): 23805437 2025-07-11 11:00:33,165 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744392_3568, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-11 11:00:34,590 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744392_3568 replica FinalizedReplica, blk_1073744392_3568, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744392 for deletion 2025-07-11 11:00:34,592 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744392_3568 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744392 2025-07-11 11:02:38,131 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744394_3570 src: /192.168.158.6:53030 dest: /192.168.158.4:9866 2025-07-11 11:02:38,151 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:53030, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-201634487_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744394_3570, duration(ns): 17829040 2025-07-11 11:02:38,152 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744394_3570, type=LAST_IN_PIPELINE terminating 2025-07-11 11:02:43,597 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744394_3570 replica FinalizedReplica, blk_1073744394_3570, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744394 for deletion 2025-07-11 11:02:43,598 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744394_3570 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744394 2025-07-11 11:04:43,120 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744396_3572 src: /192.168.158.9:43246 dest: /192.168.158.4:9866 2025-07-11 11:04:43,140 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:43246, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_788096234_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744396_3572, duration(ns): 17469876 2025-07-11 11:04:43,140 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744396_3572, type=LAST_IN_PIPELINE terminating 2025-07-11 11:04:49,601 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744396_3572 replica FinalizedReplica, blk_1073744396_3572, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744396 for deletion 2025-07-11 11:04:49,602 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744396_3572 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744396 2025-07-11 11:05:43,136 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744397_3573 src: /192.168.158.9:40366 dest: /192.168.158.4:9866 2025-07-11 11:05:43,154 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:40366, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1292382137_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744397_3573, duration(ns): 15596127 2025-07-11 11:05:43,154 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744397_3573, type=LAST_IN_PIPELINE terminating 2025-07-11 11:05:46,603 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744397_3573 replica FinalizedReplica, blk_1073744397_3573, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744397 for deletion 2025-07-11 11:05:46,604 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744397_3573 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744397 2025-07-11 11:06:43,141 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744398_3574 src: /192.168.158.6:57182 dest: /192.168.158.4:9866 2025-07-11 11:06:43,161 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:57182, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1246129635_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744398_3574, duration(ns): 17671309 2025-07-11 11:06:43,161 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744398_3574, type=LAST_IN_PIPELINE terminating 2025-07-11 11:06:49,608 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744398_3574 replica FinalizedReplica, blk_1073744398_3574, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744398 for deletion 2025-07-11 11:06:49,609 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744398_3574 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744398 2025-07-11 11:07:43,133 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744399_3575 src: /192.168.158.1:33310 dest: /192.168.158.4:9866 2025-07-11 11:07:43,168 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33310, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_159608268_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744399_3575, duration(ns): 24843944 2025-07-11 11:07:43,168 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744399_3575, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-11 11:07:46,611 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744399_3575 replica FinalizedReplica, blk_1073744399_3575, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744399 for deletion 2025-07-11 11:07:46,612 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744399_3575 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744399 2025-07-11 11:08:43,141 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744400_3576 src: /192.168.158.5:55038 dest: /192.168.158.4:9866 2025-07-11 11:08:43,160 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:55038, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-779654117_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744400_3576, duration(ns): 15613555 2025-07-11 11:08:43,160 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744400_3576, type=LAST_IN_PIPELINE terminating 2025-07-11 11:08:46,613 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744400_3576 replica FinalizedReplica, blk_1073744400_3576, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744400 for deletion 2025-07-11 11:08:46,614 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744400_3576 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744400 2025-07-11 11:14:53,148 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744406_3582 src: /192.168.158.5:49602 dest: /192.168.158.4:9866 2025-07-11 11:14:53,171 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:49602, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-287026986_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744406_3582, duration(ns): 18087491 2025-07-11 11:14:53,172 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744406_3582, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-11 11:14:55,630 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744406_3582 replica FinalizedReplica, blk_1073744406_3582, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744406 for deletion 2025-07-11 11:14:55,632 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744406_3582 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744406 2025-07-11 11:17:58,144 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744409_3585 src: /192.168.158.9:36940 dest: /192.168.158.4:9866 2025-07-11 11:17:58,169 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:36940, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_350403472_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744409_3585, duration(ns): 19088570 2025-07-11 11:17:58,169 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744409_3585, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-11 11:18:01,634 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744409_3585 replica FinalizedReplica, blk_1073744409_3585, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744409 for deletion 2025-07-11 11:18:01,636 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744409_3585 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744409 2025-07-11 11:21:03,156 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744412_3588 src: /192.168.158.1:42866 dest: /192.168.158.4:9866 2025-07-11 11:21:03,187 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42866, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_302491551_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744412_3588, duration(ns): 22419167 2025-07-11 11:21:03,188 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744412_3588, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-11 11:21:04,640 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744412_3588 replica FinalizedReplica, blk_1073744412_3588, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744412 for deletion 2025-07-11 11:21:04,641 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744412_3588 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744412 2025-07-11 11:22:03,160 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744413_3589 src: /192.168.158.7:57846 dest: /192.168.158.4:9866 2025-07-11 11:22:03,178 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:57846, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-535477984_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744413_3589, duration(ns): 16051079 2025-07-11 11:22:03,179 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744413_3589, type=LAST_IN_PIPELINE terminating 2025-07-11 11:22:07,640 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744413_3589 replica FinalizedReplica, blk_1073744413_3589, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744413 for deletion 2025-07-11 11:22:07,641 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744413_3589 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744413 2025-07-11 11:23:03,161 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744414_3590 src: /192.168.158.1:50772 dest: /192.168.158.4:9866 2025-07-11 11:23:03,193 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50772, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1210298725_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744414_3590, duration(ns): 22713994 2025-07-11 11:23:03,193 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744414_3590, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-11 11:23:04,641 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744414_3590 replica FinalizedReplica, blk_1073744414_3590, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744414 for deletion 2025-07-11 11:23:04,643 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744414_3590 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744414 2025-07-11 11:27:08,165 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744418_3594 src: /192.168.158.5:60150 dest: /192.168.158.4:9866 2025-07-11 11:27:08,181 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:60150, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1447748245_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744418_3594, duration(ns): 14102279 2025-07-11 11:27:08,182 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744418_3594, type=LAST_IN_PIPELINE terminating 2025-07-11 11:27:13,645 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744418_3594 replica FinalizedReplica, blk_1073744418_3594, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744418 for deletion 2025-07-11 11:27:13,646 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744418_3594 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744418 2025-07-11 11:28:08,170 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744419_3595 src: /192.168.158.5:33554 dest: /192.168.158.4:9866 2025-07-11 11:28:08,189 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:33554, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_744499614_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744419_3595, duration(ns): 16969199 2025-07-11 11:28:08,189 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744419_3595, type=LAST_IN_PIPELINE terminating 2025-07-11 11:28:13,647 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744419_3595 replica FinalizedReplica, blk_1073744419_3595, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744419 for deletion 2025-07-11 11:28:13,649 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744419_3595 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744419 2025-07-11 11:30:08,179 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744421_3597 src: /192.168.158.8:56952 dest: /192.168.158.4:9866 2025-07-11 11:30:08,198 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:56952, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-488477032_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744421_3597, duration(ns): 17510093 2025-07-11 11:30:08,199 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744421_3597, type=LAST_IN_PIPELINE terminating 2025-07-11 11:30:10,651 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744421_3597 replica FinalizedReplica, blk_1073744421_3597, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744421 for deletion 2025-07-11 11:30:10,652 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744421_3597 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744421 2025-07-11 11:32:08,157 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744423_3599 src: /192.168.158.9:57422 dest: /192.168.158.4:9866 2025-07-11 11:32:08,174 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:57422, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_179999709_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744423_3599, duration(ns): 15048340 2025-07-11 11:32:08,175 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744423_3599, type=LAST_IN_PIPELINE terminating 2025-07-11 11:32:10,654 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744423_3599 replica FinalizedReplica, blk_1073744423_3599, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744423 for deletion 2025-07-11 11:32:10,655 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744423_3599 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744423 2025-07-11 11:33:13,154 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744424_3600 src: /192.168.158.1:60654 dest: /192.168.158.4:9866 2025-07-11 11:33:13,185 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60654, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_928589512_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744424_3600, duration(ns): 21461592 2025-07-11 11:33:13,186 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744424_3600, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-11 11:33:16,655 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744424_3600 replica FinalizedReplica, blk_1073744424_3600, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744424 for deletion 2025-07-11 11:33:16,656 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744424_3600 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744424 2025-07-11 11:36:13,170 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744427_3603 src: /192.168.158.1:50596 dest: /192.168.158.4:9866 2025-07-11 11:36:13,202 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50596, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1872700675_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744427_3603, duration(ns): 23274109 2025-07-11 11:36:13,203 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744427_3603, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-11 11:36:13,269 INFO org.apache.hadoop.hdfs.server.datanode.DirectoryScanner: BlockPool BP-1059995147-192.168.158.1-1752101929360 Total blocks: 9, missing metadata files:0, missing block files:0, missing blocks in memory:0, mismatched blocks:0 2025-07-11 11:36:16,665 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744427_3603 replica FinalizedReplica, blk_1073744427_3603, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744427 for deletion 2025-07-11 11:36:16,666 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744427_3603 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744427 2025-07-11 11:37:18,173 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744428_3604 src: /192.168.158.7:54646 dest: /192.168.158.4:9866 2025-07-11 11:37:18,197 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:54646, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-846457012_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744428_3604, duration(ns): 18891576 2025-07-11 11:37:18,197 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744428_3604, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-11 11:37:19,669 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744428_3604 replica FinalizedReplica, blk_1073744428_3604, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744428 for deletion 2025-07-11 11:37:19,670 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744428_3604 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744428 2025-07-11 11:37:19,672 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Successfully sent block report 0x12cfd9a757d31f2c, containing 4 storage report(s), of which we sent 4. The reports had 9 total blocks and used 1 RPC(s). This took 0 msec to generate and 3 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5. 2025-07-11 11:37:19,673 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Got finalize command for block pool BP-1059995147-192.168.158.1-1752101929360 2025-07-11 11:37:22,668 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Failed to delete replica blk_1073744428_3604: ReplicaInfo not found. 2025-07-11 11:39:18,177 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744430_3606 src: /192.168.158.1:39776 dest: /192.168.158.4:9866 2025-07-11 11:39:18,210 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39776, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_774005307_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744430_3606, duration(ns): 24208788 2025-07-11 11:39:18,211 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744430_3606, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-11 11:39:22,670 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744430_3606 replica FinalizedReplica, blk_1073744430_3606, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744430 for deletion 2025-07-11 11:39:22,671 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744430_3606 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744430 2025-07-11 11:40:18,182 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744431_3607 src: /192.168.158.6:33686 dest: /192.168.158.4:9866 2025-07-11 11:40:18,208 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:33686, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-218618736_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744431_3607, duration(ns): 20463824 2025-07-11 11:40:18,208 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744431_3607, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-11 11:40:22,674 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744431_3607 replica FinalizedReplica, blk_1073744431_3607, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744431 for deletion 2025-07-11 11:40:22,675 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744431_3607 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744431 2025-07-11 11:44:23,187 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744435_3611 src: /192.168.158.5:34370 dest: /192.168.158.4:9866 2025-07-11 11:44:23,211 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:34370, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-172992624_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744435_3611, duration(ns): 18596791 2025-07-11 11:44:23,211 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744435_3611, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-11 11:44:25,685 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744435_3611 replica FinalizedReplica, blk_1073744435_3611, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744435 for deletion 2025-07-11 11:44:25,686 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744435_3611 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744435 2025-07-11 11:45:23,194 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744436_3612 src: /192.168.158.1:48968 dest: /192.168.158.4:9866 2025-07-11 11:45:23,225 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48968, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1705551071_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744436_3612, duration(ns): 21340272 2025-07-11 11:45:23,225 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744436_3612, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-11 11:45:25,688 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744436_3612 replica FinalizedReplica, blk_1073744436_3612, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744436 for deletion 2025-07-11 11:45:25,689 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744436_3612 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744436 2025-07-11 11:46:23,194 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744437_3613 src: /192.168.158.1:45314 dest: /192.168.158.4:9866 2025-07-11 11:46:23,222 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45314, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-162408708_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744437_3613, duration(ns): 19475448 2025-07-11 11:46:23,222 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744437_3613, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-11 11:46:25,691 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744437_3613 replica FinalizedReplica, blk_1073744437_3613, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744437 for deletion 2025-07-11 11:46:25,692 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744437_3613 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744437 2025-07-11 11:49:23,215 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744440_3616 src: /192.168.158.8:34942 dest: /192.168.158.4:9866 2025-07-11 11:49:23,234 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:34942, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-610793715_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744440_3616, duration(ns): 16691799 2025-07-11 11:49:23,234 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744440_3616, type=LAST_IN_PIPELINE terminating 2025-07-11 11:49:25,698 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744440_3616 replica FinalizedReplica, blk_1073744440_3616, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744440 for deletion 2025-07-11 11:49:25,699 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744440_3616 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744440 2025-07-11 11:51:33,211 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744442_3618 src: /192.168.158.1:59710 dest: /192.168.158.4:9866 2025-07-11 11:51:33,243 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59710, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1304091131_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744442_3618, duration(ns): 22346964 2025-07-11 11:51:33,243 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744442_3618, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-11 11:51:34,705 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744442_3618 replica FinalizedReplica, blk_1073744442_3618, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744442 for deletion 2025-07-11 11:51:34,706 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744442_3618 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744442 2025-07-11 11:52:33,221 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744443_3619 src: /192.168.158.6:46780 dest: /192.168.158.4:9866 2025-07-11 11:52:33,239 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:46780, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_594093523_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744443_3619, duration(ns): 14286992 2025-07-11 11:52:33,239 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744443_3619, type=LAST_IN_PIPELINE terminating 2025-07-11 11:52:34,709 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744443_3619 replica FinalizedReplica, blk_1073744443_3619, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744443 for deletion 2025-07-11 11:52:34,710 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744443_3619 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744443 2025-07-11 11:54:33,212 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744445_3621 src: /192.168.158.1:33692 dest: /192.168.158.4:9866 2025-07-11 11:54:33,243 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33692, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1949168044_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744445_3621, duration(ns): 21631249 2025-07-11 11:54:33,243 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744445_3621, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-11 11:54:34,712 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744445_3621 replica FinalizedReplica, blk_1073744445_3621, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744445 for deletion 2025-07-11 11:54:34,713 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744445_3621 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744445 2025-07-11 11:57:43,215 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744448_3624 src: /192.168.158.1:43576 dest: /192.168.158.4:9866 2025-07-11 11:57:43,245 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43576, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1549173946_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744448_3624, duration(ns): 20806140 2025-07-11 11:57:43,245 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744448_3624, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-11 11:57:49,720 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744448_3624 replica FinalizedReplica, blk_1073744448_3624, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744448 for deletion 2025-07-11 11:57:49,721 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744448_3624 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744448 2025-07-11 11:59:48,217 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744450_3626 src: /192.168.158.6:33246 dest: /192.168.158.4:9866 2025-07-11 11:59:48,242 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:33246, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-775766843_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744450_3626, duration(ns): 20173263 2025-07-11 11:59:48,243 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744450_3626, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-11 11:59:52,722 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744450_3626 replica FinalizedReplica, blk_1073744450_3626, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744450 for deletion 2025-07-11 11:59:52,723 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744450_3626 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744450 2025-07-11 12:00:48,221 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744451_3627 src: /192.168.158.9:35804 dest: /192.168.158.4:9866 2025-07-11 12:00:48,244 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:35804, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-488813482_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744451_3627, duration(ns): 17386808 2025-07-11 12:00:48,244 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744451_3627, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-11 12:00:49,725 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744451_3627 replica FinalizedReplica, blk_1073744451_3627, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744451 for deletion 2025-07-11 12:00:49,726 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744451_3627 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744451 2025-07-11 12:02:53,237 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744453_3629 src: /192.168.158.9:40132 dest: /192.168.158.4:9866 2025-07-11 12:02:53,257 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:40132, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1223580015_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744453_3629, duration(ns): 17562864 2025-07-11 12:02:53,257 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744453_3629, type=LAST_IN_PIPELINE terminating 2025-07-11 12:02:55,729 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744453_3629 replica FinalizedReplica, blk_1073744453_3629, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744453 for deletion 2025-07-11 12:02:55,731 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744453_3629 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744453 2025-07-11 12:03:53,232 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744454_3630 src: /192.168.158.1:52560 dest: /192.168.158.4:9866 2025-07-11 12:03:53,262 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52560, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_97880592_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744454_3630, duration(ns): 21900020 2025-07-11 12:03:53,262 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744454_3630, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-11 12:03:55,732 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744454_3630 replica FinalizedReplica, blk_1073744454_3630, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744454 for deletion 2025-07-11 12:03:55,733 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744454_3630 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744454 2025-07-11 12:05:58,236 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744456_3632 src: /192.168.158.8:37012 dest: /192.168.158.4:9866 2025-07-11 12:05:58,254 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:37012, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1131253443_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744456_3632, duration(ns): 15037548 2025-07-11 12:05:58,254 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744456_3632, type=LAST_IN_PIPELINE terminating 2025-07-11 12:06:04,734 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744456_3632 replica FinalizedReplica, blk_1073744456_3632, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744456 for deletion 2025-07-11 12:06:04,736 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744456_3632 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744456 2025-07-11 12:08:58,240 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744459_3635 src: /192.168.158.9:60322 dest: /192.168.158.4:9866 2025-07-11 12:08:58,263 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:60322, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1957643353_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744459_3635, duration(ns): 17325640 2025-07-11 12:08:58,263 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744459_3635, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-11 12:09:01,742 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744459_3635 replica FinalizedReplica, blk_1073744459_3635, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744459 for deletion 2025-07-11 12:09:01,743 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744459_3635 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744459 2025-07-11 12:11:58,253 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744462_3638 src: /192.168.158.9:59100 dest: /192.168.158.4:9866 2025-07-11 12:11:58,278 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:59100, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1720594608_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744462_3638, duration(ns): 19263579 2025-07-11 12:11:58,278 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744462_3638, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-11 12:12:01,749 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744462_3638 replica FinalizedReplica, blk_1073744462_3638, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744462 for deletion 2025-07-11 12:12:01,750 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744462_3638 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744462 2025-07-11 12:12:58,253 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744463_3639 src: /192.168.158.5:52470 dest: /192.168.158.4:9866 2025-07-11 12:12:58,270 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:52470, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1670180076_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744463_3639, duration(ns): 15636383 2025-07-11 12:12:58,271 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744463_3639, type=LAST_IN_PIPELINE terminating 2025-07-11 12:13:04,752 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744463_3639 replica FinalizedReplica, blk_1073744463_3639, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744463 for deletion 2025-07-11 12:13:04,753 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744463_3639 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744463 2025-07-11 12:14:58,257 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744465_3641 src: /192.168.158.6:50452 dest: /192.168.158.4:9866 2025-07-11 12:14:58,275 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:50452, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-891973902_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744465_3641, duration(ns): 15890888 2025-07-11 12:14:58,275 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744465_3641, type=LAST_IN_PIPELINE terminating 2025-07-11 12:15:01,763 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744465_3641 replica FinalizedReplica, blk_1073744465_3641, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744465 for deletion 2025-07-11 12:15:01,764 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744465_3641 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744465 2025-07-11 12:16:03,265 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744466_3642 src: /192.168.158.9:45310 dest: /192.168.158.4:9866 2025-07-11 12:16:03,283 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:45310, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_20770749_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744466_3642, duration(ns): 16224239 2025-07-11 12:16:03,283 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744466_3642, type=LAST_IN_PIPELINE terminating 2025-07-11 12:16:04,764 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744466_3642 replica FinalizedReplica, blk_1073744466_3642, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744466 for deletion 2025-07-11 12:16:04,765 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744466_3642 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744466 2025-07-11 12:18:08,261 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744468_3644 src: /192.168.158.9:57978 dest: /192.168.158.4:9866 2025-07-11 12:18:08,279 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:57978, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1966546692_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744468_3644, duration(ns): 15469143 2025-07-11 12:18:08,279 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744468_3644, type=LAST_IN_PIPELINE terminating 2025-07-11 12:18:10,770 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744468_3644 replica FinalizedReplica, blk_1073744468_3644, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744468 for deletion 2025-07-11 12:18:10,772 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744468_3644 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744468 2025-07-11 12:20:13,260 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744470_3646 src: /192.168.158.9:38418 dest: /192.168.158.4:9866 2025-07-11 12:20:13,278 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:38418, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_641094857_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744470_3646, duration(ns): 16034412 2025-07-11 12:20:13,278 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744470_3646, type=LAST_IN_PIPELINE terminating 2025-07-11 12:20:16,773 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744470_3646 replica FinalizedReplica, blk_1073744470_3646, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744470 for deletion 2025-07-11 12:20:16,774 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744470_3646 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744470 2025-07-11 12:22:13,273 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744472_3648 src: /192.168.158.6:35694 dest: /192.168.158.4:9866 2025-07-11 12:22:13,292 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:35694, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-535479439_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744472_3648, duration(ns): 16567117 2025-07-11 12:22:13,292 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744472_3648, type=LAST_IN_PIPELINE terminating 2025-07-11 12:22:16,779 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744472_3648 replica FinalizedReplica, blk_1073744472_3648, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744472 for deletion 2025-07-11 12:22:16,780 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744472_3648 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744472 2025-07-11 12:25:13,273 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744475_3651 src: /192.168.158.7:35352 dest: /192.168.158.4:9866 2025-07-11 12:25:13,297 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:35352, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1063117867_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744475_3651, duration(ns): 18412946 2025-07-11 12:25:13,297 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744475_3651, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-11 12:25:16,782 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744475_3651 replica FinalizedReplica, blk_1073744475_3651, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744475 for deletion 2025-07-11 12:25:16,783 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744475_3651 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744475 2025-07-11 12:26:13,300 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744476_3652 src: /192.168.158.1:57192 dest: /192.168.158.4:9866 2025-07-11 12:26:13,331 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57192, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1737376837_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744476_3652, duration(ns): 21853169 2025-07-11 12:26:13,331 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744476_3652, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-11 12:26:16,785 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744476_3652 replica FinalizedReplica, blk_1073744476_3652, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744476 for deletion 2025-07-11 12:26:16,786 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744476_3652 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744476 2025-07-11 12:27:13,273 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744477_3653 src: /192.168.158.1:45012 dest: /192.168.158.4:9866 2025-07-11 12:27:13,305 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45012, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1499681672_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744477_3653, duration(ns): 22443283 2025-07-11 12:27:13,305 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744477_3653, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-11 12:27:19,785 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744477_3653 replica FinalizedReplica, blk_1073744477_3653, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744477 for deletion 2025-07-11 12:27:19,786 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744477_3653 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744477 2025-07-11 12:28:13,277 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744478_3654 src: /192.168.158.6:42926 dest: /192.168.158.4:9866 2025-07-11 12:28:13,301 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:42926, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_319979798_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744478_3654, duration(ns): 18390207 2025-07-11 12:28:13,301 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744478_3654, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-11 12:28:16,789 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744478_3654 replica FinalizedReplica, blk_1073744478_3654, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744478 for deletion 2025-07-11 12:28:16,790 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744478_3654 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744478 2025-07-11 12:29:13,284 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744479_3655 src: /192.168.158.1:32918 dest: /192.168.158.4:9866 2025-07-11 12:29:13,320 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:32918, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_541667038_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744479_3655, duration(ns): 24298690 2025-07-11 12:29:13,320 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744479_3655, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-11 12:29:16,791 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744479_3655 replica FinalizedReplica, blk_1073744479_3655, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744479 for deletion 2025-07-11 12:29:16,792 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744479_3655 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744479 2025-07-11 12:32:18,276 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744482_3658 src: /192.168.158.1:46450 dest: /192.168.158.4:9866 2025-07-11 12:32:18,308 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46450, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1385457772_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744482_3658, duration(ns): 22263184 2025-07-11 12:32:18,308 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744482_3658, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-11 12:32:22,792 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744482_3658 replica FinalizedReplica, blk_1073744482_3658, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744482 for deletion 2025-07-11 12:32:22,793 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744482_3658 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744482 2025-07-11 12:33:18,291 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744483_3659 src: /192.168.158.7:59966 dest: /192.168.158.4:9866 2025-07-11 12:33:18,308 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:59966, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1731778568_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744483_3659, duration(ns): 14421682 2025-07-11 12:33:18,308 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744483_3659, type=LAST_IN_PIPELINE terminating 2025-07-11 12:33:22,793 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744483_3659 replica FinalizedReplica, blk_1073744483_3659, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744483 for deletion 2025-07-11 12:33:22,796 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744483_3659 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744483 2025-07-11 12:37:23,290 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744487_3663 src: /192.168.158.5:34554 dest: /192.168.158.4:9866 2025-07-11 12:37:23,306 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:34554, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-93960326_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744487_3663, duration(ns): 14153965 2025-07-11 12:37:23,307 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744487_3663, type=LAST_IN_PIPELINE terminating 2025-07-11 12:37:28,804 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744487_3663 replica FinalizedReplica, blk_1073744487_3663, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744487 for deletion 2025-07-11 12:37:28,805 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744487_3663 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744487 2025-07-11 12:39:28,288 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744489_3665 src: /192.168.158.1:60710 dest: /192.168.158.4:9866 2025-07-11 12:39:28,318 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60710, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1077342843_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744489_3665, duration(ns): 21400831 2025-07-11 12:39:28,318 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744489_3665, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-11 12:39:31,809 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744489_3665 replica FinalizedReplica, blk_1073744489_3665, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744489 for deletion 2025-07-11 12:39:31,810 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744489_3665 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744489 2025-07-11 12:40:33,300 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744490_3666 src: /192.168.158.7:48460 dest: /192.168.158.4:9866 2025-07-11 12:40:33,322 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:48460, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-416519822_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744490_3666, duration(ns): 19042046 2025-07-11 12:40:33,322 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744490_3666, type=LAST_IN_PIPELINE terminating 2025-07-11 12:40:37,810 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744490_3666 replica FinalizedReplica, blk_1073744490_3666, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744490 for deletion 2025-07-11 12:40:37,812 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744490_3666 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744490 2025-07-11 12:43:33,302 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744493_3669 src: /192.168.158.6:53134 dest: /192.168.158.4:9866 2025-07-11 12:43:33,327 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:53134, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-808422558_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744493_3669, duration(ns): 19083444 2025-07-11 12:43:33,327 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744493_3669, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-11 12:43:37,818 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744493_3669 replica FinalizedReplica, blk_1073744493_3669, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744493 for deletion 2025-07-11 12:43:37,819 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744493_3669 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744493 2025-07-11 12:44:33,294 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744494_3670 src: /192.168.158.1:51532 dest: /192.168.158.4:9866 2025-07-11 12:44:33,325 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51532, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_743551952_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744494_3670, duration(ns): 21361586 2025-07-11 12:44:33,325 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744494_3670, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-11 12:44:34,820 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744494_3670 replica FinalizedReplica, blk_1073744494_3670, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744494 for deletion 2025-07-11 12:44:34,821 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744494_3670 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744494 2025-07-11 12:45:33,309 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744495_3671 src: /192.168.158.1:57754 dest: /192.168.158.4:9866 2025-07-11 12:45:33,339 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57754, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-171626100_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744495_3671, duration(ns): 21630371 2025-07-11 12:45:33,339 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744495_3671, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-11 12:45:37,820 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744495_3671 replica FinalizedReplica, blk_1073744495_3671, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744495 for deletion 2025-07-11 12:45:37,821 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744495_3671 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744495 2025-07-11 12:46:33,306 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744496_3672 src: /192.168.158.6:35930 dest: /192.168.158.4:9866 2025-07-11 12:46:33,323 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:35930, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1982750220_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744496_3672, duration(ns): 14885580 2025-07-11 12:46:33,324 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744496_3672, type=LAST_IN_PIPELINE terminating 2025-07-11 12:46:34,821 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744496_3672 replica FinalizedReplica, blk_1073744496_3672, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744496 for deletion 2025-07-11 12:46:34,822 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744496_3672 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744496 2025-07-11 12:47:33,305 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744497_3673 src: /192.168.158.1:44646 dest: /192.168.158.4:9866 2025-07-11 12:47:33,337 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44646, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_84773937_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744497_3673, duration(ns): 24182119 2025-07-11 12:47:33,338 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744497_3673, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-11 12:47:34,822 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744497_3673 replica FinalizedReplica, blk_1073744497_3673, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744497 for deletion 2025-07-11 12:47:34,824 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744497_3673 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744497 2025-07-11 12:48:33,309 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744498_3674 src: /192.168.158.8:36470 dest: /192.168.158.4:9866 2025-07-11 12:48:33,335 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:36470, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-755858408_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744498_3674, duration(ns): 20997856 2025-07-11 12:48:33,336 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744498_3674, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-11 12:48:34,822 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744498_3674 replica FinalizedReplica, blk_1073744498_3674, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744498 for deletion 2025-07-11 12:48:34,824 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744498_3674 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744498 2025-07-11 12:50:43,307 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744500_3676 src: /192.168.158.5:35092 dest: /192.168.158.4:9866 2025-07-11 12:50:43,331 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:35092, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1157745943_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744500_3676, duration(ns): 18512188 2025-07-11 12:50:43,331 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744500_3676, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-11 12:50:46,828 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744500_3676 replica FinalizedReplica, blk_1073744500_3676, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744500 for deletion 2025-07-11 12:50:46,829 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744500_3676 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744500 2025-07-11 12:54:48,327 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744504_3680 src: /192.168.158.6:36976 dest: /192.168.158.4:9866 2025-07-11 12:54:48,349 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:36976, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1656982667_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744504_3680, duration(ns): 17207998 2025-07-11 12:54:48,349 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744504_3680, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-11 12:54:52,840 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744504_3680 replica FinalizedReplica, blk_1073744504_3680, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744504 for deletion 2025-07-11 12:54:52,842 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744504_3680 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744504 2025-07-11 12:55:48,319 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744505_3681 src: /192.168.158.1:60028 dest: /192.168.158.4:9866 2025-07-11 12:55:48,349 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60028, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1892850278_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744505_3681, duration(ns): 21110673 2025-07-11 12:55:48,349 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744505_3681, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-11 12:55:49,842 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744505_3681 replica FinalizedReplica, blk_1073744505_3681, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744505 for deletion 2025-07-11 12:55:49,843 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744505_3681 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744505 2025-07-11 12:56:48,327 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744506_3682 src: /192.168.158.8:59552 dest: /192.168.158.4:9866 2025-07-11 12:56:48,352 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:59552, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1364673367_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744506_3682, duration(ns): 19623321 2025-07-11 12:56:48,353 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744506_3682, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-11 12:56:49,843 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744506_3682 replica FinalizedReplica, blk_1073744506_3682, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744506 for deletion 2025-07-11 12:56:49,845 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744506_3682 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744506 2025-07-11 12:58:58,329 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744508_3684 src: /192.168.158.8:43340 dest: /192.168.158.4:9866 2025-07-11 12:58:58,346 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:43340, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1092842101_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744508_3684, duration(ns): 15453881 2025-07-11 12:58:58,347 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744508_3684, type=LAST_IN_PIPELINE terminating 2025-07-11 12:59:04,848 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744508_3684 replica FinalizedReplica, blk_1073744508_3684, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744508 for deletion 2025-07-11 12:59:04,849 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744508_3684 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744508 2025-07-11 13:01:58,329 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744511_3687 src: /192.168.158.1:43210 dest: /192.168.158.4:9866 2025-07-11 13:01:58,359 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43210, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1134210066_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744511_3687, duration(ns): 20338078 2025-07-11 13:01:58,359 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744511_3687, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-11 13:02:04,855 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744511_3687 replica FinalizedReplica, blk_1073744511_3687, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744511 for deletion 2025-07-11 13:02:04,856 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744511_3687 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744511 2025-07-11 13:02:58,337 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744512_3688 src: /192.168.158.1:41420 dest: /192.168.158.4:9866 2025-07-11 13:02:58,370 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41420, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-554607898_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744512_3688, duration(ns): 23751738 2025-07-11 13:02:58,370 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744512_3688, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-11 13:03:04,857 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744512_3688 replica FinalizedReplica, blk_1073744512_3688, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744512 for deletion 2025-07-11 13:03:04,858 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744512_3688 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744512 2025-07-11 13:03:58,333 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744513_3689 src: /192.168.158.1:48192 dest: /192.168.158.4:9866 2025-07-11 13:03:58,364 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48192, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1394818656_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744513_3689, duration(ns): 21934446 2025-07-11 13:03:58,364 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744513_3689, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-11 13:04:01,858 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744513_3689 replica FinalizedReplica, blk_1073744513_3689, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744513 for deletion 2025-07-11 13:04:01,859 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744513_3689 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744513 2025-07-11 13:08:58,342 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744518_3694 src: /192.168.158.1:55368 dest: /192.168.158.4:9866 2025-07-11 13:08:58,375 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55368, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2026012907_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744518_3694, duration(ns): 23107033 2025-07-11 13:08:58,375 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744518_3694, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-11 13:09:04,868 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744518_3694 replica FinalizedReplica, blk_1073744518_3694, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744518 for deletion 2025-07-11 13:09:04,869 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744518_3694 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744518 2025-07-11 13:09:58,343 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744519_3695 src: /192.168.158.1:43510 dest: /192.168.158.4:9866 2025-07-11 13:09:58,376 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43510, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2109953149_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744519_3695, duration(ns): 23580450 2025-07-11 13:09:58,376 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744519_3695, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-11 13:10:01,870 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744519_3695 replica FinalizedReplica, blk_1073744519_3695, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744519 for deletion 2025-07-11 13:10:01,871 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744519_3695 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744519 2025-07-11 13:14:08,389 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744523_3699 src: /192.168.158.7:49912 dest: /192.168.158.4:9866 2025-07-11 13:14:08,413 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:49912, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1263317670_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744523_3699, duration(ns): 18426744 2025-07-11 13:14:08,413 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744523_3699, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-11 13:14:10,882 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744523_3699 replica FinalizedReplica, blk_1073744523_3699, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744523 for deletion 2025-07-11 13:14:10,884 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744523_3699 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744523 2025-07-11 13:16:08,367 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744525_3701 src: /192.168.158.5:33936 dest: /192.168.158.4:9866 2025-07-11 13:16:08,385 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:33936, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1310940643_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744525_3701, duration(ns): 15789287 2025-07-11 13:16:08,385 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744525_3701, type=LAST_IN_PIPELINE terminating 2025-07-11 13:16:13,884 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744525_3701 replica FinalizedReplica, blk_1073744525_3701, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744525 for deletion 2025-07-11 13:16:13,885 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744525_3701 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744525 2025-07-11 13:19:08,382 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744528_3704 src: /192.168.158.6:55332 dest: /192.168.158.4:9866 2025-07-11 13:19:08,406 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:55332, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_382270971_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744528_3704, duration(ns): 18769901 2025-07-11 13:19:08,406 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744528_3704, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-11 13:19:10,887 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744528_3704 replica FinalizedReplica, blk_1073744528_3704, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744528 for deletion 2025-07-11 13:19:10,888 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744528_3704 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744528 2025-07-11 13:21:13,367 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744530_3706 src: /192.168.158.6:43518 dest: /192.168.158.4:9866 2025-07-11 13:21:13,388 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:43518, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2062331607_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744530_3706, duration(ns): 18466078 2025-07-11 13:21:13,388 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744530_3706, type=LAST_IN_PIPELINE terminating 2025-07-11 13:21:16,888 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744530_3706 replica FinalizedReplica, blk_1073744530_3706, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744530 for deletion 2025-07-11 13:21:16,889 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744530_3706 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744530 2025-07-11 13:23:18,399 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744532_3708 src: /192.168.158.9:47954 dest: /192.168.158.4:9866 2025-07-11 13:23:18,417 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:47954, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1146636669_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744532_3708, duration(ns): 15838424 2025-07-11 13:23:18,418 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744532_3708, type=LAST_IN_PIPELINE terminating 2025-07-11 13:23:19,892 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744532_3708 replica FinalizedReplica, blk_1073744532_3708, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744532 for deletion 2025-07-11 13:23:19,893 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744532_3708 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744532 2025-07-11 13:24:18,418 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744533_3709 src: /192.168.158.1:55334 dest: /192.168.158.4:9866 2025-07-11 13:24:18,451 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55334, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1690127670_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744533_3709, duration(ns): 24048673 2025-07-11 13:24:18,451 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744533_3709, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-11 13:24:19,895 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744533_3709 replica FinalizedReplica, blk_1073744533_3709, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744533 for deletion 2025-07-11 13:24:19,896 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744533_3709 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744533 2025-07-11 13:25:18,400 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744534_3710 src: /192.168.158.7:37090 dest: /192.168.158.4:9866 2025-07-11 13:25:18,422 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:37090, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_773097076_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744534_3710, duration(ns): 16805736 2025-07-11 13:25:18,422 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744534_3710, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-11 13:25:19,898 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744534_3710 replica FinalizedReplica, blk_1073744534_3710, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744534 for deletion 2025-07-11 13:25:19,899 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744534_3710 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744534 2025-07-11 13:26:18,404 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744535_3711 src: /192.168.158.7:47216 dest: /192.168.158.4:9866 2025-07-11 13:26:18,422 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:47216, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_351577234_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744535_3711, duration(ns): 16001657 2025-07-11 13:26:18,422 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744535_3711, type=LAST_IN_PIPELINE terminating 2025-07-11 13:26:19,903 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744535_3711 replica FinalizedReplica, blk_1073744535_3711, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744535 for deletion 2025-07-11 13:26:19,904 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744535_3711 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744535 2025-07-11 13:27:18,403 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744536_3712 src: /192.168.158.8:42110 dest: /192.168.158.4:9866 2025-07-11 13:27:18,423 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:42110, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1297248941_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744536_3712, duration(ns): 17265433 2025-07-11 13:27:18,423 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744536_3712, type=LAST_IN_PIPELINE terminating 2025-07-11 13:27:19,904 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744536_3712 replica FinalizedReplica, blk_1073744536_3712, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744536 for deletion 2025-07-11 13:27:19,906 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744536_3712 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744536 2025-07-11 13:32:33,430 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744541_3717 src: /192.168.158.1:39234 dest: /192.168.158.4:9866 2025-07-11 13:32:33,461 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39234, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_620751165_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744541_3717, duration(ns): 22018488 2025-07-11 13:32:33,461 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744541_3717, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-11 13:32:34,912 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744541_3717 replica FinalizedReplica, blk_1073744541_3717, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744541 for deletion 2025-07-11 13:32:34,913 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744541_3717 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744541 2025-07-11 13:35:33,400 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744544_3720 src: /192.168.158.1:52986 dest: /192.168.158.4:9866 2025-07-11 13:35:33,429 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52986, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1968541615_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744544_3720, duration(ns): 20026338 2025-07-11 13:35:33,429 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744544_3720, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-11 13:35:37,919 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744544_3720 replica FinalizedReplica, blk_1073744544_3720, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744544 for deletion 2025-07-11 13:35:37,920 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744544_3720 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744544 2025-07-11 13:37:38,407 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744546_3722 src: /192.168.158.8:56122 dest: /192.168.158.4:9866 2025-07-11 13:37:38,431 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:56122, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1817827802_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744546_3722, duration(ns): 18561574 2025-07-11 13:37:38,432 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744546_3722, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-11 13:37:40,924 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744546_3722 replica FinalizedReplica, blk_1073744546_3722, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744546 for deletion 2025-07-11 13:37:40,925 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744546_3722 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744546 2025-07-11 13:38:38,399 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744547_3723 src: /192.168.158.5:51562 dest: /192.168.158.4:9866 2025-07-11 13:38:38,416 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:51562, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_479341438_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744547_3723, duration(ns): 15237719 2025-07-11 13:38:38,416 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744547_3723, type=LAST_IN_PIPELINE terminating 2025-07-11 13:38:40,925 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744547_3723 replica FinalizedReplica, blk_1073744547_3723, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744547 for deletion 2025-07-11 13:38:40,927 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744547_3723 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744547 2025-07-11 13:39:38,405 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744548_3724 src: /192.168.158.9:45628 dest: /192.168.158.4:9866 2025-07-11 13:39:38,431 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:45628, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-770165076_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744548_3724, duration(ns): 19926288 2025-07-11 13:39:38,431 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744548_3724, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-11 13:39:40,928 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744548_3724 replica FinalizedReplica, blk_1073744548_3724, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744548 for deletion 2025-07-11 13:39:40,929 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744548_3724 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744548 2025-07-11 13:45:43,434 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744554_3730 src: /192.168.158.1:52602 dest: /192.168.158.4:9866 2025-07-11 13:45:43,466 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52602, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-345552677_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744554_3730, duration(ns): 22817284 2025-07-11 13:45:43,466 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744554_3730, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-11 13:45:46,934 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744554_3730 replica FinalizedReplica, blk_1073744554_3730, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744554 for deletion 2025-07-11 13:45:46,935 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744554_3730 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744554 2025-07-11 13:48:43,428 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744557_3733 src: /192.168.158.9:36398 dest: /192.168.158.4:9866 2025-07-11 13:48:43,453 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:36398, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1079639206_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744557_3733, duration(ns): 19021646 2025-07-11 13:48:43,453 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744557_3733, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-11 13:48:49,936 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744557_3733 replica FinalizedReplica, blk_1073744557_3733, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744557 for deletion 2025-07-11 13:48:49,938 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744557_3733 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744557 2025-07-11 13:49:43,432 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744558_3734 src: /192.168.158.1:56806 dest: /192.168.158.4:9866 2025-07-11 13:49:43,463 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56806, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1936218604_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744558_3734, duration(ns): 22516412 2025-07-11 13:49:43,463 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744558_3734, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-11 13:49:49,938 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744558_3734 replica FinalizedReplica, blk_1073744558_3734, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744558 for deletion 2025-07-11 13:49:49,939 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744558_3734 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744558 2025-07-11 13:50:43,439 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744559_3735 src: /192.168.158.1:37708 dest: /192.168.158.4:9866 2025-07-11 13:50:43,468 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37708, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-829295601_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744559_3735, duration(ns): 20102401 2025-07-11 13:50:43,468 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744559_3735, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-11 13:50:46,939 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744559_3735 replica FinalizedReplica, blk_1073744559_3735, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744559 for deletion 2025-07-11 13:50:46,940 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744559_3735 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744559 2025-07-11 13:52:48,451 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744561_3737 src: /192.168.158.9:49594 dest: /192.168.158.4:9866 2025-07-11 13:52:48,480 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:49594, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_668548796_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744561_3737, duration(ns): 22474281 2025-07-11 13:52:48,480 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744561_3737, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-11 13:52:49,940 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744561_3737 replica FinalizedReplica, blk_1073744561_3737, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744561 for deletion 2025-07-11 13:52:49,941 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744561_3737 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744561 2025-07-11 13:54:58,442 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744563_3739 src: /192.168.158.9:44810 dest: /192.168.158.4:9866 2025-07-11 13:54:58,467 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:44810, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1092813980_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744563_3739, duration(ns): 19697471 2025-07-11 13:54:58,468 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744563_3739, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-11 13:55:01,944 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744563_3739 replica FinalizedReplica, blk_1073744563_3739, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744563 for deletion 2025-07-11 13:55:01,945 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744563_3739 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744563 2025-07-11 13:57:03,441 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744565_3741 src: /192.168.158.7:53222 dest: /192.168.158.4:9866 2025-07-11 13:57:03,460 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:53222, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2121334262_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744565_3741, duration(ns): 16928619 2025-07-11 13:57:03,461 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744565_3741, type=LAST_IN_PIPELINE terminating 2025-07-11 13:57:04,949 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744565_3741 replica FinalizedReplica, blk_1073744565_3741, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744565 for deletion 2025-07-11 13:57:04,950 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744565_3741 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744565 2025-07-11 14:00:13,476 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744568_3744 src: /192.168.158.6:46922 dest: /192.168.158.4:9866 2025-07-11 14:00:13,498 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:46922, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-882542324_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744568_3744, duration(ns): 16917417 2025-07-11 14:00:13,498 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744568_3744, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-11 14:00:16,953 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744568_3744 replica FinalizedReplica, blk_1073744568_3744, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744568 for deletion 2025-07-11 14:00:16,954 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744568_3744 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744568 2025-07-11 14:02:18,465 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744570_3746 src: /192.168.158.9:47126 dest: /192.168.158.4:9866 2025-07-11 14:02:18,492 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:47126, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-447039554_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744570_3746, duration(ns): 20875121 2025-07-11 14:02:18,492 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744570_3746, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-11 14:02:19,957 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744570_3746 replica FinalizedReplica, blk_1073744570_3746, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744570 for deletion 2025-07-11 14:02:19,958 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744570_3746 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744570 2025-07-11 14:04:23,475 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744572_3748 src: /192.168.158.8:35564 dest: /192.168.158.4:9866 2025-07-11 14:04:23,494 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:35564, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1726642900_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744572_3748, duration(ns): 16803687 2025-07-11 14:04:23,495 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744572_3748, type=LAST_IN_PIPELINE terminating 2025-07-11 14:04:28,964 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744572_3748 replica FinalizedReplica, blk_1073744572_3748, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744572 for deletion 2025-07-11 14:04:28,965 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744572_3748 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744572 2025-07-11 14:06:23,485 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744574_3750 src: /192.168.158.1:40048 dest: /192.168.158.4:9866 2025-07-11 14:06:23,517 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40048, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1401002645_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744574_3750, duration(ns): 23637076 2025-07-11 14:06:23,518 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744574_3750, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-11 14:06:28,970 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744574_3750 replica FinalizedReplica, blk_1073744574_3750, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744574 for deletion 2025-07-11 14:06:28,971 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744574_3750 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744574 2025-07-11 14:08:28,496 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744576_3752 src: /192.168.158.5:33532 dest: /192.168.158.4:9866 2025-07-11 14:08:28,515 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:33532, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2020023030_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744576_3752, duration(ns): 16113054 2025-07-11 14:08:28,515 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744576_3752, type=LAST_IN_PIPELINE terminating 2025-07-11 14:08:31,975 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744576_3752 replica FinalizedReplica, blk_1073744576_3752, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744576 for deletion 2025-07-11 14:08:31,976 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744576_3752 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744576 2025-07-11 14:09:28,541 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744577_3753 src: /192.168.158.8:32800 dest: /192.168.158.4:9866 2025-07-11 14:09:28,565 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:32800, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-593255621_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744577_3753, duration(ns): 19222492 2025-07-11 14:09:28,566 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744577_3753, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-11 14:09:34,979 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744577_3753 replica FinalizedReplica, blk_1073744577_3753, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744577 for deletion 2025-07-11 14:09:34,980 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744577_3753 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744577 2025-07-11 14:12:33,494 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744580_3756 src: /192.168.158.1:54130 dest: /192.168.158.4:9866 2025-07-11 14:12:33,525 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54130, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1265703988_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744580_3756, duration(ns): 22601414 2025-07-11 14:12:33,526 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744580_3756, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-11 14:12:34,986 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744580_3756 replica FinalizedReplica, blk_1073744580_3756, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744580 for deletion 2025-07-11 14:12:34,987 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744580_3756 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744580 2025-07-11 14:16:43,518 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744584_3760 src: /192.168.158.7:59966 dest: /192.168.158.4:9866 2025-07-11 14:16:43,536 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:59966, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1433182513_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744584_3760, duration(ns): 15482975 2025-07-11 14:16:43,536 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744584_3760, type=LAST_IN_PIPELINE terminating 2025-07-11 14:16:46,995 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744584_3760 replica FinalizedReplica, blk_1073744584_3760, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744584 for deletion 2025-07-11 14:16:46,996 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744584_3760 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744584 2025-07-11 14:18:48,506 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744586_3762 src: /192.168.158.1:38022 dest: /192.168.158.4:9866 2025-07-11 14:18:48,539 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38022, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1215638257_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744586_3762, duration(ns): 23939528 2025-07-11 14:18:48,541 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744586_3762, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-11 14:18:49,997 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744586_3762 replica FinalizedReplica, blk_1073744586_3762, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744586 for deletion 2025-07-11 14:18:49,998 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744586_3762 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744586 2025-07-11 14:20:48,509 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744588_3764 src: /192.168.158.1:52012 dest: /192.168.158.4:9866 2025-07-11 14:20:48,539 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52012, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-276035737_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744588_3764, duration(ns): 21063227 2025-07-11 14:20:48,539 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744588_3764, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-11 14:20:50,002 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744588_3764 replica FinalizedReplica, blk_1073744588_3764, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744588 for deletion 2025-07-11 14:20:50,004 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744588_3764 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744588 2025-07-11 14:23:48,517 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744591_3767 src: /192.168.158.8:49284 dest: /192.168.158.4:9866 2025-07-11 14:23:48,542 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:49284, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-141096569_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744591_3767, duration(ns): 19246266 2025-07-11 14:23:48,542 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744591_3767, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-11 14:23:50,009 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744591_3767 replica FinalizedReplica, blk_1073744591_3767, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744591 for deletion 2025-07-11 14:23:50,010 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744591_3767 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744591 2025-07-11 14:24:48,540 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744592_3768 src: /192.168.158.5:45576 dest: /192.168.158.4:9866 2025-07-11 14:24:48,566 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:45576, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2002215286_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744592_3768, duration(ns): 21044413 2025-07-11 14:24:48,566 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744592_3768, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-11 14:24:50,012 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744592_3768 replica FinalizedReplica, blk_1073744592_3768, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744592 for deletion 2025-07-11 14:24:50,013 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744592_3768 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744592 2025-07-11 14:26:53,520 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744594_3770 src: /192.168.158.1:57370 dest: /192.168.158.4:9866 2025-07-11 14:26:53,553 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57370, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2133925341_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744594_3770, duration(ns): 23794593 2025-07-11 14:26:53,553 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744594_3770, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-11 14:26:56,018 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744594_3770 replica FinalizedReplica, blk_1073744594_3770, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744594 for deletion 2025-07-11 14:26:56,019 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744594_3770 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744594 2025-07-11 14:33:58,541 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744601_3777 src: /192.168.158.1:51870 dest: /192.168.158.4:9866 2025-07-11 14:33:58,574 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51870, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1264923360_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744601_3777, duration(ns): 24278223 2025-07-11 14:33:58,574 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744601_3777, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-11 14:34:02,032 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744601_3777 replica FinalizedReplica, blk_1073744601_3777, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744601 for deletion 2025-07-11 14:34:02,033 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744601_3777 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744601 2025-07-11 14:34:58,538 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744602_3778 src: /192.168.158.1:48966 dest: /192.168.158.4:9866 2025-07-11 14:34:58,568 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48966, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2076126831_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744602_3778, duration(ns): 21945894 2025-07-11 14:34:58,569 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744602_3778, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-11 14:35:02,032 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744602_3778 replica FinalizedReplica, blk_1073744602_3778, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744602 for deletion 2025-07-11 14:35:02,033 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744602_3778 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744602 2025-07-11 14:40:58,564 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744608_3784 src: /192.168.158.8:57764 dest: /192.168.158.4:9866 2025-07-11 14:40:58,590 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:57764, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_199352857_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744608_3784, duration(ns): 19735463 2025-07-11 14:40:58,590 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744608_3784, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-11 14:41:05,052 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744608_3784 replica FinalizedReplica, blk_1073744608_3784, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744608 for deletion 2025-07-11 14:41:05,053 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744608_3784 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744608 2025-07-11 14:41:58,533 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744609_3785 src: /192.168.158.7:59328 dest: /192.168.158.4:9866 2025-07-11 14:41:58,556 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:59328, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1612112794_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744609_3785, duration(ns): 16719516 2025-07-11 14:41:58,556 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744609_3785, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-11 14:41:59,054 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744609_3785 replica FinalizedReplica, blk_1073744609_3785, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744609 for deletion 2025-07-11 14:41:59,055 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744609_3785 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744609 2025-07-11 14:45:03,560 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744612_3788 src: /192.168.158.7:45294 dest: /192.168.158.4:9866 2025-07-11 14:45:03,589 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:45294, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2048447372_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744612_3788, duration(ns): 22369940 2025-07-11 14:45:03,589 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744612_3788, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-11 14:45:05,062 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744612_3788 replica FinalizedReplica, blk_1073744612_3788, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744612 for deletion 2025-07-11 14:45:05,064 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744612_3788 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744612 2025-07-11 14:47:03,564 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744614_3790 src: /192.168.158.1:46424 dest: /192.168.158.4:9866 2025-07-11 14:47:03,595 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46424, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-453050554_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744614_3790, duration(ns): 21836795 2025-07-11 14:47:03,596 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744614_3790, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-11 14:47:08,066 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744614_3790 replica FinalizedReplica, blk_1073744614_3790, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744614 for deletion 2025-07-11 14:47:08,068 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744614_3790 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744614 2025-07-11 14:48:08,567 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744615_3791 src: /192.168.158.1:56104 dest: /192.168.158.4:9866 2025-07-11 14:48:08,598 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56104, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1425446022_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744615_3791, duration(ns): 21747833 2025-07-11 14:48:08,598 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744615_3791, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-11 14:48:11,071 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744615_3791 replica FinalizedReplica, blk_1073744615_3791, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744615 for deletion 2025-07-11 14:48:11,072 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744615_3791 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744615 2025-07-11 14:49:13,573 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744616_3792 src: /192.168.158.9:36682 dest: /192.168.158.4:9866 2025-07-11 14:49:13,596 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:36682, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-430346588_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744616_3792, duration(ns): 17880731 2025-07-11 14:49:13,597 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744616_3792, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-11 14:49:20,075 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744616_3792 replica FinalizedReplica, blk_1073744616_3792, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744616 for deletion 2025-07-11 14:49:20,077 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744616_3792 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744616 2025-07-11 14:52:13,571 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744619_3795 src: /192.168.158.1:49280 dest: /192.168.158.4:9866 2025-07-11 14:52:13,601 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49280, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1554047184_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744619_3795, duration(ns): 21322426 2025-07-11 14:52:13,601 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744619_3795, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-11 14:52:14,084 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744619_3795 replica FinalizedReplica, blk_1073744619_3795, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744619 for deletion 2025-07-11 14:52:14,085 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744619_3795 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744619 2025-07-11 14:53:13,596 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744620_3796 src: /192.168.158.6:60782 dest: /192.168.158.4:9866 2025-07-11 14:53:13,615 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:60782, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1610493802_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744620_3796, duration(ns): 17443706 2025-07-11 14:53:13,616 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744620_3796, type=LAST_IN_PIPELINE terminating 2025-07-11 14:53:20,085 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744620_3796 replica FinalizedReplica, blk_1073744620_3796, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744620 for deletion 2025-07-11 14:53:20,086 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744620_3796 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744620 2025-07-11 14:54:13,568 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744621_3797 src: /192.168.158.1:47936 dest: /192.168.158.4:9866 2025-07-11 14:54:13,597 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47936, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1737101393_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744621_3797, duration(ns): 20829981 2025-07-11 14:54:13,598 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744621_3797, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-11 14:54:14,089 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744621_3797 replica FinalizedReplica, blk_1073744621_3797, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744621 for deletion 2025-07-11 14:54:14,090 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744621_3797 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744621 2025-07-11 14:56:13,562 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744623_3799 src: /192.168.158.5:46256 dest: /192.168.158.4:9866 2025-07-11 14:56:13,588 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:46256, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1288737022_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744623_3799, duration(ns): 21481901 2025-07-11 14:56:13,589 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744623_3799, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-11 14:56:14,096 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744623_3799 replica FinalizedReplica, blk_1073744623_3799, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744623 for deletion 2025-07-11 14:56:14,097 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744623_3799 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744623 2025-07-11 14:57:13,590 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744624_3800 src: /192.168.158.5:40252 dest: /192.168.158.4:9866 2025-07-11 14:57:13,608 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:40252, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1787677139_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744624_3800, duration(ns): 15974405 2025-07-11 14:57:13,608 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744624_3800, type=LAST_IN_PIPELINE terminating 2025-07-11 14:57:17,099 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744624_3800 replica FinalizedReplica, blk_1073744624_3800, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744624 for deletion 2025-07-11 14:57:17,100 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744624_3800 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744624 2025-07-11 14:58:13,596 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744625_3801 src: /192.168.158.8:40716 dest: /192.168.158.4:9866 2025-07-11 14:58:13,623 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:40716, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-25721834_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744625_3801, duration(ns): 21449606 2025-07-11 14:58:13,623 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744625_3801, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-11 14:58:17,100 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744625_3801 replica FinalizedReplica, blk_1073744625_3801, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744625 for deletion 2025-07-11 14:58:17,102 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744625_3801 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744625 2025-07-11 14:59:13,583 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744626_3802 src: /192.168.158.1:59410 dest: /192.168.158.4:9866 2025-07-11 14:59:13,613 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59410, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1993881494_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744626_3802, duration(ns): 20717720 2025-07-11 14:59:13,613 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744626_3802, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-11 14:59:17,100 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744626_3802 replica FinalizedReplica, blk_1073744626_3802, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744626 for deletion 2025-07-11 14:59:17,101 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744626_3802 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744626 2025-07-11 15:01:13,585 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744628_3804 src: /192.168.158.1:39460 dest: /192.168.158.4:9866 2025-07-11 15:01:13,617 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39460, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1041684931_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744628_3804, duration(ns): 22893271 2025-07-11 15:01:13,618 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744628_3804, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-11 15:01:17,107 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744628_3804 replica FinalizedReplica, blk_1073744628_3804, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744628 for deletion 2025-07-11 15:01:17,109 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744628_3804 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744628 2025-07-11 15:06:18,611 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744633_3809 src: /192.168.158.6:53060 dest: /192.168.158.4:9866 2025-07-11 15:06:18,630 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:53060, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1268573888_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744633_3809, duration(ns): 16324710 2025-07-11 15:06:18,630 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744633_3809, type=LAST_IN_PIPELINE terminating 2025-07-11 15:06:23,116 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744633_3809 replica FinalizedReplica, blk_1073744633_3809, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744633 for deletion 2025-07-11 15:06:23,117 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744633_3809 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744633 2025-07-11 15:07:18,604 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744634_3810 src: /192.168.158.5:58938 dest: /192.168.158.4:9866 2025-07-11 15:07:18,629 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:58938, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1779869496_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744634_3810, duration(ns): 20176654 2025-07-11 15:07:18,630 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744634_3810, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-11 15:07:20,119 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744634_3810 replica FinalizedReplica, blk_1073744634_3810, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744634 for deletion 2025-07-11 15:07:20,120 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744634_3810 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744634 2025-07-11 15:09:18,611 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744636_3812 src: /192.168.158.8:36468 dest: /192.168.158.4:9866 2025-07-11 15:09:18,631 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:36468, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_483585308_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744636_3812, duration(ns): 17969256 2025-07-11 15:09:18,631 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744636_3812, type=LAST_IN_PIPELINE terminating 2025-07-11 15:09:20,128 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744636_3812 replica FinalizedReplica, blk_1073744636_3812, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744636 for deletion 2025-07-11 15:09:20,129 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744636_3812 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744636 2025-07-11 15:10:18,616 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744637_3813 src: /192.168.158.7:55008 dest: /192.168.158.4:9866 2025-07-11 15:10:18,645 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:55008, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1569914595_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744637_3813, duration(ns): 22351903 2025-07-11 15:10:18,645 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744637_3813, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-11 15:10:20,131 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744637_3813 replica FinalizedReplica, blk_1073744637_3813, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744637 for deletion 2025-07-11 15:10:20,132 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744637_3813 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744637 2025-07-11 15:11:18,617 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744638_3814 src: /192.168.158.1:37984 dest: /192.168.158.4:9866 2025-07-11 15:11:18,647 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37984, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-129140069_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744638_3814, duration(ns): 21199959 2025-07-11 15:11:18,648 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744638_3814, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-11 15:11:20,131 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744638_3814 replica FinalizedReplica, blk_1073744638_3814, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744638 for deletion 2025-07-11 15:11:20,132 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744638_3814 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744638 2025-07-11 15:12:18,626 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744639_3815 src: /192.168.158.5:37162 dest: /192.168.158.4:9866 2025-07-11 15:12:18,653 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:37162, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_921337121_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744639_3815, duration(ns): 21057007 2025-07-11 15:12:18,653 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744639_3815, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-11 15:12:20,135 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744639_3815 replica FinalizedReplica, blk_1073744639_3815, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744639 for deletion 2025-07-11 15:12:20,136 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744639_3815 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073744639 2025-07-11 15:14:23,617 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744641_3817 src: /192.168.158.1:38078 dest: /192.168.158.4:9866 2025-07-11 15:14:23,652 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38078, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_99544341_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744641_3817, duration(ns): 25272294 2025-07-11 15:14:23,652 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744641_3817, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-11 15:14:29,138 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744641_3817 replica FinalizedReplica, blk_1073744641_3817, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744641 for deletion 2025-07-11 15:14:29,139 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744641_3817 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744641 2025-07-11 15:17:23,636 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744644_3820 src: /192.168.158.5:50468 dest: /192.168.158.4:9866 2025-07-11 15:17:23,661 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:50468, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1135793406_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744644_3820, duration(ns): 19773350 2025-07-11 15:17:23,662 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744644_3820, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-11 15:17:26,146 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744644_3820 replica FinalizedReplica, blk_1073744644_3820, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744644 for deletion 2025-07-11 15:17:26,147 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744644_3820 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744644 2025-07-11 15:21:28,658 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744648_3824 src: /192.168.158.7:57904 dest: /192.168.158.4:9866 2025-07-11 15:21:28,678 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:57904, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_524469140_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744648_3824, duration(ns): 16729039 2025-07-11 15:21:28,679 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744648_3824, type=LAST_IN_PIPELINE terminating 2025-07-11 15:21:29,151 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744648_3824 replica FinalizedReplica, blk_1073744648_3824, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744648 for deletion 2025-07-11 15:21:29,152 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744648_3824 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744648 2025-07-11 15:23:28,678 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744650_3826 src: /192.168.158.6:33424 dest: /192.168.158.4:9866 2025-07-11 15:23:28,698 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:33424, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_46822685_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744650_3826, duration(ns): 17949631 2025-07-11 15:23:28,698 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744650_3826, type=LAST_IN_PIPELINE terminating 2025-07-11 15:23:32,155 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744650_3826 replica FinalizedReplica, blk_1073744650_3826, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744650 for deletion 2025-07-11 15:23:32,156 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744650_3826 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744650 2025-07-11 15:25:28,664 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744652_3828 src: /192.168.158.1:41100 dest: /192.168.158.4:9866 2025-07-11 15:25:28,697 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41100, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1322219189_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744652_3828, duration(ns): 24330502 2025-07-11 15:25:28,697 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744652_3828, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-11 15:25:32,158 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744652_3828 replica FinalizedReplica, blk_1073744652_3828, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744652 for deletion 2025-07-11 15:25:32,159 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744652_3828 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744652 2025-07-11 15:27:33,659 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744654_3830 src: /192.168.158.1:60204 dest: /192.168.158.4:9866 2025-07-11 15:27:33,690 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60204, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-811044965_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744654_3830, duration(ns): 22908078 2025-07-11 15:27:33,691 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744654_3830, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-11 15:27:35,159 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744654_3830 replica FinalizedReplica, blk_1073744654_3830, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744654 for deletion 2025-07-11 15:27:35,160 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744654_3830 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744654 2025-07-11 15:31:43,670 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744658_3834 src: /192.168.158.8:59578 dest: /192.168.158.4:9866 2025-07-11 15:31:43,696 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:59578, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1622731424_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744658_3834, duration(ns): 20235243 2025-07-11 15:31:43,696 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744658_3834, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-11 15:31:44,166 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744658_3834 replica FinalizedReplica, blk_1073744658_3834, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744658 for deletion 2025-07-11 15:31:44,168 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744658_3834 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744658 2025-07-11 15:33:43,670 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744660_3836 src: /192.168.158.1:52882 dest: /192.168.158.4:9866 2025-07-11 15:33:43,704 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52882, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_650000387_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744660_3836, duration(ns): 24113682 2025-07-11 15:33:43,704 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744660_3836, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-11 15:33:44,172 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744660_3836 replica FinalizedReplica, blk_1073744660_3836, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744660 for deletion 2025-07-11 15:33:44,174 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744660_3836 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744660 2025-07-11 15:34:48,668 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744661_3837 src: /192.168.158.1:42802 dest: /192.168.158.4:9866 2025-07-11 15:34:48,698 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42802, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1175437037_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744661_3837, duration(ns): 21602866 2025-07-11 15:34:48,698 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744661_3837, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-11 15:34:50,175 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744661_3837 replica FinalizedReplica, blk_1073744661_3837, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744661 for deletion 2025-07-11 15:34:50,176 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744661_3837 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744661 2025-07-11 15:35:48,682 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744662_3838 src: /192.168.158.1:52318 dest: /192.168.158.4:9866 2025-07-11 15:35:48,713 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52318, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-106295467_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744662_3838, duration(ns): 22223961 2025-07-11 15:35:48,714 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744662_3838, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-11 15:35:53,176 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744662_3838 replica FinalizedReplica, blk_1073744662_3838, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744662 for deletion 2025-07-11 15:35:53,177 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744662_3838 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744662 2025-07-11 15:36:53,677 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744663_3839 src: /192.168.158.6:56672 dest: /192.168.158.4:9866 2025-07-11 15:36:53,701 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:56672, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1106083669_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744663_3839, duration(ns): 17990090 2025-07-11 15:36:53,701 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744663_3839, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-11 15:36:56,177 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744663_3839 replica FinalizedReplica, blk_1073744663_3839, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744663 for deletion 2025-07-11 15:36:56,178 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744663_3839 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744663 2025-07-11 15:38:58,682 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744665_3841 src: /192.168.158.9:40934 dest: /192.168.158.4:9866 2025-07-11 15:38:58,700 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:40934, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_978585094_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744665_3841, duration(ns): 15845104 2025-07-11 15:38:58,700 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744665_3841, type=LAST_IN_PIPELINE terminating 2025-07-11 15:38:59,182 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744665_3841 replica FinalizedReplica, blk_1073744665_3841, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744665 for deletion 2025-07-11 15:38:59,183 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744665_3841 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744665 2025-07-11 15:42:03,712 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744668_3844 src: /192.168.158.7:52438 dest: /192.168.158.4:9866 2025-07-11 15:42:03,732 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:52438, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_533860303_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744668_3844, duration(ns): 14291090 2025-07-11 15:42:03,732 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744668_3844, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-11 15:42:05,188 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744668_3844 replica FinalizedReplica, blk_1073744668_3844, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744668 for deletion 2025-07-11 15:42:05,189 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744668_3844 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744668 2025-07-11 15:43:08,688 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744669_3845 src: /192.168.158.1:51878 dest: /192.168.158.4:9866 2025-07-11 15:43:08,722 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51878, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-256198591_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744669_3845, duration(ns): 25453988 2025-07-11 15:43:08,723 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744669_3845, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-11 15:43:14,188 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744669_3845 replica FinalizedReplica, blk_1073744669_3845, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744669 for deletion 2025-07-11 15:43:14,189 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744669_3845 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744669 2025-07-11 15:46:18,687 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744672_3848 src: /192.168.158.5:54110 dest: /192.168.158.4:9866 2025-07-11 15:46:18,705 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:54110, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-492502036_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744672_3848, duration(ns): 16338732 2025-07-11 15:46:18,706 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744672_3848, type=LAST_IN_PIPELINE terminating 2025-07-11 15:46:20,191 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744672_3848 replica FinalizedReplica, blk_1073744672_3848, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744672 for deletion 2025-07-11 15:46:20,192 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744672_3848 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744672 2025-07-11 15:47:23,673 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744673_3849 src: /192.168.158.1:32902 dest: /192.168.158.4:9866 2025-07-11 15:47:23,703 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:32902, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2137862076_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744673_3849, duration(ns): 21511397 2025-07-11 15:47:23,704 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744673_3849, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-11 15:47:26,193 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744673_3849 replica FinalizedReplica, blk_1073744673_3849, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744673 for deletion 2025-07-11 15:47:26,194 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744673_3849 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744673 2025-07-11 15:48:23,688 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744674_3850 src: /192.168.158.5:41618 dest: /192.168.158.4:9866 2025-07-11 15:48:23,712 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:41618, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-990349820_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744674_3850, duration(ns): 18726647 2025-07-11 15:48:23,713 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744674_3850, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-11 15:48:26,196 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744674_3850 replica FinalizedReplica, blk_1073744674_3850, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744674 for deletion 2025-07-11 15:48:26,197 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744674_3850 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744674 2025-07-11 15:50:28,673 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744676_3852 src: /192.168.158.6:39782 dest: /192.168.158.4:9866 2025-07-11 15:50:28,701 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:39782, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-990556118_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744676_3852, duration(ns): 23074701 2025-07-11 15:50:28,702 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744676_3852, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-11 15:50:32,201 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744676_3852 replica FinalizedReplica, blk_1073744676_3852, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744676 for deletion 2025-07-11 15:50:32,202 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744676_3852 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744676 2025-07-11 15:54:28,697 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744680_3856 src: /192.168.158.6:55886 dest: /192.168.158.4:9866 2025-07-11 15:54:28,723 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:55886, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-568771025_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744680_3856, duration(ns): 20596060 2025-07-11 15:54:28,723 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744680_3856, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-11 15:54:29,210 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744680_3856 replica FinalizedReplica, blk_1073744680_3856, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744680 for deletion 2025-07-11 15:54:29,211 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744680_3856 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744680 2025-07-11 15:55:28,673 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744681_3857 src: /192.168.158.1:48130 dest: /192.168.158.4:9866 2025-07-11 15:55:28,706 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48130, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1803024943_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744681_3857, duration(ns): 23301702 2025-07-11 15:55:28,709 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744681_3857, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-11 15:55:29,212 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744681_3857 replica FinalizedReplica, blk_1073744681_3857, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744681 for deletion 2025-07-11 15:55:29,213 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744681_3857 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744681 2025-07-11 15:59:28,704 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744685_3861 src: /192.168.158.1:47524 dest: /192.168.158.4:9866 2025-07-11 15:59:28,734 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47524, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1728219553_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744685_3861, duration(ns): 21997484 2025-07-11 15:59:28,735 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744685_3861, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-11 15:59:32,217 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744685_3861 replica FinalizedReplica, blk_1073744685_3861, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744685 for deletion 2025-07-11 15:59:32,218 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744685_3861 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744685 2025-07-11 16:02:38,704 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744688_3864 src: /192.168.158.5:59986 dest: /192.168.158.4:9866 2025-07-11 16:02:38,722 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:59986, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1474720032_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744688_3864, duration(ns): 15939233 2025-07-11 16:02:38,722 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744688_3864, type=LAST_IN_PIPELINE terminating 2025-07-11 16:02:41,224 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744688_3864 replica FinalizedReplica, blk_1073744688_3864, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744688 for deletion 2025-07-11 16:02:41,226 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744688_3864 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744688 2025-07-11 16:03:38,721 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744689_3865 src: /192.168.158.1:33262 dest: /192.168.158.4:9866 2025-07-11 16:03:38,757 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33262, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-814018599_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744689_3865, duration(ns): 24787464 2025-07-11 16:03:38,757 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744689_3865, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-11 16:03:44,226 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744689_3865 replica FinalizedReplica, blk_1073744689_3865, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744689 for deletion 2025-07-11 16:03:44,227 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744689_3865 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744689 2025-07-11 16:05:48,701 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744691_3867 src: /192.168.158.5:44178 dest: /192.168.158.4:9866 2025-07-11 16:05:48,720 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:44178, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_561455778_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744691_3867, duration(ns): 16660060 2025-07-11 16:05:48,721 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744691_3867, type=LAST_IN_PIPELINE terminating 2025-07-11 16:05:53,230 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744691_3867 replica FinalizedReplica, blk_1073744691_3867, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744691 for deletion 2025-07-11 16:05:53,231 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744691_3867 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744691 2025-07-11 16:07:48,722 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744693_3869 src: /192.168.158.1:45098 dest: /192.168.158.4:9866 2025-07-11 16:07:48,754 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45098, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_244535238_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744693_3869, duration(ns): 23004711 2025-07-11 16:07:48,754 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744693_3869, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-11 16:07:50,236 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744693_3869 replica FinalizedReplica, blk_1073744693_3869, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744693 for deletion 2025-07-11 16:07:50,237 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744693_3869 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744693 2025-07-11 16:09:48,724 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744695_3871 src: /192.168.158.5:37732 dest: /192.168.158.4:9866 2025-07-11 16:09:48,748 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:37732, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-610703155_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744695_3871, duration(ns): 18242996 2025-07-11 16:09:48,748 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744695_3871, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-11 16:09:50,242 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744695_3871 replica FinalizedReplica, blk_1073744695_3871, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744695 for deletion 2025-07-11 16:09:50,244 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744695_3871 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744695 2025-07-11 16:10:48,725 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744696_3872 src: /192.168.158.1:47172 dest: /192.168.158.4:9866 2025-07-11 16:10:48,755 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47172, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1267892677_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744696_3872, duration(ns): 20778678 2025-07-11 16:10:48,755 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744696_3872, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-11 16:10:50,247 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744696_3872 replica FinalizedReplica, blk_1073744696_3872, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744696 for deletion 2025-07-11 16:10:50,248 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744696_3872 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744696 2025-07-11 16:11:48,720 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744697_3873 src: /192.168.158.1:53268 dest: /192.168.158.4:9866 2025-07-11 16:11:48,751 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53268, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_914698768_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744697_3873, duration(ns): 22382044 2025-07-11 16:11:48,752 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744697_3873, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-11 16:11:50,249 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744697_3873 replica FinalizedReplica, blk_1073744697_3873, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744697 for deletion 2025-07-11 16:11:50,251 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744697_3873 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744697 2025-07-11 16:12:48,762 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744698_3874 src: /192.168.158.6:57130 dest: /192.168.158.4:9866 2025-07-11 16:12:48,788 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:57130, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1050340088_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744698_3874, duration(ns): 21040707 2025-07-11 16:12:48,789 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744698_3874, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-11 16:12:53,249 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744698_3874 replica FinalizedReplica, blk_1073744698_3874, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744698 for deletion 2025-07-11 16:12:53,251 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744698_3874 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744698 2025-07-11 16:13:48,714 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744699_3875 src: /192.168.158.9:48308 dest: /192.168.158.4:9866 2025-07-11 16:13:48,737 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:48308, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1482570930_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744699_3875, duration(ns): 17530026 2025-07-11 16:13:48,738 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744699_3875, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-11 16:13:50,251 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744699_3875 replica FinalizedReplica, blk_1073744699_3875, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744699 for deletion 2025-07-11 16:13:50,252 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744699_3875 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744699 2025-07-11 16:15:48,723 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744701_3877 src: /192.168.158.9:38092 dest: /192.168.158.4:9866 2025-07-11 16:15:48,740 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:38092, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1380704103_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744701_3877, duration(ns): 15303922 2025-07-11 16:15:48,741 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744701_3877, type=LAST_IN_PIPELINE terminating 2025-07-11 16:15:50,257 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744701_3877 replica FinalizedReplica, blk_1073744701_3877, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744701 for deletion 2025-07-11 16:15:50,258 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744701_3877 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744701 2025-07-11 16:17:53,729 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744703_3879 src: /192.168.158.9:43384 dest: /192.168.158.4:9866 2025-07-11 16:17:53,753 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:43384, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1691709182_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744703_3879, duration(ns): 18875933 2025-07-11 16:17:53,754 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744703_3879, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-11 16:17:59,263 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744703_3879 replica FinalizedReplica, blk_1073744703_3879, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744703 for deletion 2025-07-11 16:17:59,264 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744703_3879 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744703 2025-07-11 16:21:59,562 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744707_3883 src: /192.168.158.9:47492 dest: /192.168.158.4:9866 2025-07-11 16:21:59,586 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:47492, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_598004136_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744707_3883, duration(ns): 19087056 2025-07-11 16:21:59,587 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744707_3883, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-11 16:22:05,268 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744707_3883 replica FinalizedReplica, blk_1073744707_3883, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744707 for deletion 2025-07-11 16:22:05,270 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744707_3883 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744707 2025-07-11 16:24:58,741 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744710_3886 src: /192.168.158.5:52398 dest: /192.168.158.4:9866 2025-07-11 16:24:58,771 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:52398, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1845336719_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744710_3886, duration(ns): 24047456 2025-07-11 16:24:58,771 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744710_3886, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-11 16:24:59,273 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744710_3886 replica FinalizedReplica, blk_1073744710_3886, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744710 for deletion 2025-07-11 16:24:59,274 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744710_3886 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744710 2025-07-11 16:25:58,743 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744711_3887 src: /192.168.158.6:36974 dest: /192.168.158.4:9866 2025-07-11 16:25:58,759 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:36974, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_836569353_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744711_3887, duration(ns): 13978180 2025-07-11 16:25:58,760 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744711_3887, type=LAST_IN_PIPELINE terminating 2025-07-11 16:25:59,273 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744711_3887 replica FinalizedReplica, blk_1073744711_3887, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744711 for deletion 2025-07-11 16:25:59,274 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744711_3887 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744711 2025-07-11 16:26:58,747 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744712_3888 src: /192.168.158.6:34200 dest: /192.168.158.4:9866 2025-07-11 16:26:58,766 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:34200, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-322829025_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744712_3888, duration(ns): 17314443 2025-07-11 16:26:58,767 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744712_3888, type=LAST_IN_PIPELINE terminating 2025-07-11 16:26:59,274 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744712_3888 replica FinalizedReplica, blk_1073744712_3888, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744712 for deletion 2025-07-11 16:26:59,275 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744712_3888 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744712 2025-07-11 16:28:03,735 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744713_3889 src: /192.168.158.5:57766 dest: /192.168.158.4:9866 2025-07-11 16:28:03,762 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:57766, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1486968444_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744713_3889, duration(ns): 21616876 2025-07-11 16:28:03,762 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744713_3889, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-11 16:28:08,275 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744713_3889 replica FinalizedReplica, blk_1073744713_3889, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744713 for deletion 2025-07-11 16:28:08,277 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744713_3889 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744713 2025-07-11 16:29:03,741 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744714_3890 src: /192.168.158.1:51508 dest: /192.168.158.4:9866 2025-07-11 16:29:03,772 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51508, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_357159128_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744714_3890, duration(ns): 22870846 2025-07-11 16:29:03,772 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744714_3890, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-11 16:29:05,276 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744714_3890 replica FinalizedReplica, blk_1073744714_3890, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744714 for deletion 2025-07-11 16:29:05,277 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744714_3890 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744714 2025-07-11 16:34:03,752 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744719_3895 src: /192.168.158.5:55664 dest: /192.168.158.4:9866 2025-07-11 16:34:03,776 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:55664, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1198138887_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744719_3895, duration(ns): 18352732 2025-07-11 16:34:03,776 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744719_3895, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-11 16:34:08,289 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744719_3895 replica FinalizedReplica, blk_1073744719_3895, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744719 for deletion 2025-07-11 16:34:08,290 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744719_3895 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744719 2025-07-11 16:37:03,763 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744722_3898 src: /192.168.158.8:35206 dest: /192.168.158.4:9866 2025-07-11 16:37:03,782 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:35206, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2117285251_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744722_3898, duration(ns): 16911380 2025-07-11 16:37:03,782 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744722_3898, type=LAST_IN_PIPELINE terminating 2025-07-11 16:37:05,297 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744722_3898 replica FinalizedReplica, blk_1073744722_3898, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744722 for deletion 2025-07-11 16:37:05,299 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744722_3898 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744722 2025-07-11 16:38:03,763 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744723_3899 src: /192.168.158.6:34362 dest: /192.168.158.4:9866 2025-07-11 16:38:03,790 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:34362, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1998996467_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744723_3899, duration(ns): 19905002 2025-07-11 16:38:03,790 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744723_3899, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-11 16:38:05,300 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744723_3899 replica FinalizedReplica, blk_1073744723_3899, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744723 for deletion 2025-07-11 16:38:05,301 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744723_3899 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744723 2025-07-11 16:41:03,799 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744726_3902 src: /192.168.158.8:46196 dest: /192.168.158.4:9866 2025-07-11 16:41:03,825 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:46196, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-956796314_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744726_3902, duration(ns): 20077627 2025-07-11 16:41:03,825 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744726_3902, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-11 16:41:05,306 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744726_3902 replica FinalizedReplica, blk_1073744726_3902, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744726 for deletion 2025-07-11 16:41:05,307 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744726_3902 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744726 2025-07-11 16:42:08,773 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744727_3903 src: /192.168.158.5:47320 dest: /192.168.158.4:9866 2025-07-11 16:42:08,792 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:47320, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_306727849_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744727_3903, duration(ns): 15832773 2025-07-11 16:42:08,792 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744727_3903, type=LAST_IN_PIPELINE terminating 2025-07-11 16:42:11,307 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744727_3903 replica FinalizedReplica, blk_1073744727_3903, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744727 for deletion 2025-07-11 16:42:11,308 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744727_3903 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744727 2025-07-11 16:45:18,767 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744730_3906 src: /192.168.158.1:53130 dest: /192.168.158.4:9866 2025-07-11 16:45:18,802 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53130, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1173847713_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744730_3906, duration(ns): 26086732 2025-07-11 16:45:18,803 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744730_3906, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-11 16:45:20,321 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744730_3906 replica FinalizedReplica, blk_1073744730_3906, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744730 for deletion 2025-07-11 16:45:20,322 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744730_3906 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744730 2025-07-11 16:46:18,768 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744731_3907 src: /192.168.158.1:43130 dest: /192.168.158.4:9866 2025-07-11 16:46:18,799 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43130, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1461994578_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744731_3907, duration(ns): 21606436 2025-07-11 16:46:18,799 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744731_3907, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-11 16:46:20,325 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744731_3907 replica FinalizedReplica, blk_1073744731_3907, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744731 for deletion 2025-07-11 16:46:20,327 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744731_3907 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744731 2025-07-11 16:49:18,793 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744734_3910 src: /192.168.158.6:58188 dest: /192.168.158.4:9866 2025-07-11 16:49:18,817 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:58188, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_770129502_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744734_3910, duration(ns): 18717772 2025-07-11 16:49:18,817 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744734_3910, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-11 16:49:20,334 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744734_3910 replica FinalizedReplica, blk_1073744734_3910, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744734 for deletion 2025-07-11 16:49:20,335 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744734_3910 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744734 2025-07-11 16:50:18,790 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744735_3911 src: /192.168.158.9:46772 dest: /192.168.158.4:9866 2025-07-11 16:50:18,808 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:46772, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2037127820_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744735_3911, duration(ns): 16538767 2025-07-11 16:50:18,809 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744735_3911, type=LAST_IN_PIPELINE terminating 2025-07-11 16:50:20,336 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744735_3911 replica FinalizedReplica, blk_1073744735_3911, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744735 for deletion 2025-07-11 16:50:20,337 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744735_3911 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744735 2025-07-11 16:53:23,811 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744738_3914 src: /192.168.158.8:56796 dest: /192.168.158.4:9866 2025-07-11 16:53:23,835 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:56796, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_971228523_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744738_3914, duration(ns): 18653557 2025-07-11 16:53:23,835 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744738_3914, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-11 16:53:29,346 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744738_3914 replica FinalizedReplica, blk_1073744738_3914, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744738 for deletion 2025-07-11 16:53:29,347 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744738_3914 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744738 2025-07-11 16:54:23,802 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744739_3915 src: /192.168.158.6:52336 dest: /192.168.158.4:9866 2025-07-11 16:54:23,825 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:52336, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1866205599_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744739_3915, duration(ns): 17446624 2025-07-11 16:54:23,826 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744739_3915, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-11 16:54:29,345 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744739_3915 replica FinalizedReplica, blk_1073744739_3915, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744739 for deletion 2025-07-11 16:54:29,347 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744739_3915 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744739 2025-07-11 16:55:23,781 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744740_3916 src: /192.168.158.5:45778 dest: /192.168.158.4:9866 2025-07-11 16:55:23,799 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:45778, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_539924996_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744740_3916, duration(ns): 15404674 2025-07-11 16:55:23,799 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744740_3916, type=LAST_IN_PIPELINE terminating 2025-07-11 16:55:29,347 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744740_3916 replica FinalizedReplica, blk_1073744740_3916, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744740 for deletion 2025-07-11 16:55:29,349 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744740_3916 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744740 2025-07-11 16:56:23,781 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744741_3917 src: /192.168.158.1:48582 dest: /192.168.158.4:9866 2025-07-11 16:56:23,811 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48582, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_241981669_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744741_3917, duration(ns): 21373381 2025-07-11 16:56:23,811 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744741_3917, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-11 16:56:26,348 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744741_3917 replica FinalizedReplica, blk_1073744741_3917, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744741 for deletion 2025-07-11 16:56:26,349 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744741_3917 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744741 2025-07-11 16:57:23,784 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744742_3918 src: /192.168.158.7:48286 dest: /192.168.158.4:9866 2025-07-11 16:57:23,808 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:48286, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_446291618_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744742_3918, duration(ns): 17876354 2025-07-11 16:57:23,808 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744742_3918, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-11 16:57:26,351 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744742_3918 replica FinalizedReplica, blk_1073744742_3918, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744742 for deletion 2025-07-11 16:57:26,352 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744742_3918 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744742 2025-07-11 16:58:23,798 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744743_3919 src: /192.168.158.7:56354 dest: /192.168.158.4:9866 2025-07-11 16:58:23,816 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:56354, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1501511983_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744743_3919, duration(ns): 15290528 2025-07-11 16:58:23,816 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744743_3919, type=LAST_IN_PIPELINE terminating 2025-07-11 16:58:26,351 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744743_3919 replica FinalizedReplica, blk_1073744743_3919, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744743 for deletion 2025-07-11 16:58:26,352 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744743_3919 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744743 2025-07-11 17:00:23,793 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744745_3921 src: /192.168.158.1:36874 dest: /192.168.158.4:9866 2025-07-11 17:00:23,824 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36874, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2107127377_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744745_3921, duration(ns): 22512096 2025-07-11 17:00:23,824 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744745_3921, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-11 17:00:29,352 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744745_3921 replica FinalizedReplica, blk_1073744745_3921, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744745 for deletion 2025-07-11 17:00:29,353 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744745_3921 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744745 2025-07-11 17:01:23,793 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744746_3922 src: /192.168.158.5:41672 dest: /192.168.158.4:9866 2025-07-11 17:01:23,818 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:41672, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-740125064_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744746_3922, duration(ns): 18244856 2025-07-11 17:01:23,818 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744746_3922, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-11 17:01:26,355 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744746_3922 replica FinalizedReplica, blk_1073744746_3922, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744746 for deletion 2025-07-11 17:01:26,356 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744746_3922 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744746 2025-07-11 17:04:23,791 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744749_3925 src: /192.168.158.1:51436 dest: /192.168.158.4:9866 2025-07-11 17:04:23,820 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51436, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-592359822_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744749_3925, duration(ns): 20191680 2025-07-11 17:04:23,820 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744749_3925, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-11 17:04:29,360 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744749_3925 replica FinalizedReplica, blk_1073744749_3925, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744749 for deletion 2025-07-11 17:04:29,361 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744749_3925 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744749 2025-07-11 17:07:23,802 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744752_3928 src: /192.168.158.1:38698 dest: /192.168.158.4:9866 2025-07-11 17:07:23,836 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38698, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1241755045_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744752_3928, duration(ns): 24365925 2025-07-11 17:07:23,836 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744752_3928, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-11 17:07:26,368 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744752_3928 replica FinalizedReplica, blk_1073744752_3928, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744752 for deletion 2025-07-11 17:07:26,369 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744752_3928 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744752 2025-07-11 17:09:23,814 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744754_3930 src: /192.168.158.1:42592 dest: /192.168.158.4:9866 2025-07-11 17:09:23,845 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42592, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-708109052_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744754_3930, duration(ns): 22230474 2025-07-11 17:09:23,845 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744754_3930, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-11 17:09:26,372 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744754_3930 replica FinalizedReplica, blk_1073744754_3930, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744754 for deletion 2025-07-11 17:09:26,373 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744754_3930 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744754 2025-07-11 17:12:23,823 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744757_3933 src: /192.168.158.9:40742 dest: /192.168.158.4:9866 2025-07-11 17:12:23,842 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:40742, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-827903189_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744757_3933, duration(ns): 16250566 2025-07-11 17:12:23,842 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744757_3933, type=LAST_IN_PIPELINE terminating 2025-07-11 17:12:26,387 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744757_3933 replica FinalizedReplica, blk_1073744757_3933, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744757 for deletion 2025-07-11 17:12:26,388 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744757_3933 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744757 2025-07-11 17:14:23,829 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744759_3935 src: /192.168.158.1:41716 dest: /192.168.158.4:9866 2025-07-11 17:14:23,861 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41716, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1035159905_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744759_3935, duration(ns): 24037765 2025-07-11 17:14:23,862 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744759_3935, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-11 17:14:29,392 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744759_3935 replica FinalizedReplica, blk_1073744759_3935, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744759 for deletion 2025-07-11 17:14:29,393 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744759_3935 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744759 2025-07-11 17:15:23,838 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744760_3936 src: /192.168.158.1:40790 dest: /192.168.158.4:9866 2025-07-11 17:15:23,870 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40790, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_571787997_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744760_3936, duration(ns): 23053823 2025-07-11 17:15:23,870 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744760_3936, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-11 17:15:29,393 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744760_3936 replica FinalizedReplica, blk_1073744760_3936, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744760 for deletion 2025-07-11 17:15:29,395 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744760_3936 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744760 2025-07-11 17:16:28,828 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744761_3937 src: /192.168.158.9:47614 dest: /192.168.158.4:9866 2025-07-11 17:16:28,853 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:47614, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1087397912_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744761_3937, duration(ns): 19731140 2025-07-11 17:16:28,854 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744761_3937, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-11 17:16:32,394 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744761_3937 replica FinalizedReplica, blk_1073744761_3937, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744761 for deletion 2025-07-11 17:16:32,395 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744761_3937 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744761 2025-07-11 17:19:28,821 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744764_3940 src: /192.168.158.1:40172 dest: /192.168.158.4:9866 2025-07-11 17:19:28,854 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40172, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1775881484_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744764_3940, duration(ns): 23090598 2025-07-11 17:19:28,855 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744764_3940, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-11 17:19:29,400 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744764_3940 replica FinalizedReplica, blk_1073744764_3940, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744764 for deletion 2025-07-11 17:19:29,401 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744764_3940 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744764 2025-07-11 17:20:28,833 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744765_3941 src: /192.168.158.5:58898 dest: /192.168.158.4:9866 2025-07-11 17:20:28,855 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:58898, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1969305467_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744765_3941, duration(ns): 17288981 2025-07-11 17:20:28,855 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744765_3941, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-11 17:20:32,401 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744765_3941 replica FinalizedReplica, blk_1073744765_3941, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744765 for deletion 2025-07-11 17:20:32,403 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744765_3941 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744765 2025-07-11 17:22:28,836 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744767_3943 src: /192.168.158.7:55262 dest: /192.168.158.4:9866 2025-07-11 17:22:28,855 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:55262, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_657307801_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744767_3943, duration(ns): 16619378 2025-07-11 17:22:28,855 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744767_3943, type=LAST_IN_PIPELINE terminating 2025-07-11 17:22:29,407 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744767_3943 replica FinalizedReplica, blk_1073744767_3943, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744767 for deletion 2025-07-11 17:22:29,408 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744767_3943 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744767 2025-07-11 17:24:33,841 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744769_3945 src: /192.168.158.8:49148 dest: /192.168.158.4:9866 2025-07-11 17:24:33,865 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:49148, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-309247885_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744769_3945, duration(ns): 18471300 2025-07-11 17:24:33,866 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744769_3945, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-11 17:24:35,411 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744769_3945 replica FinalizedReplica, blk_1073744769_3945, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744769 for deletion 2025-07-11 17:24:35,412 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744769_3945 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744769 2025-07-11 17:25:38,846 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744770_3946 src: /192.168.158.8:52002 dest: /192.168.158.4:9866 2025-07-11 17:25:38,866 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:52002, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_305977859_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744770_3946, duration(ns): 17701860 2025-07-11 17:25:38,866 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744770_3946, type=LAST_IN_PIPELINE terminating 2025-07-11 17:25:41,415 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744770_3946 replica FinalizedReplica, blk_1073744770_3946, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744770 for deletion 2025-07-11 17:25:41,416 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744770_3946 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744770 2025-07-11 17:27:43,848 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744772_3948 src: /192.168.158.7:50676 dest: /192.168.158.4:9866 2025-07-11 17:27:43,873 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:50676, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-404254581_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744772_3948, duration(ns): 18938322 2025-07-11 17:27:43,873 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744772_3948, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-11 17:27:44,421 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744772_3948 replica FinalizedReplica, blk_1073744772_3948, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744772 for deletion 2025-07-11 17:27:44,422 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744772_3948 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744772 2025-07-11 17:28:48,848 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744773_3949 src: /192.168.158.1:52258 dest: /192.168.158.4:9866 2025-07-11 17:28:48,875 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52258, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-76106376_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744773_3949, duration(ns): 18795999 2025-07-11 17:28:48,876 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744773_3949, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-11 17:28:50,425 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744773_3949 replica FinalizedReplica, blk_1073744773_3949, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744773 for deletion 2025-07-11 17:28:50,426 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744773_3949 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744773 2025-07-11 17:29:48,857 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744774_3950 src: /192.168.158.9:41790 dest: /192.168.158.4:9866 2025-07-11 17:29:48,874 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:41790, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1711932761_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744774_3950, duration(ns): 14645289 2025-07-11 17:29:48,874 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744774_3950, type=LAST_IN_PIPELINE terminating 2025-07-11 17:29:53,428 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744774_3950 replica FinalizedReplica, blk_1073744774_3950, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744774 for deletion 2025-07-11 17:29:53,429 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744774_3950 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744774 2025-07-11 17:30:53,864 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744775_3951 src: /192.168.158.9:35514 dest: /192.168.158.4:9866 2025-07-11 17:30:53,880 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:35514, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-16755967_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744775_3951, duration(ns): 12924056 2025-07-11 17:30:53,880 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744775_3951, type=LAST_IN_PIPELINE terminating 2025-07-11 17:30:56,430 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744775_3951 replica FinalizedReplica, blk_1073744775_3951, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744775 for deletion 2025-07-11 17:30:56,431 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744775_3951 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744775 2025-07-11 17:31:53,873 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744776_3952 src: /192.168.158.5:53416 dest: /192.168.158.4:9866 2025-07-11 17:31:53,899 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:53416, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_12023760_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744776_3952, duration(ns): 21745175 2025-07-11 17:31:53,900 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744776_3952, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-11 17:31:56,433 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744776_3952 replica FinalizedReplica, blk_1073744776_3952, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744776 for deletion 2025-07-11 17:31:56,434 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744776_3952 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744776 2025-07-11 17:32:58,864 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744777_3953 src: /192.168.158.1:41374 dest: /192.168.158.4:9866 2025-07-11 17:32:58,896 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41374, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-823147437_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744777_3953, duration(ns): 23166164 2025-07-11 17:32:58,896 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744777_3953, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-11 17:32:59,438 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744777_3953 replica FinalizedReplica, blk_1073744777_3953, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744777 for deletion 2025-07-11 17:32:59,439 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744777_3953 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744777 2025-07-11 17:34:03,875 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744778_3954 src: /192.168.158.7:42840 dest: /192.168.158.4:9866 2025-07-11 17:34:03,899 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:42840, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1491184327_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744778_3954, duration(ns): 19185899 2025-07-11 17:34:03,899 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744778_3954, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-11 17:34:08,439 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744778_3954 replica FinalizedReplica, blk_1073744778_3954, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744778 for deletion 2025-07-11 17:34:08,440 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744778_3954 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744778 2025-07-11 17:36:08,886 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744780_3956 src: /192.168.158.7:42946 dest: /192.168.158.4:9866 2025-07-11 17:36:08,903 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:42946, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1619252312_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744780_3956, duration(ns): 15183279 2025-07-11 17:36:08,904 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744780_3956, type=LAST_IN_PIPELINE terminating 2025-07-11 17:36:13,269 INFO org.apache.hadoop.hdfs.server.datanode.DirectoryScanner: BlockPool BP-1059995147-192.168.158.1-1752101929360 Total blocks: 9, missing metadata files:0, missing block files:0, missing blocks in memory:0, mismatched blocks:0 2025-07-11 17:36:14,445 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744780_3956 replica FinalizedReplica, blk_1073744780_3956, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744780 for deletion 2025-07-11 17:36:14,446 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744780_3956 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744780 2025-07-11 17:37:20,456 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Successfully sent block report 0x12cfd9a757d31f2d, containing 4 storage report(s), of which we sent 4. The reports had 8 total blocks and used 1 RPC(s). This took 0 msec to generate and 5 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5. 2025-07-11 17:37:20,456 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Got finalize command for block pool BP-1059995147-192.168.158.1-1752101929360 2025-07-11 17:41:13,889 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744785_3961 src: /192.168.158.8:44808 dest: /192.168.158.4:9866 2025-07-11 17:41:13,914 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:44808, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1729161261_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744785_3961, duration(ns): 18758560 2025-07-11 17:41:13,914 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744785_3961, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-11 17:41:14,460 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744785_3961 replica FinalizedReplica, blk_1073744785_3961, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744785 for deletion 2025-07-11 17:41:14,461 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744785_3961 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744785 2025-07-11 17:45:13,932 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744789_3965 src: /192.168.158.9:34326 dest: /192.168.158.4:9866 2025-07-11 17:45:13,950 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:34326, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1520667635_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744789_3965, duration(ns): 16474750 2025-07-11 17:45:13,951 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744789_3965, type=LAST_IN_PIPELINE terminating 2025-07-11 17:45:14,471 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744789_3965 replica FinalizedReplica, blk_1073744789_3965, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744789 for deletion 2025-07-11 17:45:14,472 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744789_3965 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744789 2025-07-11 17:46:13,907 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744790_3966 src: /192.168.158.6:49360 dest: /192.168.158.4:9866 2025-07-11 17:46:13,926 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:49360, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_816333266_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744790_3966, duration(ns): 16669174 2025-07-11 17:46:13,926 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744790_3966, type=LAST_IN_PIPELINE terminating 2025-07-11 17:46:17,471 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744790_3966 replica FinalizedReplica, blk_1073744790_3966, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744790 for deletion 2025-07-11 17:46:17,473 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744790_3966 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744790 2025-07-11 17:47:13,887 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744791_3967 src: /192.168.158.1:47006 dest: /192.168.158.4:9866 2025-07-11 17:47:13,917 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47006, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1682294613_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744791_3967, duration(ns): 20795156 2025-07-11 17:47:13,917 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744791_3967, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-11 17:47:14,474 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744791_3967 replica FinalizedReplica, blk_1073744791_3967, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744791 for deletion 2025-07-11 17:47:14,475 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744791_3967 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744791 2025-07-11 17:52:28,936 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744796_3972 src: /192.168.158.5:44610 dest: /192.168.158.4:9866 2025-07-11 17:52:28,953 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:44610, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-937364238_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744796_3972, duration(ns): 14913716 2025-07-11 17:52:28,954 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744796_3972, type=LAST_IN_PIPELINE terminating 2025-07-11 17:52:32,484 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744796_3972 replica FinalizedReplica, blk_1073744796_3972, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744796 for deletion 2025-07-11 17:52:32,485 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744796_3972 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744796 2025-07-11 17:53:28,898 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744797_3973 src: /192.168.158.1:54410 dest: /192.168.158.4:9866 2025-07-11 17:53:28,931 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54410, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1043122269_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744797_3973, duration(ns): 24849241 2025-07-11 17:53:28,932 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744797_3973, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-11 17:53:29,485 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744797_3973 replica FinalizedReplica, blk_1073744797_3973, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744797 for deletion 2025-07-11 17:53:29,486 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744797_3973 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744797 2025-07-11 17:54:33,900 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744798_3974 src: /192.168.158.1:58590 dest: /192.168.158.4:9866 2025-07-11 17:54:33,931 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58590, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1220251821_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744798_3974, duration(ns): 22008569 2025-07-11 17:54:33,933 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744798_3974, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-11 17:54:35,486 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744798_3974 replica FinalizedReplica, blk_1073744798_3974, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744798 for deletion 2025-07-11 17:54:35,487 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744798_3974 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744798 2025-07-11 17:57:38,900 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744801_3977 src: /192.168.158.1:38448 dest: /192.168.158.4:9866 2025-07-11 17:57:38,936 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38448, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-783435166_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744801_3977, duration(ns): 26132169 2025-07-11 17:57:38,937 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744801_3977, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-11 17:57:44,499 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744801_3977 replica FinalizedReplica, blk_1073744801_3977, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744801 for deletion 2025-07-11 17:57:44,500 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744801_3977 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744801 2025-07-11 17:59:38,907 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744803_3979 src: /192.168.158.1:47088 dest: /192.168.158.4:9866 2025-07-11 17:59:38,938 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47088, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-964089847_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744803_3979, duration(ns): 22655304 2025-07-11 17:59:38,939 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744803_3979, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-11 17:59:41,503 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744803_3979 replica FinalizedReplica, blk_1073744803_3979, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744803 for deletion 2025-07-11 17:59:41,504 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744803_3979 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744803 2025-07-11 18:00:38,926 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744804_3980 src: /192.168.158.7:51742 dest: /192.168.158.4:9866 2025-07-11 18:00:38,943 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:51742, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-487505916_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744804_3980, duration(ns): 15716834 2025-07-11 18:00:38,944 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744804_3980, type=LAST_IN_PIPELINE terminating 2025-07-11 18:00:44,505 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744804_3980 replica FinalizedReplica, blk_1073744804_3980, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744804 for deletion 2025-07-11 18:00:44,506 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744804_3980 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744804 2025-07-11 18:01:38,909 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744805_3981 src: /192.168.158.1:49010 dest: /192.168.158.4:9866 2025-07-11 18:01:38,939 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49010, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_807358769_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744805_3981, duration(ns): 20689468 2025-07-11 18:01:38,940 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744805_3981, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-11 18:01:44,508 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744805_3981 replica FinalizedReplica, blk_1073744805_3981, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744805 for deletion 2025-07-11 18:01:44,509 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744805_3981 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744805 2025-07-11 18:07:48,917 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744811_3987 src: /192.168.158.8:53826 dest: /192.168.158.4:9866 2025-07-11 18:07:48,934 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:53826, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1337022865_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744811_3987, duration(ns): 14814991 2025-07-11 18:07:48,934 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744811_3987, type=LAST_IN_PIPELINE terminating 2025-07-11 18:07:53,520 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744811_3987 replica FinalizedReplica, blk_1073744811_3987, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744811 for deletion 2025-07-11 18:07:53,521 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744811_3987 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744811 2025-07-11 18:10:53,918 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744814_3990 src: /192.168.158.1:46336 dest: /192.168.158.4:9866 2025-07-11 18:10:53,945 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46336, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2108301488_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744814_3990, duration(ns): 18580871 2025-07-11 18:10:53,946 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744814_3990, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-11 18:10:56,531 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744814_3990 replica FinalizedReplica, blk_1073744814_3990, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744814 for deletion 2025-07-11 18:10:56,532 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744814_3990 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744814 2025-07-11 18:14:58,931 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744818_3994 src: /192.168.158.8:44524 dest: /192.168.158.4:9866 2025-07-11 18:14:58,953 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:44524, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-380014940_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744818_3994, duration(ns): 16777486 2025-07-11 18:14:58,953 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744818_3994, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-11 18:15:02,544 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744818_3994 replica FinalizedReplica, blk_1073744818_3994, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744818 for deletion 2025-07-11 18:15:02,545 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744818_3994 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744818 2025-07-11 18:17:03,925 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744820_3996 src: /192.168.158.9:47732 dest: /192.168.158.4:9866 2025-07-11 18:17:03,948 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:47732, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-68301495_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744820_3996, duration(ns): 17325576 2025-07-11 18:17:03,948 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744820_3996, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-11 18:17:05,549 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744820_3996 replica FinalizedReplica, blk_1073744820_3996, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744820 for deletion 2025-07-11 18:17:05,550 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744820_3996 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744820 2025-07-11 18:18:03,927 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744821_3997 src: /192.168.158.1:57782 dest: /192.168.158.4:9866 2025-07-11 18:18:03,956 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57782, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_200520093_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744821_3997, duration(ns): 20560148 2025-07-11 18:18:03,957 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744821_3997, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-11 18:18:08,552 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744821_3997 replica FinalizedReplica, blk_1073744821_3997, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744821 for deletion 2025-07-11 18:18:08,553 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744821_3997 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744821 2025-07-11 18:20:03,936 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744823_3999 src: /192.168.158.1:38794 dest: /192.168.158.4:9866 2025-07-11 18:20:03,966 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38794, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-844369427_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744823_3999, duration(ns): 21298728 2025-07-11 18:20:03,967 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744823_3999, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-11 18:20:08,557 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744823_3999 replica FinalizedReplica, blk_1073744823_3999, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744823 for deletion 2025-07-11 18:20:08,558 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744823_3999 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744823 2025-07-11 18:23:13,939 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744826_4002 src: /192.168.158.1:52066 dest: /192.168.158.4:9866 2025-07-11 18:23:13,969 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52066, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1176382240_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744826_4002, duration(ns): 21672166 2025-07-11 18:23:13,969 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744826_4002, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-11 18:23:14,567 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744826_4002 replica FinalizedReplica, blk_1073744826_4002, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744826 for deletion 2025-07-11 18:23:14,569 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744826_4002 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744826 2025-07-11 18:24:18,943 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744827_4003 src: /192.168.158.8:55776 dest: /192.168.158.4:9866 2025-07-11 18:24:18,969 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:55776, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2018471863_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744827_4003, duration(ns): 19986092 2025-07-11 18:24:18,971 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744827_4003, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-11 18:24:23,574 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744827_4003 replica FinalizedReplica, blk_1073744827_4003, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744827 for deletion 2025-07-11 18:24:23,575 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744827_4003 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744827 2025-07-11 18:25:23,942 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744828_4004 src: /192.168.158.8:35956 dest: /192.168.158.4:9866 2025-07-11 18:25:23,967 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:35956, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1055060203_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744828_4004, duration(ns): 19323692 2025-07-11 18:25:23,968 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744828_4004, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-11 18:25:26,575 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744828_4004 replica FinalizedReplica, blk_1073744828_4004, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744828 for deletion 2025-07-11 18:25:26,576 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744828_4004 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744828 2025-07-11 18:26:23,958 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744829_4005 src: /192.168.158.8:44188 dest: /192.168.158.4:9866 2025-07-11 18:26:23,982 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:44188, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1688091889_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744829_4005, duration(ns): 18611619 2025-07-11 18:26:23,982 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744829_4005, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-11 18:26:26,579 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744829_4005 replica FinalizedReplica, blk_1073744829_4005, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744829 for deletion 2025-07-11 18:26:26,580 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744829_4005 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744829 2025-07-11 18:27:23,946 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744830_4006 src: /192.168.158.7:53762 dest: /192.168.158.4:9866 2025-07-11 18:27:23,971 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:53762, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_758496303_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744830_4006, duration(ns): 19423916 2025-07-11 18:27:23,971 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744830_4006, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-11 18:27:26,581 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744830_4006 replica FinalizedReplica, blk_1073744830_4006, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744830 for deletion 2025-07-11 18:27:26,583 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744830_4006 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744830 2025-07-11 18:30:33,948 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744833_4009 src: /192.168.158.5:54424 dest: /192.168.158.4:9866 2025-07-11 18:30:33,970 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:54424, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1752367976_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744833_4009, duration(ns): 17250617 2025-07-11 18:30:33,971 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744833_4009, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-11 18:30:35,592 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744833_4009 replica FinalizedReplica, blk_1073744833_4009, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744833 for deletion 2025-07-11 18:30:35,593 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744833_4009 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744833 2025-07-11 18:31:33,949 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744834_4010 src: /192.168.158.1:48336 dest: /192.168.158.4:9866 2025-07-11 18:31:33,982 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48336, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1595564249_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744834_4010, duration(ns): 24007262 2025-07-11 18:31:33,983 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744834_4010, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-11 18:31:35,592 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744834_4010 replica FinalizedReplica, blk_1073744834_4010, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744834 for deletion 2025-07-11 18:31:35,594 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744834_4010 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744834 2025-07-11 18:35:38,968 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744838_4014 src: /192.168.158.5:44434 dest: /192.168.158.4:9866 2025-07-11 18:35:38,986 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:44434, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1450045071_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744838_4014, duration(ns): 15459434 2025-07-11 18:35:38,986 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744838_4014, type=LAST_IN_PIPELINE terminating 2025-07-11 18:35:41,602 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744838_4014 replica FinalizedReplica, blk_1073744838_4014, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744838 for deletion 2025-07-11 18:35:41,603 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744838_4014 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744838 2025-07-11 18:36:43,965 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744839_4015 src: /192.168.158.9:52254 dest: /192.168.158.4:9866 2025-07-11 18:36:43,991 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:52254, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-260592589_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744839_4015, duration(ns): 21217925 2025-07-11 18:36:43,992 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744839_4015, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-11 18:36:47,607 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744839_4015 replica FinalizedReplica, blk_1073744839_4015, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744839 for deletion 2025-07-11 18:36:47,608 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744839_4015 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744839 2025-07-11 18:37:43,968 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744840_4016 src: /192.168.158.9:55204 dest: /192.168.158.4:9866 2025-07-11 18:37:43,985 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:55204, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1304658090_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744840_4016, duration(ns): 14460981 2025-07-11 18:37:43,985 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744840_4016, type=LAST_IN_PIPELINE terminating 2025-07-11 18:37:44,609 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744840_4016 replica FinalizedReplica, blk_1073744840_4016, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744840 for deletion 2025-07-11 18:37:44,610 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744840_4016 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744840 2025-07-11 18:44:53,978 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744847_4023 src: /192.168.158.1:50032 dest: /192.168.158.4:9866 2025-07-11 18:44:54,010 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50032, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_987769811_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744847_4023, duration(ns): 23411513 2025-07-11 18:44:54,011 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744847_4023, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-11 18:44:59,626 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744847_4023 replica FinalizedReplica, blk_1073744847_4023, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744847 for deletion 2025-07-11 18:44:59,627 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744847_4023 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744847 2025-07-11 18:47:53,989 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744850_4026 src: /192.168.158.1:36356 dest: /192.168.158.4:9866 2025-07-11 18:47:54,028 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36356, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-430345243_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744850_4026, duration(ns): 30699867 2025-07-11 18:47:54,029 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744850_4026, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-11 18:47:56,634 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744850_4026 replica FinalizedReplica, blk_1073744850_4026, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744850 for deletion 2025-07-11 18:47:56,636 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744850_4026 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744850 2025-07-11 18:48:54,020 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744851_4027 src: /192.168.158.7:46678 dest: /192.168.158.4:9866 2025-07-11 18:48:54,045 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:46678, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-595841980_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744851_4027, duration(ns): 19413239 2025-07-11 18:48:54,046 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744851_4027, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-11 18:48:56,635 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744851_4027 replica FinalizedReplica, blk_1073744851_4027, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744851 for deletion 2025-07-11 18:48:56,636 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744851_4027 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744851 2025-07-11 18:51:54,042 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744854_4030 src: /192.168.158.6:54572 dest: /192.168.158.4:9866 2025-07-11 18:51:54,067 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:54572, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1216456531_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744854_4030, duration(ns): 19505914 2025-07-11 18:51:54,067 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744854_4030, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-11 18:51:56,644 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744854_4030 replica FinalizedReplica, blk_1073744854_4030, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744854 for deletion 2025-07-11 18:51:56,645 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744854_4030 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744854 2025-07-11 18:54:59,008 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744857_4033 src: /192.168.158.8:51040 dest: /192.168.158.4:9866 2025-07-11 18:54:59,025 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:51040, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-49490594_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744857_4033, duration(ns): 15308929 2025-07-11 18:54:59,025 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744857_4033, type=LAST_IN_PIPELINE terminating 2025-07-11 18:55:02,653 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744857_4033 replica FinalizedReplica, blk_1073744857_4033, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744857 for deletion 2025-07-11 18:55:02,654 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744857_4033 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744857 2025-07-11 18:55:59,000 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744858_4034 src: /192.168.158.1:53280 dest: /192.168.158.4:9866 2025-07-11 18:55:59,031 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53280, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1950538304_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744858_4034, duration(ns): 22330877 2025-07-11 18:55:59,031 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744858_4034, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-11 18:55:59,657 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744858_4034 replica FinalizedReplica, blk_1073744858_4034, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744858 for deletion 2025-07-11 18:55:59,658 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744858_4034 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744858 2025-07-11 18:59:04,016 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744861_4037 src: /192.168.158.9:41724 dest: /192.168.158.4:9866 2025-07-11 18:59:04,033 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:41724, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_497830634_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744861_4037, duration(ns): 15510586 2025-07-11 18:59:04,034 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744861_4037, type=LAST_IN_PIPELINE terminating 2025-07-11 18:59:05,664 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744861_4037 replica FinalizedReplica, blk_1073744861_4037, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744861 for deletion 2025-07-11 18:59:05,665 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744861_4037 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744861 2025-07-11 19:01:04,015 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744863_4039 src: /192.168.158.1:57154 dest: /192.168.158.4:9866 2025-07-11 19:01:04,046 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57154, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-959817027_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744863_4039, duration(ns): 22609618 2025-07-11 19:01:04,046 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744863_4039, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-11 19:01:05,670 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744863_4039 replica FinalizedReplica, blk_1073744863_4039, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744863 for deletion 2025-07-11 19:01:05,672 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744863_4039 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744863 2025-07-11 19:02:04,006 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744864_4040 src: /192.168.158.1:55468 dest: /192.168.158.4:9866 2025-07-11 19:02:04,035 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55468, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1054779281_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744864_4040, duration(ns): 20093859 2025-07-11 19:02:04,035 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744864_4040, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-11 19:02:05,672 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744864_4040 replica FinalizedReplica, blk_1073744864_4040, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744864 for deletion 2025-07-11 19:02:05,674 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744864_4040 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744864 2025-07-11 19:04:04,045 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744866_4042 src: /192.168.158.5:58208 dest: /192.168.158.4:9866 2025-07-11 19:04:04,068 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:58208, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1684776242_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744866_4042, duration(ns): 18673758 2025-07-11 19:04:04,068 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744866_4042, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-11 19:04:05,675 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744866_4042 replica FinalizedReplica, blk_1073744866_4042, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744866 for deletion 2025-07-11 19:04:05,677 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744866_4042 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744866 2025-07-11 19:05:04,025 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744867_4043 src: /192.168.158.6:45040 dest: /192.168.158.4:9866 2025-07-11 19:05:04,046 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:45040, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1224091180_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744867_4043, duration(ns): 19003480 2025-07-11 19:05:04,046 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744867_4043, type=LAST_IN_PIPELINE terminating 2025-07-11 19:05:05,677 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744867_4043 replica FinalizedReplica, blk_1073744867_4043, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744867 for deletion 2025-07-11 19:05:05,679 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744867_4043 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744867 2025-07-11 19:07:04,031 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744869_4045 src: /192.168.158.8:43346 dest: /192.168.158.4:9866 2025-07-11 19:07:04,058 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:43346, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2008611618_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744869_4045, duration(ns): 22097219 2025-07-11 19:07:04,059 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744869_4045, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-11 19:07:05,683 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744869_4045 replica FinalizedReplica, blk_1073744869_4045, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744869 for deletion 2025-07-11 19:07:05,684 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744869_4045 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744869 2025-07-11 19:09:04,054 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744871_4047 src: /192.168.158.5:53148 dest: /192.168.158.4:9866 2025-07-11 19:09:04,072 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:53148, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-883641708_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744871_4047, duration(ns): 15396851 2025-07-11 19:09:04,072 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744871_4047, type=LAST_IN_PIPELINE terminating 2025-07-11 19:09:05,692 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744871_4047 replica FinalizedReplica, blk_1073744871_4047, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744871 for deletion 2025-07-11 19:09:05,693 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744871_4047 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744871 2025-07-11 19:12:14,031 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744874_4050 src: /192.168.158.1:45996 dest: /192.168.158.4:9866 2025-07-11 19:12:14,061 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45996, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1893687361_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744874_4050, duration(ns): 21383556 2025-07-11 19:12:14,063 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744874_4050, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-11 19:12:14,702 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744874_4050 replica FinalizedReplica, blk_1073744874_4050, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744874 for deletion 2025-07-11 19:12:14,703 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744874_4050 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744874 2025-07-11 19:17:14,061 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744879_4055 src: /192.168.158.5:51964 dest: /192.168.158.4:9866 2025-07-11 19:17:14,088 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:51964, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-380694800_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744879_4055, duration(ns): 21375161 2025-07-11 19:17:14,089 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744879_4055, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-11 19:17:14,714 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744879_4055 replica FinalizedReplica, blk_1073744879_4055, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744879 for deletion 2025-07-11 19:17:14,715 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744879_4055 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744879 2025-07-11 19:21:24,062 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744883_4059 src: /192.168.158.9:49190 dest: /192.168.158.4:9866 2025-07-11 19:21:24,085 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:49190, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_464453820_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744883_4059, duration(ns): 17239238 2025-07-11 19:21:24,085 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744883_4059, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-11 19:21:26,724 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744883_4059 replica FinalizedReplica, blk_1073744883_4059, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744883 for deletion 2025-07-11 19:21:26,725 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744883_4059 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744883 2025-07-11 19:23:24,056 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744885_4061 src: /192.168.158.8:58124 dest: /192.168.158.4:9866 2025-07-11 19:23:24,075 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:58124, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-338813333_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744885_4061, duration(ns): 17675351 2025-07-11 19:23:24,076 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744885_4061, type=LAST_IN_PIPELINE terminating 2025-07-11 19:23:29,730 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744885_4061 replica FinalizedReplica, blk_1073744885_4061, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744885 for deletion 2025-07-11 19:23:29,731 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744885_4061 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744885 2025-07-11 19:24:24,068 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744886_4062 src: /192.168.158.6:32912 dest: /192.168.158.4:9866 2025-07-11 19:24:24,091 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:32912, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2062209832_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744886_4062, duration(ns): 17464503 2025-07-11 19:24:24,091 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744886_4062, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-11 19:24:29,734 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744886_4062 replica FinalizedReplica, blk_1073744886_4062, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744886 for deletion 2025-07-11 19:24:29,735 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744886_4062 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744886 2025-07-11 19:25:24,073 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744887_4063 src: /192.168.158.6:37664 dest: /192.168.158.4:9866 2025-07-11 19:25:24,092 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:37664, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1735799492_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744887_4063, duration(ns): 16351310 2025-07-11 19:25:24,092 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744887_4063, type=LAST_IN_PIPELINE terminating 2025-07-11 19:25:29,737 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744887_4063 replica FinalizedReplica, blk_1073744887_4063, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744887 for deletion 2025-07-11 19:25:29,738 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744887_4063 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744887 2025-07-11 19:26:24,065 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744888_4064 src: /192.168.158.5:52114 dest: /192.168.158.4:9866 2025-07-11 19:26:24,089 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:52114, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_822148758_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744888_4064, duration(ns): 18924102 2025-07-11 19:26:24,089 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744888_4064, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-11 19:26:26,738 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744888_4064 replica FinalizedReplica, blk_1073744888_4064, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744888 for deletion 2025-07-11 19:26:26,739 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744888_4064 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744888 2025-07-11 19:28:24,070 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744890_4066 src: /192.168.158.1:51940 dest: /192.168.158.4:9866 2025-07-11 19:28:24,107 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51940, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1579797465_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744890_4066, duration(ns): 26816668 2025-07-11 19:28:24,107 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744890_4066, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-11 19:28:29,742 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744890_4066 replica FinalizedReplica, blk_1073744890_4066, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744890 for deletion 2025-07-11 19:28:29,743 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744890_4066 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744890 2025-07-11 19:29:24,083 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744891_4067 src: /192.168.158.5:34364 dest: /192.168.158.4:9866 2025-07-11 19:29:24,109 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:34364, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_629946015_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744891_4067, duration(ns): 20390554 2025-07-11 19:29:24,110 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744891_4067, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-11 19:29:26,742 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744891_4067 replica FinalizedReplica, blk_1073744891_4067, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744891 for deletion 2025-07-11 19:29:26,743 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744891_4067 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744891 2025-07-11 19:30:24,074 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744892_4068 src: /192.168.158.8:45508 dest: /192.168.158.4:9866 2025-07-11 19:30:24,099 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:45508, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1390126968_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744892_4068, duration(ns): 19610666 2025-07-11 19:30:24,099 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744892_4068, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-11 19:30:29,744 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744892_4068 replica FinalizedReplica, blk_1073744892_4068, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744892 for deletion 2025-07-11 19:30:29,746 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744892_4068 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744892 2025-07-11 19:31:24,058 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744893_4069 src: /192.168.158.1:50308 dest: /192.168.158.4:9866 2025-07-11 19:31:24,087 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50308, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1230162465_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744893_4069, duration(ns): 20257439 2025-07-11 19:31:24,087 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744893_4069, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-11 19:31:26,748 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744893_4069 replica FinalizedReplica, blk_1073744893_4069, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744893 for deletion 2025-07-11 19:31:26,750 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744893_4069 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744893 2025-07-11 19:33:24,063 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744895_4071 src: /192.168.158.8:41924 dest: /192.168.158.4:9866 2025-07-11 19:33:24,080 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:41924, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1901674254_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744895_4071, duration(ns): 14628697 2025-07-11 19:33:24,080 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744895_4071, type=LAST_IN_PIPELINE terminating 2025-07-11 19:33:29,753 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744895_4071 replica FinalizedReplica, blk_1073744895_4071, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744895 for deletion 2025-07-11 19:33:29,754 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744895_4071 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073744895 2025-07-11 19:34:29,066 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744896_4072 src: /192.168.158.9:51360 dest: /192.168.158.4:9866 2025-07-11 19:34:29,091 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:51360, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1667384279_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744896_4072, duration(ns): 19601575 2025-07-11 19:34:29,091 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744896_4072, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-11 19:34:32,755 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744896_4072 replica FinalizedReplica, blk_1073744896_4072, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744896 for deletion 2025-07-11 19:34:32,756 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744896_4072 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744896 2025-07-11 19:35:29,074 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744897_4073 src: /192.168.158.6:39222 dest: /192.168.158.4:9866 2025-07-11 19:35:29,092 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:39222, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_138218342_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744897_4073, duration(ns): 15838210 2025-07-11 19:35:29,092 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744897_4073, type=LAST_IN_PIPELINE terminating 2025-07-11 19:35:29,757 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744897_4073 replica FinalizedReplica, blk_1073744897_4073, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744897 for deletion 2025-07-11 19:35:29,758 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744897_4073 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744897 2025-07-11 19:37:34,045 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744899_4075 src: /192.168.158.1:59794 dest: /192.168.158.4:9866 2025-07-11 19:37:34,076 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59794, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1751203939_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744899_4075, duration(ns): 22551701 2025-07-11 19:37:34,077 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744899_4075, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-11 19:37:35,761 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744899_4075 replica FinalizedReplica, blk_1073744899_4075, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744899 for deletion 2025-07-11 19:37:35,763 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744899_4075 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744899 2025-07-11 19:42:44,084 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744904_4080 src: /192.168.158.1:56690 dest: /192.168.158.4:9866 2025-07-11 19:42:44,119 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56690, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1588574791_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744904_4080, duration(ns): 25235713 2025-07-11 19:42:44,119 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744904_4080, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-11 19:42:44,778 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744904_4080 replica FinalizedReplica, blk_1073744904_4080, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744904 for deletion 2025-07-11 19:42:44,779 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744904_4080 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744904 2025-07-11 19:43:44,097 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744905_4081 src: /192.168.158.7:55220 dest: /192.168.158.4:9866 2025-07-11 19:43:44,115 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:55220, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1062448628_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744905_4081, duration(ns): 15721390 2025-07-11 19:43:44,116 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744905_4081, type=LAST_IN_PIPELINE terminating 2025-07-11 19:43:44,780 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744905_4081 replica FinalizedReplica, blk_1073744905_4081, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744905 for deletion 2025-07-11 19:43:44,781 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744905_4081 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744905 2025-07-11 19:45:44,111 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744907_4083 src: /192.168.158.6:39320 dest: /192.168.158.4:9866 2025-07-11 19:45:44,129 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:39320, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1513263614_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744907_4083, duration(ns): 15636299 2025-07-11 19:45:44,129 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744907_4083, type=LAST_IN_PIPELINE terminating 2025-07-11 19:45:44,789 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744907_4083 replica FinalizedReplica, blk_1073744907_4083, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744907 for deletion 2025-07-11 19:45:44,790 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744907_4083 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744907 2025-07-11 19:46:44,090 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744908_4084 src: /192.168.158.5:54828 dest: /192.168.158.4:9866 2025-07-11 19:46:44,115 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:54828, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1435601365_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744908_4084, duration(ns): 19517587 2025-07-11 19:46:44,115 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744908_4084, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-11 19:46:47,792 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744908_4084 replica FinalizedReplica, blk_1073744908_4084, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744908 for deletion 2025-07-11 19:46:47,793 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744908_4084 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744908 2025-07-11 19:47:44,087 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744909_4085 src: /192.168.158.7:54474 dest: /192.168.158.4:9866 2025-07-11 19:47:44,112 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:54474, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-314296114_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744909_4085, duration(ns): 17897417 2025-07-11 19:47:44,112 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744909_4085, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-11 19:47:44,796 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744909_4085 replica FinalizedReplica, blk_1073744909_4085, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744909 for deletion 2025-07-11 19:47:44,797 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744909_4085 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744909 2025-07-11 19:48:44,093 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744910_4086 src: /192.168.158.7:55264 dest: /192.168.158.4:9866 2025-07-11 19:48:44,117 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:55264, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-271990004_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744910_4086, duration(ns): 19342709 2025-07-11 19:48:44,118 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744910_4086, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-11 19:48:44,800 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744910_4086 replica FinalizedReplica, blk_1073744910_4086, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744910 for deletion 2025-07-11 19:48:44,801 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744910_4086 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744910 2025-07-11 19:49:44,098 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744911_4087 src: /192.168.158.8:58956 dest: /192.168.158.4:9866 2025-07-11 19:49:44,115 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:58956, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1007150508_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744911_4087, duration(ns): 14638600 2025-07-11 19:49:44,115 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744911_4087, type=LAST_IN_PIPELINE terminating 2025-07-11 19:49:47,802 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744911_4087 replica FinalizedReplica, blk_1073744911_4087, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744911 for deletion 2025-07-11 19:49:47,803 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744911_4087 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744911 2025-07-11 19:52:49,105 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744914_4090 src: /192.168.158.6:53582 dest: /192.168.158.4:9866 2025-07-11 19:52:49,123 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:53582, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2039741789_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744914_4090, duration(ns): 15964347 2025-07-11 19:52:49,123 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744914_4090, type=LAST_IN_PIPELINE terminating 2025-07-11 19:52:53,810 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744914_4090 replica FinalizedReplica, blk_1073744914_4090, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744914 for deletion 2025-07-11 19:52:53,811 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744914_4090 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744914 2025-07-11 19:54:54,127 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744916_4092 src: /192.168.158.6:59386 dest: /192.168.158.4:9866 2025-07-11 19:54:54,152 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:59386, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_543705570_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744916_4092, duration(ns): 19456834 2025-07-11 19:54:54,152 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744916_4092, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-11 19:54:56,814 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744916_4092 replica FinalizedReplica, blk_1073744916_4092, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744916 for deletion 2025-07-11 19:54:56,815 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744916_4092 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744916 2025-07-11 19:59:59,116 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744921_4097 src: /192.168.158.1:34272 dest: /192.168.158.4:9866 2025-07-11 19:59:59,147 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34272, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-341976237_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744921_4097, duration(ns): 22892804 2025-07-11 19:59:59,148 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744921_4097, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-11 19:59:59,827 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744921_4097 replica FinalizedReplica, blk_1073744921_4097, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744921 for deletion 2025-07-11 19:59:59,828 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744921_4097 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744921 2025-07-11 20:01:04,127 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744922_4098 src: /192.168.158.8:45632 dest: /192.168.158.4:9866 2025-07-11 20:01:04,151 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:45632, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-801067544_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744922_4098, duration(ns): 19485328 2025-07-11 20:01:04,152 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744922_4098, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-11 20:01:05,829 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744922_4098 replica FinalizedReplica, blk_1073744922_4098, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744922 for deletion 2025-07-11 20:01:05,830 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744922_4098 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744922 2025-07-11 20:03:09,118 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744924_4100 src: /192.168.158.9:58364 dest: /192.168.158.4:9866 2025-07-11 20:03:09,136 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:58364, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_112012570_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744924_4100, duration(ns): 15762198 2025-07-11 20:03:09,136 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744924_4100, type=LAST_IN_PIPELINE terminating 2025-07-11 20:03:11,830 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744924_4100 replica FinalizedReplica, blk_1073744924_4100, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744924 for deletion 2025-07-11 20:03:11,832 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744924_4100 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744924 2025-07-11 20:04:09,119 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744925_4101 src: /192.168.158.9:39844 dest: /192.168.158.4:9866 2025-07-11 20:04:09,138 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:39844, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-875432138_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744925_4101, duration(ns): 16322716 2025-07-11 20:04:09,138 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744925_4101, type=LAST_IN_PIPELINE terminating 2025-07-11 20:04:11,835 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744925_4101 replica FinalizedReplica, blk_1073744925_4101, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744925 for deletion 2025-07-11 20:04:11,837 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744925_4101 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744925 2025-07-11 20:05:09,135 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744926_4102 src: /192.168.158.8:39022 dest: /192.168.158.4:9866 2025-07-11 20:05:09,160 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:39022, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-158696410_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744926_4102, duration(ns): 19867880 2025-07-11 20:05:09,160 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744926_4102, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-11 20:05:14,840 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744926_4102 replica FinalizedReplica, blk_1073744926_4102, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744926 for deletion 2025-07-11 20:05:14,841 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744926_4102 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744926 2025-07-11 20:06:09,130 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744927_4103 src: /192.168.158.1:49418 dest: /192.168.158.4:9866 2025-07-11 20:06:09,162 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49418, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_565874500_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744927_4103, duration(ns): 23325335 2025-07-11 20:06:09,162 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744927_4103, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-11 20:06:11,843 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744927_4103 replica FinalizedReplica, blk_1073744927_4103, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744927 for deletion 2025-07-11 20:06:11,844 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744927_4103 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744927 2025-07-11 20:07:09,138 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744928_4104 src: /192.168.158.5:55774 dest: /192.168.158.4:9866 2025-07-11 20:07:09,155 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:55774, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-204066266_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744928_4104, duration(ns): 15440743 2025-07-11 20:07:09,156 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744928_4104, type=LAST_IN_PIPELINE terminating 2025-07-11 20:07:11,846 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744928_4104 replica FinalizedReplica, blk_1073744928_4104, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744928 for deletion 2025-07-11 20:07:11,847 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744928_4104 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744928 2025-07-11 20:08:14,149 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744929_4105 src: /192.168.158.8:48276 dest: /192.168.158.4:9866 2025-07-11 20:08:14,167 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:48276, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1825110706_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744929_4105, duration(ns): 15639692 2025-07-11 20:08:14,167 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744929_4105, type=LAST_IN_PIPELINE terminating 2025-07-11 20:08:14,849 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744929_4105 replica FinalizedReplica, blk_1073744929_4105, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744929 for deletion 2025-07-11 20:08:14,850 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744929_4105 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744929 2025-07-11 20:09:14,148 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744930_4106 src: /192.168.158.5:54294 dest: /192.168.158.4:9866 2025-07-11 20:09:14,174 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:54294, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-395780797_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744930_4106, duration(ns): 20791427 2025-07-11 20:09:14,174 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744930_4106, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-11 20:09:17,850 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744930_4106 replica FinalizedReplica, blk_1073744930_4106, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744930 for deletion 2025-07-11 20:09:17,851 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744930_4106 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744930 2025-07-11 20:10:14,153 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744931_4107 src: /192.168.158.1:38654 dest: /192.168.158.4:9866 2025-07-11 20:10:14,183 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38654, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_552069555_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744931_4107, duration(ns): 21861600 2025-07-11 20:10:14,184 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744931_4107, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-11 20:10:14,855 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744931_4107 replica FinalizedReplica, blk_1073744931_4107, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744931 for deletion 2025-07-11 20:10:14,856 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744931_4107 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744931 2025-07-11 20:12:14,146 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744933_4109 src: /192.168.158.8:42510 dest: /192.168.158.4:9866 2025-07-11 20:12:14,163 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:42510, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1823671095_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744933_4109, duration(ns): 14744524 2025-07-11 20:12:14,163 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744933_4109, type=LAST_IN_PIPELINE terminating 2025-07-11 20:12:17,856 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744933_4109 replica FinalizedReplica, blk_1073744933_4109, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744933 for deletion 2025-07-11 20:12:17,858 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744933_4109 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744933 2025-07-11 20:15:14,133 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744936_4112 src: /192.168.158.1:41630 dest: /192.168.158.4:9866 2025-07-11 20:15:14,165 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41630, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_778271857_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744936_4112, duration(ns): 23025264 2025-07-11 20:15:14,165 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744936_4112, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-11 20:15:17,865 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744936_4112 replica FinalizedReplica, blk_1073744936_4112, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744936 for deletion 2025-07-11 20:15:17,866 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744936_4112 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744936 2025-07-11 20:16:14,140 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744937_4113 src: /192.168.158.7:41626 dest: /192.168.158.4:9866 2025-07-11 20:16:14,163 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:41626, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_165990558_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744937_4113, duration(ns): 18172039 2025-07-11 20:16:14,164 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744937_4113, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-11 20:16:14,868 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744937_4113 replica FinalizedReplica, blk_1073744937_4113, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744937 for deletion 2025-07-11 20:16:14,869 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744937_4113 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744937 2025-07-11 20:17:19,162 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744938_4114 src: /192.168.158.5:44984 dest: /192.168.158.4:9866 2025-07-11 20:17:19,186 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:44984, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1244467798_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744938_4114, duration(ns): 19090138 2025-07-11 20:17:19,186 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744938_4114, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-11 20:17:23,869 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744938_4114 replica FinalizedReplica, blk_1073744938_4114, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744938 for deletion 2025-07-11 20:17:23,870 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744938_4114 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744938 2025-07-11 20:18:19,156 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744939_4115 src: /192.168.158.1:56578 dest: /192.168.158.4:9866 2025-07-11 20:18:19,187 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56578, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1111572832_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744939_4115, duration(ns): 21732748 2025-07-11 20:18:19,188 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744939_4115, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-11 20:18:23,871 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744939_4115 replica FinalizedReplica, blk_1073744939_4115, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744939 for deletion 2025-07-11 20:18:23,873 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744939_4115 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744939 2025-07-11 20:20:19,170 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744941_4117 src: /192.168.158.9:39112 dest: /192.168.158.4:9866 2025-07-11 20:20:19,189 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:39112, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2014764275_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744941_4117, duration(ns): 16542797 2025-07-11 20:20:19,189 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744941_4117, type=LAST_IN_PIPELINE terminating 2025-07-11 20:20:20,874 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744941_4117 replica FinalizedReplica, blk_1073744941_4117, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744941 for deletion 2025-07-11 20:20:20,875 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744941_4117 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744941 2025-07-11 20:21:19,164 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744942_4118 src: /192.168.158.1:38274 dest: /192.168.158.4:9866 2025-07-11 20:21:19,195 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38274, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-998574813_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744942_4118, duration(ns): 21974766 2025-07-11 20:21:19,195 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744942_4118, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-11 20:21:20,878 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744942_4118 replica FinalizedReplica, blk_1073744942_4118, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744942 for deletion 2025-07-11 20:21:20,879 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744942_4118 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744942 2025-07-11 20:22:19,170 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744943_4119 src: /192.168.158.9:38936 dest: /192.168.158.4:9866 2025-07-11 20:22:19,195 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:38936, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_756424773_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744943_4119, duration(ns): 19537285 2025-07-11 20:22:19,196 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744943_4119, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-11 20:22:23,879 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744943_4119 replica FinalizedReplica, blk_1073744943_4119, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744943 for deletion 2025-07-11 20:22:23,880 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744943_4119 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744943 2025-07-11 20:23:19,177 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744944_4120 src: /192.168.158.1:50918 dest: /192.168.158.4:9866 2025-07-11 20:23:19,208 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50918, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1251381335_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744944_4120, duration(ns): 23236158 2025-07-11 20:23:19,209 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744944_4120, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-11 20:23:20,879 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744944_4120 replica FinalizedReplica, blk_1073744944_4120, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744944 for deletion 2025-07-11 20:23:20,880 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744944_4120 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744944 2025-07-11 20:25:19,177 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744946_4122 src: /192.168.158.1:57046 dest: /192.168.158.4:9866 2025-07-11 20:25:19,207 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57046, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1034289737_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744946_4122, duration(ns): 20628648 2025-07-11 20:25:19,207 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744946_4122, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-11 20:25:20,883 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744946_4122 replica FinalizedReplica, blk_1073744946_4122, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744946 for deletion 2025-07-11 20:25:20,884 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744946_4122 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744946 2025-07-11 20:26:24,184 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744947_4123 src: /192.168.158.1:47474 dest: /192.168.158.4:9866 2025-07-11 20:26:24,220 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47474, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1627887056_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744947_4123, duration(ns): 26826354 2025-07-11 20:26:24,220 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744947_4123, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-11 20:26:26,885 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744947_4123 replica FinalizedReplica, blk_1073744947_4123, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744947 for deletion 2025-07-11 20:26:26,886 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744947_4123 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744947 2025-07-11 20:28:34,158 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744949_4125 src: /192.168.158.1:50688 dest: /192.168.158.4:9866 2025-07-11 20:28:34,187 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50688, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-514847785_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744949_4125, duration(ns): 21223620 2025-07-11 20:28:34,188 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744949_4125, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-11 20:28:35,888 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744949_4125 replica FinalizedReplica, blk_1073744949_4125, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744949 for deletion 2025-07-11 20:28:35,889 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744949_4125 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744949 2025-07-11 20:29:39,163 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744950_4126 src: /192.168.158.1:37384 dest: /192.168.158.4:9866 2025-07-11 20:29:39,194 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37384, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1139696429_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744950_4126, duration(ns): 22428095 2025-07-11 20:29:39,195 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744950_4126, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-11 20:29:41,888 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744950_4126 replica FinalizedReplica, blk_1073744950_4126, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744950 for deletion 2025-07-11 20:29:41,890 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744950_4126 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744950 2025-07-11 20:31:44,169 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744952_4128 src: /192.168.158.8:50830 dest: /192.168.158.4:9866 2025-07-11 20:31:44,197 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:50830, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-629931782_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744952_4128, duration(ns): 21196818 2025-07-11 20:31:44,197 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744952_4128, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-11 20:31:47,892 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744952_4128 replica FinalizedReplica, blk_1073744952_4128, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744952 for deletion 2025-07-11 20:31:47,893 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744952_4128 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744952 2025-07-11 20:33:49,177 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744954_4130 src: /192.168.158.7:55002 dest: /192.168.158.4:9866 2025-07-11 20:33:49,197 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:55002, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_713810932_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744954_4130, duration(ns): 18041173 2025-07-11 20:33:49,197 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744954_4130, type=LAST_IN_PIPELINE terminating 2025-07-11 20:33:53,895 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744954_4130 replica FinalizedReplica, blk_1073744954_4130, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744954 for deletion 2025-07-11 20:33:53,896 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744954_4130 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744954 2025-07-11 20:34:49,174 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744955_4131 src: /192.168.158.1:58672 dest: /192.168.158.4:9866 2025-07-11 20:34:49,205 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58672, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_326432223_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744955_4131, duration(ns): 22328042 2025-07-11 20:34:49,205 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744955_4131, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-11 20:34:53,899 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744955_4131 replica FinalizedReplica, blk_1073744955_4131, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744955 for deletion 2025-07-11 20:34:53,901 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744955_4131 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744955 2025-07-11 20:37:54,183 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744958_4134 src: /192.168.158.1:54094 dest: /192.168.158.4:9866 2025-07-11 20:37:54,211 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54094, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1902344859_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744958_4134, duration(ns): 19525829 2025-07-11 20:37:54,211 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744958_4134, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-11 20:38:02,900 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744958_4134 replica FinalizedReplica, blk_1073744958_4134, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744958 for deletion 2025-07-11 20:38:02,901 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744958_4134 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744958 2025-07-11 20:38:59,180 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744959_4135 src: /192.168.158.5:51470 dest: /192.168.158.4:9866 2025-07-11 20:38:59,203 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:51470, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_95795462_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744959_4135, duration(ns): 17868275 2025-07-11 20:38:59,204 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744959_4135, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-11 20:39:02,901 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744959_4135 replica FinalizedReplica, blk_1073744959_4135, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744959 for deletion 2025-07-11 20:39:02,903 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744959_4135 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744959 2025-07-11 20:39:59,184 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744960_4136 src: /192.168.158.6:36766 dest: /192.168.158.4:9866 2025-07-11 20:39:59,206 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:36766, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-532051184_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744960_4136, duration(ns): 15897714 2025-07-11 20:39:59,206 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744960_4136, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-11 20:40:02,903 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744960_4136 replica FinalizedReplica, blk_1073744960_4136, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744960 for deletion 2025-07-11 20:40:02,905 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744960_4136 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744960 2025-07-11 20:44:14,189 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744964_4140 src: /192.168.158.5:40554 dest: /192.168.158.4:9866 2025-07-11 20:44:14,206 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:40554, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2121354140_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744964_4140, duration(ns): 14946241 2025-07-11 20:44:14,206 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744964_4140, type=LAST_IN_PIPELINE terminating 2025-07-11 20:44:17,906 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744964_4140 replica FinalizedReplica, blk_1073744964_4140, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744964 for deletion 2025-07-11 20:44:17,908 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744964_4140 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744964 2025-07-11 20:49:29,210 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744969_4145 src: /192.168.158.5:44698 dest: /192.168.158.4:9866 2025-07-11 20:49:29,227 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:44698, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1154387696_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744969_4145, duration(ns): 14189058 2025-07-11 20:49:29,227 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744969_4145, type=LAST_IN_PIPELINE terminating 2025-07-11 20:49:32,917 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744969_4145 replica FinalizedReplica, blk_1073744969_4145, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744969 for deletion 2025-07-11 20:49:32,918 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744969_4145 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744969 2025-07-11 20:51:29,201 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744971_4147 src: /192.168.158.8:45370 dest: /192.168.158.4:9866 2025-07-11 20:51:29,219 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:45370, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-535799332_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744971_4147, duration(ns): 15836007 2025-07-11 20:51:29,219 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744971_4147, type=LAST_IN_PIPELINE terminating 2025-07-11 20:51:35,924 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744971_4147 replica FinalizedReplica, blk_1073744971_4147, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744971 for deletion 2025-07-11 20:51:35,925 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744971_4147 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744971 2025-07-11 20:54:39,207 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744974_4150 src: /192.168.158.1:54920 dest: /192.168.158.4:9866 2025-07-11 20:54:39,237 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54920, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_482063879_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744974_4150, duration(ns): 20526005 2025-07-11 20:54:39,237 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744974_4150, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-11 20:54:47,927 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744974_4150 replica FinalizedReplica, blk_1073744974_4150, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744974 for deletion 2025-07-11 20:54:47,928 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744974_4150 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744974 2025-07-11 20:56:39,198 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744976_4152 src: /192.168.158.6:37096 dest: /192.168.158.4:9866 2025-07-11 20:56:39,223 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:37096, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1620767375_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744976_4152, duration(ns): 19949863 2025-07-11 20:56:39,223 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744976_4152, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-11 20:56:44,928 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744976_4152 replica FinalizedReplica, blk_1073744976_4152, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744976 for deletion 2025-07-11 20:56:44,929 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744976_4152 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744976 2025-07-11 20:57:39,206 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744977_4153 src: /192.168.158.6:38466 dest: /192.168.158.4:9866 2025-07-11 20:57:39,225 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:38466, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2106236176_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744977_4153, duration(ns): 17875618 2025-07-11 20:57:39,226 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744977_4153, type=LAST_IN_PIPELINE terminating 2025-07-11 20:57:44,928 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744977_4153 replica FinalizedReplica, blk_1073744977_4153, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744977 for deletion 2025-07-11 20:57:44,929 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744977_4153 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744977 2025-07-11 20:58:44,209 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744978_4154 src: /192.168.158.5:44918 dest: /192.168.158.4:9866 2025-07-11 20:58:44,234 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:44918, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_470684241_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744978_4154, duration(ns): 19908891 2025-07-11 20:58:44,235 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744978_4154, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-11 20:58:50,928 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744978_4154 replica FinalizedReplica, blk_1073744978_4154, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744978 for deletion 2025-07-11 20:58:50,929 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744978_4154 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744978 2025-07-11 20:59:44,197 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744979_4155 src: /192.168.158.1:43754 dest: /192.168.158.4:9866 2025-07-11 20:59:44,229 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43754, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1791976983_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744979_4155, duration(ns): 21707453 2025-07-11 20:59:44,229 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744979_4155, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-11 20:59:47,929 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744979_4155 replica FinalizedReplica, blk_1073744979_4155, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744979 for deletion 2025-07-11 20:59:47,930 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744979_4155 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744979 2025-07-11 21:01:44,205 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744981_4157 src: /192.168.158.9:45436 dest: /192.168.158.4:9866 2025-07-11 21:01:44,222 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:45436, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-732060906_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744981_4157, duration(ns): 15320381 2025-07-11 21:01:44,223 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744981_4157, type=LAST_IN_PIPELINE terminating 2025-07-11 21:01:47,937 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744981_4157 replica FinalizedReplica, blk_1073744981_4157, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744981 for deletion 2025-07-11 21:01:47,938 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744981_4157 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744981 2025-07-11 21:03:44,210 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744983_4159 src: /192.168.158.5:35110 dest: /192.168.158.4:9866 2025-07-11 21:03:44,228 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:35110, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1717761567_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744983_4159, duration(ns): 15134513 2025-07-11 21:03:44,228 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744983_4159, type=LAST_IN_PIPELINE terminating 2025-07-11 21:03:47,944 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744983_4159 replica FinalizedReplica, blk_1073744983_4159, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744983 for deletion 2025-07-11 21:03:47,945 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744983_4159 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744983 2025-07-11 21:06:44,212 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744986_4162 src: /192.168.158.1:37852 dest: /192.168.158.4:9866 2025-07-11 21:06:44,241 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37852, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1044756693_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744986_4162, duration(ns): 20710437 2025-07-11 21:06:44,242 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744986_4162, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-11 21:06:50,952 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744986_4162 replica FinalizedReplica, blk_1073744986_4162, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744986 for deletion 2025-07-11 21:06:50,953 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744986_4162 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744986 2025-07-11 21:07:44,217 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744987_4163 src: /192.168.158.7:53202 dest: /192.168.158.4:9866 2025-07-11 21:07:44,238 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:53202, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2125360815_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744987_4163, duration(ns): 18237566 2025-07-11 21:07:44,238 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744987_4163, type=LAST_IN_PIPELINE terminating 2025-07-11 21:07:47,955 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744987_4163 replica FinalizedReplica, blk_1073744987_4163, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744987 for deletion 2025-07-11 21:07:47,956 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744987_4163 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744987 2025-07-11 21:09:44,219 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744989_4165 src: /192.168.158.8:59434 dest: /192.168.158.4:9866 2025-07-11 21:09:44,237 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:59434, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1401101239_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744989_4165, duration(ns): 16684027 2025-07-11 21:09:44,238 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744989_4165, type=LAST_IN_PIPELINE terminating 2025-07-11 21:09:50,959 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744989_4165 replica FinalizedReplica, blk_1073744989_4165, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744989 for deletion 2025-07-11 21:09:50,960 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744989_4165 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744989 2025-07-11 21:10:44,227 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744990_4166 src: /192.168.158.6:41280 dest: /192.168.158.4:9866 2025-07-11 21:10:44,253 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:41280, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1889648949_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744990_4166, duration(ns): 20573271 2025-07-11 21:10:44,253 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744990_4166, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-11 21:10:47,959 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744990_4166 replica FinalizedReplica, blk_1073744990_4166, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744990 for deletion 2025-07-11 21:10:47,960 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744990_4166 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744990 2025-07-11 21:13:44,245 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744993_4169 src: /192.168.158.1:45694 dest: /192.168.158.4:9866 2025-07-11 21:13:44,281 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45694, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_570534173_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744993_4169, duration(ns): 26138953 2025-07-11 21:13:44,282 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744993_4169, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-11 21:13:47,962 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744993_4169 replica FinalizedReplica, blk_1073744993_4169, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744993 for deletion 2025-07-11 21:13:47,963 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744993_4169 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744993 2025-07-11 21:15:49,230 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744995_4171 src: /192.168.158.6:58590 dest: /192.168.158.4:9866 2025-07-11 21:15:49,249 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:58590, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1573558025_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744995_4171, duration(ns): 16186283 2025-07-11 21:15:49,249 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744995_4171, type=LAST_IN_PIPELINE terminating 2025-07-11 21:15:53,966 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744995_4171 replica FinalizedReplica, blk_1073744995_4171, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744995 for deletion 2025-07-11 21:15:53,967 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744995_4171 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744995 2025-07-11 21:17:49,238 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744997_4173 src: /192.168.158.9:59916 dest: /192.168.158.4:9866 2025-07-11 21:17:49,256 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:59916, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1479850506_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744997_4173, duration(ns): 15740459 2025-07-11 21:17:49,256 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744997_4173, type=LAST_IN_PIPELINE terminating 2025-07-11 21:17:53,973 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744997_4173 replica FinalizedReplica, blk_1073744997_4173, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744997 for deletion 2025-07-11 21:17:53,974 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744997_4173 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744997 2025-07-11 21:18:49,235 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073744998_4174 src: /192.168.158.5:51494 dest: /192.168.158.4:9866 2025-07-11 21:18:49,259 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:51494, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1873242451_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073744998_4174, duration(ns): 19123892 2025-07-11 21:18:49,260 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073744998_4174, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-11 21:18:53,976 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073744998_4174 replica FinalizedReplica, blk_1073744998_4174, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744998 for deletion 2025-07-11 21:18:53,977 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073744998_4174 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073744998 2025-07-11 21:25:54,261 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745005_4181 src: /192.168.158.6:44910 dest: /192.168.158.4:9866 2025-07-11 21:25:54,287 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:44910, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1607819056_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745005_4181, duration(ns): 20299440 2025-07-11 21:25:54,287 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745005_4181, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-11 21:26:02,988 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745005_4181 replica FinalizedReplica, blk_1073745005_4181, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745005 for deletion 2025-07-11 21:26:02,989 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745005_4181 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745005 2025-07-11 21:31:04,269 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745010_4186 src: /192.168.158.6:57438 dest: /192.168.158.4:9866 2025-07-11 21:31:04,293 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:57438, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1159250565_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745010_4186, duration(ns): 18697235 2025-07-11 21:31:04,293 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745010_4186, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-11 21:31:09,001 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745010_4186 replica FinalizedReplica, blk_1073745010_4186, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745010 for deletion 2025-07-11 21:31:09,002 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745010_4186 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745010 2025-07-11 21:32:04,286 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745011_4187 src: /192.168.158.7:37382 dest: /192.168.158.4:9866 2025-07-11 21:32:04,310 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:37382, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1185236763_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745011_4187, duration(ns): 18118330 2025-07-11 21:32:04,310 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745011_4187, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-11 21:32:12,006 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745011_4187 replica FinalizedReplica, blk_1073745011_4187, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745011 for deletion 2025-07-11 21:32:12,007 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745011_4187 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745011 2025-07-11 21:35:04,278 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745014_4190 src: /192.168.158.7:42284 dest: /192.168.158.4:9866 2025-07-11 21:35:04,302 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:42284, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-593160175_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745014_4190, duration(ns): 18572096 2025-07-11 21:35:04,302 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745014_4190, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-11 21:35:09,016 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745014_4190 replica FinalizedReplica, blk_1073745014_4190, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745014 for deletion 2025-07-11 21:35:09,017 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745014_4190 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745014 2025-07-11 21:36:09,274 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745015_4191 src: /192.168.158.1:56170 dest: /192.168.158.4:9866 2025-07-11 21:36:09,308 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56170, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1794813957_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745015_4191, duration(ns): 24970900 2025-07-11 21:36:09,309 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745015_4191, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-11 21:36:15,018 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745015_4191 replica FinalizedReplica, blk_1073745015_4191, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745015 for deletion 2025-07-11 21:36:15,019 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745015_4191 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745015 2025-07-11 21:38:14,294 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745017_4193 src: /192.168.158.6:40226 dest: /192.168.158.4:9866 2025-07-11 21:38:14,311 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:40226, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-656050773_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745017_4193, duration(ns): 15318789 2025-07-11 21:38:14,311 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745017_4193, type=LAST_IN_PIPELINE terminating 2025-07-11 21:38:18,022 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745017_4193 replica FinalizedReplica, blk_1073745017_4193, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745017 for deletion 2025-07-11 21:38:18,023 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745017_4193 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745017 2025-07-11 21:40:14,283 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745019_4195 src: /192.168.158.1:38952 dest: /192.168.158.4:9866 2025-07-11 21:40:14,313 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38952, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-620947670_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745019_4195, duration(ns): 21704314 2025-07-11 21:40:14,313 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745019_4195, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-11 21:40:18,027 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745019_4195 replica FinalizedReplica, blk_1073745019_4195, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745019 for deletion 2025-07-11 21:40:18,028 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745019_4195 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745019 2025-07-11 21:42:19,294 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745021_4197 src: /192.168.158.1:33192 dest: /192.168.158.4:9866 2025-07-11 21:42:19,326 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33192, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1394952786_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745021_4197, duration(ns): 23680587 2025-07-11 21:42:19,328 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745021_4197, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-11 21:42:27,030 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745021_4197 replica FinalizedReplica, blk_1073745021_4197, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745021 for deletion 2025-07-11 21:42:27,031 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745021_4197 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745021 2025-07-11 21:44:19,294 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745023_4199 src: /192.168.158.1:46086 dest: /192.168.158.4:9866 2025-07-11 21:44:19,325 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46086, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1569230113_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745023_4199, duration(ns): 22030233 2025-07-11 21:44:19,325 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745023_4199, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-11 21:44:24,038 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745023_4199 replica FinalizedReplica, blk_1073745023_4199, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745023 for deletion 2025-07-11 21:44:24,039 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745023_4199 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745023 2025-07-11 21:48:29,292 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745027_4203 src: /192.168.158.9:42014 dest: /192.168.158.4:9866 2025-07-11 21:48:29,311 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:42014, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1617671978_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745027_4203, duration(ns): 16825044 2025-07-11 21:48:29,311 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745027_4203, type=LAST_IN_PIPELINE terminating 2025-07-11 21:48:33,045 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745027_4203 replica FinalizedReplica, blk_1073745027_4203, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745027 for deletion 2025-07-11 21:48:33,046 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745027_4203 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745027 2025-07-11 21:49:29,308 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745028_4204 src: /192.168.158.5:44010 dest: /192.168.158.4:9866 2025-07-11 21:49:29,335 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:44010, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-864111507_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745028_4204, duration(ns): 21634289 2025-07-11 21:49:29,335 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745028_4204, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-11 21:49:36,048 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745028_4204 replica FinalizedReplica, blk_1073745028_4204, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745028 for deletion 2025-07-11 21:49:36,049 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745028_4204 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745028 2025-07-11 21:53:29,303 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745032_4208 src: /192.168.158.1:44030 dest: /192.168.158.4:9866 2025-07-11 21:53:29,339 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44030, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1862439204_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745032_4208, duration(ns): 26846943 2025-07-11 21:53:29,339 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745032_4208, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-11 21:53:33,056 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745032_4208 replica FinalizedReplica, blk_1073745032_4208, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745032 for deletion 2025-07-11 21:53:33,057 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745032_4208 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745032 2025-07-11 21:54:29,311 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745033_4209 src: /192.168.158.5:52242 dest: /192.168.158.4:9866 2025-07-11 21:54:29,329 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:52242, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1077041821_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745033_4209, duration(ns): 15679768 2025-07-11 21:54:29,329 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745033_4209, type=LAST_IN_PIPELINE terminating 2025-07-11 21:54:36,057 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745033_4209 replica FinalizedReplica, blk_1073745033_4209, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745033 for deletion 2025-07-11 21:54:36,058 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745033_4209 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745033 2025-07-11 21:59:34,321 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745038_4214 src: /192.168.158.7:58476 dest: /192.168.158.4:9866 2025-07-11 21:59:34,345 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:58476, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-126629329_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745038_4214, duration(ns): 18227324 2025-07-11 21:59:34,346 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745038_4214, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-11 21:59:42,069 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745038_4214 replica FinalizedReplica, blk_1073745038_4214, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745038 for deletion 2025-07-11 21:59:42,070 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745038_4214 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745038 2025-07-11 22:07:49,348 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745046_4222 src: /192.168.158.9:36412 dest: /192.168.158.4:9866 2025-07-11 22:07:49,367 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:36412, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1547990712_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745046_4222, duration(ns): 16341230 2025-07-11 22:07:49,367 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745046_4222, type=LAST_IN_PIPELINE terminating 2025-07-11 22:07:54,091 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745046_4222 replica FinalizedReplica, blk_1073745046_4222, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745046 for deletion 2025-07-11 22:07:54,092 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745046_4222 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745046 2025-07-11 22:10:49,355 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745049_4225 src: /192.168.158.8:51486 dest: /192.168.158.4:9866 2025-07-11 22:10:49,374 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:51486, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1974233346_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745049_4225, duration(ns): 16438370 2025-07-11 22:10:49,374 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745049_4225, type=LAST_IN_PIPELINE terminating 2025-07-11 22:10:54,099 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745049_4225 replica FinalizedReplica, blk_1073745049_4225, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745049 for deletion 2025-07-11 22:10:54,100 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745049_4225 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745049 2025-07-11 22:11:54,344 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745050_4226 src: /192.168.158.1:33666 dest: /192.168.158.4:9866 2025-07-11 22:11:54,379 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33666, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-396726036_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745050_4226, duration(ns): 24906568 2025-07-11 22:11:54,379 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745050_4226, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-11 22:12:00,103 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745050_4226 replica FinalizedReplica, blk_1073745050_4226, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745050 for deletion 2025-07-11 22:12:00,104 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745050_4226 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745050 2025-07-11 22:13:54,365 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745052_4228 src: /192.168.158.8:46754 dest: /192.168.158.4:9866 2025-07-11 22:13:54,383 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:46754, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_683449817_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745052_4228, duration(ns): 16144638 2025-07-11 22:13:54,384 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745052_4228, type=LAST_IN_PIPELINE terminating 2025-07-11 22:14:00,105 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745052_4228 replica FinalizedReplica, blk_1073745052_4228, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745052 for deletion 2025-07-11 22:14:00,107 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745052_4228 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745052 2025-07-11 22:14:54,359 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745053_4229 src: /192.168.158.6:60106 dest: /192.168.158.4:9866 2025-07-11 22:14:54,383 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:60106, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1970466625_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745053_4229, duration(ns): 18642837 2025-07-11 22:14:54,383 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745053_4229, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-11 22:15:03,108 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745053_4229 replica FinalizedReplica, blk_1073745053_4229, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745053 for deletion 2025-07-11 22:15:03,109 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745053_4229 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745053 2025-07-11 22:15:54,357 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745054_4230 src: /192.168.158.5:55712 dest: /192.168.158.4:9866 2025-07-11 22:15:54,372 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:55712, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1544443998_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745054_4230, duration(ns): 13069554 2025-07-11 22:15:54,372 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745054_4230, type=LAST_IN_PIPELINE terminating 2025-07-11 22:16:00,109 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745054_4230 replica FinalizedReplica, blk_1073745054_4230, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745054 for deletion 2025-07-11 22:16:00,110 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745054_4230 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745054 2025-07-11 22:19:54,349 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745058_4234 src: /192.168.158.1:38206 dest: /192.168.158.4:9866 2025-07-11 22:19:54,379 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38206, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_225915099_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745058_4234, duration(ns): 21537907 2025-07-11 22:19:54,380 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745058_4234, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-11 22:20:03,119 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745058_4234 replica FinalizedReplica, blk_1073745058_4234, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745058 for deletion 2025-07-11 22:20:03,120 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745058_4234 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745058 2025-07-11 22:21:59,374 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745060_4236 src: /192.168.158.5:52442 dest: /192.168.158.4:9866 2025-07-11 22:21:59,424 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:52442, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2125081601_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745060_4236, duration(ns): 47006113 2025-07-11 22:21:59,424 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745060_4236, type=LAST_IN_PIPELINE terminating 2025-07-11 22:22:06,125 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745060_4236 replica FinalizedReplica, blk_1073745060_4236, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745060 for deletion 2025-07-11 22:22:06,126 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745060_4236 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745060 2025-07-11 22:22:59,359 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745061_4237 src: /192.168.158.9:56652 dest: /192.168.158.4:9866 2025-07-11 22:22:59,376 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:56652, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1258389397_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745061_4237, duration(ns): 15059656 2025-07-11 22:22:59,376 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745061_4237, type=LAST_IN_PIPELINE terminating 2025-07-11 22:23:03,129 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745061_4237 replica FinalizedReplica, blk_1073745061_4237, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745061 for deletion 2025-07-11 22:23:03,130 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745061_4237 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745061 2025-07-11 22:28:04,449 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745066_4242 src: /192.168.158.1:50170 dest: /192.168.158.4:9866 2025-07-11 22:28:04,479 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50170, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1711569321_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745066_4242, duration(ns): 21824450 2025-07-11 22:28:04,480 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745066_4242, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-11 22:28:09,142 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745066_4242 replica FinalizedReplica, blk_1073745066_4242, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745066 for deletion 2025-07-11 22:28:09,143 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745066_4242 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745066 2025-07-11 22:31:04,378 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745069_4245 src: /192.168.158.1:60604 dest: /192.168.158.4:9866 2025-07-11 22:31:04,409 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60604, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-341164869_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745069_4245, duration(ns): 22648993 2025-07-11 22:31:04,409 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745069_4245, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-11 22:31:09,148 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745069_4245 replica FinalizedReplica, blk_1073745069_4245, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745069 for deletion 2025-07-11 22:31:09,149 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745069_4245 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745069 2025-07-11 22:33:04,410 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745071_4247 src: /192.168.158.7:40984 dest: /192.168.158.4:9866 2025-07-11 22:33:04,434 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:40984, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2114019906_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745071_4247, duration(ns): 18463553 2025-07-11 22:33:04,434 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745071_4247, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-11 22:33:09,149 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745071_4247 replica FinalizedReplica, blk_1073745071_4247, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745071 for deletion 2025-07-11 22:33:09,150 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745071_4247 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745071 2025-07-11 22:35:04,389 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745073_4249 src: /192.168.158.5:36432 dest: /192.168.158.4:9866 2025-07-11 22:35:04,414 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:36432, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1030100980_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745073_4249, duration(ns): 19124581 2025-07-11 22:35:04,414 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745073_4249, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-11 22:35:12,154 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745073_4249 replica FinalizedReplica, blk_1073745073_4249, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745073 for deletion 2025-07-11 22:35:12,156 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745073_4249 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745073 2025-07-11 22:39:09,384 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745077_4253 src: /192.168.158.1:35208 dest: /192.168.158.4:9866 2025-07-11 22:39:09,416 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35208, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1192016055_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745077_4253, duration(ns): 23540813 2025-07-11 22:39:09,416 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745077_4253, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-11 22:39:15,167 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745077_4253 replica FinalizedReplica, blk_1073745077_4253, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745077 for deletion 2025-07-11 22:39:15,169 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745077_4253 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745077 2025-07-11 22:44:09,405 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745082_4258 src: /192.168.158.8:34586 dest: /192.168.158.4:9866 2025-07-11 22:44:09,431 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:34586, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1283647356_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745082_4258, duration(ns): 20361338 2025-07-11 22:44:09,431 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745082_4258, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-11 22:44:15,180 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745082_4258 replica FinalizedReplica, blk_1073745082_4258, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745082 for deletion 2025-07-11 22:44:15,181 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745082_4258 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745082 2025-07-11 22:48:09,422 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745086_4262 src: /192.168.158.8:57048 dest: /192.168.158.4:9866 2025-07-11 22:48:09,439 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:57048, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_599371954_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745086_4262, duration(ns): 14637095 2025-07-11 22:48:09,439 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745086_4262, type=LAST_IN_PIPELINE terminating 2025-07-11 22:48:15,186 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745086_4262 replica FinalizedReplica, blk_1073745086_4262, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745086 for deletion 2025-07-11 22:48:15,187 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745086_4262 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745086 2025-07-11 22:50:09,422 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745088_4264 src: /192.168.158.6:36808 dest: /192.168.158.4:9866 2025-07-11 22:50:09,445 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:36808, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_256534015_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745088_4264, duration(ns): 18271151 2025-07-11 22:50:09,446 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745088_4264, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-11 22:50:15,186 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745088_4264 replica FinalizedReplica, blk_1073745088_4264, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745088 for deletion 2025-07-11 22:50:15,187 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745088_4264 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745088 2025-07-11 22:51:09,430 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745089_4265 src: /192.168.158.7:55370 dest: /192.168.158.4:9866 2025-07-11 22:51:09,448 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:55370, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-946315266_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745089_4265, duration(ns): 15328996 2025-07-11 22:51:09,448 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745089_4265, type=LAST_IN_PIPELINE terminating 2025-07-11 22:51:18,187 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745089_4265 replica FinalizedReplica, blk_1073745089_4265, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745089 for deletion 2025-07-11 22:51:18,189 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745089_4265 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745089 2025-07-11 22:55:19,420 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745093_4269 src: /192.168.158.1:51356 dest: /192.168.158.4:9866 2025-07-11 22:55:19,451 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51356, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1880089726_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745093_4269, duration(ns): 21917845 2025-07-11 22:55:19,452 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745093_4269, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-11 22:55:27,195 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745093_4269 replica FinalizedReplica, blk_1073745093_4269, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745093 for deletion 2025-07-11 22:55:27,196 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745093_4269 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745093 2025-07-11 22:58:24,437 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745096_4272 src: /192.168.158.7:48588 dest: /192.168.158.4:9866 2025-07-11 22:58:24,455 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:48588, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1486920938_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745096_4272, duration(ns): 16340961 2025-07-11 22:58:24,456 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745096_4272, type=LAST_IN_PIPELINE terminating 2025-07-11 22:58:33,201 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745096_4272 replica FinalizedReplica, blk_1073745096_4272, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745096 for deletion 2025-07-11 22:58:33,202 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745096_4272 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745096 2025-07-11 23:00:29,439 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745098_4274 src: /192.168.158.7:33284 dest: /192.168.158.4:9866 2025-07-11 23:00:29,464 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:33284, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_829816190_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745098_4274, duration(ns): 20196528 2025-07-11 23:00:29,465 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745098_4274, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-11 23:00:33,208 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745098_4274 replica FinalizedReplica, blk_1073745098_4274, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745098 for deletion 2025-07-11 23:00:33,210 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745098_4274 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745098 2025-07-11 23:01:29,439 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745099_4275 src: /192.168.158.8:60794 dest: /192.168.158.4:9866 2025-07-11 23:01:29,457 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:60794, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1758161372_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745099_4275, duration(ns): 16128075 2025-07-11 23:01:29,458 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745099_4275, type=LAST_IN_PIPELINE terminating 2025-07-11 23:01:36,212 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745099_4275 replica FinalizedReplica, blk_1073745099_4275, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745099 for deletion 2025-07-11 23:01:36,213 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745099_4275 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745099 2025-07-11 23:04:29,447 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745102_4278 src: /192.168.158.7:34490 dest: /192.168.158.4:9866 2025-07-11 23:04:29,465 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:34490, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_852604143_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745102_4278, duration(ns): 15381053 2025-07-11 23:04:29,466 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745102_4278, type=LAST_IN_PIPELINE terminating 2025-07-11 23:04:36,220 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745102_4278 replica FinalizedReplica, blk_1073745102_4278, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745102 for deletion 2025-07-11 23:04:36,221 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745102_4278 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745102 2025-07-11 23:05:29,451 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745103_4279 src: /192.168.158.6:38718 dest: /192.168.158.4:9866 2025-07-11 23:05:29,470 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:38718, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_139102714_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745103_4279, duration(ns): 16335382 2025-07-11 23:05:29,470 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745103_4279, type=LAST_IN_PIPELINE terminating 2025-07-11 23:05:33,220 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745103_4279 replica FinalizedReplica, blk_1073745103_4279, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745103 for deletion 2025-07-11 23:05:33,221 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745103_4279 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745103 2025-07-11 23:06:29,452 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745104_4280 src: /192.168.158.5:59174 dest: /192.168.158.4:9866 2025-07-11 23:06:29,470 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:59174, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_21451099_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745104_4280, duration(ns): 15922861 2025-07-11 23:06:29,471 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745104_4280, type=LAST_IN_PIPELINE terminating 2025-07-11 23:06:33,225 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745104_4280 replica FinalizedReplica, blk_1073745104_4280, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745104 for deletion 2025-07-11 23:06:33,226 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745104_4280 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745104 2025-07-11 23:07:29,443 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745105_4281 src: /192.168.158.1:51186 dest: /192.168.158.4:9866 2025-07-11 23:07:29,472 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51186, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2141011580_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745105_4281, duration(ns): 21357178 2025-07-11 23:07:29,473 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745105_4281, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-11 23:07:33,228 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745105_4281 replica FinalizedReplica, blk_1073745105_4281, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745105 for deletion 2025-07-11 23:07:33,229 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745105_4281 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745105 2025-07-11 23:08:34,453 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745106_4282 src: /192.168.158.8:53800 dest: /192.168.158.4:9866 2025-07-11 23:08:34,470 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:53800, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1923594236_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745106_4282, duration(ns): 15092888 2025-07-11 23:08:34,470 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745106_4282, type=LAST_IN_PIPELINE terminating 2025-07-11 23:08:42,229 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745106_4282 replica FinalizedReplica, blk_1073745106_4282, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745106 for deletion 2025-07-11 23:08:42,231 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745106_4282 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745106 2025-07-11 23:16:39,500 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745114_4290 src: /192.168.158.8:39912 dest: /192.168.158.4:9866 2025-07-11 23:16:39,520 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:39912, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-394044636_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745114_4290, duration(ns): 17384088 2025-07-11 23:16:39,520 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745114_4290, type=LAST_IN_PIPELINE terminating 2025-07-11 23:16:45,250 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745114_4290 replica FinalizedReplica, blk_1073745114_4290, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745114 for deletion 2025-07-11 23:16:45,251 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745114_4290 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745114 2025-07-11 23:18:39,482 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745116_4292 src: /192.168.158.8:47496 dest: /192.168.158.4:9866 2025-07-11 23:18:39,502 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:47496, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1884439016_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745116_4292, duration(ns): 17617887 2025-07-11 23:18:39,502 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745116_4292, type=LAST_IN_PIPELINE terminating 2025-07-11 23:18:48,254 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745116_4292 replica FinalizedReplica, blk_1073745116_4292, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745116 for deletion 2025-07-11 23:18:48,255 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745116_4292 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745116 2025-07-11 23:21:39,501 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745119_4295 src: /192.168.158.6:37316 dest: /192.168.158.4:9866 2025-07-11 23:21:39,525 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:37316, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_344075505_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745119_4295, duration(ns): 19208685 2025-07-11 23:21:39,526 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745119_4295, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-11 23:21:45,257 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745119_4295 replica FinalizedReplica, blk_1073745119_4295, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745119 for deletion 2025-07-11 23:21:45,258 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745119_4295 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745119 2025-07-11 23:23:44,493 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745121_4297 src: /192.168.158.7:57940 dest: /192.168.158.4:9866 2025-07-11 23:23:44,511 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:57940, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1161874871_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745121_4297, duration(ns): 15264627 2025-07-11 23:23:44,511 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745121_4297, type=LAST_IN_PIPELINE terminating 2025-07-11 23:23:48,261 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745121_4297 replica FinalizedReplica, blk_1073745121_4297, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745121 for deletion 2025-07-11 23:23:48,262 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745121_4297 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745121 2025-07-11 23:24:49,485 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745122_4298 src: /192.168.158.5:38868 dest: /192.168.158.4:9866 2025-07-11 23:24:49,509 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:38868, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1491154265_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745122_4298, duration(ns): 19034573 2025-07-11 23:24:49,509 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745122_4298, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-11 23:24:54,261 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745122_4298 replica FinalizedReplica, blk_1073745122_4298, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745122 for deletion 2025-07-11 23:24:54,262 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745122_4298 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745122 2025-07-11 23:27:49,513 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745125_4301 src: /192.168.158.1:55574 dest: /192.168.158.4:9866 2025-07-11 23:27:49,542 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55574, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-847521456_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745125_4301, duration(ns): 20498981 2025-07-11 23:27:49,543 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745125_4301, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-11 23:27:54,268 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745125_4301 replica FinalizedReplica, blk_1073745125_4301, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745125 for deletion 2025-07-11 23:27:54,269 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745125_4301 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745125 2025-07-11 23:28:49,515 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745126_4302 src: /192.168.158.7:58082 dest: /192.168.158.4:9866 2025-07-11 23:28:49,533 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:58082, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1636535190_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745126_4302, duration(ns): 16402185 2025-07-11 23:28:49,533 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745126_4302, type=LAST_IN_PIPELINE terminating 2025-07-11 23:28:57,271 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745126_4302 replica FinalizedReplica, blk_1073745126_4302, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745126 for deletion 2025-07-11 23:28:57,272 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745126_4302 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745126 2025-07-11 23:29:49,519 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745127_4303 src: /192.168.158.7:58378 dest: /192.168.158.4:9866 2025-07-11 23:29:49,539 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:58378, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-232371235_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745127_4303, duration(ns): 17651706 2025-07-11 23:29:49,539 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745127_4303, type=LAST_IN_PIPELINE terminating 2025-07-11 23:29:57,272 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745127_4303 replica FinalizedReplica, blk_1073745127_4303, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745127 for deletion 2025-07-11 23:29:57,274 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745127_4303 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745127 2025-07-11 23:30:49,556 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745128_4304 src: /192.168.158.9:57498 dest: /192.168.158.4:9866 2025-07-11 23:30:49,580 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:57498, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1773294004_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745128_4304, duration(ns): 18324755 2025-07-11 23:30:49,580 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745128_4304, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-11 23:30:54,274 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745128_4304 replica FinalizedReplica, blk_1073745128_4304, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745128 for deletion 2025-07-11 23:30:54,275 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745128_4304 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745128 2025-07-11 23:32:54,532 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745130_4306 src: /192.168.158.7:33476 dest: /192.168.158.4:9866 2025-07-11 23:32:54,550 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:33476, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-390312305_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745130_4306, duration(ns): 16223442 2025-07-11 23:32:54,550 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745130_4306, type=LAST_IN_PIPELINE terminating 2025-07-11 23:33:03,280 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745130_4306 replica FinalizedReplica, blk_1073745130_4306, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745130 for deletion 2025-07-11 23:33:03,281 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745130_4306 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745130 2025-07-11 23:34:59,519 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745132_4308 src: /192.168.158.6:58222 dest: /192.168.158.4:9866 2025-07-11 23:34:59,545 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:58222, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-800042573_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745132_4308, duration(ns): 20429668 2025-07-11 23:34:59,545 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745132_4308, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-11 23:35:06,281 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745132_4308 replica FinalizedReplica, blk_1073745132_4308, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745132 for deletion 2025-07-11 23:35:06,282 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745132_4308 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745132 2025-07-11 23:35:59,518 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745133_4309 src: /192.168.158.1:57078 dest: /192.168.158.4:9866 2025-07-11 23:35:59,547 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57078, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_625089022_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745133_4309, duration(ns): 20370841 2025-07-11 23:35:59,549 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745133_4309, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-11 23:36:03,282 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745133_4309 replica FinalizedReplica, blk_1073745133_4309, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745133 for deletion 2025-07-11 23:36:03,283 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745133_4309 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745133 2025-07-11 23:36:13,269 INFO org.apache.hadoop.hdfs.server.datanode.DirectoryScanner: BlockPool BP-1059995147-192.168.158.1-1752101929360 Total blocks: 8, missing metadata files:0, missing block files:0, missing blocks in memory:0, mismatched blocks:0 2025-07-11 23:37:21,291 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Successfully sent block report 0x12cfd9a757d31f2e, containing 4 storage report(s), of which we sent 4. The reports had 8 total blocks and used 1 RPC(s). This took 1 msec to generate and 5 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5. 2025-07-11 23:37:21,292 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Got finalize command for block pool BP-1059995147-192.168.158.1-1752101929360 2025-07-11 23:43:09,541 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745140_4316 src: /192.168.158.1:37664 dest: /192.168.158.4:9866 2025-07-11 23:43:09,571 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37664, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1713959146_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745140_4316, duration(ns): 21477736 2025-07-11 23:43:09,572 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745140_4316, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-11 23:43:12,293 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745140_4316 replica FinalizedReplica, blk_1073745140_4316, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745140 for deletion 2025-07-11 23:43:12,295 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745140_4316 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745140 2025-07-11 23:44:14,539 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745141_4317 src: /192.168.158.1:37460 dest: /192.168.158.4:9866 2025-07-11 23:44:14,570 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37460, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-522544007_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745141_4317, duration(ns): 22610514 2025-07-11 23:44:14,571 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745141_4317, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-11 23:44:18,297 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745141_4317 replica FinalizedReplica, blk_1073745141_4317, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745141 for deletion 2025-07-11 23:44:18,298 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745141_4317 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745141 2025-07-11 23:45:19,541 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745142_4318 src: /192.168.158.6:59106 dest: /192.168.158.4:9866 2025-07-11 23:45:19,558 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:59106, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_242019561_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745142_4318, duration(ns): 15393140 2025-07-11 23:45:19,558 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745142_4318, type=LAST_IN_PIPELINE terminating 2025-07-11 23:45:27,299 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745142_4318 replica FinalizedReplica, blk_1073745142_4318, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745142 for deletion 2025-07-11 23:45:27,300 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745142_4318 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745142 2025-07-11 23:48:29,530 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745145_4321 src: /192.168.158.1:46924 dest: /192.168.158.4:9866 2025-07-11 23:48:29,566 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46924, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-737258379_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745145_4321, duration(ns): 25807283 2025-07-11 23:48:29,566 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745145_4321, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-11 23:48:36,305 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745145_4321 replica FinalizedReplica, blk_1073745145_4321, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745145 for deletion 2025-07-11 23:48:36,306 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745145_4321 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745145 2025-07-11 23:51:34,541 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745148_4324 src: /192.168.158.6:53772 dest: /192.168.158.4:9866 2025-07-11 23:51:34,559 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:53772, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1046181535_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745148_4324, duration(ns): 16807986 2025-07-11 23:51:34,560 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745148_4324, type=LAST_IN_PIPELINE terminating 2025-07-11 23:51:39,309 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745148_4324 replica FinalizedReplica, blk_1073745148_4324, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745148 for deletion 2025-07-11 23:51:39,310 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745148_4324 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745148 2025-07-11 23:52:34,544 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745149_4325 src: /192.168.158.6:55724 dest: /192.168.158.4:9866 2025-07-11 23:52:34,561 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:55724, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-858503696_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745149_4325, duration(ns): 15115919 2025-07-11 23:52:34,561 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745149_4325, type=LAST_IN_PIPELINE terminating 2025-07-11 23:52:39,309 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745149_4325 replica FinalizedReplica, blk_1073745149_4325, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745149 for deletion 2025-07-11 23:52:39,310 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745149_4325 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745149 2025-07-11 23:54:39,545 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745151_4327 src: /192.168.158.5:35156 dest: /192.168.158.4:9866 2025-07-11 23:54:39,564 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:35156, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1031686683_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745151_4327, duration(ns): 17374790 2025-07-11 23:54:39,564 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745151_4327, type=LAST_IN_PIPELINE terminating 2025-07-11 23:54:45,315 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745151_4327 replica FinalizedReplica, blk_1073745151_4327, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745151 for deletion 2025-07-11 23:54:45,316 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745151_4327 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073745151 2025-07-11 23:56:39,540 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745153_4329 src: /192.168.158.8:48800 dest: /192.168.158.4:9866 2025-07-11 23:56:39,558 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:48800, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_809546927_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745153_4329, duration(ns): 15102764 2025-07-11 23:56:39,558 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745153_4329, type=LAST_IN_PIPELINE terminating 2025-07-11 23:56:42,319 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745153_4329 replica FinalizedReplica, blk_1073745153_4329, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745153 for deletion 2025-07-11 23:56:42,320 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745153_4329 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745153 2025-07-11 23:57:39,544 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745154_4330 src: /192.168.158.6:50378 dest: /192.168.158.4:9866 2025-07-11 23:57:39,561 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:50378, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1556889223_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745154_4330, duration(ns): 14346726 2025-07-11 23:57:39,561 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745154_4330, type=LAST_IN_PIPELINE terminating 2025-07-11 23:57:42,319 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745154_4330 replica FinalizedReplica, blk_1073745154_4330, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745154 for deletion 2025-07-11 23:57:42,321 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745154_4330 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745154 2025-07-11 23:59:39,549 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745156_4332 src: /192.168.158.7:37758 dest: /192.168.158.4:9866 2025-07-11 23:59:39,567 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:37758, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-117209321_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745156_4332, duration(ns): 15706136 2025-07-11 23:59:39,567 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745156_4332, type=LAST_IN_PIPELINE terminating 2025-07-11 23:59:45,324 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745156_4332 replica FinalizedReplica, blk_1073745156_4332, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745156 for deletion 2025-07-11 23:59:45,325 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745156_4332 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745156 2025-07-12 00:00:39,552 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745157_4333 src: /192.168.158.9:51260 dest: /192.168.158.4:9866 2025-07-12 00:00:39,570 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:51260, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-250650129_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745157_4333, duration(ns): 15630931 2025-07-12 00:00:39,570 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745157_4333, type=LAST_IN_PIPELINE terminating 2025-07-12 00:00:42,327 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745157_4333 replica FinalizedReplica, blk_1073745157_4333, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745157 for deletion 2025-07-12 00:00:42,328 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745157_4333 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745157 2025-07-12 00:01:44,548 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745158_4334 src: /192.168.158.8:54324 dest: /192.168.158.4:9866 2025-07-12 00:01:44,573 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:54324, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-80161095_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745158_4334, duration(ns): 18632479 2025-07-12 00:01:44,573 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745158_4334, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-12 00:01:51,331 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745158_4334 replica FinalizedReplica, blk_1073745158_4334, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745158 for deletion 2025-07-12 00:01:51,332 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745158_4334 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745158 2025-07-12 00:02:44,555 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745159_4335 src: /192.168.158.9:48494 dest: /192.168.158.4:9866 2025-07-12 00:02:44,580 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:48494, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_899712265_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745159_4335, duration(ns): 19505242 2025-07-12 00:02:44,581 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745159_4335, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-12 00:02:48,335 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745159_4335 replica FinalizedReplica, blk_1073745159_4335, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745159 for deletion 2025-07-12 00:02:48,336 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745159_4335 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745159 2025-07-12 00:04:54,554 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745161_4337 src: /192.168.158.8:35794 dest: /192.168.158.4:9866 2025-07-12 00:04:54,574 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:35794, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_535173161_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745161_4337, duration(ns): 18523315 2025-07-12 00:04:54,575 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745161_4337, type=LAST_IN_PIPELINE terminating 2025-07-12 00:05:00,339 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745161_4337 replica FinalizedReplica, blk_1073745161_4337, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745161 for deletion 2025-07-12 00:05:00,341 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745161_4337 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745161 2025-07-12 00:05:54,552 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745162_4338 src: /192.168.158.1:50168 dest: /192.168.158.4:9866 2025-07-12 00:05:54,583 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50168, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_683321355_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745162_4338, duration(ns): 22933161 2025-07-12 00:05:54,584 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745162_4338, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-12 00:05:57,340 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745162_4338 replica FinalizedReplica, blk_1073745162_4338, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745162 for deletion 2025-07-12 00:05:57,341 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745162_4338 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745162 2025-07-12 00:06:54,564 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745163_4339 src: /192.168.158.1:54568 dest: /192.168.158.4:9866 2025-07-12 00:06:54,597 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54568, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_995677145_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745163_4339, duration(ns): 23366397 2025-07-12 00:06:54,597 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745163_4339, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-12 00:06:57,343 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745163_4339 replica FinalizedReplica, blk_1073745163_4339, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745163 for deletion 2025-07-12 00:06:57,344 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745163_4339 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745163 2025-07-12 00:07:54,582 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745164_4340 src: /192.168.158.1:41070 dest: /192.168.158.4:9866 2025-07-12 00:07:54,612 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41070, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1984521940_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745164_4340, duration(ns): 22196754 2025-07-12 00:07:54,613 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745164_4340, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-12 00:08:00,345 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745164_4340 replica FinalizedReplica, blk_1073745164_4340, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745164 for deletion 2025-07-12 00:08:00,346 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745164_4340 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745164 2025-07-12 00:11:54,564 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745168_4344 src: /192.168.158.1:46984 dest: /192.168.158.4:9866 2025-07-12 00:11:54,592 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46984, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-832085487_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745168_4344, duration(ns): 19514902 2025-07-12 00:11:54,593 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745168_4344, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-12 00:11:57,353 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745168_4344 replica FinalizedReplica, blk_1073745168_4344, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745168 for deletion 2025-07-12 00:11:57,354 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745168_4344 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745168 2025-07-12 00:16:59,585 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745173_4349 src: /192.168.158.1:43780 dest: /192.168.158.4:9866 2025-07-12 00:16:59,616 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43780, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1145603416_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745173_4349, duration(ns): 21638784 2025-07-12 00:16:59,616 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745173_4349, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-12 00:17:03,366 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745173_4349 replica FinalizedReplica, blk_1073745173_4349, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745173 for deletion 2025-07-12 00:17:03,367 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745173_4349 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745173 2025-07-12 00:17:59,595 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745174_4350 src: /192.168.158.5:48496 dest: /192.168.158.4:9866 2025-07-12 00:17:59,612 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:48496, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-548607993_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745174_4350, duration(ns): 14572506 2025-07-12 00:17:59,612 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745174_4350, type=LAST_IN_PIPELINE terminating 2025-07-12 00:18:03,367 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745174_4350 replica FinalizedReplica, blk_1073745174_4350, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745174 for deletion 2025-07-12 00:18:03,369 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745174_4350 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745174 2025-07-12 00:18:59,589 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745175_4351 src: /192.168.158.1:59936 dest: /192.168.158.4:9866 2025-07-12 00:18:59,620 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59936, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-866480944_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745175_4351, duration(ns): 22085535 2025-07-12 00:18:59,620 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745175_4351, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-12 00:19:03,369 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745175_4351 replica FinalizedReplica, blk_1073745175_4351, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745175 for deletion 2025-07-12 00:19:03,371 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745175_4351 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745175 2025-07-12 00:20:59,614 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745177_4353 src: /192.168.158.8:49832 dest: /192.168.158.4:9866 2025-07-12 00:20:59,636 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:49832, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_557240403_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745177_4353, duration(ns): 19115522 2025-07-12 00:20:59,636 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745177_4353, type=LAST_IN_PIPELINE terminating 2025-07-12 00:21:06,372 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745177_4353 replica FinalizedReplica, blk_1073745177_4353, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745177 for deletion 2025-07-12 00:21:06,373 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745177_4353 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745177 2025-07-12 00:21:59,598 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745178_4354 src: /192.168.158.6:48014 dest: /192.168.158.4:9866 2025-07-12 00:21:59,624 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:48014, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1447719960_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745178_4354, duration(ns): 19482233 2025-07-12 00:21:59,626 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745178_4354, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-12 00:22:03,375 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745178_4354 replica FinalizedReplica, blk_1073745178_4354, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745178 for deletion 2025-07-12 00:22:03,376 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745178_4354 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745178 2025-07-12 00:24:04,609 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745180_4356 src: /192.168.158.6:53974 dest: /192.168.158.4:9866 2025-07-12 00:24:04,628 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:53974, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1925867806_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745180_4356, duration(ns): 15690854 2025-07-12 00:24:04,628 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745180_4356, type=LAST_IN_PIPELINE terminating 2025-07-12 00:24:12,382 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745180_4356 replica FinalizedReplica, blk_1073745180_4356, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745180 for deletion 2025-07-12 00:24:12,383 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745180_4356 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745180 2025-07-12 00:27:04,625 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745183_4359 src: /192.168.158.8:49116 dest: /192.168.158.4:9866 2025-07-12 00:27:04,643 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:49116, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2067470171_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745183_4359, duration(ns): 15570402 2025-07-12 00:27:04,643 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745183_4359, type=LAST_IN_PIPELINE terminating 2025-07-12 00:27:09,386 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745183_4359 replica FinalizedReplica, blk_1073745183_4359, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745183 for deletion 2025-07-12 00:27:09,387 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745183_4359 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745183 2025-07-12 00:28:04,613 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745184_4360 src: /192.168.158.8:41916 dest: /192.168.158.4:9866 2025-07-12 00:28:04,637 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:41916, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1154125316_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745184_4360, duration(ns): 18474157 2025-07-12 00:28:04,637 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745184_4360, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-12 00:28:12,387 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745184_4360 replica FinalizedReplica, blk_1073745184_4360, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745184 for deletion 2025-07-12 00:28:12,388 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745184_4360 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745184 2025-07-12 00:30:14,619 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745186_4362 src: /192.168.158.8:39236 dest: /192.168.158.4:9866 2025-07-12 00:30:14,642 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:39236, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_346417008_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745186_4362, duration(ns): 17869515 2025-07-12 00:30:14,643 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745186_4362, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-12 00:30:21,392 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745186_4362 replica FinalizedReplica, blk_1073745186_4362, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745186 for deletion 2025-07-12 00:30:21,393 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745186_4362 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745186 2025-07-12 00:32:14,620 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745188_4364 src: /192.168.158.9:56716 dest: /192.168.158.4:9866 2025-07-12 00:32:14,644 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:56716, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1561181221_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745188_4364, duration(ns): 18518608 2025-07-12 00:32:14,644 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745188_4364, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-12 00:32:18,397 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745188_4364 replica FinalizedReplica, blk_1073745188_4364, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745188 for deletion 2025-07-12 00:32:18,398 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745188_4364 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745188 2025-07-12 00:34:14,620 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745190_4366 src: /192.168.158.1:40326 dest: /192.168.158.4:9866 2025-07-12 00:34:14,650 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40326, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_592834874_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745190_4366, duration(ns): 21470126 2025-07-12 00:34:14,651 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745190_4366, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-12 00:34:18,404 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745190_4366 replica FinalizedReplica, blk_1073745190_4366, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745190 for deletion 2025-07-12 00:34:18,406 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745190_4366 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745190 2025-07-12 00:36:19,626 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745192_4368 src: /192.168.158.7:58452 dest: /192.168.158.4:9866 2025-07-12 00:36:19,651 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:58452, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_121238534_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745192_4368, duration(ns): 19156713 2025-07-12 00:36:19,651 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745192_4368, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-12 00:36:27,408 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745192_4368 replica FinalizedReplica, blk_1073745192_4368, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745192 for deletion 2025-07-12 00:36:27,409 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745192_4368 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745192 2025-07-12 00:38:24,659 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745194_4370 src: /192.168.158.8:48080 dest: /192.168.158.4:9866 2025-07-12 00:38:24,678 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:48080, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-116168284_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745194_4370, duration(ns): 16493696 2025-07-12 00:38:24,678 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745194_4370, type=LAST_IN_PIPELINE terminating 2025-07-12 00:38:30,413 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745194_4370 replica FinalizedReplica, blk_1073745194_4370, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745194 for deletion 2025-07-12 00:38:30,414 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745194_4370 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745194 2025-07-12 00:39:24,659 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745195_4371 src: /192.168.158.9:51478 dest: /192.168.158.4:9866 2025-07-12 00:39:24,675 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:51478, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-987495086_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745195_4371, duration(ns): 13721487 2025-07-12 00:39:24,676 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745195_4371, type=LAST_IN_PIPELINE terminating 2025-07-12 00:39:27,415 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745195_4371 replica FinalizedReplica, blk_1073745195_4371, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745195 for deletion 2025-07-12 00:39:27,417 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745195_4371 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745195 2025-07-12 00:43:34,667 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745199_4375 src: /192.168.158.1:56774 dest: /192.168.158.4:9866 2025-07-12 00:43:34,700 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56774, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_259599320_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745199_4375, duration(ns): 23583416 2025-07-12 00:43:34,700 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745199_4375, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-12 00:43:39,429 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745199_4375 replica FinalizedReplica, blk_1073745199_4375, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745199 for deletion 2025-07-12 00:43:39,430 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745199_4375 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745199 2025-07-12 00:44:39,673 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745200_4376 src: /192.168.158.1:38934 dest: /192.168.158.4:9866 2025-07-12 00:44:39,704 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38934, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1346474224_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745200_4376, duration(ns): 23040485 2025-07-12 00:44:39,705 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745200_4376, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-12 00:44:42,432 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745200_4376 replica FinalizedReplica, blk_1073745200_4376, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745200 for deletion 2025-07-12 00:44:42,433 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745200_4376 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745200 2025-07-12 00:45:39,679 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745201_4377 src: /192.168.158.5:54390 dest: /192.168.158.4:9866 2025-07-12 00:45:39,697 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:54390, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_976721971_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745201_4377, duration(ns): 15844942 2025-07-12 00:45:39,697 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745201_4377, type=LAST_IN_PIPELINE terminating 2025-07-12 00:45:42,434 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745201_4377 replica FinalizedReplica, blk_1073745201_4377, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745201 for deletion 2025-07-12 00:45:42,436 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745201_4377 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745201 2025-07-12 00:46:44,662 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745202_4378 src: /192.168.158.1:49954 dest: /192.168.158.4:9866 2025-07-12 00:46:44,696 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49954, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1841707197_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745202_4378, duration(ns): 26041854 2025-07-12 00:46:44,697 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745202_4378, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-12 00:46:51,437 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745202_4378 replica FinalizedReplica, blk_1073745202_4378, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745202 for deletion 2025-07-12 00:46:51,438 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745202_4378 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745202 2025-07-12 00:47:49,662 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745203_4379 src: /192.168.158.1:38796 dest: /192.168.158.4:9866 2025-07-12 00:47:49,713 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38796, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1218482904_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745203_4379, duration(ns): 22918081 2025-07-12 00:47:49,714 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745203_4379, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-12 00:47:54,438 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745203_4379 replica FinalizedReplica, blk_1073745203_4379, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745203 for deletion 2025-07-12 00:47:54,439 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745203_4379 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745203 2025-07-12 00:49:54,653 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745205_4381 src: /192.168.158.1:57228 dest: /192.168.158.4:9866 2025-07-12 00:49:54,682 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57228, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1501882088_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745205_4381, duration(ns): 20092126 2025-07-12 00:49:54,682 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745205_4381, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-12 00:50:00,442 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745205_4381 replica FinalizedReplica, blk_1073745205_4381, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745205 for deletion 2025-07-12 00:50:00,443 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745205_4381 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745205 2025-07-12 00:52:54,684 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745208_4384 src: /192.168.158.5:59148 dest: /192.168.158.4:9866 2025-07-12 00:52:54,701 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:59148, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_28461995_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745208_4384, duration(ns): 14393207 2025-07-12 00:52:54,701 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745208_4384, type=LAST_IN_PIPELINE terminating 2025-07-12 00:52:57,447 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745208_4384 replica FinalizedReplica, blk_1073745208_4384, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745208 for deletion 2025-07-12 00:52:57,448 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745208_4384 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745208 2025-07-12 00:54:54,668 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745210_4386 src: /192.168.158.9:50062 dest: /192.168.158.4:9866 2025-07-12 00:54:54,694 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:50062, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1813208967_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745210_4386, duration(ns): 19568447 2025-07-12 00:54:54,694 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745210_4386, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-12 00:55:00,452 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745210_4386 replica FinalizedReplica, blk_1073745210_4386, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745210 for deletion 2025-07-12 00:55:00,453 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745210_4386 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745210 2025-07-12 00:59:04,684 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745214_4390 src: /192.168.158.1:35680 dest: /192.168.158.4:9866 2025-07-12 00:59:04,717 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35680, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1870308886_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745214_4390, duration(ns): 24544817 2025-07-12 00:59:04,717 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745214_4390, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-12 00:59:09,462 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745214_4390 replica FinalizedReplica, blk_1073745214_4390, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745214 for deletion 2025-07-12 00:59:09,463 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745214_4390 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745214 2025-07-12 01:04:04,667 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745219_4395 src: /192.168.158.1:36240 dest: /192.168.158.4:9866 2025-07-12 01:04:04,699 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36240, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1038482004_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745219_4395, duration(ns): 22372565 2025-07-12 01:04:04,699 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745219_4395, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-12 01:04:09,480 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745219_4395 replica FinalizedReplica, blk_1073745219_4395, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745219 for deletion 2025-07-12 01:04:09,481 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745219_4395 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745219 2025-07-12 01:05:09,687 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745220_4396 src: /192.168.158.9:41692 dest: /192.168.158.4:9866 2025-07-12 01:05:09,706 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:41692, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-106588967_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745220_4396, duration(ns): 16317390 2025-07-12 01:05:09,706 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745220_4396, type=LAST_IN_PIPELINE terminating 2025-07-12 01:05:15,484 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745220_4396 replica FinalizedReplica, blk_1073745220_4396, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745220 for deletion 2025-07-12 01:05:15,485 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745220_4396 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745220 2025-07-12 01:06:09,696 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745221_4397 src: /192.168.158.5:40800 dest: /192.168.158.4:9866 2025-07-12 01:06:09,723 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:40800, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1386828523_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745221_4397, duration(ns): 21103765 2025-07-12 01:06:09,724 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745221_4397, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-12 01:06:15,485 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745221_4397 replica FinalizedReplica, blk_1073745221_4397, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745221 for deletion 2025-07-12 01:06:15,487 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745221_4397 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745221 2025-07-12 01:08:09,682 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745223_4399 src: /192.168.158.5:50592 dest: /192.168.158.4:9866 2025-07-12 01:08:09,707 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:50592, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1555041405_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745223_4399, duration(ns): 19867745 2025-07-12 01:08:09,708 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745223_4399, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-12 01:08:15,489 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745223_4399 replica FinalizedReplica, blk_1073745223_4399, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745223 for deletion 2025-07-12 01:08:15,490 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745223_4399 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745223 2025-07-12 01:09:09,674 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745224_4400 src: /192.168.158.1:59850 dest: /192.168.158.4:9866 2025-07-12 01:09:09,706 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59850, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1705234904_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745224_4400, duration(ns): 22876835 2025-07-12 01:09:09,706 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745224_4400, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-12 01:09:15,492 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745224_4400 replica FinalizedReplica, blk_1073745224_4400, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745224 for deletion 2025-07-12 01:09:15,493 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745224_4400 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745224 2025-07-12 01:11:09,683 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745226_4402 src: /192.168.158.5:39366 dest: /192.168.158.4:9866 2025-07-12 01:11:09,708 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:39366, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-955457155_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745226_4402, duration(ns): 18968195 2025-07-12 01:11:09,708 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745226_4402, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-12 01:11:15,496 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745226_4402 replica FinalizedReplica, blk_1073745226_4402, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745226 for deletion 2025-07-12 01:11:15,497 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745226_4402 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745226 2025-07-12 01:17:14,701 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745232_4408 src: /192.168.158.1:41696 dest: /192.168.158.4:9866 2025-07-12 01:17:14,734 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41696, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_845505277_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745232_4408, duration(ns): 24072651 2025-07-12 01:17:14,734 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745232_4408, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-12 01:17:18,512 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745232_4408 replica FinalizedReplica, blk_1073745232_4408, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745232 for deletion 2025-07-12 01:17:18,513 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745232_4408 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745232 2025-07-12 01:18:14,733 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745233_4409 src: /192.168.158.1:54844 dest: /192.168.158.4:9866 2025-07-12 01:18:14,765 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54844, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_661382910_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745233_4409, duration(ns): 23063919 2025-07-12 01:18:14,765 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745233_4409, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-12 01:18:21,515 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745233_4409 replica FinalizedReplica, blk_1073745233_4409, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745233 for deletion 2025-07-12 01:18:21,516 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745233_4409 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745233 2025-07-12 01:19:14,708 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745234_4410 src: /192.168.158.9:36266 dest: /192.168.158.4:9866 2025-07-12 01:19:14,726 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:36266, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-250973471_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745234_4410, duration(ns): 16081918 2025-07-12 01:19:14,726 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745234_4410, type=LAST_IN_PIPELINE terminating 2025-07-12 01:19:18,517 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745234_4410 replica FinalizedReplica, blk_1073745234_4410, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745234 for deletion 2025-07-12 01:19:18,518 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745234_4410 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745234 2025-07-12 01:20:14,693 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745235_4411 src: /192.168.158.5:56960 dest: /192.168.158.4:9866 2025-07-12 01:20:14,713 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:56960, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_884956656_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745235_4411, duration(ns): 18361133 2025-07-12 01:20:14,714 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745235_4411, type=LAST_IN_PIPELINE terminating 2025-07-12 01:20:18,520 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745235_4411 replica FinalizedReplica, blk_1073745235_4411, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745235 for deletion 2025-07-12 01:20:18,521 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745235_4411 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745235 2025-07-12 01:21:14,708 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745236_4412 src: /192.168.158.5:39054 dest: /192.168.158.4:9866 2025-07-12 01:21:14,733 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:39054, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_94586160_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745236_4412, duration(ns): 20317903 2025-07-12 01:21:14,734 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745236_4412, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-12 01:21:21,524 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745236_4412 replica FinalizedReplica, blk_1073745236_4412, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745236 for deletion 2025-07-12 01:21:21,525 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745236_4412 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745236 2025-07-12 01:23:19,707 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745238_4414 src: /192.168.158.1:39344 dest: /192.168.158.4:9866 2025-07-12 01:23:19,743 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39344, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-628261870_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745238_4414, duration(ns): 27697818 2025-07-12 01:23:19,744 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745238_4414, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-12 01:23:24,533 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745238_4414 replica FinalizedReplica, blk_1073745238_4414, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745238 for deletion 2025-07-12 01:23:24,534 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745238_4414 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745238 2025-07-12 01:25:24,716 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745240_4416 src: /192.168.158.5:52784 dest: /192.168.158.4:9866 2025-07-12 01:25:24,740 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:52784, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-387812355_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745240_4416, duration(ns): 18617630 2025-07-12 01:25:24,740 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745240_4416, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-12 01:25:30,542 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745240_4416 replica FinalizedReplica, blk_1073745240_4416, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745240 for deletion 2025-07-12 01:25:30,543 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745240_4416 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745240 2025-07-12 01:31:29,724 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745246_4422 src: /192.168.158.8:41304 dest: /192.168.158.4:9866 2025-07-12 01:31:29,742 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:41304, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1963592265_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745246_4422, duration(ns): 15964356 2025-07-12 01:31:29,743 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745246_4422, type=LAST_IN_PIPELINE terminating 2025-07-12 01:31:33,553 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745246_4422 replica FinalizedReplica, blk_1073745246_4422, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745246 for deletion 2025-07-12 01:31:33,555 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745246_4422 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745246 2025-07-12 01:33:29,729 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745248_4424 src: /192.168.158.5:36736 dest: /192.168.158.4:9866 2025-07-12 01:33:29,747 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:36736, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1029263582_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745248_4424, duration(ns): 15649202 2025-07-12 01:33:29,747 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745248_4424, type=LAST_IN_PIPELINE terminating 2025-07-12 01:33:33,563 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745248_4424 replica FinalizedReplica, blk_1073745248_4424, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745248 for deletion 2025-07-12 01:33:33,564 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745248_4424 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745248 2025-07-12 01:34:29,725 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745249_4425 src: /192.168.158.9:43022 dest: /192.168.158.4:9866 2025-07-12 01:34:29,742 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:43022, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1423949556_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745249_4425, duration(ns): 15411978 2025-07-12 01:34:29,743 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745249_4425, type=LAST_IN_PIPELINE terminating 2025-07-12 01:34:33,568 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745249_4425 replica FinalizedReplica, blk_1073745249_4425, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745249 for deletion 2025-07-12 01:34:33,569 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745249_4425 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745249 2025-07-12 01:36:34,725 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745251_4427 src: /192.168.158.6:34390 dest: /192.168.158.4:9866 2025-07-12 01:36:34,748 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:34390, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_930362819_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745251_4427, duration(ns): 18014040 2025-07-12 01:36:34,749 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745251_4427, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-12 01:36:42,576 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745251_4427 replica FinalizedReplica, blk_1073745251_4427, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745251 for deletion 2025-07-12 01:36:42,577 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745251_4427 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745251 2025-07-12 01:40:44,729 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745255_4431 src: /192.168.158.5:60338 dest: /192.168.158.4:9866 2025-07-12 01:40:44,752 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:60338, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_476910955_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745255_4431, duration(ns): 16743092 2025-07-12 01:40:44,752 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745255_4431, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-12 01:40:48,581 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745255_4431 replica FinalizedReplica, blk_1073745255_4431, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745255 for deletion 2025-07-12 01:40:48,582 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745255_4431 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745255 2025-07-12 01:48:04,743 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745262_4438 src: /192.168.158.6:33432 dest: /192.168.158.4:9866 2025-07-12 01:48:04,767 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:33432, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_597880363_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745262_4438, duration(ns): 19321420 2025-07-12 01:48:04,768 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745262_4438, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-12 01:48:12,600 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745262_4438 replica FinalizedReplica, blk_1073745262_4438, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745262 for deletion 2025-07-12 01:48:12,601 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745262_4438 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745262 2025-07-12 01:49:04,742 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745263_4439 src: /192.168.158.1:49248 dest: /192.168.158.4:9866 2025-07-12 01:49:04,774 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49248, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_334997896_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745263_4439, duration(ns): 23241442 2025-07-12 01:49:04,774 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745263_4439, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-12 01:49:12,600 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745263_4439 replica FinalizedReplica, blk_1073745263_4439, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745263 for deletion 2025-07-12 01:49:12,601 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745263_4439 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745263 2025-07-12 01:50:04,738 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745264_4440 src: /192.168.158.1:52618 dest: /192.168.158.4:9866 2025-07-12 01:50:04,771 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52618, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1510109400_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745264_4440, duration(ns): 23954183 2025-07-12 01:50:04,771 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745264_4440, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-12 01:50:12,601 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745264_4440 replica FinalizedReplica, blk_1073745264_4440, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745264 for deletion 2025-07-12 01:50:12,603 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745264_4440 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745264 2025-07-12 01:51:04,752 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745265_4441 src: /192.168.158.6:59604 dest: /192.168.158.4:9866 2025-07-12 01:51:04,776 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:59604, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_356494419_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745265_4441, duration(ns): 19292902 2025-07-12 01:51:04,777 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745265_4441, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-12 01:51:12,601 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745265_4441 replica FinalizedReplica, blk_1073745265_4441, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745265 for deletion 2025-07-12 01:51:12,602 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745265_4441 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745265 2025-07-12 01:52:04,754 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745266_4442 src: /192.168.158.1:59458 dest: /192.168.158.4:9866 2025-07-12 01:52:04,788 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59458, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_377464794_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745266_4442, duration(ns): 23962740 2025-07-12 01:52:04,789 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745266_4442, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-12 01:52:09,602 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745266_4442 replica FinalizedReplica, blk_1073745266_4442, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745266 for deletion 2025-07-12 01:52:09,604 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745266_4442 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745266 2025-07-12 01:53:04,748 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745267_4443 src: /192.168.158.1:44486 dest: /192.168.158.4:9866 2025-07-12 01:53:04,777 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44486, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-804052903_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745267_4443, duration(ns): 21044580 2025-07-12 01:53:04,778 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745267_4443, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-12 01:53:09,602 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745267_4443 replica FinalizedReplica, blk_1073745267_4443, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745267 for deletion 2025-07-12 01:53:09,604 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745267_4443 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745267 2025-07-12 01:55:04,765 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745269_4445 src: /192.168.158.9:38336 dest: /192.168.158.4:9866 2025-07-12 01:55:04,803 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:38336, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_146180910_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745269_4445, duration(ns): 32635866 2025-07-12 01:55:04,803 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745269_4445, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-12 01:55:09,606 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745269_4445 replica FinalizedReplica, blk_1073745269_4445, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745269 for deletion 2025-07-12 01:55:09,607 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745269_4445 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745269 2025-07-12 01:56:04,751 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745270_4446 src: /192.168.158.1:50932 dest: /192.168.158.4:9866 2025-07-12 01:56:04,782 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50932, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1970652257_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745270_4446, duration(ns): 22691141 2025-07-12 01:56:04,783 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745270_4446, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-12 01:56:12,607 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745270_4446 replica FinalizedReplica, blk_1073745270_4446, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745270 for deletion 2025-07-12 01:56:12,608 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745270_4446 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745270 2025-07-12 01:59:04,760 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745273_4449 src: /192.168.158.7:47808 dest: /192.168.158.4:9866 2025-07-12 01:59:04,779 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:47808, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_121530112_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745273_4449, duration(ns): 16589138 2025-07-12 01:59:04,779 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745273_4449, type=LAST_IN_PIPELINE terminating 2025-07-12 01:59:12,614 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745273_4449 replica FinalizedReplica, blk_1073745273_4449, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745273 for deletion 2025-07-12 01:59:12,616 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745273_4449 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745273 2025-07-12 02:00:04,760 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745274_4450 src: /192.168.158.5:51258 dest: /192.168.158.4:9866 2025-07-12 02:00:04,785 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:51258, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1752728684_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745274_4450, duration(ns): 19750847 2025-07-12 02:00:04,785 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745274_4450, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-12 02:00:12,618 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745274_4450 replica FinalizedReplica, blk_1073745274_4450, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745274 for deletion 2025-07-12 02:00:12,620 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745274_4450 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745274 2025-07-12 02:03:09,763 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745277_4453 src: /192.168.158.1:39772 dest: /192.168.158.4:9866 2025-07-12 02:03:09,794 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39772, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1188947062_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745277_4453, duration(ns): 21850720 2025-07-12 02:03:09,794 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745277_4453, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-12 02:03:12,627 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745277_4453 replica FinalizedReplica, blk_1073745277_4453, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745277 for deletion 2025-07-12 02:03:12,628 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745277_4453 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745277 2025-07-12 02:04:09,775 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745278_4454 src: /192.168.158.5:58232 dest: /192.168.158.4:9866 2025-07-12 02:04:09,792 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:58232, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1342949740_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745278_4454, duration(ns): 15845696 2025-07-12 02:04:09,792 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745278_4454, type=LAST_IN_PIPELINE terminating 2025-07-12 02:04:12,628 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745278_4454 replica FinalizedReplica, blk_1073745278_4454, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745278 for deletion 2025-07-12 02:04:12,630 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745278_4454 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745278 2025-07-12 02:05:09,778 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745279_4455 src: /192.168.158.8:52378 dest: /192.168.158.4:9866 2025-07-12 02:05:09,795 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:52378, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2022255478_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745279_4455, duration(ns): 14914212 2025-07-12 02:05:09,796 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745279_4455, type=LAST_IN_PIPELINE terminating 2025-07-12 02:05:12,630 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745279_4455 replica FinalizedReplica, blk_1073745279_4455, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745279 for deletion 2025-07-12 02:05:12,631 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745279_4455 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745279 2025-07-12 02:06:14,774 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745280_4456 src: /192.168.158.1:57378 dest: /192.168.158.4:9866 2025-07-12 02:06:14,807 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57378, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1619106292_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745280_4456, duration(ns): 23666720 2025-07-12 02:06:14,809 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745280_4456, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-12 02:06:21,635 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745280_4456 replica FinalizedReplica, blk_1073745280_4456, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745280 for deletion 2025-07-12 02:06:21,636 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745280_4456 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745280 2025-07-12 02:08:14,778 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745282_4458 src: /192.168.158.1:52870 dest: /192.168.158.4:9866 2025-07-12 02:08:14,809 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52870, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_374743503_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745282_4458, duration(ns): 22942013 2025-07-12 02:08:14,810 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745282_4458, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-12 02:08:21,638 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745282_4458 replica FinalizedReplica, blk_1073745282_4458, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745282 for deletion 2025-07-12 02:08:21,639 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745282_4458 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745282 2025-07-12 02:09:19,779 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745283_4459 src: /192.168.158.7:35292 dest: /192.168.158.4:9866 2025-07-12 02:09:19,804 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:35292, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-576558245_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745283_4459, duration(ns): 19870530 2025-07-12 02:09:19,805 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745283_4459, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-12 02:09:27,641 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745283_4459 replica FinalizedReplica, blk_1073745283_4459, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745283 for deletion 2025-07-12 02:09:27,642 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745283_4459 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745283 2025-07-12 02:10:19,787 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745284_4460 src: /192.168.158.5:52322 dest: /192.168.158.4:9866 2025-07-12 02:10:19,811 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:52322, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-468239691_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745284_4460, duration(ns): 17983721 2025-07-12 02:10:19,811 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745284_4460, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-12 02:10:27,645 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745284_4460 replica FinalizedReplica, blk_1073745284_4460, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745284 for deletion 2025-07-12 02:10:27,646 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745284_4460 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745284 2025-07-12 02:12:24,784 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745286_4462 src: /192.168.158.1:33538 dest: /192.168.158.4:9866 2025-07-12 02:12:24,814 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33538, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1420511911_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745286_4462, duration(ns): 21379119 2025-07-12 02:12:24,815 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745286_4462, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-12 02:12:27,651 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745286_4462 replica FinalizedReplica, blk_1073745286_4462, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745286 for deletion 2025-07-12 02:12:27,652 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745286_4462 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745286 2025-07-12 02:17:24,796 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745291_4467 src: /192.168.158.1:41372 dest: /192.168.158.4:9866 2025-07-12 02:17:24,824 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41372, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1717729077_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745291_4467, duration(ns): 19423789 2025-07-12 02:17:24,825 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745291_4467, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-12 02:17:27,665 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745291_4467 replica FinalizedReplica, blk_1073745291_4467, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745291 for deletion 2025-07-12 02:17:27,667 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745291_4467 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745291 2025-07-12 02:19:34,802 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745293_4469 src: /192.168.158.5:44390 dest: /192.168.158.4:9866 2025-07-12 02:19:34,826 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:44390, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-938027728_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745293_4469, duration(ns): 18790984 2025-07-12 02:19:34,826 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745293_4469, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-12 02:19:39,671 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745293_4469 replica FinalizedReplica, blk_1073745293_4469, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745293 for deletion 2025-07-12 02:19:39,672 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745293_4469 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745293 2025-07-12 02:24:39,812 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745298_4474 src: /192.168.158.1:53646 dest: /192.168.158.4:9866 2025-07-12 02:24:39,842 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53646, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1332775167_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745298_4474, duration(ns): 21553244 2025-07-12 02:24:39,843 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745298_4474, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-12 02:24:45,682 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745298_4474 replica FinalizedReplica, blk_1073745298_4474, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745298 for deletion 2025-07-12 02:24:45,684 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745298_4474 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745298 2025-07-12 02:27:44,813 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745301_4477 src: /192.168.158.6:35010 dest: /192.168.158.4:9866 2025-07-12 02:27:44,832 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:35010, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1083529028_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745301_4477, duration(ns): 16519031 2025-07-12 02:27:44,832 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745301_4477, type=LAST_IN_PIPELINE terminating 2025-07-12 02:27:48,687 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745301_4477 replica FinalizedReplica, blk_1073745301_4477, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745301 for deletion 2025-07-12 02:27:48,688 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745301_4477 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745301 2025-07-12 02:30:44,818 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745304_4480 src: /192.168.158.8:35380 dest: /192.168.158.4:9866 2025-07-12 02:30:44,844 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:35380, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1051306768_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745304_4480, duration(ns): 20273094 2025-07-12 02:30:44,844 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745304_4480, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-12 02:30:48,700 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745304_4480 replica FinalizedReplica, blk_1073745304_4480, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745304 for deletion 2025-07-12 02:30:48,701 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745304_4480 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745304 2025-07-12 02:34:44,816 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745308_4484 src: /192.168.158.6:36332 dest: /192.168.158.4:9866 2025-07-12 02:34:44,841 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:36332, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1589185863_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745308_4484, duration(ns): 19421550 2025-07-12 02:34:44,842 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745308_4484, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-12 02:34:48,709 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745308_4484 replica FinalizedReplica, blk_1073745308_4484, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745308 for deletion 2025-07-12 02:34:48,710 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745308_4484 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745308 2025-07-12 02:35:44,831 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745309_4485 src: /192.168.158.1:53232 dest: /192.168.158.4:9866 2025-07-12 02:35:44,866 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53232, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_534628830_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745309_4485, duration(ns): 26174134 2025-07-12 02:35:44,867 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745309_4485, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-12 02:35:51,714 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745309_4485 replica FinalizedReplica, blk_1073745309_4485, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745309 for deletion 2025-07-12 02:35:51,715 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745309_4485 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745309 2025-07-12 02:37:49,827 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745311_4487 src: /192.168.158.8:37460 dest: /192.168.158.4:9866 2025-07-12 02:37:49,843 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:37460, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-266160485_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745311_4487, duration(ns): 14203439 2025-07-12 02:37:49,843 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745311_4487, type=LAST_IN_PIPELINE terminating 2025-07-12 02:37:54,718 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745311_4487 replica FinalizedReplica, blk_1073745311_4487, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745311 for deletion 2025-07-12 02:37:54,719 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745311_4487 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745311 2025-07-12 02:39:54,845 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745313_4489 src: /192.168.158.1:45282 dest: /192.168.158.4:9866 2025-07-12 02:39:54,877 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45282, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_257624109_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745313_4489, duration(ns): 22837064 2025-07-12 02:39:54,877 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745313_4489, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-12 02:39:57,723 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745313_4489 replica FinalizedReplica, blk_1073745313_4489, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745313 for deletion 2025-07-12 02:39:57,724 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745313_4489 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745313 2025-07-12 02:40:59,840 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745314_4490 src: /192.168.158.1:34940 dest: /192.168.158.4:9866 2025-07-12 02:40:59,870 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34940, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_615073278_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745314_4490, duration(ns): 21986747 2025-07-12 02:40:59,871 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745314_4490, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-12 02:41:03,725 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745314_4490 replica FinalizedReplica, blk_1073745314_4490, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745314 for deletion 2025-07-12 02:41:03,726 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745314_4490 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745314 2025-07-12 02:42:04,855 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745315_4491 src: /192.168.158.9:50986 dest: /192.168.158.4:9866 2025-07-12 02:42:04,872 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:50986, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_784561386_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745315_4491, duration(ns): 14378345 2025-07-12 02:42:04,872 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745315_4491, type=LAST_IN_PIPELINE terminating 2025-07-12 02:42:09,728 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745315_4491 replica FinalizedReplica, blk_1073745315_4491, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745315 for deletion 2025-07-12 02:42:09,729 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745315_4491 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745315 2025-07-12 02:43:09,851 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745316_4492 src: /192.168.158.8:50492 dest: /192.168.158.4:9866 2025-07-12 02:43:09,885 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:50492, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2067848905_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745316_4492, duration(ns): 29003175 2025-07-12 02:43:09,886 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745316_4492, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-12 02:43:12,728 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745316_4492 replica FinalizedReplica, blk_1073745316_4492, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745316 for deletion 2025-07-12 02:43:12,730 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745316_4492 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745316 2025-07-12 02:45:19,842 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745318_4494 src: /192.168.158.1:52424 dest: /192.168.158.4:9866 2025-07-12 02:45:19,873 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52424, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-400547374_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745318_4494, duration(ns): 22432085 2025-07-12 02:45:19,873 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745318_4494, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-12 02:45:27,733 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745318_4494 replica FinalizedReplica, blk_1073745318_4494, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745318 for deletion 2025-07-12 02:45:27,734 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745318_4494 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745318 2025-07-12 02:46:19,856 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745319_4495 src: /192.168.158.7:53014 dest: /192.168.158.4:9866 2025-07-12 02:46:19,873 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:53014, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1567728215_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745319_4495, duration(ns): 15051850 2025-07-12 02:46:19,874 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745319_4495, type=LAST_IN_PIPELINE terminating 2025-07-12 02:46:24,736 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745319_4495 replica FinalizedReplica, blk_1073745319_4495, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745319 for deletion 2025-07-12 02:46:24,737 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745319_4495 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745319 2025-07-12 02:47:19,845 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745320_4496 src: /192.168.158.1:55778 dest: /192.168.158.4:9866 2025-07-12 02:47:19,878 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55778, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1556586439_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745320_4496, duration(ns): 24588265 2025-07-12 02:47:19,878 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745320_4496, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-12 02:47:27,737 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745320_4496 replica FinalizedReplica, blk_1073745320_4496, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745320 for deletion 2025-07-12 02:47:27,738 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745320_4496 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745320 2025-07-12 02:50:19,849 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745323_4499 src: /192.168.158.6:39796 dest: /192.168.158.4:9866 2025-07-12 02:50:19,868 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:39796, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1715713469_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745323_4499, duration(ns): 16667151 2025-07-12 02:50:19,868 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745323_4499, type=LAST_IN_PIPELINE terminating 2025-07-12 02:50:24,744 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745323_4499 replica FinalizedReplica, blk_1073745323_4499, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745323 for deletion 2025-07-12 02:50:24,745 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745323_4499 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745323 2025-07-12 02:52:19,851 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745325_4501 src: /192.168.158.8:40582 dest: /192.168.158.4:9866 2025-07-12 02:52:19,876 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:40582, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1990979814_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745325_4501, duration(ns): 19396383 2025-07-12 02:52:19,876 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745325_4501, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-12 02:52:24,747 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745325_4501 replica FinalizedReplica, blk_1073745325_4501, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745325 for deletion 2025-07-12 02:52:24,749 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745325_4501 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745325 2025-07-12 02:55:24,883 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745328_4504 src: /192.168.158.8:34232 dest: /192.168.158.4:9866 2025-07-12 02:55:24,909 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:34232, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1926307645_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745328_4504, duration(ns): 20591030 2025-07-12 02:55:24,909 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745328_4504, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-12 02:55:27,759 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745328_4504 replica FinalizedReplica, blk_1073745328_4504, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745328 for deletion 2025-07-12 02:55:27,760 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745328_4504 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745328 2025-07-12 02:56:29,859 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745329_4505 src: /192.168.158.8:38090 dest: /192.168.158.4:9866 2025-07-12 02:56:29,877 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:38090, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1460228189_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745329_4505, duration(ns): 16515294 2025-07-12 02:56:29,878 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745329_4505, type=LAST_IN_PIPELINE terminating 2025-07-12 02:56:33,761 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745329_4505 replica FinalizedReplica, blk_1073745329_4505, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745329 for deletion 2025-07-12 02:56:33,762 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745329_4505 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745329 2025-07-12 02:57:29,864 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745330_4506 src: /192.168.158.6:48996 dest: /192.168.158.4:9866 2025-07-12 02:57:29,886 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:48996, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_38349642_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745330_4506, duration(ns): 20081982 2025-07-12 02:57:29,887 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745330_4506, type=LAST_IN_PIPELINE terminating 2025-07-12 02:57:36,762 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745330_4506 replica FinalizedReplica, blk_1073745330_4506, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745330 for deletion 2025-07-12 02:57:36,763 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745330_4506 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745330 2025-07-12 02:58:29,879 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745331_4507 src: /192.168.158.1:50316 dest: /192.168.158.4:9866 2025-07-12 02:58:29,910 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50316, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1591974584_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745331_4507, duration(ns): 22217722 2025-07-12 02:58:29,911 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745331_4507, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-12 02:58:33,764 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745331_4507 replica FinalizedReplica, blk_1073745331_4507, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745331 for deletion 2025-07-12 02:58:33,766 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745331_4507 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745331 2025-07-12 03:02:29,854 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745335_4511 src: /192.168.158.1:40140 dest: /192.168.158.4:9866 2025-07-12 03:02:29,885 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40140, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-823867661_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745335_4511, duration(ns): 22030264 2025-07-12 03:02:29,886 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745335_4511, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-12 03:02:33,774 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745335_4511 replica FinalizedReplica, blk_1073745335_4511, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745335 for deletion 2025-07-12 03:02:33,775 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745335_4511 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745335 2025-07-12 03:04:29,864 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745337_4513 src: /192.168.158.7:38850 dest: /192.168.158.4:9866 2025-07-12 03:04:29,888 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:38850, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-684520978_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745337_4513, duration(ns): 18883569 2025-07-12 03:04:29,889 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745337_4513, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-12 03:04:33,780 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745337_4513 replica FinalizedReplica, blk_1073745337_4513, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745337 for deletion 2025-07-12 03:04:33,781 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745337_4513 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745337 2025-07-12 03:09:34,880 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745342_4518 src: /192.168.158.1:43766 dest: /192.168.158.4:9866 2025-07-12 03:09:34,911 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43766, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_787233638_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745342_4518, duration(ns): 21723061 2025-07-12 03:09:34,911 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745342_4518, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-12 03:09:39,795 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745342_4518 replica FinalizedReplica, blk_1073745342_4518, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745342 for deletion 2025-07-12 03:09:39,796 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745342_4518 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745342 2025-07-12 03:10:39,875 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745343_4519 src: /192.168.158.1:38084 dest: /192.168.158.4:9866 2025-07-12 03:10:39,907 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38084, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-511067162_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745343_4519, duration(ns): 23308874 2025-07-12 03:10:39,907 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745343_4519, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-12 03:10:42,798 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745343_4519 replica FinalizedReplica, blk_1073745343_4519, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745343 for deletion 2025-07-12 03:10:42,799 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745343_4519 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745343 2025-07-12 03:11:44,881 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745344_4520 src: /192.168.158.1:37958 dest: /192.168.158.4:9866 2025-07-12 03:11:44,911 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37958, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1755029040_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745344_4520, duration(ns): 21632009 2025-07-12 03:11:44,914 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745344_4520, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-12 03:11:48,803 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745344_4520 replica FinalizedReplica, blk_1073745344_4520, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745344 for deletion 2025-07-12 03:11:48,804 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745344_4520 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745344 2025-07-12 03:14:49,896 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745347_4523 src: /192.168.158.5:54070 dest: /192.168.158.4:9866 2025-07-12 03:14:49,923 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:54070, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_486665501_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745347_4523, duration(ns): 20158243 2025-07-12 03:14:49,923 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745347_4523, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-12 03:14:57,808 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745347_4523 replica FinalizedReplica, blk_1073745347_4523, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745347 for deletion 2025-07-12 03:14:57,809 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745347_4523 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745347 2025-07-12 03:15:49,929 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745348_4524 src: /192.168.158.9:59848 dest: /192.168.158.4:9866 2025-07-12 03:15:49,954 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:59848, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1825139351_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745348_4524, duration(ns): 19559255 2025-07-12 03:15:49,955 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745348_4524, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-12 03:15:54,809 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745348_4524 replica FinalizedReplica, blk_1073745348_4524, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745348 for deletion 2025-07-12 03:15:54,810 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745348_4524 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745348 2025-07-12 03:18:59,913 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745351_4527 src: /192.168.158.5:49406 dest: /192.168.158.4:9866 2025-07-12 03:18:59,930 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:49406, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1213835070_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745351_4527, duration(ns): 15047143 2025-07-12 03:18:59,931 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745351_4527, type=LAST_IN_PIPELINE terminating 2025-07-12 03:19:03,816 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745351_4527 replica FinalizedReplica, blk_1073745351_4527, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745351 for deletion 2025-07-12 03:19:03,817 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745351_4527 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745351 2025-07-12 03:21:04,904 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745353_4529 src: /192.168.158.1:57302 dest: /192.168.158.4:9866 2025-07-12 03:21:04,937 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57302, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1530938441_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745353_4529, duration(ns): 23071627 2025-07-12 03:21:04,937 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745353_4529, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-12 03:21:12,824 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745353_4529 replica FinalizedReplica, blk_1073745353_4529, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745353 for deletion 2025-07-12 03:21:12,825 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745353_4529 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745353 2025-07-12 03:22:04,956 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745354_4530 src: /192.168.158.9:54440 dest: /192.168.158.4:9866 2025-07-12 03:22:04,980 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:54440, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1678930876_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745354_4530, duration(ns): 18972808 2025-07-12 03:22:04,980 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745354_4530, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-12 03:22:12,828 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745354_4530 replica FinalizedReplica, blk_1073745354_4530, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745354 for deletion 2025-07-12 03:22:12,829 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745354_4530 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745354 2025-07-12 03:24:04,912 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745356_4532 src: /192.168.158.1:51036 dest: /192.168.158.4:9866 2025-07-12 03:24:04,942 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51036, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_155598292_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745356_4532, duration(ns): 22156192 2025-07-12 03:24:04,943 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745356_4532, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-12 03:24:09,833 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745356_4532 replica FinalizedReplica, blk_1073745356_4532, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745356 for deletion 2025-07-12 03:24:09,834 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745356_4532 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745356 2025-07-12 03:25:09,936 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745357_4533 src: /192.168.158.6:51710 dest: /192.168.158.4:9866 2025-07-12 03:25:09,955 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:51710, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1327320691_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745357_4533, duration(ns): 16071005 2025-07-12 03:25:09,955 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745357_4533, type=LAST_IN_PIPELINE terminating 2025-07-12 03:25:12,836 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745357_4533 replica FinalizedReplica, blk_1073745357_4533, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745357 for deletion 2025-07-12 03:25:12,837 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745357_4533 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745357 2025-07-12 03:27:14,934 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745359_4535 src: /192.168.158.1:44064 dest: /192.168.158.4:9866 2025-07-12 03:27:14,965 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44064, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1438266083_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745359_4535, duration(ns): 22350987 2025-07-12 03:27:14,965 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745359_4535, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-12 03:27:18,841 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745359_4535 replica FinalizedReplica, blk_1073745359_4535, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745359 for deletion 2025-07-12 03:27:18,842 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745359_4535 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745359 2025-07-12 03:28:14,920 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745360_4536 src: /192.168.158.1:40888 dest: /192.168.158.4:9866 2025-07-12 03:28:14,950 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40888, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-953005459_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745360_4536, duration(ns): 20479198 2025-07-12 03:28:14,951 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745360_4536, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-12 03:28:18,844 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745360_4536 replica FinalizedReplica, blk_1073745360_4536, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745360 for deletion 2025-07-12 03:28:18,845 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745360_4536 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745360 2025-07-12 03:34:29,924 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745366_4542 src: /192.168.158.5:47302 dest: /192.168.158.4:9866 2025-07-12 03:34:29,948 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:47302, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1676924217_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745366_4542, duration(ns): 19151171 2025-07-12 03:34:29,949 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745366_4542, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-12 03:34:36,860 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745366_4542 replica FinalizedReplica, blk_1073745366_4542, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745366 for deletion 2025-07-12 03:34:36,861 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745366_4542 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745366 2025-07-12 03:36:29,926 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745368_4544 src: /192.168.158.9:47470 dest: /192.168.158.4:9866 2025-07-12 03:36:29,944 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:47470, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_113097573_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745368_4544, duration(ns): 16089660 2025-07-12 03:36:29,944 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745368_4544, type=LAST_IN_PIPELINE terminating 2025-07-12 03:36:33,864 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745368_4544 replica FinalizedReplica, blk_1073745368_4544, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745368 for deletion 2025-07-12 03:36:33,865 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745368_4544 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745368 2025-07-12 03:39:34,923 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745371_4547 src: /192.168.158.1:34666 dest: /192.168.158.4:9866 2025-07-12 03:39:34,956 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34666, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-469814614_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745371_4547, duration(ns): 24262137 2025-07-12 03:39:34,956 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745371_4547, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-12 03:39:39,869 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745371_4547 replica FinalizedReplica, blk_1073745371_4547, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745371 for deletion 2025-07-12 03:39:39,870 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745371_4547 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745371 2025-07-12 03:40:34,942 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745372_4548 src: /192.168.158.9:40034 dest: /192.168.158.4:9866 2025-07-12 03:40:34,959 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:40034, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1533144411_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745372_4548, duration(ns): 15092694 2025-07-12 03:40:34,959 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745372_4548, type=LAST_IN_PIPELINE terminating 2025-07-12 03:40:39,871 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745372_4548 replica FinalizedReplica, blk_1073745372_4548, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745372 for deletion 2025-07-12 03:40:39,873 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745372_4548 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745372 2025-07-12 03:42:39,930 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745374_4550 src: /192.168.158.8:46384 dest: /192.168.158.4:9866 2025-07-12 03:42:39,948 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:46384, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2123144396_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745374_4550, duration(ns): 15244353 2025-07-12 03:42:39,948 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745374_4550, type=LAST_IN_PIPELINE terminating 2025-07-12 03:42:42,876 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745374_4550 replica FinalizedReplica, blk_1073745374_4550, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745374 for deletion 2025-07-12 03:42:42,877 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745374_4550 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745374 2025-07-12 03:44:44,928 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745376_4552 src: /192.168.158.9:42638 dest: /192.168.158.4:9866 2025-07-12 03:44:44,952 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:42638, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-39277520_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745376_4552, duration(ns): 18449637 2025-07-12 03:44:44,954 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745376_4552, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-12 03:44:48,882 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745376_4552 replica FinalizedReplica, blk_1073745376_4552, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745376 for deletion 2025-07-12 03:44:48,883 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745376_4552 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745376 2025-07-12 03:45:44,940 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745377_4553 src: /192.168.158.7:54840 dest: /192.168.158.4:9866 2025-07-12 03:45:44,956 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:54840, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_28519857_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745377_4553, duration(ns): 13801623 2025-07-12 03:45:44,957 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745377_4553, type=LAST_IN_PIPELINE terminating 2025-07-12 03:45:48,885 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745377_4553 replica FinalizedReplica, blk_1073745377_4553, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745377 for deletion 2025-07-12 03:45:48,886 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745377_4553 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745377 2025-07-12 03:47:44,934 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745379_4555 src: /192.168.158.5:44328 dest: /192.168.158.4:9866 2025-07-12 03:47:44,958 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:44328, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_905172537_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745379_4555, duration(ns): 18738360 2025-07-12 03:47:44,959 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745379_4555, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-12 03:47:48,890 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745379_4555 replica FinalizedReplica, blk_1073745379_4555, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745379 for deletion 2025-07-12 03:47:48,892 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745379_4555 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745379 2025-07-12 03:48:44,936 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745380_4556 src: /192.168.158.9:50986 dest: /192.168.158.4:9866 2025-07-12 03:48:44,960 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:50986, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_339283122_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745380_4556, duration(ns): 18114841 2025-07-12 03:48:44,961 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745380_4556, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-12 03:48:51,893 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745380_4556 replica FinalizedReplica, blk_1073745380_4556, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745380 for deletion 2025-07-12 03:48:51,894 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745380_4556 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745380 2025-07-12 03:49:44,942 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745381_4557 src: /192.168.158.6:38596 dest: /192.168.158.4:9866 2025-07-12 03:49:44,969 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:38596, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1680169386_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745381_4557, duration(ns): 20571907 2025-07-12 03:49:44,970 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745381_4557, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-12 03:49:48,894 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745381_4557 replica FinalizedReplica, blk_1073745381_4557, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745381 for deletion 2025-07-12 03:49:48,895 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745381_4557 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745381 2025-07-12 03:52:54,950 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745384_4560 src: /192.168.158.9:43146 dest: /192.168.158.4:9866 2025-07-12 03:52:54,967 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:43146, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1107897429_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745384_4560, duration(ns): 14944186 2025-07-12 03:52:54,967 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745384_4560, type=LAST_IN_PIPELINE terminating 2025-07-12 03:52:57,902 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745384_4560 replica FinalizedReplica, blk_1073745384_4560, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745384 for deletion 2025-07-12 03:52:57,904 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745384_4560 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745384 2025-07-12 03:53:54,949 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745385_4561 src: /192.168.158.1:55070 dest: /192.168.158.4:9866 2025-07-12 03:53:54,985 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55070, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1913459066_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745385_4561, duration(ns): 25277560 2025-07-12 03:53:54,985 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745385_4561, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-12 03:53:57,905 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745385_4561 replica FinalizedReplica, blk_1073745385_4561, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745385 for deletion 2025-07-12 03:53:57,906 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745385_4561 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745385 2025-07-12 03:54:54,941 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745386_4562 src: /192.168.158.1:57604 dest: /192.168.158.4:9866 2025-07-12 03:54:54,974 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57604, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1324141485_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745386_4562, duration(ns): 24111283 2025-07-12 03:54:54,974 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745386_4562, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-12 03:54:57,907 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745386_4562 replica FinalizedReplica, blk_1073745386_4562, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745386 for deletion 2025-07-12 03:54:57,909 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745386_4562 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745386 2025-07-12 03:55:59,955 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745387_4563 src: /192.168.158.1:47122 dest: /192.168.158.4:9866 2025-07-12 03:55:59,991 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47122, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_252160621_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745387_4563, duration(ns): 27759857 2025-07-12 03:55:59,992 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745387_4563, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-12 03:56:03,910 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745387_4563 replica FinalizedReplica, blk_1073745387_4563, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745387 for deletion 2025-07-12 03:56:03,911 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745387_4563 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745387 2025-07-12 03:56:59,956 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745388_4564 src: /192.168.158.8:49792 dest: /192.168.158.4:9866 2025-07-12 03:56:59,984 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:49792, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1303278091_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745388_4564, duration(ns): 21838865 2025-07-12 03:56:59,984 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745388_4564, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-12 03:57:03,911 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745388_4564 replica FinalizedReplica, blk_1073745388_4564, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745388 for deletion 2025-07-12 03:57:03,912 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745388_4564 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745388 2025-07-12 03:57:59,952 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745389_4565 src: /192.168.158.1:60262 dest: /192.168.158.4:9866 2025-07-12 03:57:59,983 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60262, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_902285845_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745389_4565, duration(ns): 22297892 2025-07-12 03:57:59,983 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745389_4565, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-12 03:58:03,916 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745389_4565 replica FinalizedReplica, blk_1073745389_4565, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745389 for deletion 2025-07-12 03:58:03,917 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745389_4565 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745389 2025-07-12 03:58:59,958 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745390_4566 src: /192.168.158.5:39060 dest: /192.168.158.4:9866 2025-07-12 03:58:59,977 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:39060, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1557127892_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745390_4566, duration(ns): 17054662 2025-07-12 03:58:59,978 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745390_4566, type=LAST_IN_PIPELINE terminating 2025-07-12 03:59:03,918 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745390_4566 replica FinalizedReplica, blk_1073745390_4566, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745390 for deletion 2025-07-12 03:59:03,920 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745390_4566 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745390 2025-07-12 03:59:59,952 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745391_4567 src: /192.168.158.1:39058 dest: /192.168.158.4:9866 2025-07-12 03:59:59,982 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39058, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-626162850_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745391_4567, duration(ns): 20974456 2025-07-12 03:59:59,983 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745391_4567, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-12 04:00:03,919 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745391_4567 replica FinalizedReplica, blk_1073745391_4567, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745391 for deletion 2025-07-12 04:00:03,921 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745391_4567 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745391 2025-07-12 04:02:04,977 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745393_4569 src: /192.168.158.1:36970 dest: /192.168.158.4:9866 2025-07-12 04:02:05,009 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36970, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_796754348_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745393_4569, duration(ns): 23724376 2025-07-12 04:02:05,010 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745393_4569, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-12 04:02:09,923 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745393_4569 replica FinalizedReplica, blk_1073745393_4569, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745393 for deletion 2025-07-12 04:02:09,924 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745393_4569 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745393 2025-07-12 04:08:19,971 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745399_4575 src: /192.168.158.5:53228 dest: /192.168.158.4:9866 2025-07-12 04:08:19,988 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:53228, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1930359382_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745399_4575, duration(ns): 15087067 2025-07-12 04:08:19,988 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745399_4575, type=LAST_IN_PIPELINE terminating 2025-07-12 04:08:27,934 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745399_4575 replica FinalizedReplica, blk_1073745399_4575, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745399 for deletion 2025-07-12 04:08:27,935 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745399_4575 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745399 2025-07-12 04:10:19,968 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745401_4577 src: /192.168.158.1:57808 dest: /192.168.158.4:9866 2025-07-12 04:10:19,999 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57808, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1885855632_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745401_4577, duration(ns): 21927417 2025-07-12 04:10:19,999 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745401_4577, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-12 04:10:24,939 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745401_4577 replica FinalizedReplica, blk_1073745401_4577, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745401 for deletion 2025-07-12 04:10:24,940 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745401_4577 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745401 2025-07-12 04:11:19,958 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745402_4578 src: /192.168.158.1:54414 dest: /192.168.158.4:9866 2025-07-12 04:11:19,989 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54414, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1920576218_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745402_4578, duration(ns): 22461310 2025-07-12 04:11:19,989 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745402_4578, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-12 04:11:24,941 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745402_4578 replica FinalizedReplica, blk_1073745402_4578, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745402 for deletion 2025-07-12 04:11:24,943 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745402_4578 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745402 2025-07-12 04:12:19,975 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745403_4579 src: /192.168.158.9:41222 dest: /192.168.158.4:9866 2025-07-12 04:12:19,993 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:41222, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_653336836_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745403_4579, duration(ns): 16017897 2025-07-12 04:12:19,993 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745403_4579, type=LAST_IN_PIPELINE terminating 2025-07-12 04:12:24,947 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745403_4579 replica FinalizedReplica, blk_1073745403_4579, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745403 for deletion 2025-07-12 04:12:24,948 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745403_4579 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745403 2025-07-12 04:13:19,969 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745404_4580 src: /192.168.158.6:37166 dest: /192.168.158.4:9866 2025-07-12 04:13:19,993 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:37166, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1753951355_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745404_4580, duration(ns): 19127043 2025-07-12 04:13:19,994 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745404_4580, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-12 04:13:24,951 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745404_4580 replica FinalizedReplica, blk_1073745404_4580, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745404 for deletion 2025-07-12 04:13:24,952 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745404_4580 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073745404 2025-07-12 04:18:34,972 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745409_4585 src: /192.168.158.5:51136 dest: /192.168.158.4:9866 2025-07-12 04:18:34,999 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:51136, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1915504100_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745409_4585, duration(ns): 21204471 2025-07-12 04:18:34,999 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745409_4585, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-12 04:18:39,963 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745409_4585 replica FinalizedReplica, blk_1073745409_4585, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745409 for deletion 2025-07-12 04:18:39,964 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745409_4585 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745409 2025-07-12 04:19:34,977 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745410_4586 src: /192.168.158.9:60604 dest: /192.168.158.4:9866 2025-07-12 04:19:35,003 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:60604, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1437879248_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745410_4586, duration(ns): 20260704 2025-07-12 04:19:35,003 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745410_4586, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-12 04:19:39,965 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745410_4586 replica FinalizedReplica, blk_1073745410_4586, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745410 for deletion 2025-07-12 04:19:39,967 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745410_4586 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745410 2025-07-12 04:20:34,975 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745411_4587 src: /192.168.158.1:45238 dest: /192.168.158.4:9866 2025-07-12 04:20:35,008 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45238, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1698109841_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745411_4587, duration(ns): 23838348 2025-07-12 04:20:35,008 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745411_4587, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-12 04:20:42,966 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745411_4587 replica FinalizedReplica, blk_1073745411_4587, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745411 for deletion 2025-07-12 04:20:42,967 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745411_4587 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745411 2025-07-12 04:21:34,967 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745412_4588 src: /192.168.158.5:45282 dest: /192.168.158.4:9866 2025-07-12 04:21:34,987 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:45282, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1862058645_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745412_4588, duration(ns): 17267188 2025-07-12 04:21:34,987 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745412_4588, type=LAST_IN_PIPELINE terminating 2025-07-12 04:21:39,968 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745412_4588 replica FinalizedReplica, blk_1073745412_4588, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745412 for deletion 2025-07-12 04:21:39,969 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745412_4588 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745412 2025-07-12 04:22:34,990 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745413_4589 src: /192.168.158.5:46122 dest: /192.168.158.4:9866 2025-07-12 04:22:35,015 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:46122, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-491917800_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745413_4589, duration(ns): 19739579 2025-07-12 04:22:35,016 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745413_4589, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-12 04:22:39,971 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745413_4589 replica FinalizedReplica, blk_1073745413_4589, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745413 for deletion 2025-07-12 04:22:39,972 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745413_4589 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745413 2025-07-12 04:26:34,996 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745417_4593 src: /192.168.158.7:37840 dest: /192.168.158.4:9866 2025-07-12 04:26:35,014 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:37840, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1020883490_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745417_4593, duration(ns): 15202040 2025-07-12 04:26:35,014 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745417_4593, type=LAST_IN_PIPELINE terminating 2025-07-12 04:26:39,987 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745417_4593 replica FinalizedReplica, blk_1073745417_4593, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745417 for deletion 2025-07-12 04:26:39,988 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745417_4593 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745417 2025-07-12 04:28:34,996 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745419_4595 src: /192.168.158.7:41374 dest: /192.168.158.4:9866 2025-07-12 04:28:35,020 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:41374, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1544618158_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745419_4595, duration(ns): 18805253 2025-07-12 04:28:35,020 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745419_4595, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-12 04:28:39,990 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745419_4595 replica FinalizedReplica, blk_1073745419_4595, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745419 for deletion 2025-07-12 04:28:39,991 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745419_4595 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745419 2025-07-12 04:29:39,987 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745420_4596 src: /192.168.158.1:46408 dest: /192.168.158.4:9866 2025-07-12 04:29:40,022 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46408, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1041371376_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745420_4596, duration(ns): 24969375 2025-07-12 04:29:40,022 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745420_4596, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-12 04:29:45,990 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745420_4596 replica FinalizedReplica, blk_1073745420_4596, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745420 for deletion 2025-07-12 04:29:45,991 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745420_4596 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745420 2025-07-12 04:30:39,992 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745421_4597 src: /192.168.158.1:40588 dest: /192.168.158.4:9866 2025-07-12 04:30:40,024 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40588, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_716435713_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745421_4597, duration(ns): 22459550 2025-07-12 04:30:40,024 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745421_4597, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-12 04:30:42,993 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745421_4597 replica FinalizedReplica, blk_1073745421_4597, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745421 for deletion 2025-07-12 04:30:42,994 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745421_4597 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745421 2025-07-12 04:31:40,009 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745422_4598 src: /192.168.158.6:48852 dest: /192.168.158.4:9866 2025-07-12 04:31:40,034 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:48852, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1382351905_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745422_4598, duration(ns): 19491966 2025-07-12 04:31:40,034 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745422_4598, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-12 04:31:42,994 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745422_4598 replica FinalizedReplica, blk_1073745422_4598, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745422 for deletion 2025-07-12 04:31:42,996 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745422_4598 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745422 2025-07-12 04:32:45,004 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745423_4599 src: /192.168.158.7:38948 dest: /192.168.158.4:9866 2025-07-12 04:32:45,028 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:38948, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1634628027_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745423_4599, duration(ns): 18465954 2025-07-12 04:32:45,028 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745423_4599, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-12 04:32:48,997 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745423_4599 replica FinalizedReplica, blk_1073745423_4599, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745423 for deletion 2025-07-12 04:32:48,998 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745423_4599 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745423 2025-07-12 04:34:45,016 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745425_4601 src: /192.168.158.5:59946 dest: /192.168.158.4:9866 2025-07-12 04:34:45,040 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:59946, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_174072289_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745425_4601, duration(ns): 18650810 2025-07-12 04:34:45,040 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745425_4601, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-12 04:34:49,000 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745425_4601 replica FinalizedReplica, blk_1073745425_4601, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745425 for deletion 2025-07-12 04:34:49,002 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745425_4601 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745425 2025-07-12 04:37:50,011 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745428_4604 src: /192.168.158.5:51818 dest: /192.168.158.4:9866 2025-07-12 04:37:50,035 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:51818, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1768511002_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745428_4604, duration(ns): 17525173 2025-07-12 04:37:50,035 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745428_4604, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-12 04:37:55,007 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745428_4604 replica FinalizedReplica, blk_1073745428_4604, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745428 for deletion 2025-07-12 04:37:55,008 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745428_4604 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745428 2025-07-12 04:41:50,025 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745432_4608 src: /192.168.158.1:39640 dest: /192.168.158.4:9866 2025-07-12 04:41:50,059 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39640, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1972458203_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745432_4608, duration(ns): 25199206 2025-07-12 04:41:50,059 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745432_4608, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-12 04:41:58,015 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745432_4608 replica FinalizedReplica, blk_1073745432_4608, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745432 for deletion 2025-07-12 04:41:58,016 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745432_4608 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745432 2025-07-12 04:45:55,031 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745436_4612 src: /192.168.158.8:33964 dest: /192.168.158.4:9866 2025-07-12 04:45:55,055 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:33964, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1437145823_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745436_4612, duration(ns): 18744271 2025-07-12 04:45:55,055 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745436_4612, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-12 04:45:58,022 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745436_4612 replica FinalizedReplica, blk_1073745436_4612, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745436 for deletion 2025-07-12 04:45:58,023 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745436_4612 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745436 2025-07-12 04:50:05,027 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745440_4616 src: /192.168.158.1:35748 dest: /192.168.158.4:9866 2025-07-12 04:50:05,058 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35748, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_560854343_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745440_4616, duration(ns): 21994377 2025-07-12 04:50:05,059 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745440_4616, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-12 04:50:10,030 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745440_4616 replica FinalizedReplica, blk_1073745440_4616, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745440 for deletion 2025-07-12 04:50:10,031 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745440_4616 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745440 2025-07-12 04:51:05,033 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745441_4617 src: /192.168.158.9:49034 dest: /192.168.158.4:9866 2025-07-12 04:51:05,050 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:49034, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_885461207_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745441_4617, duration(ns): 15308077 2025-07-12 04:51:05,051 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745441_4617, type=LAST_IN_PIPELINE terminating 2025-07-12 04:51:10,031 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745441_4617 replica FinalizedReplica, blk_1073745441_4617, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745441 for deletion 2025-07-12 04:51:10,033 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745441_4617 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745441 2025-07-12 04:55:05,047 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745445_4621 src: /192.168.158.8:37524 dest: /192.168.158.4:9866 2025-07-12 04:55:05,064 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:37524, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_339030566_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745445_4621, duration(ns): 15383867 2025-07-12 04:55:05,065 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745445_4621, type=LAST_IN_PIPELINE terminating 2025-07-12 04:55:13,036 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745445_4621 replica FinalizedReplica, blk_1073745445_4621, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745445 for deletion 2025-07-12 04:55:13,037 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745445_4621 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745445 2025-07-12 04:56:05,036 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745446_4622 src: /192.168.158.8:49246 dest: /192.168.158.4:9866 2025-07-12 04:56:05,061 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:49246, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-523826822_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745446_4622, duration(ns): 19994837 2025-07-12 04:56:05,061 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745446_4622, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-12 04:56:10,037 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745446_4622 replica FinalizedReplica, blk_1073745446_4622, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745446 for deletion 2025-07-12 04:56:10,038 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745446_4622 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745446 2025-07-12 04:59:05,057 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745449_4625 src: /192.168.158.7:48736 dest: /192.168.158.4:9866 2025-07-12 04:59:05,074 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:48736, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-300736094_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745449_4625, duration(ns): 15257892 2025-07-12 04:59:05,075 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745449_4625, type=LAST_IN_PIPELINE terminating 2025-07-12 04:59:10,041 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745449_4625 replica FinalizedReplica, blk_1073745449_4625, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745449 for deletion 2025-07-12 04:59:10,043 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745449_4625 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745449 2025-07-12 05:01:10,049 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745451_4627 src: /192.168.158.9:60100 dest: /192.168.158.4:9866 2025-07-12 05:01:10,069 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:60100, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1460361798_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745451_4627, duration(ns): 17270902 2025-07-12 05:01:10,069 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745451_4627, type=LAST_IN_PIPELINE terminating 2025-07-12 05:01:13,050 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745451_4627 replica FinalizedReplica, blk_1073745451_4627, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745451 for deletion 2025-07-12 05:01:13,051 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745451_4627 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745451 2025-07-12 05:02:10,066 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745452_4628 src: /192.168.158.8:56564 dest: /192.168.158.4:9866 2025-07-12 05:02:10,084 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:56564, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1353197550_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745452_4628, duration(ns): 16561210 2025-07-12 05:02:10,085 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745452_4628, type=LAST_IN_PIPELINE terminating 2025-07-12 05:02:13,051 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745452_4628 replica FinalizedReplica, blk_1073745452_4628, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745452 for deletion 2025-07-12 05:02:13,052 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745452_4628 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745452 2025-07-12 05:03:10,052 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745453_4629 src: /192.168.158.8:44308 dest: /192.168.158.4:9866 2025-07-12 05:03:10,076 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:44308, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_173880304_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745453_4629, duration(ns): 18302316 2025-07-12 05:03:10,076 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745453_4629, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-12 05:03:16,053 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745453_4629 replica FinalizedReplica, blk_1073745453_4629, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745453 for deletion 2025-07-12 05:03:16,054 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745453_4629 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745453 2025-07-12 05:05:10,059 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745455_4631 src: /192.168.158.6:51664 dest: /192.168.158.4:9866 2025-07-12 05:05:10,077 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:51664, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1774034538_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745455_4631, duration(ns): 15339710 2025-07-12 05:05:10,077 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745455_4631, type=LAST_IN_PIPELINE terminating 2025-07-12 05:05:13,057 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745455_4631 replica FinalizedReplica, blk_1073745455_4631, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745455 for deletion 2025-07-12 05:05:13,058 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745455_4631 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745455 2025-07-12 05:15:15,087 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745465_4641 src: /192.168.158.7:38922 dest: /192.168.158.4:9866 2025-07-12 05:15:15,106 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:38922, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1331845473_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745465_4641, duration(ns): 17334684 2025-07-12 05:15:15,107 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745465_4641, type=LAST_IN_PIPELINE terminating 2025-07-12 05:15:19,070 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745465_4641 replica FinalizedReplica, blk_1073745465_4641, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745465 for deletion 2025-07-12 05:15:19,071 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745465_4641 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745465 2025-07-12 05:18:20,097 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745468_4644 src: /192.168.158.9:60376 dest: /192.168.158.4:9866 2025-07-12 05:18:20,121 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:60376, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1245963446_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745468_4644, duration(ns): 18746316 2025-07-12 05:18:20,124 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745468_4644, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-12 05:18:28,076 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745468_4644 replica FinalizedReplica, blk_1073745468_4644, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745468 for deletion 2025-07-12 05:18:28,077 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745468_4644 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745468 2025-07-12 05:20:20,124 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745470_4646 src: /192.168.158.7:38136 dest: /192.168.158.4:9866 2025-07-12 05:20:20,142 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:38136, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1546779391_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745470_4646, duration(ns): 15921444 2025-07-12 05:20:20,142 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745470_4646, type=LAST_IN_PIPELINE terminating 2025-07-12 05:20:25,081 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745470_4646 replica FinalizedReplica, blk_1073745470_4646, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745470 for deletion 2025-07-12 05:20:25,082 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745470_4646 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745470 2025-07-12 05:21:20,106 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745471_4647 src: /192.168.158.1:47586 dest: /192.168.158.4:9866 2025-07-12 05:21:20,140 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47586, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-722148431_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745471_4647, duration(ns): 24306311 2025-07-12 05:21:20,140 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745471_4647, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-12 05:21:28,084 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745471_4647 replica FinalizedReplica, blk_1073745471_4647, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745471 for deletion 2025-07-12 05:21:28,086 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745471_4647 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745471 2025-07-12 05:22:20,107 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745472_4648 src: /192.168.158.7:43038 dest: /192.168.158.4:9866 2025-07-12 05:22:20,133 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:43038, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1678910803_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745472_4648, duration(ns): 21119610 2025-07-12 05:22:20,133 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745472_4648, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-12 05:22:28,088 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745472_4648 replica FinalizedReplica, blk_1073745472_4648, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745472 for deletion 2025-07-12 05:22:28,089 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745472_4648 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745472 2025-07-12 05:23:25,112 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745473_4649 src: /192.168.158.7:36014 dest: /192.168.158.4:9866 2025-07-12 05:23:25,131 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:36014, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1023378820_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745473_4649, duration(ns): 16660059 2025-07-12 05:23:25,131 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745473_4649, type=LAST_IN_PIPELINE terminating 2025-07-12 05:23:28,091 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745473_4649 replica FinalizedReplica, blk_1073745473_4649, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745473 for deletion 2025-07-12 05:23:28,092 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745473_4649 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745473 2025-07-12 05:24:25,085 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745474_4650 src: /192.168.158.6:55104 dest: /192.168.158.4:9866 2025-07-12 05:24:25,111 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:55104, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-519120709_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745474_4650, duration(ns): 20685712 2025-07-12 05:24:25,112 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745474_4650, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-12 05:24:28,091 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745474_4650 replica FinalizedReplica, blk_1073745474_4650, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745474 for deletion 2025-07-12 05:24:28,092 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745474_4650 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745474 2025-07-12 05:26:25,093 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745476_4652 src: /192.168.158.9:59672 dest: /192.168.158.4:9866 2025-07-12 05:26:25,116 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:59672, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1589986763_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745476_4652, duration(ns): 17615844 2025-07-12 05:26:25,117 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745476_4652, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-12 05:26:28,100 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745476_4652 replica FinalizedReplica, blk_1073745476_4652, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745476 for deletion 2025-07-12 05:26:28,102 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745476_4652 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745476 2025-07-12 05:28:25,100 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745478_4654 src: /192.168.158.8:59350 dest: /192.168.158.4:9866 2025-07-12 05:28:25,115 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:59350, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_772936794_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745478_4654, duration(ns): 13750348 2025-07-12 05:28:25,116 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745478_4654, type=LAST_IN_PIPELINE terminating 2025-07-12 05:28:31,106 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745478_4654 replica FinalizedReplica, blk_1073745478_4654, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745478 for deletion 2025-07-12 05:28:31,107 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745478_4654 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745478 2025-07-12 05:32:35,122 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745482_4658 src: /192.168.158.5:36516 dest: /192.168.158.4:9866 2025-07-12 05:32:35,141 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:36516, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2141269598_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745482_4658, duration(ns): 16197683 2025-07-12 05:32:35,141 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745482_4658, type=LAST_IN_PIPELINE terminating 2025-07-12 05:32:43,111 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745482_4658 replica FinalizedReplica, blk_1073745482_4658, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745482 for deletion 2025-07-12 05:32:43,112 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745482_4658 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745482 2025-07-12 05:33:35,107 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745483_4659 src: /192.168.158.9:38084 dest: /192.168.158.4:9866 2025-07-12 05:33:35,124 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:38084, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-195919847_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745483_4659, duration(ns): 14782649 2025-07-12 05:33:35,124 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745483_4659, type=LAST_IN_PIPELINE terminating 2025-07-12 05:33:43,111 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745483_4659 replica FinalizedReplica, blk_1073745483_4659, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745483 for deletion 2025-07-12 05:33:43,112 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745483_4659 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745483 2025-07-12 05:35:35,112 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745485_4661 src: /192.168.158.8:47912 dest: /192.168.158.4:9866 2025-07-12 05:35:35,128 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:47912, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1549662355_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745485_4661, duration(ns): 14035275 2025-07-12 05:35:35,128 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745485_4661, type=LAST_IN_PIPELINE terminating 2025-07-12 05:35:40,111 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745485_4661 replica FinalizedReplica, blk_1073745485_4661, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745485 for deletion 2025-07-12 05:35:40,112 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745485_4661 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745485 2025-07-12 05:36:13,271 INFO org.apache.hadoop.hdfs.server.datanode.DirectoryScanner: BlockPool BP-1059995147-192.168.158.1-1752101929360 Total blocks: 8, missing metadata files:0, missing block files:0, missing blocks in memory:0, mismatched blocks:0 2025-07-12 05:37:19,117 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Successfully sent block report 0x12cfd9a757d31f2f, containing 4 storage report(s), of which we sent 4. The reports had 8 total blocks and used 1 RPC(s). This took 0 msec to generate and 4 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5. 2025-07-12 05:37:19,118 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Got finalize command for block pool BP-1059995147-192.168.158.1-1752101929360 2025-07-12 05:37:45,091 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745487_4663 src: /192.168.158.7:52148 dest: /192.168.158.4:9866 2025-07-12 05:37:45,114 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:52148, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-308047271_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745487_4663, duration(ns): 18424721 2025-07-12 05:37:45,114 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745487_4663, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-12 05:37:52,113 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745487_4663 replica FinalizedReplica, blk_1073745487_4663, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745487 for deletion 2025-07-12 05:37:52,114 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745487_4663 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745487 2025-07-12 05:38:50,134 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745488_4664 src: /192.168.158.8:57698 dest: /192.168.158.4:9866 2025-07-12 05:38:50,159 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:57698, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1535500127_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745488_4664, duration(ns): 19371740 2025-07-12 05:38:50,160 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745488_4664, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-12 05:38:55,116 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745488_4664 replica FinalizedReplica, blk_1073745488_4664, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745488 for deletion 2025-07-12 05:38:55,118 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745488_4664 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745488 2025-07-12 05:39:55,137 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745489_4665 src: /192.168.158.1:45836 dest: /192.168.158.4:9866 2025-07-12 05:39:55,171 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45836, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1085985038_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745489_4665, duration(ns): 23845348 2025-07-12 05:39:55,171 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745489_4665, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-12 05:39:58,120 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745489_4665 replica FinalizedReplica, blk_1073745489_4665, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745489 for deletion 2025-07-12 05:39:58,122 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745489_4665 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745489 2025-07-12 05:40:55,131 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745490_4666 src: /192.168.158.1:45770 dest: /192.168.158.4:9866 2025-07-12 05:40:55,164 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45770, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_934456504_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745490_4666, duration(ns): 24592003 2025-07-12 05:40:55,165 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745490_4666, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-12 05:40:58,121 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745490_4666 replica FinalizedReplica, blk_1073745490_4666, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745490 for deletion 2025-07-12 05:40:58,123 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745490_4666 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745490 2025-07-12 05:44:00,145 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745493_4669 src: /192.168.158.1:36412 dest: /192.168.158.4:9866 2025-07-12 05:44:00,180 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36412, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1634905294_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745493_4669, duration(ns): 26024964 2025-07-12 05:44:00,180 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745493_4669, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-12 05:44:04,127 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745493_4669 replica FinalizedReplica, blk_1073745493_4669, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745493 for deletion 2025-07-12 05:44:04,129 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745493_4669 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745493 2025-07-12 05:45:05,143 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745494_4670 src: /192.168.158.1:60078 dest: /192.168.158.4:9866 2025-07-12 05:45:05,178 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60078, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1760929711_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745494_4670, duration(ns): 25800736 2025-07-12 05:45:05,178 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745494_4670, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-12 05:45:10,130 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745494_4670 replica FinalizedReplica, blk_1073745494_4670, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745494 for deletion 2025-07-12 05:45:10,131 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745494_4670 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745494 2025-07-12 05:49:05,201 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745498_4674 src: /192.168.158.9:45848 dest: /192.168.158.4:9866 2025-07-12 05:49:05,225 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:45848, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1589551275_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745498_4674, duration(ns): 18819452 2025-07-12 05:49:05,226 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745498_4674, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-12 05:49:13,140 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745498_4674 replica FinalizedReplica, blk_1073745498_4674, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745498 for deletion 2025-07-12 05:49:13,141 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745498_4674 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745498 2025-07-12 05:50:05,199 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745499_4675 src: /192.168.158.1:57264 dest: /192.168.158.4:9866 2025-07-12 05:50:05,229 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57264, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1625385048_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745499_4675, duration(ns): 20561321 2025-07-12 05:50:05,229 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745499_4675, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-12 05:50:10,139 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745499_4675 replica FinalizedReplica, blk_1073745499_4675, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745499 for deletion 2025-07-12 05:50:10,140 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745499_4675 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745499 2025-07-12 05:53:10,190 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745502_4678 src: /192.168.158.1:48756 dest: /192.168.158.4:9866 2025-07-12 05:53:10,219 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48756, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_809573670_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745502_4678, duration(ns): 20735982 2025-07-12 05:53:10,220 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745502_4678, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-12 05:53:13,145 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745502_4678 replica FinalizedReplica, blk_1073745502_4678, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745502 for deletion 2025-07-12 05:53:13,146 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745502_4678 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745502 2025-07-12 05:58:15,181 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745507_4683 src: /192.168.158.5:54296 dest: /192.168.158.4:9866 2025-07-12 05:58:15,206 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:54296, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-717868147_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745507_4683, duration(ns): 20501817 2025-07-12 05:58:15,207 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745507_4683, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-12 05:58:19,157 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745507_4683 replica FinalizedReplica, blk_1073745507_4683, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745507 for deletion 2025-07-12 05:58:19,158 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745507_4683 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745507 2025-07-12 06:00:15,166 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745509_4685 src: /192.168.158.9:37396 dest: /192.168.158.4:9866 2025-07-12 06:00:15,183 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:37396, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-639942804_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745509_4685, duration(ns): 14814511 2025-07-12 06:00:15,184 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745509_4685, type=LAST_IN_PIPELINE terminating 2025-07-12 06:00:19,159 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745509_4685 replica FinalizedReplica, blk_1073745509_4685, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745509 for deletion 2025-07-12 06:00:19,160 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745509_4685 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745509 2025-07-12 06:02:15,177 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745511_4687 src: /192.168.158.9:41434 dest: /192.168.158.4:9866 2025-07-12 06:02:15,194 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:41434, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-463177359_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745511_4687, duration(ns): 14002188 2025-07-12 06:02:15,194 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745511_4687, type=LAST_IN_PIPELINE terminating 2025-07-12 06:02:19,162 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745511_4687 replica FinalizedReplica, blk_1073745511_4687, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745511 for deletion 2025-07-12 06:02:19,163 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745511_4687 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745511 2025-07-12 06:03:15,174 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745512_4688 src: /192.168.158.7:57506 dest: /192.168.158.4:9866 2025-07-12 06:03:15,199 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:57506, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-162945639_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745512_4688, duration(ns): 19630348 2025-07-12 06:03:15,200 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745512_4688, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-12 06:03:22,162 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745512_4688 replica FinalizedReplica, blk_1073745512_4688, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745512 for deletion 2025-07-12 06:03:22,163 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745512_4688 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745512 2025-07-12 06:04:20,210 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745513_4689 src: /192.168.158.8:46944 dest: /192.168.158.4:9866 2025-07-12 06:04:20,236 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:46944, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1960360785_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745513_4689, duration(ns): 19603571 2025-07-12 06:04:20,237 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745513_4689, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-12 06:04:28,163 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745513_4689 replica FinalizedReplica, blk_1073745513_4689, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745513 for deletion 2025-07-12 06:04:28,164 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745513_4689 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745513 2025-07-12 06:06:20,211 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745515_4691 src: /192.168.158.9:38282 dest: /192.168.158.4:9866 2025-07-12 06:06:20,236 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:38282, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2565601_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745515_4691, duration(ns): 20084486 2025-07-12 06:06:20,237 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745515_4691, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-12 06:06:25,167 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745515_4691 replica FinalizedReplica, blk_1073745515_4691, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745515 for deletion 2025-07-12 06:06:25,168 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745515_4691 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745515 2025-07-12 06:08:25,229 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745517_4693 src: /192.168.158.1:53472 dest: /192.168.158.4:9866 2025-07-12 06:08:25,263 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53472, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-814658505_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745517_4693, duration(ns): 26480143 2025-07-12 06:08:25,264 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745517_4693, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-12 06:08:28,173 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745517_4693 replica FinalizedReplica, blk_1073745517_4693, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745517 for deletion 2025-07-12 06:08:28,174 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745517_4693 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745517 2025-07-12 06:09:25,248 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745518_4694 src: /192.168.158.8:51130 dest: /192.168.158.4:9866 2025-07-12 06:09:25,268 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:51130, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1182251118_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745518_4694, duration(ns): 15368666 2025-07-12 06:09:25,269 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745518_4694, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-12 06:09:28,174 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745518_4694 replica FinalizedReplica, blk_1073745518_4694, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745518 for deletion 2025-07-12 06:09:28,175 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745518_4694 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745518 2025-07-12 06:10:25,212 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745519_4695 src: /192.168.158.9:34816 dest: /192.168.158.4:9866 2025-07-12 06:10:25,235 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:34816, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_131969020_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745519_4695, duration(ns): 18018622 2025-07-12 06:10:25,235 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745519_4695, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-12 06:10:31,175 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745519_4695 replica FinalizedReplica, blk_1073745519_4695, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745519 for deletion 2025-07-12 06:10:31,176 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745519_4695 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745519 2025-07-12 06:11:25,208 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745520_4696 src: /192.168.158.9:35732 dest: /192.168.158.4:9866 2025-07-12 06:11:25,233 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:35732, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_560350620_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745520_4696, duration(ns): 19774686 2025-07-12 06:11:25,233 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745520_4696, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-12 06:11:28,179 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745520_4696 replica FinalizedReplica, blk_1073745520_4696, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745520 for deletion 2025-07-12 06:11:28,180 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745520_4696 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745520 2025-07-12 06:12:30,222 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745521_4697 src: /192.168.158.5:60202 dest: /192.168.158.4:9866 2025-07-12 06:12:30,240 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:60202, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-176058691_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745521_4697, duration(ns): 15351066 2025-07-12 06:12:30,240 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745521_4697, type=LAST_IN_PIPELINE terminating 2025-07-12 06:12:37,184 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745521_4697 replica FinalizedReplica, blk_1073745521_4697, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745521 for deletion 2025-07-12 06:12:37,185 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745521_4697 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745521 2025-07-12 06:14:30,211 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745523_4699 src: /192.168.158.9:41472 dest: /192.168.158.4:9866 2025-07-12 06:14:30,236 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:41472, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-397436129_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745523_4699, duration(ns): 19410457 2025-07-12 06:14:30,236 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745523_4699, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-12 06:14:37,189 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745523_4699 replica FinalizedReplica, blk_1073745523_4699, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745523 for deletion 2025-07-12 06:14:37,191 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745523_4699 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745523 2025-07-12 06:18:35,203 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745527_4703 src: /192.168.158.1:47954 dest: /192.168.158.4:9866 2025-07-12 06:18:35,235 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47954, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1576774280_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745527_4703, duration(ns): 21904011 2025-07-12 06:18:35,236 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745527_4703, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-12 06:18:43,193 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745527_4703 replica FinalizedReplica, blk_1073745527_4703, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745527 for deletion 2025-07-12 06:18:43,194 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745527_4703 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745527 2025-07-12 06:19:40,203 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745528_4704 src: /192.168.158.1:41070 dest: /192.168.158.4:9866 2025-07-12 06:19:40,236 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41070, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_358772396_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745528_4704, duration(ns): 24812975 2025-07-12 06:19:40,237 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745528_4704, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-12 06:19:43,195 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745528_4704 replica FinalizedReplica, blk_1073745528_4704, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745528 for deletion 2025-07-12 06:19:43,196 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745528_4704 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745528 2025-07-12 06:20:40,189 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745529_4705 src: /192.168.158.1:40456 dest: /192.168.158.4:9866 2025-07-12 06:20:40,219 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40456, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1987197763_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745529_4705, duration(ns): 21999936 2025-07-12 06:20:40,220 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745529_4705, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-12 06:20:46,196 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745529_4705 replica FinalizedReplica, blk_1073745529_4705, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745529 for deletion 2025-07-12 06:20:46,197 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745529_4705 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745529 2025-07-12 06:24:45,206 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745533_4709 src: /192.168.158.9:38218 dest: /192.168.158.4:9866 2025-07-12 06:24:45,230 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:38218, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1714037246_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745533_4709, duration(ns): 18573797 2025-07-12 06:24:45,231 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745533_4709, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-12 06:24:49,204 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745533_4709 replica FinalizedReplica, blk_1073745533_4709, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745533 for deletion 2025-07-12 06:24:49,206 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745533_4709 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745533 2025-07-12 06:25:45,200 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745534_4710 src: /192.168.158.1:58218 dest: /192.168.158.4:9866 2025-07-12 06:25:45,231 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58218, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_928986863_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745534_4710, duration(ns): 22034810 2025-07-12 06:25:45,233 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745534_4710, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-12 06:25:52,207 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745534_4710 replica FinalizedReplica, blk_1073745534_4710, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745534 for deletion 2025-07-12 06:25:52,208 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745534_4710 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745534 2025-07-12 06:27:50,225 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745536_4712 src: /192.168.158.1:58516 dest: /192.168.158.4:9866 2025-07-12 06:27:50,255 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58516, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_278776835_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745536_4712, duration(ns): 21720402 2025-07-12 06:27:50,256 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745536_4712, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-12 06:27:55,210 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745536_4712 replica FinalizedReplica, blk_1073745536_4712, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745536 for deletion 2025-07-12 06:27:55,211 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745536_4712 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745536 2025-07-12 06:29:55,233 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745538_4714 src: /192.168.158.6:39594 dest: /192.168.158.4:9866 2025-07-12 06:29:55,250 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:39594, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1438111323_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745538_4714, duration(ns): 15386576 2025-07-12 06:29:55,251 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745538_4714, type=LAST_IN_PIPELINE terminating 2025-07-12 06:30:01,213 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745538_4714 replica FinalizedReplica, blk_1073745538_4714, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745538 for deletion 2025-07-12 06:30:01,214 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745538_4714 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745538 2025-07-12 06:32:00,257 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745540_4716 src: /192.168.158.5:45782 dest: /192.168.158.4:9866 2025-07-12 06:32:00,281 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:45782, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-448362754_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745540_4716, duration(ns): 18524802 2025-07-12 06:32:00,282 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745540_4716, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-12 06:32:04,216 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745540_4716 replica FinalizedReplica, blk_1073745540_4716, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745540 for deletion 2025-07-12 06:32:04,217 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745540_4716 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745540 2025-07-12 06:35:00,215 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745543_4719 src: /192.168.158.9:41310 dest: /192.168.158.4:9866 2025-07-12 06:35:00,241 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:41310, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1396754453_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745543_4719, duration(ns): 19735486 2025-07-12 06:35:00,241 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745543_4719, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-12 06:35:04,224 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745543_4719 replica FinalizedReplica, blk_1073745543_4719, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745543 for deletion 2025-07-12 06:35:04,225 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745543_4719 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745543 2025-07-12 06:37:10,242 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745545_4721 src: /192.168.158.9:50252 dest: /192.168.158.4:9866 2025-07-12 06:37:10,267 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:50252, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_652098455_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745545_4721, duration(ns): 19043318 2025-07-12 06:37:10,267 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745545_4721, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-12 06:37:16,226 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745545_4721 replica FinalizedReplica, blk_1073745545_4721, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745545 for deletion 2025-07-12 06:37:16,227 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745545_4721 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745545 2025-07-12 06:38:10,275 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745546_4722 src: /192.168.158.8:41186 dest: /192.168.158.4:9866 2025-07-12 06:38:10,301 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:41186, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1298336897_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745546_4722, duration(ns): 20690807 2025-07-12 06:38:10,302 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745546_4722, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-12 06:38:13,228 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745546_4722 replica FinalizedReplica, blk_1073745546_4722, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745546 for deletion 2025-07-12 06:38:13,230 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745546_4722 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745546 2025-07-12 06:43:15,261 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745551_4727 src: /192.168.158.8:36686 dest: /192.168.158.4:9866 2025-07-12 06:43:15,279 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:36686, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1740147064_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745551_4727, duration(ns): 16311663 2025-07-12 06:43:15,279 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745551_4727, type=LAST_IN_PIPELINE terminating 2025-07-12 06:43:22,239 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745551_4727 replica FinalizedReplica, blk_1073745551_4727, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745551 for deletion 2025-07-12 06:43:22,240 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745551_4727 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745551 2025-07-12 06:46:25,254 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745554_4730 src: /192.168.158.8:47114 dest: /192.168.158.4:9866 2025-07-12 06:46:25,280 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:47114, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_811028303_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745554_4730, duration(ns): 20757862 2025-07-12 06:46:25,280 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745554_4730, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-12 06:46:28,247 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745554_4730 replica FinalizedReplica, blk_1073745554_4730, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745554 for deletion 2025-07-12 06:46:28,248 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745554_4730 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745554 2025-07-12 06:48:25,245 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745556_4732 src: /192.168.158.6:38018 dest: /192.168.158.4:9866 2025-07-12 06:48:25,263 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:38018, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2033614836_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745556_4732, duration(ns): 15980586 2025-07-12 06:48:25,263 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745556_4732, type=LAST_IN_PIPELINE terminating 2025-07-12 06:48:28,253 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745556_4732 replica FinalizedReplica, blk_1073745556_4732, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745556 for deletion 2025-07-12 06:48:28,254 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745556_4732 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745556 2025-07-12 06:49:25,247 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745557_4733 src: /192.168.158.6:42128 dest: /192.168.158.4:9866 2025-07-12 06:49:25,271 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:42128, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-995090165_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745557_4733, duration(ns): 18543114 2025-07-12 06:49:25,271 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745557_4733, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-12 06:49:31,256 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745557_4733 replica FinalizedReplica, blk_1073745557_4733, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745557 for deletion 2025-07-12 06:49:31,257 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745557_4733 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745557 2025-07-12 06:50:25,249 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745558_4734 src: /192.168.158.7:53766 dest: /192.168.158.4:9866 2025-07-12 06:50:25,273 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:53766, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_373657973_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745558_4734, duration(ns): 18697719 2025-07-12 06:50:25,273 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745558_4734, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-12 06:50:28,259 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745558_4734 replica FinalizedReplica, blk_1073745558_4734, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745558 for deletion 2025-07-12 06:50:28,260 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745558_4734 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745558 2025-07-12 06:54:30,268 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745562_4738 src: /192.168.158.1:60164 dest: /192.168.158.4:9866 2025-07-12 06:54:30,298 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60164, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1169690600_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745562_4738, duration(ns): 21273199 2025-07-12 06:54:30,298 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745562_4738, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-12 06:54:34,269 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745562_4738 replica FinalizedReplica, blk_1073745562_4738, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745562 for deletion 2025-07-12 06:54:34,270 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745562_4738 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745562 2025-07-12 06:55:30,267 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745563_4739 src: /192.168.158.6:34196 dest: /192.168.158.4:9866 2025-07-12 06:55:30,282 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:34196, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_534196097_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745563_4739, duration(ns): 13596851 2025-07-12 06:55:30,283 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745563_4739, type=LAST_IN_PIPELINE terminating 2025-07-12 06:55:34,272 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745563_4739 replica FinalizedReplica, blk_1073745563_4739, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745563 for deletion 2025-07-12 06:55:34,273 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745563_4739 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745563 2025-07-12 06:56:30,261 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745564_4740 src: /192.168.158.1:53420 dest: /192.168.158.4:9866 2025-07-12 06:56:30,290 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53420, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_763250358_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745564_4740, duration(ns): 20339739 2025-07-12 06:56:30,291 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745564_4740, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-12 06:56:37,277 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745564_4740 replica FinalizedReplica, blk_1073745564_4740, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745564 for deletion 2025-07-12 06:56:37,278 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745564_4740 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745564 2025-07-12 07:03:35,272 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745571_4747 src: /192.168.158.9:44642 dest: /192.168.158.4:9866 2025-07-12 07:03:35,294 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:44642, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_704647449_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745571_4747, duration(ns): 16722937 2025-07-12 07:03:35,294 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745571_4747, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-12 07:03:40,291 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745571_4747 replica FinalizedReplica, blk_1073745571_4747, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745571 for deletion 2025-07-12 07:03:40,293 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745571_4747 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745571 2025-07-12 07:05:45,274 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745573_4749 src: /192.168.158.8:37782 dest: /192.168.158.4:9866 2025-07-12 07:05:45,292 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:37782, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1650462613_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745573_4749, duration(ns): 15865751 2025-07-12 07:05:45,292 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745573_4749, type=LAST_IN_PIPELINE terminating 2025-07-12 07:05:49,293 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745573_4749 replica FinalizedReplica, blk_1073745573_4749, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745573 for deletion 2025-07-12 07:05:49,295 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745573_4749 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745573 2025-07-12 07:07:50,284 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745575_4751 src: /192.168.158.9:43894 dest: /192.168.158.4:9866 2025-07-12 07:07:50,303 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:43894, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-470927109_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745575_4751, duration(ns): 17078551 2025-07-12 07:07:50,304 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745575_4751, type=LAST_IN_PIPELINE terminating 2025-07-12 07:07:55,297 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745575_4751 replica FinalizedReplica, blk_1073745575_4751, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745575 for deletion 2025-07-12 07:07:55,299 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745575_4751 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745575 2025-07-12 07:10:50,292 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745578_4754 src: /192.168.158.7:34782 dest: /192.168.158.4:9866 2025-07-12 07:10:50,312 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:34782, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1763873941_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745578_4754, duration(ns): 18104464 2025-07-12 07:10:50,313 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745578_4754, type=LAST_IN_PIPELINE terminating 2025-07-12 07:10:55,305 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745578_4754 replica FinalizedReplica, blk_1073745578_4754, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745578 for deletion 2025-07-12 07:10:55,307 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745578_4754 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745578 2025-07-12 07:12:55,291 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745580_4756 src: /192.168.158.6:34954 dest: /192.168.158.4:9866 2025-07-12 07:12:55,308 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:34954, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_764352586_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745580_4756, duration(ns): 14531571 2025-07-12 07:12:55,308 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745580_4756, type=LAST_IN_PIPELINE terminating 2025-07-12 07:13:01,311 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745580_4756 replica FinalizedReplica, blk_1073745580_4756, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745580 for deletion 2025-07-12 07:13:01,312 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745580_4756 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745580 2025-07-12 07:14:00,292 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745581_4757 src: /192.168.158.5:48580 dest: /192.168.158.4:9866 2025-07-12 07:14:00,308 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:48580, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-198732042_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745581_4757, duration(ns): 14759778 2025-07-12 07:14:00,309 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745581_4757, type=LAST_IN_PIPELINE terminating 2025-07-12 07:14:07,312 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745581_4757 replica FinalizedReplica, blk_1073745581_4757, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745581 for deletion 2025-07-12 07:14:07,313 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745581_4757 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745581 2025-07-12 07:19:00,296 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745586_4762 src: /192.168.158.8:49780 dest: /192.168.158.4:9866 2025-07-12 07:19:00,313 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:49780, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1861274935_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745586_4762, duration(ns): 15390813 2025-07-12 07:19:00,314 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745586_4762, type=LAST_IN_PIPELINE terminating 2025-07-12 07:19:04,322 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745586_4762 replica FinalizedReplica, blk_1073745586_4762, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745586 for deletion 2025-07-12 07:19:04,323 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745586_4762 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745586 2025-07-12 07:22:15,299 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745589_4765 src: /192.168.158.6:41762 dest: /192.168.158.4:9866 2025-07-12 07:22:15,326 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:41762, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_355290725_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745589_4765, duration(ns): 21773964 2025-07-12 07:22:15,326 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745589_4765, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-12 07:22:22,329 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745589_4765 replica FinalizedReplica, blk_1073745589_4765, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745589 for deletion 2025-07-12 07:22:22,330 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745589_4765 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745589 2025-07-12 07:23:15,312 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745590_4766 src: /192.168.158.6:59518 dest: /192.168.158.4:9866 2025-07-12 07:23:15,331 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:59518, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1283954866_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745590_4766, duration(ns): 16632202 2025-07-12 07:23:15,331 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745590_4766, type=LAST_IN_PIPELINE terminating 2025-07-12 07:23:22,332 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745590_4766 replica FinalizedReplica, blk_1073745590_4766, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745590 for deletion 2025-07-12 07:23:22,333 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745590_4766 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745590 2025-07-12 07:25:15,325 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745592_4768 src: /192.168.158.9:45366 dest: /192.168.158.4:9866 2025-07-12 07:25:15,344 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:45366, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_41351094_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745592_4768, duration(ns): 16797863 2025-07-12 07:25:15,345 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745592_4768, type=LAST_IN_PIPELINE terminating 2025-07-12 07:25:19,338 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745592_4768 replica FinalizedReplica, blk_1073745592_4768, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745592 for deletion 2025-07-12 07:25:19,339 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745592_4768 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745592 2025-07-12 07:27:20,321 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745594_4770 src: /192.168.158.6:49918 dest: /192.168.158.4:9866 2025-07-12 07:27:20,339 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:49918, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-227433352_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745594_4770, duration(ns): 16131662 2025-07-12 07:27:20,339 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745594_4770, type=LAST_IN_PIPELINE terminating 2025-07-12 07:27:28,342 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745594_4770 replica FinalizedReplica, blk_1073745594_4770, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745594 for deletion 2025-07-12 07:27:28,343 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745594_4770 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745594 2025-07-12 07:29:20,322 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745596_4772 src: /192.168.158.1:34648 dest: /192.168.158.4:9866 2025-07-12 07:29:20,355 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34648, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-427327649_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745596_4772, duration(ns): 23778661 2025-07-12 07:29:20,355 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745596_4772, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-12 07:29:25,349 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745596_4772 replica FinalizedReplica, blk_1073745596_4772, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745596 for deletion 2025-07-12 07:29:25,350 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745596_4772 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745596 2025-07-12 07:30:20,318 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745597_4773 src: /192.168.158.1:50242 dest: /192.168.158.4:9866 2025-07-12 07:30:20,350 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50242, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2080185412_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745597_4773, duration(ns): 22916668 2025-07-12 07:30:20,350 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745597_4773, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-12 07:30:25,351 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745597_4773 replica FinalizedReplica, blk_1073745597_4773, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745597 for deletion 2025-07-12 07:30:25,352 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745597_4773 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745597 2025-07-12 07:32:30,322 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745599_4775 src: /192.168.158.9:40752 dest: /192.168.158.4:9866 2025-07-12 07:32:30,346 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:40752, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1371105200_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745599_4775, duration(ns): 18711991 2025-07-12 07:32:30,349 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745599_4775, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-12 07:32:34,360 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745599_4775 replica FinalizedReplica, blk_1073745599_4775, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745599 for deletion 2025-07-12 07:32:34,361 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745599_4775 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745599 2025-07-12 07:33:30,325 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745600_4776 src: /192.168.158.1:42504 dest: /192.168.158.4:9866 2025-07-12 07:33:30,378 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42504, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_9328256_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745600_4776, duration(ns): 41888238 2025-07-12 07:33:30,378 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745600_4776, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-12 07:33:37,360 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745600_4776 replica FinalizedReplica, blk_1073745600_4776, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745600 for deletion 2025-07-12 07:33:37,361 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745600_4776 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745600 2025-07-12 07:37:30,332 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745604_4780 src: /192.168.158.1:38014 dest: /192.168.158.4:9866 2025-07-12 07:37:30,363 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38014, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1449310634_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745604_4780, duration(ns): 21619940 2025-07-12 07:37:30,363 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745604_4780, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-12 07:37:34,362 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745604_4780 replica FinalizedReplica, blk_1073745604_4780, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745604 for deletion 2025-07-12 07:37:34,364 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745604_4780 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745604 2025-07-12 07:38:35,362 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745605_4781 src: /192.168.158.6:55014 dest: /192.168.158.4:9866 2025-07-12 07:38:35,381 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:55014, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1260190971_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745605_4781, duration(ns): 16277993 2025-07-12 07:38:35,381 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745605_4781, type=LAST_IN_PIPELINE terminating 2025-07-12 07:38:43,363 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745605_4781 replica FinalizedReplica, blk_1073745605_4781, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745605 for deletion 2025-07-12 07:38:43,364 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745605_4781 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745605 2025-07-12 07:41:35,362 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745608_4784 src: /192.168.158.7:37342 dest: /192.168.158.4:9866 2025-07-12 07:41:35,386 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:37342, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1755155760_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745608_4784, duration(ns): 18629487 2025-07-12 07:41:35,386 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745608_4784, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-12 07:41:40,366 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745608_4784 replica FinalizedReplica, blk_1073745608_4784, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745608 for deletion 2025-07-12 07:41:40,367 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745608_4784 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745608 2025-07-12 07:42:35,356 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745609_4785 src: /192.168.158.1:51766 dest: /192.168.158.4:9866 2025-07-12 07:42:35,386 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51766, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_550419327_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745609_4785, duration(ns): 20799218 2025-07-12 07:42:35,386 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745609_4785, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-12 07:42:43,369 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745609_4785 replica FinalizedReplica, blk_1073745609_4785, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745609 for deletion 2025-07-12 07:42:43,370 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745609_4785 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745609 2025-07-12 07:43:35,377 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745610_4786 src: /192.168.158.6:43522 dest: /192.168.158.4:9866 2025-07-12 07:43:35,399 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:43522, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_177115964_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745610_4786, duration(ns): 16430902 2025-07-12 07:43:35,399 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745610_4786, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-12 07:43:40,370 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745610_4786 replica FinalizedReplica, blk_1073745610_4786, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745610 for deletion 2025-07-12 07:43:40,371 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745610_4786 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745610 2025-07-12 07:47:45,369 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745614_4790 src: /192.168.158.7:48604 dest: /192.168.158.4:9866 2025-07-12 07:47:45,393 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:48604, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_708858455_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745614_4790, duration(ns): 18377904 2025-07-12 07:47:45,393 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745614_4790, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-12 07:47:49,375 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745614_4790 replica FinalizedReplica, blk_1073745614_4790, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745614 for deletion 2025-07-12 07:47:49,376 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745614_4790 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745614 2025-07-12 07:48:45,347 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745615_4791 src: /192.168.158.5:50158 dest: /192.168.158.4:9866 2025-07-12 07:48:45,370 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:50158, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1652982333_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745615_4791, duration(ns): 17426648 2025-07-12 07:48:45,370 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745615_4791, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-12 07:48:52,378 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745615_4791 replica FinalizedReplica, blk_1073745615_4791, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745615 for deletion 2025-07-12 07:48:52,380 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745615_4791 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745615 2025-07-12 07:50:45,373 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745617_4793 src: /192.168.158.1:43322 dest: /192.168.158.4:9866 2025-07-12 07:50:45,404 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43322, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1639228567_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745617_4793, duration(ns): 22078373 2025-07-12 07:50:45,404 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745617_4793, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-12 07:50:52,383 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745617_4793 replica FinalizedReplica, blk_1073745617_4793, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745617 for deletion 2025-07-12 07:50:52,385 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745617_4793 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745617 2025-07-12 07:52:55,358 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745619_4795 src: /192.168.158.7:59138 dest: /192.168.158.4:9866 2025-07-12 07:52:55,376 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:59138, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2068304263_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745619_4795, duration(ns): 16520265 2025-07-12 07:52:55,377 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745619_4795, type=LAST_IN_PIPELINE terminating 2025-07-12 07:52:58,387 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745619_4795 replica FinalizedReplica, blk_1073745619_4795, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745619 for deletion 2025-07-12 07:52:58,389 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745619_4795 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745619 2025-07-12 07:59:00,369 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745625_4801 src: /192.168.158.1:33546 dest: /192.168.158.4:9866 2025-07-12 07:59:00,399 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33546, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_101344092_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745625_4801, duration(ns): 21620289 2025-07-12 07:59:00,400 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745625_4801, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-12 07:59:04,403 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745625_4801 replica FinalizedReplica, blk_1073745625_4801, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745625 for deletion 2025-07-12 07:59:04,404 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745625_4801 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745625 2025-07-12 08:03:10,402 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745629_4805 src: /192.168.158.7:37324 dest: /192.168.158.4:9866 2025-07-12 08:03:10,425 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:37324, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1993436875_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745629_4805, duration(ns): 17280738 2025-07-12 08:03:10,426 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745629_4805, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-12 08:03:13,413 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745629_4805 replica FinalizedReplica, blk_1073745629_4805, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745629 for deletion 2025-07-12 08:03:13,415 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745629_4805 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745629 2025-07-12 08:04:10,406 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745630_4806 src: /192.168.158.1:43408 dest: /192.168.158.4:9866 2025-07-12 08:04:10,440 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43408, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1225865120_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745630_4806, duration(ns): 25077650 2025-07-12 08:04:10,441 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745630_4806, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-12 08:04:13,417 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745630_4806 replica FinalizedReplica, blk_1073745630_4806, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745630 for deletion 2025-07-12 08:04:13,418 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745630_4806 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745630 2025-07-12 08:08:10,391 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745634_4810 src: /192.168.158.1:42212 dest: /192.168.158.4:9866 2025-07-12 08:08:10,422 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42212, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-267915411_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745634_4810, duration(ns): 22494449 2025-07-12 08:08:10,422 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745634_4810, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-12 08:08:16,428 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745634_4810 replica FinalizedReplica, blk_1073745634_4810, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745634 for deletion 2025-07-12 08:08:16,429 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745634_4810 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745634 2025-07-12 08:11:15,419 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745637_4813 src: /192.168.158.8:44962 dest: /192.168.158.4:9866 2025-07-12 08:11:15,438 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:44962, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-881626641_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745637_4813, duration(ns): 16889107 2025-07-12 08:11:15,438 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745637_4813, type=LAST_IN_PIPELINE terminating 2025-07-12 08:11:22,435 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745637_4813 replica FinalizedReplica, blk_1073745637_4813, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745637 for deletion 2025-07-12 08:11:22,436 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745637_4813 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745637 2025-07-12 08:16:20,406 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745642_4818 src: /192.168.158.8:37722 dest: /192.168.158.4:9866 2025-07-12 08:16:20,432 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:37722, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1794851825_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745642_4818, duration(ns): 20967398 2025-07-12 08:16:20,432 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745642_4818, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-12 08:16:25,448 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745642_4818 replica FinalizedReplica, blk_1073745642_4818, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745642 for deletion 2025-07-12 08:16:25,449 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745642_4818 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745642 2025-07-12 08:23:30,454 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745649_4825 src: /192.168.158.7:44764 dest: /192.168.158.4:9866 2025-07-12 08:23:30,479 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:44764, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_564239136_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745649_4825, duration(ns): 20295528 2025-07-12 08:23:30,479 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745649_4825, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-12 08:23:34,460 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745649_4825 replica FinalizedReplica, blk_1073745649_4825, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745649 for deletion 2025-07-12 08:23:34,461 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745649_4825 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745649 2025-07-12 08:25:35,502 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745651_4827 src: /192.168.158.9:42032 dest: /192.168.158.4:9866 2025-07-12 08:25:35,529 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:42032, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-132617579_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745651_4827, duration(ns): 20536058 2025-07-12 08:25:35,529 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745651_4827, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-12 08:25:40,463 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745651_4827 replica FinalizedReplica, blk_1073745651_4827, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745651 for deletion 2025-07-12 08:25:40,464 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745651_4827 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745651 2025-07-12 08:26:35,472 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745652_4828 src: /192.168.158.7:43300 dest: /192.168.158.4:9866 2025-07-12 08:26:35,501 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:43300, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_910358377_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745652_4828, duration(ns): 22149994 2025-07-12 08:26:35,502 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745652_4828, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-12 08:26:40,464 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745652_4828 replica FinalizedReplica, blk_1073745652_4828, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745652 for deletion 2025-07-12 08:26:40,466 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745652_4828 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745652 2025-07-12 08:29:40,505 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745655_4831 src: /192.168.158.1:39950 dest: /192.168.158.4:9866 2025-07-12 08:29:40,536 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39950, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1232231141_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745655_4831, duration(ns): 23052722 2025-07-12 08:29:40,537 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745655_4831, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-12 08:29:46,468 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745655_4831 replica FinalizedReplica, blk_1073745655_4831, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745655 for deletion 2025-07-12 08:29:46,470 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745655_4831 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745655 2025-07-12 08:32:45,468 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745658_4834 src: /192.168.158.5:43886 dest: /192.168.158.4:9866 2025-07-12 08:32:45,491 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:43886, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1094876492_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745658_4834, duration(ns): 17679585 2025-07-12 08:32:45,491 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745658_4834, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-12 08:32:49,473 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745658_4834 replica FinalizedReplica, blk_1073745658_4834, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745658 for deletion 2025-07-12 08:32:49,475 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745658_4834 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745658 2025-07-12 08:35:45,453 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745661_4837 src: /192.168.158.1:58484 dest: /192.168.158.4:9866 2025-07-12 08:35:45,484 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58484, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_289203403_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745661_4837, duration(ns): 21280252 2025-07-12 08:35:45,484 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745661_4837, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-12 08:35:52,483 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745661_4837 replica FinalizedReplica, blk_1073745661_4837, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745661 for deletion 2025-07-12 08:35:52,484 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745661_4837 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073745661 2025-07-12 08:40:45,493 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745666_4842 src: /192.168.158.5:51876 dest: /192.168.158.4:9866 2025-07-12 08:40:45,518 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:51876, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2072490107_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745666_4842, duration(ns): 19523322 2025-07-12 08:40:45,518 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745666_4842, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-12 08:40:49,497 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745666_4842 replica FinalizedReplica, blk_1073745666_4842, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745666 for deletion 2025-07-12 08:40:49,498 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745666_4842 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745666 2025-07-12 08:50:00,541 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745675_4851 src: /192.168.158.9:33700 dest: /192.168.158.4:9866 2025-07-12 08:50:00,565 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:33700, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1940913127_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745675_4851, duration(ns): 18375397 2025-07-12 08:50:00,565 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745675_4851, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-12 08:50:04,519 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745675_4851 replica FinalizedReplica, blk_1073745675_4851, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745675 for deletion 2025-07-12 08:50:04,520 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745675_4851 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745675 2025-07-12 08:51:00,525 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745676_4852 src: /192.168.158.7:41874 dest: /192.168.158.4:9866 2025-07-12 08:51:00,551 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:41874, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_184583376_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745676_4852, duration(ns): 19894206 2025-07-12 08:51:00,551 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745676_4852, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-12 08:51:04,521 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745676_4852 replica FinalizedReplica, blk_1073745676_4852, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745676 for deletion 2025-07-12 08:51:04,522 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745676_4852 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745676 2025-07-12 08:52:00,510 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745677_4853 src: /192.168.158.1:41364 dest: /192.168.158.4:9866 2025-07-12 08:52:00,543 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41364, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-186338521_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745677_4853, duration(ns): 23115814 2025-07-12 08:52:00,543 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745677_4853, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-12 08:52:04,525 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745677_4853 replica FinalizedReplica, blk_1073745677_4853, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745677 for deletion 2025-07-12 08:52:04,526 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745677_4853 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745677 2025-07-12 09:00:25,516 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745685_4861 src: /192.168.158.8:44084 dest: /192.168.158.4:9866 2025-07-12 09:00:25,535 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:44084, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1786522972_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745685_4861, duration(ns): 16272846 2025-07-12 09:00:25,535 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745685_4861, type=LAST_IN_PIPELINE terminating 2025-07-12 09:00:28,538 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745685_4861 replica FinalizedReplica, blk_1073745685_4861, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745685 for deletion 2025-07-12 09:00:28,539 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745685_4861 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745685 2025-07-12 09:01:25,507 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745686_4862 src: /192.168.158.8:44326 dest: /192.168.158.4:9866 2025-07-12 09:01:25,531 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:44326, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1505285070_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745686_4862, duration(ns): 18871320 2025-07-12 09:01:25,531 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745686_4862, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-12 09:01:31,541 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745686_4862 replica FinalizedReplica, blk_1073745686_4862, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745686 for deletion 2025-07-12 09:01:31,542 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745686_4862 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745686 2025-07-12 09:02:25,512 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745687_4863 src: /192.168.158.1:40684 dest: /192.168.158.4:9866 2025-07-12 09:02:25,547 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40684, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1964376792_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745687_4863, duration(ns): 24233498 2025-07-12 09:02:25,547 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745687_4863, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-12 09:02:28,546 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745687_4863 replica FinalizedReplica, blk_1073745687_4863, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745687 for deletion 2025-07-12 09:02:28,547 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745687_4863 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745687 2025-07-12 09:05:30,518 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745690_4866 src: /192.168.158.1:58712 dest: /192.168.158.4:9866 2025-07-12 09:05:30,550 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58712, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1396564548_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745690_4866, duration(ns): 23004301 2025-07-12 09:05:30,550 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745690_4866, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-12 09:05:34,555 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745690_4866 replica FinalizedReplica, blk_1073745690_4866, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745690 for deletion 2025-07-12 09:05:34,556 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745690_4866 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745690 2025-07-12 09:10:35,509 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745695_4871 src: /192.168.158.6:46228 dest: /192.168.158.4:9866 2025-07-12 09:10:35,537 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:46228, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-129749323_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745695_4871, duration(ns): 21891721 2025-07-12 09:10:35,537 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745695_4871, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-12 09:10:40,560 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745695_4871 replica FinalizedReplica, blk_1073745695_4871, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745695 for deletion 2025-07-12 09:10:40,561 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745695_4871 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745695 2025-07-12 09:11:35,529 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745696_4872 src: /192.168.158.1:45286 dest: /192.168.158.4:9866 2025-07-12 09:11:35,560 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45286, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1436492354_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745696_4872, duration(ns): 21864037 2025-07-12 09:11:35,560 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745696_4872, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-12 09:11:37,562 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745696_4872 replica FinalizedReplica, blk_1073745696_4872, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745696 for deletion 2025-07-12 09:11:37,563 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745696_4872 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745696 2025-07-12 09:12:40,540 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745697_4873 src: /192.168.158.5:41216 dest: /192.168.158.4:9866 2025-07-12 09:12:40,566 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:41216, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2027052826_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745697_4873, duration(ns): 19871974 2025-07-12 09:12:40,566 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745697_4873, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-12 09:12:46,563 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745697_4873 replica FinalizedReplica, blk_1073745697_4873, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745697 for deletion 2025-07-12 09:12:46,565 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745697_4873 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745697 2025-07-12 09:16:40,551 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745701_4877 src: /192.168.158.5:57338 dest: /192.168.158.4:9866 2025-07-12 09:16:40,569 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:57338, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-599473532_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745701_4877, duration(ns): 15330547 2025-07-12 09:16:40,569 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745701_4877, type=LAST_IN_PIPELINE terminating 2025-07-12 09:16:43,576 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745701_4877 replica FinalizedReplica, blk_1073745701_4877, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745701 for deletion 2025-07-12 09:16:43,577 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745701_4877 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745701 2025-07-12 09:21:45,564 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745706_4882 src: /192.168.158.7:39200 dest: /192.168.158.4:9866 2025-07-12 09:21:45,582 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:39200, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-697868747_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745706_4882, duration(ns): 15390273 2025-07-12 09:21:45,582 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745706_4882, type=LAST_IN_PIPELINE terminating 2025-07-12 09:21:49,586 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745706_4882 replica FinalizedReplica, blk_1073745706_4882, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745706 for deletion 2025-07-12 09:21:49,588 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745706_4882 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745706 2025-07-12 09:22:45,568 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745707_4883 src: /192.168.158.1:55752 dest: /192.168.158.4:9866 2025-07-12 09:22:45,601 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55752, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1710777942_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745707_4883, duration(ns): 23765652 2025-07-12 09:22:45,601 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745707_4883, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-12 09:22:52,591 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745707_4883 replica FinalizedReplica, blk_1073745707_4883, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745707 for deletion 2025-07-12 09:22:52,592 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745707_4883 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745707 2025-07-12 09:23:45,544 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745708_4884 src: /192.168.158.1:45220 dest: /192.168.158.4:9866 2025-07-12 09:23:45,575 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45220, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1955558832_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745708_4884, duration(ns): 22680004 2025-07-12 09:23:45,575 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745708_4884, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-12 09:23:49,591 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745708_4884 replica FinalizedReplica, blk_1073745708_4884, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745708 for deletion 2025-07-12 09:23:49,592 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745708_4884 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745708 2025-07-12 09:24:45,572 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745709_4885 src: /192.168.158.8:42194 dest: /192.168.158.4:9866 2025-07-12 09:24:45,591 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:42194, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-995858620_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745709_4885, duration(ns): 15207960 2025-07-12 09:24:45,591 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745709_4885, type=LAST_IN_PIPELINE terminating 2025-07-12 09:24:49,592 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745709_4885 replica FinalizedReplica, blk_1073745709_4885, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745709 for deletion 2025-07-12 09:24:49,593 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745709_4885 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745709 2025-07-12 09:29:45,568 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745714_4890 src: /192.168.158.1:53862 dest: /192.168.158.4:9866 2025-07-12 09:29:45,600 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53862, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1873412600_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745714_4890, duration(ns): 23329192 2025-07-12 09:29:45,601 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745714_4890, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-12 09:29:49,605 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745714_4890 replica FinalizedReplica, blk_1073745714_4890, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745714 for deletion 2025-07-12 09:29:49,607 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745714_4890 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745714 2025-07-12 09:30:50,567 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745715_4891 src: /192.168.158.1:46626 dest: /192.168.158.4:9866 2025-07-12 09:30:50,597 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46626, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_422835050_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745715_4891, duration(ns): 21589722 2025-07-12 09:30:50,597 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745715_4891, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-12 09:30:55,606 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745715_4891 replica FinalizedReplica, blk_1073745715_4891, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745715 for deletion 2025-07-12 09:30:55,608 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745715_4891 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745715 2025-07-12 09:35:55,570 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745720_4896 src: /192.168.158.1:49388 dest: /192.168.158.4:9866 2025-07-12 09:35:55,602 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49388, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_971170345_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745720_4896, duration(ns): 22823669 2025-07-12 09:35:55,602 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745720_4896, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-12 09:35:58,616 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745720_4896 replica FinalizedReplica, blk_1073745720_4896, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745720 for deletion 2025-07-12 09:35:58,618 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745720_4896 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745720 2025-07-12 09:38:00,604 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745722_4898 src: /192.168.158.9:36530 dest: /192.168.158.4:9866 2025-07-12 09:38:00,627 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:36530, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1842547726_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745722_4898, duration(ns): 16870033 2025-07-12 09:38:00,627 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745722_4898, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-12 09:38:04,621 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745722_4898 replica FinalizedReplica, blk_1073745722_4898, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745722 for deletion 2025-07-12 09:38:04,622 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745722_4898 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745722 2025-07-12 09:39:00,577 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745723_4899 src: /192.168.158.6:38506 dest: /192.168.158.4:9866 2025-07-12 09:39:00,601 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:38506, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1867020435_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745723_4899, duration(ns): 18343287 2025-07-12 09:39:00,601 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745723_4899, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-12 09:39:04,622 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745723_4899 replica FinalizedReplica, blk_1073745723_4899, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745723 for deletion 2025-07-12 09:39:04,624 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745723_4899 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745723 2025-07-12 09:40:05,576 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745724_4900 src: /192.168.158.9:49336 dest: /192.168.158.4:9866 2025-07-12 09:40:05,599 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:49336, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1711669108_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745724_4900, duration(ns): 17991270 2025-07-12 09:40:05,599 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745724_4900, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-12 09:40:07,628 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745724_4900 replica FinalizedReplica, blk_1073745724_4900, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745724 for deletion 2025-07-12 09:40:07,629 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745724_4900 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745724 2025-07-12 09:42:05,593 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745726_4902 src: /192.168.158.7:54526 dest: /192.168.158.4:9866 2025-07-12 09:42:05,610 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:54526, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1509863425_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745726_4902, duration(ns): 15616482 2025-07-12 09:42:05,611 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745726_4902, type=LAST_IN_PIPELINE terminating 2025-07-12 09:42:07,630 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745726_4902 replica FinalizedReplica, blk_1073745726_4902, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745726 for deletion 2025-07-12 09:42:07,632 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745726_4902 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745726 2025-07-12 09:43:05,584 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745727_4903 src: /192.168.158.6:59990 dest: /192.168.158.4:9866 2025-07-12 09:43:05,601 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:59990, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-115142805_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745727_4903, duration(ns): 14832019 2025-07-12 09:43:05,601 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745727_4903, type=LAST_IN_PIPELINE terminating 2025-07-12 09:43:07,634 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745727_4903 replica FinalizedReplica, blk_1073745727_4903, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745727 for deletion 2025-07-12 09:43:07,636 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745727_4903 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745727 2025-07-12 09:44:10,574 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745728_4904 src: /192.168.158.9:53070 dest: /192.168.158.4:9866 2025-07-12 09:44:10,598 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:53070, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-327498561_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745728_4904, duration(ns): 18273419 2025-07-12 09:44:10,599 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745728_4904, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-12 09:44:13,635 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745728_4904 replica FinalizedReplica, blk_1073745728_4904, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745728 for deletion 2025-07-12 09:44:13,636 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745728_4904 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745728 2025-07-12 09:45:10,576 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745729_4905 src: /192.168.158.1:57418 dest: /192.168.158.4:9866 2025-07-12 09:45:10,606 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57418, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_347827097_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745729_4905, duration(ns): 21417214 2025-07-12 09:45:10,606 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745729_4905, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-12 09:45:16,637 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745729_4905 replica FinalizedReplica, blk_1073745729_4905, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745729 for deletion 2025-07-12 09:45:16,638 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745729_4905 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745729 2025-07-12 09:46:10,599 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745730_4906 src: /192.168.158.8:59798 dest: /192.168.158.4:9866 2025-07-12 09:46:10,622 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:59798, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_507483355_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745730_4906, duration(ns): 21257957 2025-07-12 09:46:10,622 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745730_4906, type=LAST_IN_PIPELINE terminating 2025-07-12 09:46:13,638 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745730_4906 replica FinalizedReplica, blk_1073745730_4906, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745730 for deletion 2025-07-12 09:46:13,639 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745730_4906 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745730 2025-07-12 09:47:10,621 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745731_4907 src: /192.168.158.9:58564 dest: /192.168.158.4:9866 2025-07-12 09:47:10,647 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:58564, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1451803071_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745731_4907, duration(ns): 20849100 2025-07-12 09:47:10,647 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745731_4907, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-12 09:47:13,639 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745731_4907 replica FinalizedReplica, blk_1073745731_4907, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745731 for deletion 2025-07-12 09:47:13,641 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745731_4907 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745731 2025-07-12 09:48:10,582 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745732_4908 src: /192.168.158.1:46598 dest: /192.168.158.4:9866 2025-07-12 09:48:10,613 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46598, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1947083614_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745732_4908, duration(ns): 21969456 2025-07-12 09:48:10,613 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745732_4908, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-12 09:48:13,641 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745732_4908 replica FinalizedReplica, blk_1073745732_4908, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745732 for deletion 2025-07-12 09:48:13,642 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745732_4908 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745732 2025-07-12 09:52:25,586 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745736_4912 src: /192.168.158.9:39322 dest: /192.168.158.4:9866 2025-07-12 09:52:25,610 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:39322, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1775069143_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745736_4912, duration(ns): 18747866 2025-07-12 09:52:25,611 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745736_4912, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-12 09:52:28,649 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745736_4912 replica FinalizedReplica, blk_1073745736_4912, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745736 for deletion 2025-07-12 09:52:28,650 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745736_4912 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745736 2025-07-12 09:53:25,581 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745737_4913 src: /192.168.158.1:37200 dest: /192.168.158.4:9866 2025-07-12 09:53:25,612 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37200, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1678566519_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745737_4913, duration(ns): 22857761 2025-07-12 09:53:25,613 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745737_4913, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-12 09:53:28,652 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745737_4913 replica FinalizedReplica, blk_1073745737_4913, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745737 for deletion 2025-07-12 09:53:28,653 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745737_4913 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745737 2025-07-12 09:55:30,601 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745739_4915 src: /192.168.158.8:48634 dest: /192.168.158.4:9866 2025-07-12 09:55:30,618 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:48634, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1299249299_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745739_4915, duration(ns): 15387144 2025-07-12 09:55:30,619 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745739_4915, type=LAST_IN_PIPELINE terminating 2025-07-12 09:55:34,657 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745739_4915 replica FinalizedReplica, blk_1073745739_4915, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745739 for deletion 2025-07-12 09:55:34,658 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745739_4915 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745739 2025-07-12 09:59:35,591 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745743_4919 src: /192.168.158.5:35434 dest: /192.168.158.4:9866 2025-07-12 09:59:35,618 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:35434, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1497798277_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745743_4919, duration(ns): 21104878 2025-07-12 09:59:35,618 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745743_4919, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-12 09:59:37,668 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745743_4919 replica FinalizedReplica, blk_1073745743_4919, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745743 for deletion 2025-07-12 09:59:37,669 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745743_4919 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745743 2025-07-12 10:05:40,590 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745749_4925 src: /192.168.158.1:59968 dest: /192.168.158.4:9866 2025-07-12 10:05:40,622 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59968, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1380506021_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745749_4925, duration(ns): 23031922 2025-07-12 10:05:40,623 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745749_4925, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-12 10:05:43,679 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745749_4925 replica FinalizedReplica, blk_1073745749_4925, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745749 for deletion 2025-07-12 10:05:43,680 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745749_4925 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745749 2025-07-12 10:07:40,622 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745751_4927 src: /192.168.158.8:41422 dest: /192.168.158.4:9866 2025-07-12 10:07:40,645 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:41422, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-355475512_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745751_4927, duration(ns): 17269857 2025-07-12 10:07:40,645 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745751_4927, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-12 10:07:46,683 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745751_4927 replica FinalizedReplica, blk_1073745751_4927, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745751 for deletion 2025-07-12 10:07:46,685 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745751_4927 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745751 2025-07-12 10:08:40,625 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745752_4928 src: /192.168.158.1:50238 dest: /192.168.158.4:9866 2025-07-12 10:08:40,660 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50238, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2115054853_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745752_4928, duration(ns): 24451857 2025-07-12 10:08:40,660 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745752_4928, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-12 10:08:43,686 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745752_4928 replica FinalizedReplica, blk_1073745752_4928, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745752 for deletion 2025-07-12 10:08:43,688 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745752_4928 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745752 2025-07-12 10:09:40,604 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745753_4929 src: /192.168.158.1:58082 dest: /192.168.158.4:9866 2025-07-12 10:09:40,635 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58082, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-541751621_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745753_4929, duration(ns): 22899368 2025-07-12 10:09:40,636 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745753_4929, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-12 10:09:43,688 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745753_4929 replica FinalizedReplica, blk_1073745753_4929, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745753 for deletion 2025-07-12 10:09:43,689 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745753_4929 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745753 2025-07-12 10:16:50,614 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745760_4936 src: /192.168.158.6:35620 dest: /192.168.158.4:9866 2025-07-12 10:16:50,638 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:35620, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1299269237_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745760_4936, duration(ns): 18480056 2025-07-12 10:16:50,638 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745760_4936, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-12 10:16:55,701 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745760_4936 replica FinalizedReplica, blk_1073745760_4936, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745760 for deletion 2025-07-12 10:16:55,702 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745760_4936 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745760 2025-07-12 10:17:50,616 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745761_4937 src: /192.168.158.7:48724 dest: /192.168.158.4:9866 2025-07-12 10:17:50,639 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:48724, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-830949416_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745761_4937, duration(ns): 18175850 2025-07-12 10:17:50,639 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745761_4937, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-12 10:17:55,702 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745761_4937 replica FinalizedReplica, blk_1073745761_4937, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745761 for deletion 2025-07-12 10:17:55,703 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745761_4937 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745761 2025-07-12 10:18:50,616 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745762_4938 src: /192.168.158.8:46730 dest: /192.168.158.4:9866 2025-07-12 10:18:50,634 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:46730, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1356258031_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745762_4938, duration(ns): 16036950 2025-07-12 10:18:50,634 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745762_4938, type=LAST_IN_PIPELINE terminating 2025-07-12 10:18:55,706 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745762_4938 replica FinalizedReplica, blk_1073745762_4938, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745762 for deletion 2025-07-12 10:18:55,707 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745762_4938 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745762 2025-07-12 10:21:55,623 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745765_4941 src: /192.168.158.6:50354 dest: /192.168.158.4:9866 2025-07-12 10:21:55,643 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:50354, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1193616828_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745765_4941, duration(ns): 17534517 2025-07-12 10:21:55,643 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745765_4941, type=LAST_IN_PIPELINE terminating 2025-07-12 10:21:58,712 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745765_4941 replica FinalizedReplica, blk_1073745765_4941, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745765 for deletion 2025-07-12 10:21:58,713 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745765_4941 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745765 2025-07-12 10:29:00,645 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745772_4948 src: /192.168.158.7:44252 dest: /192.168.158.4:9866 2025-07-12 10:29:00,670 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:44252, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-318745524_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745772_4948, duration(ns): 19238550 2025-07-12 10:29:00,670 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745772_4948, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-12 10:29:07,726 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745772_4948 replica FinalizedReplica, blk_1073745772_4948, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745772 for deletion 2025-07-12 10:29:07,728 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745772_4948 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745772 2025-07-12 10:31:00,646 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745774_4950 src: /192.168.158.1:36230 dest: /192.168.158.4:9866 2025-07-12 10:31:00,678 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36230, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1143328933_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745774_4950, duration(ns): 23458789 2025-07-12 10:31:00,679 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745774_4950, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-12 10:31:04,730 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745774_4950 replica FinalizedReplica, blk_1073745774_4950, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745774 for deletion 2025-07-12 10:31:04,732 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745774_4950 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745774 2025-07-12 10:32:05,648 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745775_4951 src: /192.168.158.1:53128 dest: /192.168.158.4:9866 2025-07-12 10:32:05,680 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53128, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1174852740_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745775_4951, duration(ns): 22179600 2025-07-12 10:32:05,680 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745775_4951, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-12 10:32:07,732 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745775_4951 replica FinalizedReplica, blk_1073745775_4951, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745775 for deletion 2025-07-12 10:32:07,733 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745775_4951 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745775 2025-07-12 10:33:05,654 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745776_4952 src: /192.168.158.9:46540 dest: /192.168.158.4:9866 2025-07-12 10:33:05,680 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:46540, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_471558554_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745776_4952, duration(ns): 20617345 2025-07-12 10:33:05,681 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745776_4952, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-12 10:33:07,731 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745776_4952 replica FinalizedReplica, blk_1073745776_4952, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745776 for deletion 2025-07-12 10:33:07,733 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745776_4952 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745776 2025-07-12 10:36:10,656 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745779_4955 src: /192.168.158.6:56228 dest: /192.168.158.4:9866 2025-07-12 10:36:10,674 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:56228, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-391218479_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745779_4955, duration(ns): 15990576 2025-07-12 10:36:10,675 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745779_4955, type=LAST_IN_PIPELINE terminating 2025-07-12 10:36:13,740 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745779_4955 replica FinalizedReplica, blk_1073745779_4955, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745779 for deletion 2025-07-12 10:36:13,741 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745779_4955 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745779 2025-07-12 10:38:10,665 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745781_4957 src: /192.168.158.5:49714 dest: /192.168.158.4:9866 2025-07-12 10:38:10,686 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:49714, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_585550149_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745781_4957, duration(ns): 19045687 2025-07-12 10:38:10,686 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745781_4957, type=LAST_IN_PIPELINE terminating 2025-07-12 10:38:13,743 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745781_4957 replica FinalizedReplica, blk_1073745781_4957, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745781 for deletion 2025-07-12 10:38:13,744 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745781_4957 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745781 2025-07-12 10:41:15,670 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745784_4960 src: /192.168.158.8:35956 dest: /192.168.158.4:9866 2025-07-12 10:41:15,692 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:35956, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1093721704_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745784_4960, duration(ns): 17087187 2025-07-12 10:41:15,692 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745784_4960, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-12 10:41:19,748 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745784_4960 replica FinalizedReplica, blk_1073745784_4960, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745784 for deletion 2025-07-12 10:41:19,750 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745784_4960 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745784 2025-07-12 10:44:20,675 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745787_4963 src: /192.168.158.6:37854 dest: /192.168.158.4:9866 2025-07-12 10:44:20,699 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:37854, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-580722647_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745787_4963, duration(ns): 18670164 2025-07-12 10:44:20,700 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745787_4963, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-12 10:44:22,750 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745787_4963 replica FinalizedReplica, blk_1073745787_4963, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745787 for deletion 2025-07-12 10:44:22,751 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745787_4963 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745787 2025-07-12 10:45:20,682 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745788_4964 src: /192.168.158.1:57976 dest: /192.168.158.4:9866 2025-07-12 10:45:20,713 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57976, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1745402888_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745788_4964, duration(ns): 22028144 2025-07-12 10:45:20,713 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745788_4964, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-12 10:45:25,752 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745788_4964 replica FinalizedReplica, blk_1073745788_4964, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745788 for deletion 2025-07-12 10:45:25,753 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745788_4964 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745788 2025-07-12 10:47:20,678 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745790_4966 src: /192.168.158.8:43184 dest: /192.168.158.4:9866 2025-07-12 10:47:20,703 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:43184, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1730341808_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745790_4966, duration(ns): 18370551 2025-07-12 10:47:20,703 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745790_4966, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-12 10:47:22,755 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745790_4966 replica FinalizedReplica, blk_1073745790_4966, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745790 for deletion 2025-07-12 10:47:22,756 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745790_4966 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745790 2025-07-12 10:48:20,679 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745791_4967 src: /192.168.158.1:35196 dest: /192.168.158.4:9866 2025-07-12 10:48:20,710 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35196, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1557983832_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745791_4967, duration(ns): 21853895 2025-07-12 10:48:20,710 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745791_4967, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-12 10:48:22,758 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745791_4967 replica FinalizedReplica, blk_1073745791_4967, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745791 for deletion 2025-07-12 10:48:22,759 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745791_4967 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745791 2025-07-12 10:50:25,685 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745793_4969 src: /192.168.158.6:52550 dest: /192.168.158.4:9866 2025-07-12 10:50:25,709 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:52550, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2059595349_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745793_4969, duration(ns): 18339463 2025-07-12 10:50:25,709 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745793_4969, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-12 10:50:28,759 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745793_4969 replica FinalizedReplica, blk_1073745793_4969, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745793 for deletion 2025-07-12 10:50:28,760 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745793_4969 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745793 2025-07-12 10:51:25,682 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745794_4970 src: /192.168.158.1:56962 dest: /192.168.158.4:9866 2025-07-12 10:51:25,714 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56962, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-766069314_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745794_4970, duration(ns): 23664757 2025-07-12 10:51:25,715 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745794_4970, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-12 10:51:28,763 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745794_4970 replica FinalizedReplica, blk_1073745794_4970, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745794 for deletion 2025-07-12 10:51:28,765 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745794_4970 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745794 2025-07-12 10:53:25,685 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745796_4972 src: /192.168.158.1:39974 dest: /192.168.158.4:9866 2025-07-12 10:53:25,716 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39974, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_907256578_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745796_4972, duration(ns): 22268881 2025-07-12 10:53:25,716 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745796_4972, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-12 10:53:28,767 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745796_4972 replica FinalizedReplica, blk_1073745796_4972, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745796 for deletion 2025-07-12 10:53:28,768 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745796_4972 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745796 2025-07-12 10:54:25,692 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745797_4973 src: /192.168.158.1:37428 dest: /192.168.158.4:9866 2025-07-12 10:54:25,726 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37428, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1188499635_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745797_4973, duration(ns): 24628234 2025-07-12 10:54:25,727 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745797_4973, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-12 10:54:31,769 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745797_4973 replica FinalizedReplica, blk_1073745797_4973, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745797 for deletion 2025-07-12 10:54:31,771 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745797_4973 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745797 2025-07-12 10:56:30,690 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745799_4975 src: /192.168.158.1:58332 dest: /192.168.158.4:9866 2025-07-12 10:56:30,722 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58332, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1909519558_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745799_4975, duration(ns): 23858760 2025-07-12 10:56:30,723 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745799_4975, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-12 10:56:34,776 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745799_4975 replica FinalizedReplica, blk_1073745799_4975, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745799 for deletion 2025-07-12 10:56:34,777 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745799_4975 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745799 2025-07-12 10:57:30,698 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745800_4976 src: /192.168.158.5:36142 dest: /192.168.158.4:9866 2025-07-12 10:57:30,715 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:36142, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_261484844_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745800_4976, duration(ns): 15239924 2025-07-12 10:57:30,715 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745800_4976, type=LAST_IN_PIPELINE terminating 2025-07-12 10:57:37,777 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745800_4976 replica FinalizedReplica, blk_1073745800_4976, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745800 for deletion 2025-07-12 10:57:37,778 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745800_4976 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745800 2025-07-12 10:59:30,698 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745802_4978 src: /192.168.158.1:55494 dest: /192.168.158.4:9866 2025-07-12 10:59:30,728 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55494, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1110543779_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745802_4978, duration(ns): 21268694 2025-07-12 10:59:30,728 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745802_4978, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-12 10:59:37,777 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745802_4978 replica FinalizedReplica, blk_1073745802_4978, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745802 for deletion 2025-07-12 10:59:37,778 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745802_4978 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745802 2025-07-12 11:00:30,732 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745803_4979 src: /192.168.158.7:36452 dest: /192.168.158.4:9866 2025-07-12 11:00:30,749 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:36452, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_366144878_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745803_4979, duration(ns): 14919203 2025-07-12 11:00:30,750 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745803_4979, type=LAST_IN_PIPELINE terminating 2025-07-12 11:00:34,778 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745803_4979 replica FinalizedReplica, blk_1073745803_4979, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745803 for deletion 2025-07-12 11:00:34,779 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745803_4979 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745803 2025-07-12 11:01:30,700 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745804_4980 src: /192.168.158.1:59376 dest: /192.168.158.4:9866 2025-07-12 11:01:30,734 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59376, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1460551521_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745804_4980, duration(ns): 24464891 2025-07-12 11:01:30,734 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745804_4980, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-12 11:01:34,781 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745804_4980 replica FinalizedReplica, blk_1073745804_4980, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745804 for deletion 2025-07-12 11:01:34,783 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745804_4980 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745804 2025-07-12 11:02:30,706 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745805_4981 src: /192.168.158.7:56310 dest: /192.168.158.4:9866 2025-07-12 11:02:30,724 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:56310, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-53163945_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745805_4981, duration(ns): 15306946 2025-07-12 11:02:30,724 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745805_4981, type=LAST_IN_PIPELINE terminating 2025-07-12 11:02:34,783 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745805_4981 replica FinalizedReplica, blk_1073745805_4981, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745805 for deletion 2025-07-12 11:02:34,784 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745805_4981 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745805 2025-07-12 11:03:30,712 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745806_4982 src: /192.168.158.1:34016 dest: /192.168.158.4:9866 2025-07-12 11:03:30,741 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34016, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1504970843_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745806_4982, duration(ns): 20962533 2025-07-12 11:03:30,742 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745806_4982, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-12 11:03:34,786 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745806_4982 replica FinalizedReplica, blk_1073745806_4982, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745806 for deletion 2025-07-12 11:03:34,787 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745806_4982 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745806 2025-07-12 11:04:30,712 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745807_4983 src: /192.168.158.7:54600 dest: /192.168.158.4:9866 2025-07-12 11:04:30,729 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:54600, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1121003571_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745807_4983, duration(ns): 15577504 2025-07-12 11:04:30,730 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745807_4983, type=LAST_IN_PIPELINE terminating 2025-07-12 11:04:34,787 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745807_4983 replica FinalizedReplica, blk_1073745807_4983, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745807 for deletion 2025-07-12 11:04:34,789 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745807_4983 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745807 2025-07-12 11:06:30,711 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745809_4985 src: /192.168.158.7:55250 dest: /192.168.158.4:9866 2025-07-12 11:06:30,735 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:55250, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-876342257_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745809_4985, duration(ns): 19119796 2025-07-12 11:06:30,736 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745809_4985, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-12 11:06:37,792 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745809_4985 replica FinalizedReplica, blk_1073745809_4985, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745809 for deletion 2025-07-12 11:06:37,793 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745809_4985 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745809 2025-07-12 11:07:30,741 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745810_4986 src: /192.168.158.9:47014 dest: /192.168.158.4:9866 2025-07-12 11:07:30,768 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:47014, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1099398219_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745810_4986, duration(ns): 20554362 2025-07-12 11:07:30,768 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745810_4986, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-12 11:07:37,793 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745810_4986 replica FinalizedReplica, blk_1073745810_4986, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745810 for deletion 2025-07-12 11:07:37,796 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745810_4986 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745810 2025-07-12 11:09:30,722 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745812_4988 src: /192.168.158.9:33922 dest: /192.168.158.4:9866 2025-07-12 11:09:30,748 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:33922, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_725088086_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745812_4988, duration(ns): 21245870 2025-07-12 11:09:30,749 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745812_4988, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-12 11:09:34,796 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745812_4988 replica FinalizedReplica, blk_1073745812_4988, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745812 for deletion 2025-07-12 11:09:34,797 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745812_4988 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745812 2025-07-12 11:10:30,722 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745813_4989 src: /192.168.158.7:38170 dest: /192.168.158.4:9866 2025-07-12 11:10:30,740 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:38170, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1011272075_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745813_4989, duration(ns): 15391066 2025-07-12 11:10:30,740 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745813_4989, type=LAST_IN_PIPELINE terminating 2025-07-12 11:10:34,799 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745813_4989 replica FinalizedReplica, blk_1073745813_4989, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745813 for deletion 2025-07-12 11:10:34,800 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745813_4989 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745813 2025-07-12 11:11:35,716 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745814_4990 src: /192.168.158.5:55218 dest: /192.168.158.4:9866 2025-07-12 11:11:35,742 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:55218, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_501683105_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745814_4990, duration(ns): 19716661 2025-07-12 11:11:35,742 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745814_4990, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-12 11:11:37,800 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745814_4990 replica FinalizedReplica, blk_1073745814_4990, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745814 for deletion 2025-07-12 11:11:37,801 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745814_4990 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745814 2025-07-12 11:14:40,717 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745817_4993 src: /192.168.158.1:35204 dest: /192.168.158.4:9866 2025-07-12 11:14:40,750 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35204, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_617442004_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745817_4993, duration(ns): 23771496 2025-07-12 11:14:40,750 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745817_4993, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-12 11:14:43,804 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745817_4993 replica FinalizedReplica, blk_1073745817_4993, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745817 for deletion 2025-07-12 11:14:43,806 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745817_4993 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745817 2025-07-12 11:16:40,729 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745819_4995 src: /192.168.158.6:52244 dest: /192.168.158.4:9866 2025-07-12 11:16:40,747 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:52244, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1752108262_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745819_4995, duration(ns): 15609334 2025-07-12 11:16:40,747 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745819_4995, type=LAST_IN_PIPELINE terminating 2025-07-12 11:16:46,807 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745819_4995 replica FinalizedReplica, blk_1073745819_4995, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745819 for deletion 2025-07-12 11:16:46,809 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745819_4995 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745819 2025-07-12 11:19:40,731 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745822_4998 src: /192.168.158.9:54630 dest: /192.168.158.4:9866 2025-07-12 11:19:40,750 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:54630, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-889746810_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745822_4998, duration(ns): 16412439 2025-07-12 11:19:40,750 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745822_4998, type=LAST_IN_PIPELINE terminating 2025-07-12 11:19:43,813 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745822_4998 replica FinalizedReplica, blk_1073745822_4998, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745822 for deletion 2025-07-12 11:19:43,814 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745822_4998 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745822 2025-07-12 11:20:45,733 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745823_4999 src: /192.168.158.5:34804 dest: /192.168.158.4:9866 2025-07-12 11:20:45,755 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:34804, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2021837042_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745823_4999, duration(ns): 16922621 2025-07-12 11:20:45,755 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745823_4999, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-12 11:20:49,815 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745823_4999 replica FinalizedReplica, blk_1073745823_4999, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745823 for deletion 2025-07-12 11:20:49,816 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745823_4999 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745823 2025-07-12 11:21:50,732 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745824_5000 src: /192.168.158.9:59272 dest: /192.168.158.4:9866 2025-07-12 11:21:50,756 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:59272, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1628039835_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745824_5000, duration(ns): 19257612 2025-07-12 11:21:50,756 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745824_5000, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-12 11:21:52,818 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745824_5000 replica FinalizedReplica, blk_1073745824_5000, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745824 for deletion 2025-07-12 11:21:52,819 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745824_5000 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745824 2025-07-12 11:23:55,734 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745826_5002 src: /192.168.158.9:46316 dest: /192.168.158.4:9866 2025-07-12 11:23:55,757 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:46316, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1795734954_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745826_5002, duration(ns): 18676314 2025-07-12 11:23:55,758 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745826_5002, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-12 11:24:01,823 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745826_5002 replica FinalizedReplica, blk_1073745826_5002, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745826 for deletion 2025-07-12 11:24:01,824 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745826_5002 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745826 2025-07-12 11:27:55,739 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745830_5006 src: /192.168.158.1:40030 dest: /192.168.158.4:9866 2025-07-12 11:27:55,775 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40030, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1472893819_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745830_5006, duration(ns): 26587592 2025-07-12 11:27:55,775 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745830_5006, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-12 11:27:58,830 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745830_5006 replica FinalizedReplica, blk_1073745830_5006, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745830 for deletion 2025-07-12 11:27:58,831 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745830_5006 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745830 2025-07-12 11:28:55,742 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745831_5007 src: /192.168.158.8:33312 dest: /192.168.158.4:9866 2025-07-12 11:28:55,768 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:33312, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1745252441_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745831_5007, duration(ns): 20689295 2025-07-12 11:28:55,768 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745831_5007, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-12 11:28:58,833 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745831_5007 replica FinalizedReplica, blk_1073745831_5007, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745831 for deletion 2025-07-12 11:28:58,834 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745831_5007 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745831 2025-07-12 11:29:55,753 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745832_5008 src: /192.168.158.5:58110 dest: /192.168.158.4:9866 2025-07-12 11:29:55,779 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:58110, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_979225454_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745832_5008, duration(ns): 20725323 2025-07-12 11:29:55,779 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745832_5008, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-12 11:30:01,834 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745832_5008 replica FinalizedReplica, blk_1073745832_5008, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745832 for deletion 2025-07-12 11:30:01,835 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745832_5008 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745832 2025-07-12 11:30:55,751 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745833_5009 src: /192.168.158.5:44064 dest: /192.168.158.4:9866 2025-07-12 11:30:55,769 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:44064, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1586984228_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745833_5009, duration(ns): 15563049 2025-07-12 11:30:55,769 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745833_5009, type=LAST_IN_PIPELINE terminating 2025-07-12 11:30:58,835 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745833_5009 replica FinalizedReplica, blk_1073745833_5009, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745833 for deletion 2025-07-12 11:30:58,836 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745833_5009 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745833 2025-07-12 11:32:00,751 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745834_5010 src: /192.168.158.6:59430 dest: /192.168.158.4:9866 2025-07-12 11:32:00,769 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:59430, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-75916589_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745834_5010, duration(ns): 15067179 2025-07-12 11:32:00,769 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745834_5010, type=LAST_IN_PIPELINE terminating 2025-07-12 11:32:04,839 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745834_5010 replica FinalizedReplica, blk_1073745834_5010, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745834 for deletion 2025-07-12 11:32:04,840 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745834_5010 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745834 2025-07-12 11:35:15,751 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745837_5013 src: /192.168.158.6:42998 dest: /192.168.158.4:9866 2025-07-12 11:35:15,774 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:42998, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_588927272_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745837_5013, duration(ns): 17860776 2025-07-12 11:35:15,774 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745837_5013, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-12 11:35:22,841 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745837_5013 replica FinalizedReplica, blk_1073745837_5013, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745837 for deletion 2025-07-12 11:35:22,843 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745837_5013 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745837 2025-07-12 11:36:13,268 INFO org.apache.hadoop.hdfs.server.datanode.DirectoryScanner: BlockPool BP-1059995147-192.168.158.1-1752101929360 Total blocks: 8, missing metadata files:0, missing block files:0, missing blocks in memory:0, mismatched blocks:0 2025-07-12 11:37:19,849 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Successfully sent block report 0x12cfd9a757d31f30, containing 4 storage report(s), of which we sent 4. The reports had 8 total blocks and used 1 RPC(s). This took 1 msec to generate and 4 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5. 2025-07-12 11:37:19,849 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Got finalize command for block pool BP-1059995147-192.168.158.1-1752101929360 2025-07-12 11:41:20,758 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745843_5019 src: /192.168.158.7:38888 dest: /192.168.158.4:9866 2025-07-12 11:41:20,783 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:38888, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1404919707_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745843_5019, duration(ns): 19634284 2025-07-12 11:41:20,783 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745843_5019, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-12 11:41:22,850 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745843_5019 replica FinalizedReplica, blk_1073745843_5019, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745843 for deletion 2025-07-12 11:41:22,851 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745843_5019 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745843 2025-07-12 11:42:25,758 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745844_5020 src: /192.168.158.8:50244 dest: /192.168.158.4:9866 2025-07-12 11:42:25,776 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:50244, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1228630084_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745844_5020, duration(ns): 13607818 2025-07-12 11:42:25,776 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745844_5020, type=LAST_IN_PIPELINE terminating 2025-07-12 11:42:28,854 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745844_5020 replica FinalizedReplica, blk_1073745844_5020, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745844 for deletion 2025-07-12 11:42:28,855 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745844_5020 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745844 2025-07-12 11:43:25,762 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745845_5021 src: /192.168.158.6:59546 dest: /192.168.158.4:9866 2025-07-12 11:43:25,786 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:59546, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1928767940_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745845_5021, duration(ns): 19175642 2025-07-12 11:43:25,786 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745845_5021, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-12 11:43:31,857 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745845_5021 replica FinalizedReplica, blk_1073745845_5021, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745845 for deletion 2025-07-12 11:43:31,858 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745845_5021 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745845 2025-07-12 11:44:30,765 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745846_5022 src: /192.168.158.7:58526 dest: /192.168.158.4:9866 2025-07-12 11:44:30,782 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:58526, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1717758894_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745846_5022, duration(ns): 15485669 2025-07-12 11:44:30,782 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745846_5022, type=LAST_IN_PIPELINE terminating 2025-07-12 11:44:34,860 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745846_5022 replica FinalizedReplica, blk_1073745846_5022, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745846 for deletion 2025-07-12 11:44:34,861 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745846_5022 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745846 2025-07-12 11:45:30,767 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745847_5023 src: /192.168.158.6:51626 dest: /192.168.158.4:9866 2025-07-12 11:45:30,785 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:51626, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-44323926_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745847_5023, duration(ns): 15050450 2025-07-12 11:45:30,785 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745847_5023, type=LAST_IN_PIPELINE terminating 2025-07-12 11:45:37,860 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745847_5023 replica FinalizedReplica, blk_1073745847_5023, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745847 for deletion 2025-07-12 11:45:37,862 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745847_5023 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745847 2025-07-12 11:47:30,760 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745849_5025 src: /192.168.158.1:45090 dest: /192.168.158.4:9866 2025-07-12 11:47:30,792 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45090, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1540637980_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745849_5025, duration(ns): 23714680 2025-07-12 11:47:30,793 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745849_5025, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-12 11:47:34,862 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745849_5025 replica FinalizedReplica, blk_1073745849_5025, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745849 for deletion 2025-07-12 11:47:34,863 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745849_5025 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745849 2025-07-12 11:48:30,766 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745850_5026 src: /192.168.158.1:44436 dest: /192.168.158.4:9866 2025-07-12 11:48:30,799 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44436, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_133161422_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745850_5026, duration(ns): 24109257 2025-07-12 11:48:30,800 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745850_5026, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-12 11:48:34,863 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745850_5026 replica FinalizedReplica, blk_1073745850_5026, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745850 for deletion 2025-07-12 11:48:34,865 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745850_5026 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745850 2025-07-12 11:49:30,769 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745851_5027 src: /192.168.158.6:36842 dest: /192.168.158.4:9866 2025-07-12 11:49:30,792 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:36842, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_855442759_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745851_5027, duration(ns): 17720267 2025-07-12 11:49:30,793 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745851_5027, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-12 11:49:37,865 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745851_5027 replica FinalizedReplica, blk_1073745851_5027, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745851 for deletion 2025-07-12 11:49:37,867 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745851_5027 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745851 2025-07-12 11:50:30,772 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745852_5028 src: /192.168.158.9:51074 dest: /192.168.158.4:9866 2025-07-12 11:50:30,794 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:51074, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1340405977_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745852_5028, duration(ns): 16618580 2025-07-12 11:50:30,794 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745852_5028, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-12 11:50:34,871 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745852_5028 replica FinalizedReplica, blk_1073745852_5028, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745852 for deletion 2025-07-12 11:50:34,872 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745852_5028 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745852 2025-07-12 11:52:40,777 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745854_5030 src: /192.168.158.8:48422 dest: /192.168.158.4:9866 2025-07-12 11:52:40,794 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:48422, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_236435667_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745854_5030, duration(ns): 15403693 2025-07-12 11:52:40,794 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745854_5030, type=LAST_IN_PIPELINE terminating 2025-07-12 11:52:43,880 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745854_5030 replica FinalizedReplica, blk_1073745854_5030, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745854 for deletion 2025-07-12 11:52:43,881 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745854_5030 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745854 2025-07-12 11:54:45,783 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745856_5032 src: /192.168.158.6:37712 dest: /192.168.158.4:9866 2025-07-12 11:54:45,800 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:37712, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1895154700_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745856_5032, duration(ns): 14905023 2025-07-12 11:54:45,800 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745856_5032, type=LAST_IN_PIPELINE terminating 2025-07-12 11:54:52,887 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745856_5032 replica FinalizedReplica, blk_1073745856_5032, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745856 for deletion 2025-07-12 11:54:52,888 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745856_5032 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745856 2025-07-12 11:55:50,780 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745857_5033 src: /192.168.158.8:52058 dest: /192.168.158.4:9866 2025-07-12 11:55:50,803 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:52058, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_900378755_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745857_5033, duration(ns): 17799383 2025-07-12 11:55:50,803 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745857_5033, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-12 11:55:52,891 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745857_5033 replica FinalizedReplica, blk_1073745857_5033, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745857 for deletion 2025-07-12 11:55:52,892 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745857_5033 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745857 2025-07-12 12:01:55,795 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745863_5039 src: /192.168.158.1:35966 dest: /192.168.158.4:9866 2025-07-12 12:01:55,828 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35966, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1771843146_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745863_5039, duration(ns): 23952691 2025-07-12 12:01:55,828 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745863_5039, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-12 12:01:58,901 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745863_5039 replica FinalizedReplica, blk_1073745863_5039, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745863 for deletion 2025-07-12 12:01:58,902 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745863_5039 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745863 2025-07-12 12:02:55,797 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745864_5040 src: /192.168.158.1:43582 dest: /192.168.158.4:9866 2025-07-12 12:02:55,829 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43582, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1515785561_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745864_5040, duration(ns): 22416212 2025-07-12 12:02:55,829 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745864_5040, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-12 12:03:01,904 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745864_5040 replica FinalizedReplica, blk_1073745864_5040, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745864 for deletion 2025-07-12 12:03:01,905 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745864_5040 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745864 2025-07-12 12:04:00,793 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745865_5041 src: /192.168.158.1:37946 dest: /192.168.158.4:9866 2025-07-12 12:04:00,831 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37946, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_353520755_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745865_5041, duration(ns): 27889440 2025-07-12 12:04:00,832 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745865_5041, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-12 12:04:04,905 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745865_5041 replica FinalizedReplica, blk_1073745865_5041, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745865 for deletion 2025-07-12 12:04:04,906 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745865_5041 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745865 2025-07-12 12:06:00,802 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745867_5043 src: /192.168.158.7:59368 dest: /192.168.158.4:9866 2025-07-12 12:06:00,821 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:59368, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_763702879_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745867_5043, duration(ns): 16846563 2025-07-12 12:06:00,821 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745867_5043, type=LAST_IN_PIPELINE terminating 2025-07-12 12:06:04,908 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745867_5043 replica FinalizedReplica, blk_1073745867_5043, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745867 for deletion 2025-07-12 12:06:04,909 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745867_5043 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745867 2025-07-12 12:07:00,804 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745868_5044 src: /192.168.158.8:33884 dest: /192.168.158.4:9866 2025-07-12 12:07:00,827 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:33884, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1739014800_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745868_5044, duration(ns): 17195364 2025-07-12 12:07:00,827 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745868_5044, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-12 12:07:04,908 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745868_5044 replica FinalizedReplica, blk_1073745868_5044, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745868 for deletion 2025-07-12 12:07:04,909 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745868_5044 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745868 2025-07-12 12:11:10,831 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745872_5048 src: /192.168.158.5:60624 dest: /192.168.158.4:9866 2025-07-12 12:11:10,849 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:60624, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1301885142_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745872_5048, duration(ns): 15848554 2025-07-12 12:11:10,849 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745872_5048, type=LAST_IN_PIPELINE terminating 2025-07-12 12:11:16,914 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745872_5048 replica FinalizedReplica, blk_1073745872_5048, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745872 for deletion 2025-07-12 12:11:16,915 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745872_5048 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745872 2025-07-12 12:13:15,827 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745874_5050 src: /192.168.158.8:38492 dest: /192.168.158.4:9866 2025-07-12 12:13:15,844 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:38492, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2012547926_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745874_5050, duration(ns): 15592202 2025-07-12 12:13:15,845 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745874_5050, type=LAST_IN_PIPELINE terminating 2025-07-12 12:13:22,918 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745874_5050 replica FinalizedReplica, blk_1073745874_5050, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745874 for deletion 2025-07-12 12:13:22,919 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745874_5050 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745874 2025-07-12 12:14:15,816 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745875_5051 src: /192.168.158.1:42180 dest: /192.168.158.4:9866 2025-07-12 12:14:15,849 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42180, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_54250704_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745875_5051, duration(ns): 23852863 2025-07-12 12:14:15,850 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745875_5051, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-12 12:14:19,921 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745875_5051 replica FinalizedReplica, blk_1073745875_5051, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745875 for deletion 2025-07-12 12:14:19,922 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745875_5051 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745875 2025-07-12 12:15:15,829 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745876_5052 src: /192.168.158.6:49080 dest: /192.168.158.4:9866 2025-07-12 12:15:15,846 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:49080, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-404907400_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745876_5052, duration(ns): 14842082 2025-07-12 12:15:15,847 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745876_5052, type=LAST_IN_PIPELINE terminating 2025-07-12 12:15:22,924 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745876_5052 replica FinalizedReplica, blk_1073745876_5052, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745876 for deletion 2025-07-12 12:15:22,925 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745876_5052 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745876 2025-07-12 12:16:20,825 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745877_5053 src: /192.168.158.9:39058 dest: /192.168.158.4:9866 2025-07-12 12:16:20,842 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:39058, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2007035250_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745877_5053, duration(ns): 15271635 2025-07-12 12:16:20,843 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745877_5053, type=LAST_IN_PIPELINE terminating 2025-07-12 12:16:22,923 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745877_5053 replica FinalizedReplica, blk_1073745877_5053, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745877 for deletion 2025-07-12 12:16:22,925 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745877_5053 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745877 2025-07-12 12:18:20,842 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745879_5055 src: /192.168.158.9:53002 dest: /192.168.158.4:9866 2025-07-12 12:18:20,859 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:53002, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2068824391_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745879_5055, duration(ns): 15665866 2025-07-12 12:18:20,860 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745879_5055, type=LAST_IN_PIPELINE terminating 2025-07-12 12:18:22,926 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745879_5055 replica FinalizedReplica, blk_1073745879_5055, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745879 for deletion 2025-07-12 12:18:22,927 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745879_5055 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745879 2025-07-12 12:19:20,830 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745880_5056 src: /192.168.158.6:38688 dest: /192.168.158.4:9866 2025-07-12 12:19:20,849 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:38688, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1447969849_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745880_5056, duration(ns): 17236316 2025-07-12 12:19:20,850 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745880_5056, type=LAST_IN_PIPELINE terminating 2025-07-12 12:19:22,926 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745880_5056 replica FinalizedReplica, blk_1073745880_5056, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745880 for deletion 2025-07-12 12:19:22,928 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745880_5056 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745880 2025-07-12 12:20:20,829 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745881_5057 src: /192.168.158.1:41064 dest: /192.168.158.4:9866 2025-07-12 12:20:20,861 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41064, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1409337482_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745881_5057, duration(ns): 23601193 2025-07-12 12:20:20,861 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745881_5057, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-12 12:20:25,928 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745881_5057 replica FinalizedReplica, blk_1073745881_5057, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745881 for deletion 2025-07-12 12:20:25,929 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745881_5057 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745881 2025-07-12 12:22:25,821 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745883_5059 src: /192.168.158.1:59752 dest: /192.168.158.4:9866 2025-07-12 12:22:25,853 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59752, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1949481920_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745883_5059, duration(ns): 23023155 2025-07-12 12:22:25,853 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745883_5059, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-12 12:22:31,930 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745883_5059 replica FinalizedReplica, blk_1073745883_5059, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745883 for deletion 2025-07-12 12:22:31,932 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745883_5059 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745883 2025-07-12 12:23:25,832 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745884_5060 src: /192.168.158.1:59868 dest: /192.168.158.4:9866 2025-07-12 12:23:25,864 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59868, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_714014943_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745884_5060, duration(ns): 23233698 2025-07-12 12:23:25,864 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745884_5060, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-12 12:23:28,933 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745884_5060 replica FinalizedReplica, blk_1073745884_5060, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745884 for deletion 2025-07-12 12:23:28,934 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745884_5060 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745884 2025-07-12 12:25:25,839 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745886_5062 src: /192.168.158.9:41450 dest: /192.168.158.4:9866 2025-07-12 12:25:25,864 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:41450, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1794090116_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745886_5062, duration(ns): 19146857 2025-07-12 12:25:25,864 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745886_5062, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-12 12:25:31,934 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745886_5062 replica FinalizedReplica, blk_1073745886_5062, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745886 for deletion 2025-07-12 12:25:31,935 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745886_5062 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745886 2025-07-12 12:26:31,622 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745887_5063 src: /192.168.158.1:38970 dest: /192.168.158.4:9866 2025-07-12 12:26:31,651 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38970, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1577370432_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745887_5063, duration(ns): 19458468 2025-07-12 12:26:31,651 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745887_5063, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-12 12:26:37,934 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745887_5063 replica FinalizedReplica, blk_1073745887_5063, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745887 for deletion 2025-07-12 12:26:37,935 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745887_5063 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745887 2025-07-12 12:27:30,843 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745888_5064 src: /192.168.158.1:44358 dest: /192.168.158.4:9866 2025-07-12 12:27:30,873 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44358, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1376914254_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745888_5064, duration(ns): 21527477 2025-07-12 12:27:30,874 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745888_5064, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-12 12:27:34,935 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745888_5064 replica FinalizedReplica, blk_1073745888_5064, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745888 for deletion 2025-07-12 12:27:34,936 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745888_5064 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745888 2025-07-12 12:28:35,843 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745889_5065 src: /192.168.158.1:49860 dest: /192.168.158.4:9866 2025-07-12 12:28:35,872 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49860, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-799564347_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745889_5065, duration(ns): 20510732 2025-07-12 12:28:35,873 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745889_5065, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-12 12:28:40,938 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745889_5065 replica FinalizedReplica, blk_1073745889_5065, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745889 for deletion 2025-07-12 12:28:40,939 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745889_5065 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745889 2025-07-12 12:29:35,850 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745890_5066 src: /192.168.158.8:35254 dest: /192.168.158.4:9866 2025-07-12 12:29:35,876 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:35254, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-441432486_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745890_5066, duration(ns): 20443227 2025-07-12 12:29:35,877 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745890_5066, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-12 12:29:40,941 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745890_5066 replica FinalizedReplica, blk_1073745890_5066, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745890 for deletion 2025-07-12 12:29:40,942 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745890_5066 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745890 2025-07-12 12:30:35,828 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745891_5067 src: /192.168.158.1:39666 dest: /192.168.158.4:9866 2025-07-12 12:30:35,861 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39666, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1675991612_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745891_5067, duration(ns): 24406047 2025-07-12 12:30:35,862 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745891_5067, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-12 12:30:40,945 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745891_5067 replica FinalizedReplica, blk_1073745891_5067, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745891 for deletion 2025-07-12 12:30:40,946 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745891_5067 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745891 2025-07-12 12:31:35,835 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745892_5068 src: /192.168.158.9:33792 dest: /192.168.158.4:9866 2025-07-12 12:31:35,854 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:33792, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_612870194_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745892_5068, duration(ns): 16081504 2025-07-12 12:31:35,854 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745892_5068, type=LAST_IN_PIPELINE terminating 2025-07-12 12:31:40,944 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745892_5068 replica FinalizedReplica, blk_1073745892_5068, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745892 for deletion 2025-07-12 12:31:40,945 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745892_5068 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745892 2025-07-12 12:32:35,833 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745893_5069 src: /192.168.158.5:49354 dest: /192.168.158.4:9866 2025-07-12 12:32:35,861 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:49354, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-178092213_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745893_5069, duration(ns): 20987398 2025-07-12 12:32:35,861 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745893_5069, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-12 12:32:37,947 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745893_5069 replica FinalizedReplica, blk_1073745893_5069, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745893 for deletion 2025-07-12 12:32:37,948 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745893_5069 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745893 2025-07-12 12:33:40,837 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745894_5070 src: /192.168.158.7:60178 dest: /192.168.158.4:9866 2025-07-12 12:33:40,855 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:60178, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-501552589_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745894_5070, duration(ns): 16283013 2025-07-12 12:33:40,856 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745894_5070, type=LAST_IN_PIPELINE terminating 2025-07-12 12:33:43,949 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745894_5070 replica FinalizedReplica, blk_1073745894_5070, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745894 for deletion 2025-07-12 12:33:43,950 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745894_5070 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745894 2025-07-12 12:39:50,847 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745900_5076 src: /192.168.158.1:41494 dest: /192.168.158.4:9866 2025-07-12 12:39:50,879 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41494, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_924866150_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745900_5076, duration(ns): 21842311 2025-07-12 12:39:50,879 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745900_5076, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-12 12:39:52,958 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745900_5076 replica FinalizedReplica, blk_1073745900_5076, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745900 for deletion 2025-07-12 12:39:52,959 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745900_5076 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745900 2025-07-12 12:41:50,855 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745902_5078 src: /192.168.158.1:34734 dest: /192.168.158.4:9866 2025-07-12 12:41:50,886 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34734, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1152224585_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745902_5078, duration(ns): 22284743 2025-07-12 12:41:50,886 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745902_5078, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-12 12:41:55,962 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745902_5078 replica FinalizedReplica, blk_1073745902_5078, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745902 for deletion 2025-07-12 12:41:55,963 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745902_5078 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745902 2025-07-12 12:42:50,850 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745903_5079 src: /192.168.158.1:54588 dest: /192.168.158.4:9866 2025-07-12 12:42:50,882 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54588, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2135782796_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745903_5079, duration(ns): 22626226 2025-07-12 12:42:50,882 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745903_5079, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-12 12:42:55,965 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745903_5079 replica FinalizedReplica, blk_1073745903_5079, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745903 for deletion 2025-07-12 12:42:55,966 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745903_5079 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745903 2025-07-12 12:46:00,870 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745906_5082 src: /192.168.158.1:51860 dest: /192.168.158.4:9866 2025-07-12 12:46:00,904 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51860, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1225450885_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745906_5082, duration(ns): 24953937 2025-07-12 12:46:00,904 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745906_5082, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-12 12:46:04,971 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745906_5082 replica FinalizedReplica, blk_1073745906_5082, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745906 for deletion 2025-07-12 12:46:04,972 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745906_5082 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745906 2025-07-12 12:53:10,897 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745913_5089 src: /192.168.158.5:40118 dest: /192.168.158.4:9866 2025-07-12 12:53:10,922 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:40118, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1924852646_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745913_5089, duration(ns): 18764091 2025-07-12 12:53:10,922 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745913_5089, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-12 12:53:13,987 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745913_5089 replica FinalizedReplica, blk_1073745913_5089, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745913 for deletion 2025-07-12 12:53:13,988 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745913_5089 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745913 2025-07-12 12:54:10,904 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745914_5090 src: /192.168.158.5:51082 dest: /192.168.158.4:9866 2025-07-12 12:54:10,928 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:51082, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-162750588_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745914_5090, duration(ns): 19209712 2025-07-12 12:54:10,929 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745914_5090, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-12 12:54:13,989 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745914_5090 replica FinalizedReplica, blk_1073745914_5090, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745914 for deletion 2025-07-12 12:54:13,990 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745914_5090 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745914 2025-07-12 12:55:15,898 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745915_5091 src: /192.168.158.1:37444 dest: /192.168.158.4:9866 2025-07-12 12:55:15,928 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37444, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1208996291_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745915_5091, duration(ns): 22082322 2025-07-12 12:55:15,929 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745915_5091, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-12 12:55:19,990 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745915_5091 replica FinalizedReplica, blk_1073745915_5091, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745915 for deletion 2025-07-12 12:55:19,991 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745915_5091 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745915 2025-07-12 12:57:20,905 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745917_5093 src: /192.168.158.1:56016 dest: /192.168.158.4:9866 2025-07-12 12:57:20,936 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56016, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1437781547_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745917_5093, duration(ns): 22294169 2025-07-12 12:57:20,936 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745917_5093, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-12 12:57:22,997 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745917_5093 replica FinalizedReplica, blk_1073745917_5093, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745917 for deletion 2025-07-12 12:57:22,998 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745917_5093 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745917 2025-07-12 12:58:20,905 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745918_5094 src: /192.168.158.1:57100 dest: /192.168.158.4:9866 2025-07-12 12:58:20,938 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57100, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_904766155_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745918_5094, duration(ns): 23588619 2025-07-12 12:58:20,938 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745918_5094, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-12 12:58:23,000 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745918_5094 replica FinalizedReplica, blk_1073745918_5094, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745918 for deletion 2025-07-12 12:58:23,001 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745918_5094 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073745918 2025-07-12 13:00:25,920 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745920_5096 src: /192.168.158.1:51972 dest: /192.168.158.4:9866 2025-07-12 13:00:25,954 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51972, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1375654999_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745920_5096, duration(ns): 25197339 2025-07-12 13:00:25,955 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745920_5096, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-12 13:00:29,002 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745920_5096 replica FinalizedReplica, blk_1073745920_5096, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745920 for deletion 2025-07-12 13:00:29,003 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745920_5096 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745920 2025-07-12 13:02:25,926 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745922_5098 src: /192.168.158.8:54474 dest: /192.168.158.4:9866 2025-07-12 13:02:25,944 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:54474, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-119967527_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745922_5098, duration(ns): 16195921 2025-07-12 13:02:25,945 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745922_5098, type=LAST_IN_PIPELINE terminating 2025-07-12 13:02:32,008 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745922_5098 replica FinalizedReplica, blk_1073745922_5098, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745922 for deletion 2025-07-12 13:02:32,009 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745922_5098 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745922 2025-07-12 13:05:25,923 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745925_5101 src: /192.168.158.7:51904 dest: /192.168.158.4:9866 2025-07-12 13:05:25,945 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:51904, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1333517025_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745925_5101, duration(ns): 16829249 2025-07-12 13:05:25,946 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745925_5101, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-12 13:05:29,015 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745925_5101 replica FinalizedReplica, blk_1073745925_5101, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745925 for deletion 2025-07-12 13:05:29,016 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745925_5101 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745925 2025-07-12 13:07:30,916 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745927_5103 src: /192.168.158.7:54392 dest: /192.168.158.4:9866 2025-07-12 13:07:30,939 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:54392, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1518113522_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745927_5103, duration(ns): 18060493 2025-07-12 13:07:30,940 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745927_5103, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-12 13:07:35,019 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745927_5103 replica FinalizedReplica, blk_1073745927_5103, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745927 for deletion 2025-07-12 13:07:35,020 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745927_5103 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745927 2025-07-12 13:08:30,903 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745928_5104 src: /192.168.158.6:37398 dest: /192.168.158.4:9866 2025-07-12 13:08:30,927 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:37398, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-77695293_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745928_5104, duration(ns): 18699861 2025-07-12 13:08:30,928 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745928_5104, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-12 13:08:35,019 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745928_5104 replica FinalizedReplica, blk_1073745928_5104, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745928 for deletion 2025-07-12 13:08:35,020 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745928_5104 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745928 2025-07-12 13:11:30,904 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745931_5107 src: /192.168.158.6:39076 dest: /192.168.158.4:9866 2025-07-12 13:11:30,930 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:39076, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_822348103_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745931_5107, duration(ns): 20188288 2025-07-12 13:11:30,930 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745931_5107, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-12 13:11:38,027 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745931_5107 replica FinalizedReplica, blk_1073745931_5107, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745931 for deletion 2025-07-12 13:11:38,028 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745931_5107 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745931 2025-07-12 13:12:30,910 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745932_5108 src: /192.168.158.8:33346 dest: /192.168.158.4:9866 2025-07-12 13:12:30,937 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:33346, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-670324277_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745932_5108, duration(ns): 21433594 2025-07-12 13:12:30,937 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745932_5108, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-12 13:12:38,028 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745932_5108 replica FinalizedReplica, blk_1073745932_5108, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745932 for deletion 2025-07-12 13:12:38,030 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745932_5108 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745932 2025-07-12 13:13:30,915 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745933_5109 src: /192.168.158.8:38180 dest: /192.168.158.4:9866 2025-07-12 13:13:30,934 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:38180, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1363014044_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745933_5109, duration(ns): 16704219 2025-07-12 13:13:30,934 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745933_5109, type=LAST_IN_PIPELINE terminating 2025-07-12 13:13:35,030 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745933_5109 replica FinalizedReplica, blk_1073745933_5109, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745933 for deletion 2025-07-12 13:13:35,031 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745933_5109 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745933 2025-07-12 13:17:35,962 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745937_5113 src: /192.168.158.1:36598 dest: /192.168.158.4:9866 2025-07-12 13:17:35,994 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36598, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1847585206_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745937_5113, duration(ns): 22376127 2025-07-12 13:17:35,994 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745937_5113, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-12 13:17:38,035 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745937_5113 replica FinalizedReplica, blk_1073745937_5113, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745937 for deletion 2025-07-12 13:17:38,036 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745937_5113 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745937 2025-07-12 13:18:35,929 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745938_5114 src: /192.168.158.6:57006 dest: /192.168.158.4:9866 2025-07-12 13:18:35,958 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:57006, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2049553962_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745938_5114, duration(ns): 21746548 2025-07-12 13:18:35,958 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745938_5114, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-12 13:18:41,037 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745938_5114 replica FinalizedReplica, blk_1073745938_5114, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745938 for deletion 2025-07-12 13:18:41,038 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745938_5114 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745938 2025-07-12 13:19:35,937 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745939_5115 src: /192.168.158.1:40224 dest: /192.168.158.4:9866 2025-07-12 13:19:35,969 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40224, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-968369074_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745939_5115, duration(ns): 22800887 2025-07-12 13:19:35,970 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745939_5115, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-12 13:19:38,039 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745939_5115 replica FinalizedReplica, blk_1073745939_5115, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745939 for deletion 2025-07-12 13:19:38,040 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745939_5115 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745939 2025-07-12 13:20:40,936 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745940_5116 src: /192.168.158.7:41890 dest: /192.168.158.4:9866 2025-07-12 13:20:40,954 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:41890, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_351245095_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745940_5116, duration(ns): 14829520 2025-07-12 13:20:40,954 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745940_5116, type=LAST_IN_PIPELINE terminating 2025-07-12 13:20:44,041 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745940_5116 replica FinalizedReplica, blk_1073745940_5116, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745940 for deletion 2025-07-12 13:20:44,043 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745940_5116 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745940 2025-07-12 13:22:45,934 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745942_5118 src: /192.168.158.9:42096 dest: /192.168.158.4:9866 2025-07-12 13:22:45,961 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:42096, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1218439686_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745942_5118, duration(ns): 22014687 2025-07-12 13:22:45,961 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745942_5118, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-12 13:22:50,047 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745942_5118 replica FinalizedReplica, blk_1073745942_5118, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745942 for deletion 2025-07-12 13:22:50,048 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745942_5118 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745942 2025-07-12 13:24:45,943 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745944_5120 src: /192.168.158.1:55616 dest: /192.168.158.4:9866 2025-07-12 13:24:45,972 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55616, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_347106602_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745944_5120, duration(ns): 21067246 2025-07-12 13:24:45,973 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745944_5120, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-12 13:24:50,049 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745944_5120 replica FinalizedReplica, blk_1073745944_5120, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745944 for deletion 2025-07-12 13:24:50,050 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745944_5120 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745944 2025-07-12 13:25:45,941 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745945_5121 src: /192.168.158.1:34262 dest: /192.168.158.4:9866 2025-07-12 13:25:45,970 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34262, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-28985186_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745945_5121, duration(ns): 21236926 2025-07-12 13:25:45,971 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745945_5121, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-12 13:25:53,054 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745945_5121 replica FinalizedReplica, blk_1073745945_5121, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745945 for deletion 2025-07-12 13:25:53,055 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745945_5121 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745945 2025-07-12 13:27:55,962 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745947_5123 src: /192.168.158.1:41138 dest: /192.168.158.4:9866 2025-07-12 13:27:55,997 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41138, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_731919073_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745947_5123, duration(ns): 25370760 2025-07-12 13:27:55,997 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745947_5123, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-12 13:27:59,060 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745947_5123 replica FinalizedReplica, blk_1073745947_5123, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745947 for deletion 2025-07-12 13:27:59,061 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745947_5123 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745947 2025-07-12 13:32:05,976 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745951_5127 src: /192.168.158.7:57904 dest: /192.168.158.4:9866 2025-07-12 13:32:05,995 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:57904, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-891835944_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745951_5127, duration(ns): 17290531 2025-07-12 13:32:05,995 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745951_5127, type=LAST_IN_PIPELINE terminating 2025-07-12 13:32:08,070 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745951_5127 replica FinalizedReplica, blk_1073745951_5127, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745951 for deletion 2025-07-12 13:32:08,071 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745951_5127 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745951 2025-07-12 13:33:05,986 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745952_5128 src: /192.168.158.9:40936 dest: /192.168.158.4:9866 2025-07-12 13:33:06,006 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:40936, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_693958353_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745952_5128, duration(ns): 17152700 2025-07-12 13:33:06,006 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745952_5128, type=LAST_IN_PIPELINE terminating 2025-07-12 13:33:11,072 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745952_5128 replica FinalizedReplica, blk_1073745952_5128, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745952 for deletion 2025-07-12 13:33:11,073 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745952_5128 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745952 2025-07-12 13:34:05,975 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745953_5129 src: /192.168.158.9:49652 dest: /192.168.158.4:9866 2025-07-12 13:34:05,994 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:49652, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2120280785_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745953_5129, duration(ns): 16241560 2025-07-12 13:34:05,994 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745953_5129, type=LAST_IN_PIPELINE terminating 2025-07-12 13:34:08,074 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745953_5129 replica FinalizedReplica, blk_1073745953_5129, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745953 for deletion 2025-07-12 13:34:08,075 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745953_5129 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745953 2025-07-12 13:35:05,989 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745954_5130 src: /192.168.158.5:58746 dest: /192.168.158.4:9866 2025-07-12 13:35:06,007 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:58746, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_398547332_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745954_5130, duration(ns): 15885386 2025-07-12 13:35:06,007 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745954_5130, type=LAST_IN_PIPELINE terminating 2025-07-12 13:35:08,077 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745954_5130 replica FinalizedReplica, blk_1073745954_5130, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745954 for deletion 2025-07-12 13:35:08,078 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745954_5130 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745954 2025-07-12 13:36:05,977 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745955_5131 src: /192.168.158.7:58716 dest: /192.168.158.4:9866 2025-07-12 13:36:06,003 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:58716, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1683021837_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745955_5131, duration(ns): 20862520 2025-07-12 13:36:06,003 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745955_5131, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-12 13:36:08,081 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745955_5131 replica FinalizedReplica, blk_1073745955_5131, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745955 for deletion 2025-07-12 13:36:08,082 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745955_5131 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745955 2025-07-12 13:38:10,989 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745957_5133 src: /192.168.158.5:48062 dest: /192.168.158.4:9866 2025-07-12 13:38:11,014 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:48062, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_80909337_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745957_5133, duration(ns): 19156836 2025-07-12 13:38:11,014 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745957_5133, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-12 13:38:17,084 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745957_5133 replica FinalizedReplica, blk_1073745957_5133, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745957 for deletion 2025-07-12 13:38:17,085 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745957_5133 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745957 2025-07-12 13:39:10,987 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745958_5134 src: /192.168.158.9:49198 dest: /192.168.158.4:9866 2025-07-12 13:39:11,011 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:49198, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679827310_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745958_5134, duration(ns): 18896605 2025-07-12 13:39:11,012 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745958_5134, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-12 13:39:17,085 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745958_5134 replica FinalizedReplica, blk_1073745958_5134, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745958 for deletion 2025-07-12 13:39:17,087 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745958_5134 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745958 2025-07-12 13:41:11,015 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745960_5136 src: /192.168.158.9:48326 dest: /192.168.158.4:9866 2025-07-12 13:41:11,033 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:48326, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1881129509_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745960_5136, duration(ns): 15383035 2025-07-12 13:41:11,033 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745960_5136, type=LAST_IN_PIPELINE terminating 2025-07-12 13:41:17,090 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745960_5136 replica FinalizedReplica, blk_1073745960_5136, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745960 for deletion 2025-07-12 13:41:17,091 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745960_5136 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745960 2025-07-12 13:42:11,012 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745961_5137 src: /192.168.158.5:42844 dest: /192.168.158.4:9866 2025-07-12 13:42:11,030 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:42844, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1184745023_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745961_5137, duration(ns): 16544399 2025-07-12 13:42:11,031 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745961_5137, type=LAST_IN_PIPELINE terminating 2025-07-12 13:42:14,091 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745961_5137 replica FinalizedReplica, blk_1073745961_5137, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745961 for deletion 2025-07-12 13:42:14,092 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745961_5137 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745961 2025-07-12 13:43:16,054 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745962_5138 src: /192.168.158.7:57342 dest: /192.168.158.4:9866 2025-07-12 13:43:16,072 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:57342, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1392729977_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745962_5138, duration(ns): 16175566 2025-07-12 13:43:16,072 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745962_5138, type=LAST_IN_PIPELINE terminating 2025-07-12 13:43:20,095 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745962_5138 replica FinalizedReplica, blk_1073745962_5138, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745962 for deletion 2025-07-12 13:43:20,096 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745962_5138 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745962 2025-07-12 13:45:21,006 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745964_5140 src: /192.168.158.6:37748 dest: /192.168.158.4:9866 2025-07-12 13:45:21,030 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:37748, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_642202621_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745964_5140, duration(ns): 18420067 2025-07-12 13:45:21,031 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745964_5140, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-12 13:45:23,100 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745964_5140 replica FinalizedReplica, blk_1073745964_5140, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745964 for deletion 2025-07-12 13:45:23,101 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745964_5140 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745964 2025-07-12 13:52:25,995 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745971_5147 src: /192.168.158.9:35926 dest: /192.168.158.4:9866 2025-07-12 13:52:26,016 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:35926, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1533184161_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745971_5147, duration(ns): 18700459 2025-07-12 13:52:26,016 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745971_5147, type=LAST_IN_PIPELINE terminating 2025-07-12 13:52:29,116 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745971_5147 replica FinalizedReplica, blk_1073745971_5147, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745971 for deletion 2025-07-12 13:52:29,117 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745971_5147 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745971 2025-07-12 13:53:25,998 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745972_5148 src: /192.168.158.8:46016 dest: /192.168.158.4:9866 2025-07-12 13:53:26,023 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:46016, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-615302504_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745972_5148, duration(ns): 19943933 2025-07-12 13:53:26,024 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745972_5148, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-12 13:53:32,117 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745972_5148 replica FinalizedReplica, blk_1073745972_5148, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745972 for deletion 2025-07-12 13:53:32,119 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745972_5148 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745972 2025-07-12 13:56:26,003 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745975_5151 src: /192.168.158.1:40460 dest: /192.168.158.4:9866 2025-07-12 13:56:26,036 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40460, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1454362447_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745975_5151, duration(ns): 23474325 2025-07-12 13:56:26,037 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745975_5151, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-12 13:56:29,123 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745975_5151 replica FinalizedReplica, blk_1073745975_5151, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745975 for deletion 2025-07-12 13:56:29,125 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745975_5151 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745975 2025-07-12 13:57:26,031 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745976_5152 src: /192.168.158.1:33426 dest: /192.168.158.4:9866 2025-07-12 13:57:26,062 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33426, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1485131856_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745976_5152, duration(ns): 22484403 2025-07-12 13:57:26,063 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745976_5152, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-12 13:57:29,126 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745976_5152 replica FinalizedReplica, blk_1073745976_5152, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745976 for deletion 2025-07-12 13:57:29,127 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745976_5152 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745976 2025-07-12 13:58:26,015 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745977_5153 src: /192.168.158.8:55472 dest: /192.168.158.4:9866 2025-07-12 13:58:26,035 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:55472, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1912357061_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745977_5153, duration(ns): 17801072 2025-07-12 13:58:26,036 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745977_5153, type=LAST_IN_PIPELINE terminating 2025-07-12 13:58:29,130 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745977_5153 replica FinalizedReplica, blk_1073745977_5153, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745977 for deletion 2025-07-12 13:58:29,131 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745977_5153 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745977 2025-07-12 13:59:31,006 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745978_5154 src: /192.168.158.6:40180 dest: /192.168.158.4:9866 2025-07-12 13:59:31,031 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:40180, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_320617572_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745978_5154, duration(ns): 19444205 2025-07-12 13:59:31,031 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745978_5154, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-12 13:59:35,132 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745978_5154 replica FinalizedReplica, blk_1073745978_5154, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745978 for deletion 2025-07-12 13:59:35,133 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745978_5154 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745978 2025-07-12 14:04:36,052 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745983_5159 src: /192.168.158.1:56298 dest: /192.168.158.4:9866 2025-07-12 14:04:36,083 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56298, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-986903418_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745983_5159, duration(ns): 21974309 2025-07-12 14:04:36,083 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745983_5159, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-12 14:04:41,155 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745983_5159 replica FinalizedReplica, blk_1073745983_5159, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745983 for deletion 2025-07-12 14:04:41,156 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745983_5159 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745983 2025-07-12 14:10:51,040 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745989_5165 src: /192.168.158.8:48780 dest: /192.168.158.4:9866 2025-07-12 14:10:51,063 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:48780, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_298678860_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745989_5165, duration(ns): 18329637 2025-07-12 14:10:51,064 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745989_5165, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-12 14:10:53,165 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745989_5165 replica FinalizedReplica, blk_1073745989_5165, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745989 for deletion 2025-07-12 14:10:53,167 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745989_5165 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745989 2025-07-12 14:11:56,051 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745990_5166 src: /192.168.158.1:39386 dest: /192.168.158.4:9866 2025-07-12 14:11:56,084 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39386, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_834530340_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745990_5166, duration(ns): 24298234 2025-07-12 14:11:56,084 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745990_5166, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-12 14:11:59,168 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745990_5166 replica FinalizedReplica, blk_1073745990_5166, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745990 for deletion 2025-07-12 14:11:59,169 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745990_5166 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745990 2025-07-12 14:13:01,057 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745991_5167 src: /192.168.158.6:36466 dest: /192.168.158.4:9866 2025-07-12 14:13:01,081 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:36466, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_992559945_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745991_5167, duration(ns): 18506877 2025-07-12 14:13:01,081 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745991_5167, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-12 14:13:08,173 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745991_5167 replica FinalizedReplica, blk_1073745991_5167, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745991 for deletion 2025-07-12 14:13:08,175 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745991_5167 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745991 2025-07-12 14:14:01,035 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745992_5168 src: /192.168.158.1:41894 dest: /192.168.158.4:9866 2025-07-12 14:14:01,067 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41894, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1020133866_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745992_5168, duration(ns): 22322988 2025-07-12 14:14:01,067 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745992_5168, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-12 14:14:05,176 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745992_5168 replica FinalizedReplica, blk_1073745992_5168, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745992 for deletion 2025-07-12 14:14:05,177 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745992_5168 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745992 2025-07-12 14:16:06,080 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745994_5170 src: /192.168.158.9:40808 dest: /192.168.158.4:9866 2025-07-12 14:16:06,097 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:40808, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1534459220_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745994_5170, duration(ns): 15157407 2025-07-12 14:16:06,098 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745994_5170, type=LAST_IN_PIPELINE terminating 2025-07-12 14:16:08,182 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745994_5170 replica FinalizedReplica, blk_1073745994_5170, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745994 for deletion 2025-07-12 14:16:08,184 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745994_5170 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745994 2025-07-12 14:17:06,072 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073745995_5171 src: /192.168.158.7:55580 dest: /192.168.158.4:9866 2025-07-12 14:17:06,097 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:55580, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1475473905_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073745995_5171, duration(ns): 19106471 2025-07-12 14:17:06,097 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073745995_5171, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-12 14:17:11,185 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073745995_5171 replica FinalizedReplica, blk_1073745995_5171, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745995 for deletion 2025-07-12 14:17:11,186 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073745995_5171 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073745995 2025-07-12 14:22:16,089 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746000_5176 src: /192.168.158.9:39634 dest: /192.168.158.4:9866 2025-07-12 14:22:16,115 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:39634, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-739126026_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746000_5176, duration(ns): 20123326 2025-07-12 14:22:16,115 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746000_5176, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-12 14:22:23,197 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746000_5176 replica FinalizedReplica, blk_1073746000_5176, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746000 for deletion 2025-07-12 14:22:23,198 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746000_5176 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746000 2025-07-12 14:24:21,080 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746002_5178 src: /192.168.158.1:55764 dest: /192.168.158.4:9866 2025-07-12 14:24:21,115 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55764, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1554901128_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746002_5178, duration(ns): 24237527 2025-07-12 14:24:21,115 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746002_5178, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-12 14:24:23,204 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746002_5178 replica FinalizedReplica, blk_1073746002_5178, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746002 for deletion 2025-07-12 14:24:23,205 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746002_5178 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746002 2025-07-12 14:26:21,113 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746004_5180 src: /192.168.158.5:50384 dest: /192.168.158.4:9866 2025-07-12 14:26:21,131 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:50384, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_30369218_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746004_5180, duration(ns): 16272283 2025-07-12 14:26:21,131 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746004_5180, type=LAST_IN_PIPELINE terminating 2025-07-12 14:26:23,210 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746004_5180 replica FinalizedReplica, blk_1073746004_5180, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746004 for deletion 2025-07-12 14:26:23,211 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746004_5180 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746004 2025-07-12 14:27:21,096 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746005_5181 src: /192.168.158.5:45236 dest: /192.168.158.4:9866 2025-07-12 14:27:21,121 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:45236, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1195496731_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746005_5181, duration(ns): 19166522 2025-07-12 14:27:21,121 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746005_5181, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-12 14:27:23,211 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746005_5181 replica FinalizedReplica, blk_1073746005_5181, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746005 for deletion 2025-07-12 14:27:23,213 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746005_5181 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746005 2025-07-12 14:28:21,097 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746006_5182 src: /192.168.158.9:43898 dest: /192.168.158.4:9866 2025-07-12 14:28:21,124 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:43898, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1948022363_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746006_5182, duration(ns): 21708441 2025-07-12 14:28:21,124 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746006_5182, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-12 14:28:23,214 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746006_5182 replica FinalizedReplica, blk_1073746006_5182, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746006 for deletion 2025-07-12 14:28:23,215 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746006_5182 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746006 2025-07-12 14:29:26,103 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746007_5183 src: /192.168.158.9:48108 dest: /192.168.158.4:9866 2025-07-12 14:29:26,121 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:48108, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1260260087_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746007_5183, duration(ns): 16396927 2025-07-12 14:29:26,122 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746007_5183, type=LAST_IN_PIPELINE terminating 2025-07-12 14:29:29,216 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746007_5183 replica FinalizedReplica, blk_1073746007_5183, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746007 for deletion 2025-07-12 14:29:29,217 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746007_5183 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746007 2025-07-12 14:31:26,112 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746009_5185 src: /192.168.158.6:40190 dest: /192.168.158.4:9866 2025-07-12 14:31:26,136 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:40190, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-729452332_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746009_5185, duration(ns): 18187571 2025-07-12 14:31:26,136 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746009_5185, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-12 14:31:32,219 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746009_5185 replica FinalizedReplica, blk_1073746009_5185, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746009 for deletion 2025-07-12 14:31:32,220 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746009_5185 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746009 2025-07-12 14:38:36,111 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746016_5192 src: /192.168.158.1:32832 dest: /192.168.158.4:9866 2025-07-12 14:38:36,147 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:32832, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_347120204_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746016_5192, duration(ns): 26846554 2025-07-12 14:38:36,147 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746016_5192, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-12 14:38:41,227 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746016_5192 replica FinalizedReplica, blk_1073746016_5192, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746016 for deletion 2025-07-12 14:38:41,228 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746016_5192 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746016 2025-07-12 14:40:36,160 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746018_5194 src: /192.168.158.6:34346 dest: /192.168.158.4:9866 2025-07-12 14:40:36,184 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:34346, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-339261594_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746018_5194, duration(ns): 18220649 2025-07-12 14:40:36,184 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746018_5194, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-12 14:40:41,230 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746018_5194 replica FinalizedReplica, blk_1073746018_5194, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746018 for deletion 2025-07-12 14:40:41,231 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746018_5194 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746018 2025-07-12 14:44:46,125 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746022_5198 src: /192.168.158.8:45654 dest: /192.168.158.4:9866 2025-07-12 14:44:46,150 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:45654, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-774368184_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746022_5198, duration(ns): 18759815 2025-07-12 14:44:46,150 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746022_5198, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-12 14:44:50,235 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746022_5198 replica FinalizedReplica, blk_1073746022_5198, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746022 for deletion 2025-07-12 14:44:50,236 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746022_5198 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746022 2025-07-12 14:47:46,144 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746025_5201 src: /192.168.158.5:42224 dest: /192.168.158.4:9866 2025-07-12 14:47:46,169 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:42224, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-256724789_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746025_5201, duration(ns): 19978257 2025-07-12 14:47:46,169 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746025_5201, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-12 14:47:50,241 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746025_5201 replica FinalizedReplica, blk_1073746025_5201, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746025 for deletion 2025-07-12 14:47:50,242 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746025_5201 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746025 2025-07-12 14:49:51,154 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746027_5203 src: /192.168.158.7:57104 dest: /192.168.158.4:9866 2025-07-12 14:49:51,171 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:57104, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1871173500_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746027_5203, duration(ns): 14462721 2025-07-12 14:49:51,171 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746027_5203, type=LAST_IN_PIPELINE terminating 2025-07-12 14:49:53,245 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746027_5203 replica FinalizedReplica, blk_1073746027_5203, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746027 for deletion 2025-07-12 14:49:53,246 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746027_5203 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746027 2025-07-12 14:50:51,183 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746028_5204 src: /192.168.158.9:38796 dest: /192.168.158.4:9866 2025-07-12 14:50:51,198 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:38796, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1248891258_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746028_5204, duration(ns): 13000671 2025-07-12 14:50:51,198 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746028_5204, type=LAST_IN_PIPELINE terminating 2025-07-12 14:50:53,245 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746028_5204 replica FinalizedReplica, blk_1073746028_5204, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746028 for deletion 2025-07-12 14:50:53,246 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746028_5204 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746028 2025-07-12 14:52:51,164 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746030_5206 src: /192.168.158.1:43946 dest: /192.168.158.4:9866 2025-07-12 14:52:51,195 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43946, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1099586292_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746030_5206, duration(ns): 21651558 2025-07-12 14:52:51,195 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746030_5206, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-12 14:52:56,248 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746030_5206 replica FinalizedReplica, blk_1073746030_5206, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746030 for deletion 2025-07-12 14:52:56,249 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746030_5206 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746030 2025-07-12 14:54:56,151 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746032_5208 src: /192.168.158.7:41604 dest: /192.168.158.4:9866 2025-07-12 14:54:56,175 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:41604, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1531938391_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746032_5208, duration(ns): 18692026 2025-07-12 14:54:56,175 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746032_5208, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-12 14:54:59,251 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746032_5208 replica FinalizedReplica, blk_1073746032_5208, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746032 for deletion 2025-07-12 14:54:59,253 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746032_5208 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746032 2025-07-12 14:55:56,158 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746033_5209 src: /192.168.158.1:47836 dest: /192.168.158.4:9866 2025-07-12 14:55:56,189 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47836, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1422697370_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746033_5209, duration(ns): 22590776 2025-07-12 14:55:56,189 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746033_5209, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-12 14:55:59,252 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746033_5209 replica FinalizedReplica, blk_1073746033_5209, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746033 for deletion 2025-07-12 14:55:59,254 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746033_5209 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746033 2025-07-12 14:57:56,159 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746035_5211 src: /192.168.158.1:48344 dest: /192.168.158.4:9866 2025-07-12 14:57:56,190 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48344, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1388537649_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746035_5211, duration(ns): 22035362 2025-07-12 14:57:56,190 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746035_5211, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-12 14:57:59,257 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746035_5211 replica FinalizedReplica, blk_1073746035_5211, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746035 for deletion 2025-07-12 14:57:59,258 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746035_5211 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746035 2025-07-12 15:00:56,166 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746038_5214 src: /192.168.158.7:53546 dest: /192.168.158.4:9866 2025-07-12 15:00:56,190 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:53546, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-64447243_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746038_5214, duration(ns): 19182380 2025-07-12 15:00:56,190 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746038_5214, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-12 15:00:59,258 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746038_5214 replica FinalizedReplica, blk_1073746038_5214, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746038 for deletion 2025-07-12 15:00:59,260 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746038_5214 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746038 2025-07-12 15:03:01,166 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746040_5216 src: /192.168.158.1:38364 dest: /192.168.158.4:9866 2025-07-12 15:03:01,197 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38364, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1016524743_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746040_5216, duration(ns): 22379420 2025-07-12 15:03:01,197 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746040_5216, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-12 15:03:05,264 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746040_5216 replica FinalizedReplica, blk_1073746040_5216, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746040 for deletion 2025-07-12 15:03:05,265 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746040_5216 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746040 2025-07-12 15:05:01,189 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746042_5218 src: /192.168.158.7:53386 dest: /192.168.158.4:9866 2025-07-12 15:05:01,215 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:53386, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1640010304_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746042_5218, duration(ns): 20720711 2025-07-12 15:05:01,215 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746042_5218, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-12 15:05:08,269 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746042_5218 replica FinalizedReplica, blk_1073746042_5218, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746042 for deletion 2025-07-12 15:05:08,271 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746042_5218 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746042 2025-07-12 15:06:01,157 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746043_5219 src: /192.168.158.8:54900 dest: /192.168.158.4:9866 2025-07-12 15:06:01,175 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:54900, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1721404423_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746043_5219, duration(ns): 16007117 2025-07-12 15:06:01,175 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746043_5219, type=LAST_IN_PIPELINE terminating 2025-07-12 15:06:05,271 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746043_5219 replica FinalizedReplica, blk_1073746043_5219, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746043 for deletion 2025-07-12 15:06:05,272 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746043_5219 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746043 2025-07-12 15:07:01,165 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746044_5220 src: /192.168.158.6:33668 dest: /192.168.158.4:9866 2025-07-12 15:07:01,190 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:33668, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1476796346_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746044_5220, duration(ns): 19737490 2025-07-12 15:07:01,190 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746044_5220, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-12 15:07:05,273 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746044_5220 replica FinalizedReplica, blk_1073746044_5220, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746044 for deletion 2025-07-12 15:07:05,275 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746044_5220 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746044 2025-07-12 15:08:01,168 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746045_5221 src: /192.168.158.8:54842 dest: /192.168.158.4:9866 2025-07-12 15:08:01,193 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:54842, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1078794092_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746045_5221, duration(ns): 19912820 2025-07-12 15:08:01,193 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746045_5221, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-12 15:08:08,274 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746045_5221 replica FinalizedReplica, blk_1073746045_5221, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746045 for deletion 2025-07-12 15:08:08,275 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746045_5221 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746045 2025-07-12 15:11:01,164 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746048_5224 src: /192.168.158.7:51974 dest: /192.168.158.4:9866 2025-07-12 15:11:01,188 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:51974, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1939009939_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746048_5224, duration(ns): 18383281 2025-07-12 15:11:01,188 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746048_5224, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-12 15:11:08,281 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746048_5224 replica FinalizedReplica, blk_1073746048_5224, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746048 for deletion 2025-07-12 15:11:08,283 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746048_5224 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746048 2025-07-12 15:13:01,181 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746050_5226 src: /192.168.158.7:49284 dest: /192.168.158.4:9866 2025-07-12 15:13:01,199 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:49284, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-624535437_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746050_5226, duration(ns): 15847267 2025-07-12 15:13:01,199 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746050_5226, type=LAST_IN_PIPELINE terminating 2025-07-12 15:13:05,286 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746050_5226 replica FinalizedReplica, blk_1073746050_5226, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746050 for deletion 2025-07-12 15:13:05,287 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746050_5226 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746050 2025-07-12 15:14:06,174 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746051_5227 src: /192.168.158.1:44478 dest: /192.168.158.4:9866 2025-07-12 15:14:06,205 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44478, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2135356533_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746051_5227, duration(ns): 22105019 2025-07-12 15:14:06,205 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746051_5227, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-12 15:14:08,289 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746051_5227 replica FinalizedReplica, blk_1073746051_5227, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746051 for deletion 2025-07-12 15:14:08,290 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746051_5227 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746051 2025-07-12 15:15:06,172 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746052_5228 src: /192.168.158.1:59742 dest: /192.168.158.4:9866 2025-07-12 15:15:06,204 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59742, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_478710727_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746052_5228, duration(ns): 23093946 2025-07-12 15:15:06,204 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746052_5228, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-12 15:15:08,291 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746052_5228 replica FinalizedReplica, blk_1073746052_5228, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746052 for deletion 2025-07-12 15:15:08,292 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746052_5228 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746052 2025-07-12 15:16:06,178 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746053_5229 src: /192.168.158.1:40018 dest: /192.168.158.4:9866 2025-07-12 15:16:06,212 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40018, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1181195734_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746053_5229, duration(ns): 26288405 2025-07-12 15:16:06,213 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746053_5229, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-12 15:16:08,292 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746053_5229 replica FinalizedReplica, blk_1073746053_5229, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746053 for deletion 2025-07-12 15:16:08,293 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746053_5229 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746053 2025-07-12 15:17:06,177 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746054_5230 src: /192.168.158.1:49346 dest: /192.168.158.4:9866 2025-07-12 15:17:06,208 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49346, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_318210019_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746054_5230, duration(ns): 22271532 2025-07-12 15:17:06,208 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746054_5230, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-12 15:17:11,295 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746054_5230 replica FinalizedReplica, blk_1073746054_5230, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746054 for deletion 2025-07-12 15:17:11,297 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746054_5230 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746054 2025-07-12 15:19:06,173 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746056_5232 src: /192.168.158.1:36588 dest: /192.168.158.4:9866 2025-07-12 15:19:06,207 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36588, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1499130371_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746056_5232, duration(ns): 25260550 2025-07-12 15:19:06,208 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746056_5232, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-12 15:19:08,297 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746056_5232 replica FinalizedReplica, blk_1073746056_5232, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746056 for deletion 2025-07-12 15:19:08,298 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746056_5232 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746056 2025-07-12 15:21:06,178 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746058_5234 src: /192.168.158.1:41156 dest: /192.168.158.4:9866 2025-07-12 15:21:06,209 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41156, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1682976814_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746058_5234, duration(ns): 22346990 2025-07-12 15:21:06,210 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746058_5234, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-12 15:21:08,299 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746058_5234 replica FinalizedReplica, blk_1073746058_5234, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746058 for deletion 2025-07-12 15:21:08,301 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746058_5234 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746058 2025-07-12 15:23:06,194 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746060_5236 src: /192.168.158.7:36632 dest: /192.168.158.4:9866 2025-07-12 15:23:06,213 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:36632, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_959431823_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746060_5236, duration(ns): 17146443 2025-07-12 15:23:06,213 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746060_5236, type=LAST_IN_PIPELINE terminating 2025-07-12 15:23:08,301 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746060_5236 replica FinalizedReplica, blk_1073746060_5236, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746060 for deletion 2025-07-12 15:23:08,303 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746060_5236 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746060 2025-07-12 15:24:06,197 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746061_5237 src: /192.168.158.6:56746 dest: /192.168.158.4:9866 2025-07-12 15:24:06,212 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:56746, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1219787937_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746061_5237, duration(ns): 13540335 2025-07-12 15:24:06,213 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746061_5237, type=LAST_IN_PIPELINE terminating 2025-07-12 15:24:11,304 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746061_5237 replica FinalizedReplica, blk_1073746061_5237, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746061 for deletion 2025-07-12 15:24:11,305 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746061_5237 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746061 2025-07-12 15:25:06,195 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746062_5238 src: /192.168.158.1:41338 dest: /192.168.158.4:9866 2025-07-12 15:25:06,226 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41338, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_762921466_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746062_5238, duration(ns): 22673380 2025-07-12 15:25:06,226 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746062_5238, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-12 15:25:11,306 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746062_5238 replica FinalizedReplica, blk_1073746062_5238, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746062 for deletion 2025-07-12 15:25:11,307 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746062_5238 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746062 2025-07-12 15:27:11,233 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746064_5240 src: /192.168.158.1:33294 dest: /192.168.158.4:9866 2025-07-12 15:27:11,261 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33294, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1424300775_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746064_5240, duration(ns): 19780437 2025-07-12 15:27:11,262 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746064_5240, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-12 15:27:14,311 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746064_5240 replica FinalizedReplica, blk_1073746064_5240, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746064 for deletion 2025-07-12 15:27:14,312 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746064_5240 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746064 2025-07-12 15:28:16,255 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746065_5241 src: /192.168.158.6:56516 dest: /192.168.158.4:9866 2025-07-12 15:28:16,271 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:56516, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1332740873_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746065_5241, duration(ns): 13883174 2025-07-12 15:28:16,271 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746065_5241, type=LAST_IN_PIPELINE terminating 2025-07-12 15:28:20,315 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746065_5241 replica FinalizedReplica, blk_1073746065_5241, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746065 for deletion 2025-07-12 15:28:20,316 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746065_5241 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746065 2025-07-12 15:30:21,235 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746067_5243 src: /192.168.158.1:51766 dest: /192.168.158.4:9866 2025-07-12 15:30:21,267 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51766, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_257870403_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746067_5243, duration(ns): 23191300 2025-07-12 15:30:21,267 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746067_5243, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-12 15:30:26,317 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746067_5243 replica FinalizedReplica, blk_1073746067_5243, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746067 for deletion 2025-07-12 15:30:26,319 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746067_5243 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746067 2025-07-12 15:33:21,231 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746070_5246 src: /192.168.158.1:55696 dest: /192.168.158.4:9866 2025-07-12 15:33:21,265 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55696, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-495954838_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746070_5246, duration(ns): 25818004 2025-07-12 15:33:21,265 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746070_5246, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-12 15:33:23,326 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746070_5246 replica FinalizedReplica, blk_1073746070_5246, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746070 for deletion 2025-07-12 15:33:23,327 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746070_5246 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746070 2025-07-12 15:34:21,235 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746071_5247 src: /192.168.158.9:39728 dest: /192.168.158.4:9866 2025-07-12 15:34:21,261 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:39728, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1799604872_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746071_5247, duration(ns): 20329693 2025-07-12 15:34:21,261 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746071_5247, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-12 15:34:26,329 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746071_5247 replica FinalizedReplica, blk_1073746071_5247, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746071 for deletion 2025-07-12 15:34:26,330 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746071_5247 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746071 2025-07-12 15:35:21,239 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746072_5248 src: /192.168.158.1:33712 dest: /192.168.158.4:9866 2025-07-12 15:35:21,272 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33712, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1341049601_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746072_5248, duration(ns): 24138755 2025-07-12 15:35:21,272 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746072_5248, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-12 15:35:26,330 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746072_5248 replica FinalizedReplica, blk_1073746072_5248, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746072 for deletion 2025-07-12 15:35:26,330 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746072_5248 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746072 2025-07-12 15:36:26,241 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746073_5249 src: /192.168.158.5:45380 dest: /192.168.158.4:9866 2025-07-12 15:36:26,267 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:45380, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1698173684_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746073_5249, duration(ns): 21181080 2025-07-12 15:36:26,268 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746073_5249, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-12 15:36:29,331 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746073_5249 replica FinalizedReplica, blk_1073746073_5249, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746073 for deletion 2025-07-12 15:36:29,332 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746073_5249 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746073 2025-07-12 15:37:26,257 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746074_5250 src: /192.168.158.9:60718 dest: /192.168.158.4:9866 2025-07-12 15:37:26,273 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:60718, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_150525052_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746074_5250, duration(ns): 14655715 2025-07-12 15:37:26,274 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746074_5250, type=LAST_IN_PIPELINE terminating 2025-07-12 15:37:29,335 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746074_5250 replica FinalizedReplica, blk_1073746074_5250, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746074 for deletion 2025-07-12 15:37:29,336 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746074_5250 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746074 2025-07-12 15:39:31,253 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746076_5252 src: /192.168.158.8:46428 dest: /192.168.158.4:9866 2025-07-12 15:39:31,271 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:46428, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2127872466_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746076_5252, duration(ns): 15716642 2025-07-12 15:39:31,271 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746076_5252, type=LAST_IN_PIPELINE terminating 2025-07-12 15:39:38,340 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746076_5252 replica FinalizedReplica, blk_1073746076_5252, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746076 for deletion 2025-07-12 15:39:38,341 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746076_5252 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746076 2025-07-12 15:43:36,303 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746080_5256 src: /192.168.158.8:45160 dest: /192.168.158.4:9866 2025-07-12 15:43:36,321 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:45160, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-84565422_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746080_5256, duration(ns): 15706481 2025-07-12 15:43:36,321 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746080_5256, type=LAST_IN_PIPELINE terminating 2025-07-12 15:43:38,343 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746080_5256 replica FinalizedReplica, blk_1073746080_5256, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746080 for deletion 2025-07-12 15:43:38,344 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746080_5256 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746080 2025-07-12 15:44:41,265 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746081_5257 src: /192.168.158.7:52068 dest: /192.168.158.4:9866 2025-07-12 15:44:41,288 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:52068, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1956784082_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746081_5257, duration(ns): 18160275 2025-07-12 15:44:41,288 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746081_5257, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-12 15:44:44,344 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746081_5257 replica FinalizedReplica, blk_1073746081_5257, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746081 for deletion 2025-07-12 15:44:44,345 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746081_5257 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746081 2025-07-12 15:46:46,267 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746083_5259 src: /192.168.158.6:54440 dest: /192.168.158.4:9866 2025-07-12 15:46:46,284 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:54440, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-988870969_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746083_5259, duration(ns): 14573205 2025-07-12 15:46:46,284 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746083_5259, type=LAST_IN_PIPELINE terminating 2025-07-12 15:46:53,349 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746083_5259 replica FinalizedReplica, blk_1073746083_5259, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746083 for deletion 2025-07-12 15:46:53,350 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746083_5259 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746083 2025-07-12 15:52:46,286 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746089_5265 src: /192.168.158.7:56210 dest: /192.168.158.4:9866 2025-07-12 15:52:46,312 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:56210, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_441391758_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746089_5265, duration(ns): 20068596 2025-07-12 15:52:46,312 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746089_5265, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-12 15:52:50,359 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746089_5265 replica FinalizedReplica, blk_1073746089_5265, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746089 for deletion 2025-07-12 15:52:50,360 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746089_5265 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746089 2025-07-12 15:54:51,281 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746091_5267 src: /192.168.158.6:36790 dest: /192.168.158.4:9866 2025-07-12 15:54:51,303 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:36790, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_832049848_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746091_5267, duration(ns): 19324779 2025-07-12 15:54:51,303 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746091_5267, type=LAST_IN_PIPELINE terminating 2025-07-12 15:54:56,361 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746091_5267 replica FinalizedReplica, blk_1073746091_5267, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746091 for deletion 2025-07-12 15:54:56,362 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746091_5267 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746091 2025-07-12 15:55:51,272 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746092_5268 src: /192.168.158.7:59524 dest: /192.168.158.4:9866 2025-07-12 15:55:51,294 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:59524, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-697889497_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746092_5268, duration(ns): 20254622 2025-07-12 15:55:51,295 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746092_5268, type=LAST_IN_PIPELINE terminating 2025-07-12 15:55:53,364 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746092_5268 replica FinalizedReplica, blk_1073746092_5268, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746092 for deletion 2025-07-12 15:55:53,365 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746092_5268 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746092 2025-07-12 15:56:51,285 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746093_5269 src: /192.168.158.1:41898 dest: /192.168.158.4:9866 2025-07-12 15:56:51,320 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41898, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1153522189_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746093_5269, duration(ns): 25846901 2025-07-12 15:56:51,320 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746093_5269, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-12 15:56:53,368 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746093_5269 replica FinalizedReplica, blk_1073746093_5269, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746093 for deletion 2025-07-12 15:56:53,369 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746093_5269 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746093 2025-07-12 15:59:51,260 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746096_5272 src: /192.168.158.1:55994 dest: /192.168.158.4:9866 2025-07-12 15:59:51,293 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55994, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1832149724_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746096_5272, duration(ns): 24766260 2025-07-12 15:59:51,294 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746096_5272, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-12 15:59:53,376 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746096_5272 replica FinalizedReplica, blk_1073746096_5272, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746096 for deletion 2025-07-12 15:59:53,377 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746096_5272 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746096 2025-07-12 16:01:51,275 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746098_5274 src: /192.168.158.5:33962 dest: /192.168.158.4:9866 2025-07-12 16:01:51,299 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:33962, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_608284782_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746098_5274, duration(ns): 18713563 2025-07-12 16:01:51,299 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746098_5274, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-12 16:01:53,381 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746098_5274 replica FinalizedReplica, blk_1073746098_5274, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746098 for deletion 2025-07-12 16:01:53,382 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746098_5274 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746098 2025-07-12 16:04:56,260 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746101_5277 src: /192.168.158.6:33626 dest: /192.168.158.4:9866 2025-07-12 16:04:56,281 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:33626, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1542168658_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746101_5277, duration(ns): 16107789 2025-07-12 16:04:56,281 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746101_5277, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-12 16:04:59,385 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746101_5277 replica FinalizedReplica, blk_1073746101_5277, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746101 for deletion 2025-07-12 16:04:59,386 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746101_5277 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746101 2025-07-12 16:05:56,287 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746102_5278 src: /192.168.158.7:45882 dest: /192.168.158.4:9866 2025-07-12 16:05:56,310 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:45882, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1072479162_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746102_5278, duration(ns): 17865542 2025-07-12 16:05:56,311 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746102_5278, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-12 16:06:02,388 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746102_5278 replica FinalizedReplica, blk_1073746102_5278, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746102 for deletion 2025-07-12 16:06:02,389 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746102_5278 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746102 2025-07-12 16:06:56,289 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746103_5279 src: /192.168.158.7:34390 dest: /192.168.158.4:9866 2025-07-12 16:06:56,313 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:34390, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_679532466_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746103_5279, duration(ns): 18074189 2025-07-12 16:06:56,314 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746103_5279, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-12 16:06:59,389 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746103_5279 replica FinalizedReplica, blk_1073746103_5279, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746103 for deletion 2025-07-12 16:06:59,390 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746103_5279 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746103 2025-07-12 16:08:01,292 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746104_5280 src: /192.168.158.7:37092 dest: /192.168.158.4:9866 2025-07-12 16:08:01,311 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:37092, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1277765007_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746104_5280, duration(ns): 16553748 2025-07-12 16:08:01,311 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746104_5280, type=LAST_IN_PIPELINE terminating 2025-07-12 16:08:08,393 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746104_5280 replica FinalizedReplica, blk_1073746104_5280, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746104 for deletion 2025-07-12 16:08:08,394 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746104_5280 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746104 2025-07-12 16:10:01,310 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746106_5282 src: /192.168.158.7:37138 dest: /192.168.158.4:9866 2025-07-12 16:10:01,328 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:37138, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1916680295_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746106_5282, duration(ns): 16325954 2025-07-12 16:10:01,328 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746106_5282, type=LAST_IN_PIPELINE terminating 2025-07-12 16:10:05,396 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746106_5282 replica FinalizedReplica, blk_1073746106_5282, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746106 for deletion 2025-07-12 16:10:05,397 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746106_5282 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746106 2025-07-12 16:12:01,295 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746108_5284 src: /192.168.158.7:56556 dest: /192.168.158.4:9866 2025-07-12 16:12:01,319 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:56556, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1701362371_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746108_5284, duration(ns): 18759104 2025-07-12 16:12:01,320 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746108_5284, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-12 16:12:08,401 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746108_5284 replica FinalizedReplica, blk_1073746108_5284, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746108 for deletion 2025-07-12 16:12:08,402 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746108_5284 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746108 2025-07-12 16:13:01,294 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746109_5285 src: /192.168.158.1:40760 dest: /192.168.158.4:9866 2025-07-12 16:13:01,325 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40760, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-693946341_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746109_5285, duration(ns): 22249898 2025-07-12 16:13:01,325 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746109_5285, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-12 16:13:05,403 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746109_5285 replica FinalizedReplica, blk_1073746109_5285, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746109 for deletion 2025-07-12 16:13:05,404 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746109_5285 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746109 2025-07-12 16:14:01,281 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746110_5286 src: /192.168.158.1:53204 dest: /192.168.158.4:9866 2025-07-12 16:14:01,312 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53204, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1271705343_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746110_5286, duration(ns): 22921453 2025-07-12 16:14:01,313 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746110_5286, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-12 16:14:08,405 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746110_5286 replica FinalizedReplica, blk_1073746110_5286, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746110 for deletion 2025-07-12 16:14:08,406 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746110_5286 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746110 2025-07-12 16:15:01,287 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746111_5287 src: /192.168.158.6:39184 dest: /192.168.158.4:9866 2025-07-12 16:15:01,313 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:39184, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_244941132_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746111_5287, duration(ns): 20139017 2025-07-12 16:15:01,313 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746111_5287, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-12 16:15:05,408 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746111_5287 replica FinalizedReplica, blk_1073746111_5287, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746111 for deletion 2025-07-12 16:15:05,409 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746111_5287 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746111 2025-07-12 16:21:11,300 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746117_5293 src: /192.168.158.1:46712 dest: /192.168.158.4:9866 2025-07-12 16:21:11,330 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46712, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1031060919_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746117_5293, duration(ns): 21298411 2025-07-12 16:21:11,331 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746117_5293, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-12 16:21:17,418 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746117_5293 replica FinalizedReplica, blk_1073746117_5293, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746117 for deletion 2025-07-12 16:21:17,419 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746117_5293 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746117 2025-07-12 16:25:11,307 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746121_5297 src: /192.168.158.1:44594 dest: /192.168.158.4:9866 2025-07-12 16:25:11,339 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44594, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1244677123_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746121_5297, duration(ns): 22447131 2025-07-12 16:25:11,339 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746121_5297, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-12 16:25:14,424 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746121_5297 replica FinalizedReplica, blk_1073746121_5297, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746121 for deletion 2025-07-12 16:25:14,425 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746121_5297 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746121 2025-07-12 16:26:11,308 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746122_5298 src: /192.168.158.5:37150 dest: /192.168.158.4:9866 2025-07-12 16:26:11,334 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:37150, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_373555738_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746122_5298, duration(ns): 19156122 2025-07-12 16:26:11,334 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746122_5298, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-12 16:26:14,427 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746122_5298 replica FinalizedReplica, blk_1073746122_5298, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746122 for deletion 2025-07-12 16:26:14,428 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746122_5298 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746122 2025-07-12 16:28:11,320 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746124_5300 src: /192.168.158.9:50346 dest: /192.168.158.4:9866 2025-07-12 16:28:11,345 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:50346, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-858987450_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746124_5300, duration(ns): 18872878 2025-07-12 16:28:11,345 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746124_5300, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-12 16:28:14,430 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746124_5300 replica FinalizedReplica, blk_1073746124_5300, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746124 for deletion 2025-07-12 16:28:14,431 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746124_5300 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746124 2025-07-12 16:30:11,322 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746126_5302 src: /192.168.158.5:45478 dest: /192.168.158.4:9866 2025-07-12 16:30:11,347 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:45478, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1332821242_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746126_5302, duration(ns): 19150193 2025-07-12 16:30:11,347 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746126_5302, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-12 16:30:17,433 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746126_5302 replica FinalizedReplica, blk_1073746126_5302, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746126 for deletion 2025-07-12 16:30:17,434 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746126_5302 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746126 2025-07-12 16:34:21,334 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746130_5306 src: /192.168.158.9:37800 dest: /192.168.158.4:9866 2025-07-12 16:34:21,360 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:37800, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1655696343_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746130_5306, duration(ns): 20664207 2025-07-12 16:34:21,360 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746130_5306, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-12 16:34:26,439 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746130_5306 replica FinalizedReplica, blk_1073746130_5306, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746130 for deletion 2025-07-12 16:34:26,440 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746130_5306 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746130 2025-07-12 16:35:26,330 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746131_5307 src: /192.168.158.6:58424 dest: /192.168.158.4:9866 2025-07-12 16:35:26,357 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:58424, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1225586307_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746131_5307, duration(ns): 21160513 2025-07-12 16:35:26,357 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746131_5307, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-12 16:35:32,440 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746131_5307 replica FinalizedReplica, blk_1073746131_5307, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746131 for deletion 2025-07-12 16:35:32,441 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746131_5307 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746131 2025-07-12 16:37:26,344 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746133_5309 src: /192.168.158.6:50446 dest: /192.168.158.4:9866 2025-07-12 16:37:26,369 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:50446, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-50468629_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746133_5309, duration(ns): 19385071 2025-07-12 16:37:26,369 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746133_5309, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-12 16:37:29,446 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746133_5309 replica FinalizedReplica, blk_1073746133_5309, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746133 for deletion 2025-07-12 16:37:29,447 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746133_5309 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746133 2025-07-12 16:38:31,347 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746134_5310 src: /192.168.158.6:54176 dest: /192.168.158.4:9866 2025-07-12 16:38:31,375 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:54176, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1224760056_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746134_5310, duration(ns): 21395420 2025-07-12 16:38:31,375 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746134_5310, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-12 16:38:32,447 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746134_5310 replica FinalizedReplica, blk_1073746134_5310, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746134 for deletion 2025-07-12 16:38:32,449 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746134_5310 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746134 2025-07-12 16:39:36,400 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746135_5311 src: /192.168.158.9:45918 dest: /192.168.158.4:9866 2025-07-12 16:39:36,418 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:45918, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-126271344_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746135_5311, duration(ns): 16184231 2025-07-12 16:39:36,418 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746135_5311, type=LAST_IN_PIPELINE terminating 2025-07-12 16:39:38,449 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746135_5311 replica FinalizedReplica, blk_1073746135_5311, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746135 for deletion 2025-07-12 16:39:38,450 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746135_5311 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746135 2025-07-12 16:40:36,388 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746136_5312 src: /192.168.158.7:36190 dest: /192.168.158.4:9866 2025-07-12 16:40:36,412 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:36190, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-839501748_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746136_5312, duration(ns): 18727502 2025-07-12 16:40:36,413 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746136_5312, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-12 16:40:41,450 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746136_5312 replica FinalizedReplica, blk_1073746136_5312, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746136 for deletion 2025-07-12 16:40:41,451 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746136_5312 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746136 2025-07-12 16:41:36,376 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746137_5313 src: /192.168.158.5:34988 dest: /192.168.158.4:9866 2025-07-12 16:41:36,400 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:34988, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1575482253_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746137_5313, duration(ns): 18839115 2025-07-12 16:41:36,401 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746137_5313, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-12 16:41:38,452 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746137_5313 replica FinalizedReplica, blk_1073746137_5313, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746137 for deletion 2025-07-12 16:41:38,453 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746137_5313 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746137 2025-07-12 16:46:41,351 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746142_5318 src: /192.168.158.1:60150 dest: /192.168.158.4:9866 2025-07-12 16:46:41,382 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60150, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_388122151_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746142_5318, duration(ns): 22091462 2025-07-12 16:46:41,382 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746142_5318, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-12 16:46:44,460 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746142_5318 replica FinalizedReplica, blk_1073746142_5318, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746142 for deletion 2025-07-12 16:46:44,462 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746142_5318 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746142 2025-07-12 16:48:51,354 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746144_5320 src: /192.168.158.9:45098 dest: /192.168.158.4:9866 2025-07-12 16:48:51,371 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:45098, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-256258449_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746144_5320, duration(ns): 14135399 2025-07-12 16:48:51,371 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746144_5320, type=LAST_IN_PIPELINE terminating 2025-07-12 16:48:53,466 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746144_5320 replica FinalizedReplica, blk_1073746144_5320, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746144 for deletion 2025-07-12 16:48:53,467 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746144_5320 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746144 2025-07-12 16:51:56,356 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746147_5323 src: /192.168.158.6:41132 dest: /192.168.158.4:9866 2025-07-12 16:51:56,372 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:41132, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1888385118_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746147_5323, duration(ns): 13953456 2025-07-12 16:51:56,373 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746147_5323, type=LAST_IN_PIPELINE terminating 2025-07-12 16:51:59,473 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746147_5323 replica FinalizedReplica, blk_1073746147_5323, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746147 for deletion 2025-07-12 16:51:59,475 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746147_5323 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746147 2025-07-12 16:54:56,357 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746150_5326 src: /192.168.158.1:40022 dest: /192.168.158.4:9866 2025-07-12 16:54:56,394 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40022, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_46176656_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746150_5326, duration(ns): 28493403 2025-07-12 16:54:56,394 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746150_5326, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-12 16:54:59,480 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746150_5326 replica FinalizedReplica, blk_1073746150_5326, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746150 for deletion 2025-07-12 16:54:59,481 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746150_5326 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746150 2025-07-12 16:55:56,366 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746151_5327 src: /192.168.158.1:34878 dest: /192.168.158.4:9866 2025-07-12 16:55:56,399 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34878, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1934998977_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746151_5327, duration(ns): 22804766 2025-07-12 16:55:56,399 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746151_5327, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-12 16:55:59,481 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746151_5327 replica FinalizedReplica, blk_1073746151_5327, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746151 for deletion 2025-07-12 16:55:59,483 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746151_5327 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746151 2025-07-12 16:58:01,361 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746153_5329 src: /192.168.158.9:50416 dest: /192.168.158.4:9866 2025-07-12 16:58:01,386 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:50416, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1591574968_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746153_5329, duration(ns): 19293291 2025-07-12 16:58:01,386 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746153_5329, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-12 16:58:02,486 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746153_5329 replica FinalizedReplica, blk_1073746153_5329, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746153 for deletion 2025-07-12 16:58:02,487 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746153_5329 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746153 2025-07-12 17:02:01,371 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746157_5333 src: /192.168.158.1:33502 dest: /192.168.158.4:9866 2025-07-12 17:02:01,401 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33502, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1757363145_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746157_5333, duration(ns): 21223789 2025-07-12 17:02:01,401 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746157_5333, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-12 17:02:02,494 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746157_5333 replica FinalizedReplica, blk_1073746157_5333, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746157 for deletion 2025-07-12 17:02:02,495 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746157_5333 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746157 2025-07-12 17:05:06,383 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746160_5336 src: /192.168.158.6:54734 dest: /192.168.158.4:9866 2025-07-12 17:05:06,402 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:54734, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_248155717_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746160_5336, duration(ns): 16683406 2025-07-12 17:05:06,402 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746160_5336, type=LAST_IN_PIPELINE terminating 2025-07-12 17:05:08,501 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746160_5336 replica FinalizedReplica, blk_1073746160_5336, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746160 for deletion 2025-07-12 17:05:08,502 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746160_5336 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746160 2025-07-12 17:07:06,375 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746162_5338 src: /192.168.158.5:44498 dest: /192.168.158.4:9866 2025-07-12 17:07:06,392 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:44498, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_894486074_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746162_5338, duration(ns): 14904833 2025-07-12 17:07:06,392 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746162_5338, type=LAST_IN_PIPELINE terminating 2025-07-12 17:07:08,504 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746162_5338 replica FinalizedReplica, blk_1073746162_5338, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746162 for deletion 2025-07-12 17:07:08,505 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746162_5338 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746162 2025-07-12 17:11:11,379 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746166_5342 src: /192.168.158.5:42000 dest: /192.168.158.4:9866 2025-07-12 17:11:11,397 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:42000, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-959891666_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746166_5342, duration(ns): 16116465 2025-07-12 17:11:11,398 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746166_5342, type=LAST_IN_PIPELINE terminating 2025-07-12 17:11:14,512 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746166_5342 replica FinalizedReplica, blk_1073746166_5342, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746166 for deletion 2025-07-12 17:11:14,514 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746166_5342 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746166 2025-07-12 17:14:16,385 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746169_5345 src: /192.168.158.1:57762 dest: /192.168.158.4:9866 2025-07-12 17:14:16,415 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57762, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1063550751_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746169_5345, duration(ns): 21288959 2025-07-12 17:14:16,417 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746169_5345, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-12 17:14:17,518 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746169_5345 replica FinalizedReplica, blk_1073746169_5345, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746169 for deletion 2025-07-12 17:14:17,519 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746169_5345 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746169 2025-07-12 17:15:16,389 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746170_5346 src: /192.168.158.9:44256 dest: /192.168.158.4:9866 2025-07-12 17:15:16,414 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:44256, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-813059080_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746170_5346, duration(ns): 19649106 2025-07-12 17:15:16,415 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746170_5346, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-12 17:15:17,520 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746170_5346 replica FinalizedReplica, blk_1073746170_5346, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746170 for deletion 2025-07-12 17:15:17,521 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746170_5346 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746170 2025-07-12 17:16:21,384 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746171_5347 src: /192.168.158.8:36508 dest: /192.168.158.4:9866 2025-07-12 17:16:21,408 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:36508, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1570345198_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746171_5347, duration(ns): 19166978 2025-07-12 17:16:21,409 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746171_5347, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-12 17:16:23,522 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746171_5347 replica FinalizedReplica, blk_1073746171_5347, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746171 for deletion 2025-07-12 17:16:23,523 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746171_5347 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746171 2025-07-12 17:17:21,392 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746172_5348 src: /192.168.158.7:41186 dest: /192.168.158.4:9866 2025-07-12 17:17:21,410 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:41186, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-476314248_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746172_5348, duration(ns): 15796753 2025-07-12 17:17:21,410 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746172_5348, type=LAST_IN_PIPELINE terminating 2025-07-12 17:17:23,523 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746172_5348 replica FinalizedReplica, blk_1073746172_5348, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746172 for deletion 2025-07-12 17:17:23,525 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746172_5348 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746172 2025-07-12 17:19:21,391 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746174_5350 src: /192.168.158.5:49442 dest: /192.168.158.4:9866 2025-07-12 17:19:21,415 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:49442, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1038121463_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746174_5350, duration(ns): 19157728 2025-07-12 17:19:21,416 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746174_5350, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-12 17:19:23,527 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746174_5350 replica FinalizedReplica, blk_1073746174_5350, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746174 for deletion 2025-07-12 17:19:23,529 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746174_5350 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073746174 2025-07-12 17:22:31,390 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746177_5353 src: /192.168.158.1:38214 dest: /192.168.158.4:9866 2025-07-12 17:22:31,421 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38214, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_533635689_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746177_5353, duration(ns): 21852299 2025-07-12 17:22:31,421 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746177_5353, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-12 17:22:32,536 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746177_5353 replica FinalizedReplica, blk_1073746177_5353, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746177 for deletion 2025-07-12 17:22:32,537 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746177_5353 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746177 2025-07-12 17:23:31,395 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746178_5354 src: /192.168.158.9:56516 dest: /192.168.158.4:9866 2025-07-12 17:23:31,414 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:56516, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-72458346_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746178_5354, duration(ns): 15163101 2025-07-12 17:23:31,414 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746178_5354, type=LAST_IN_PIPELINE terminating 2025-07-12 17:23:32,539 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746178_5354 replica FinalizedReplica, blk_1073746178_5354, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746178 for deletion 2025-07-12 17:23:32,540 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746178_5354 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746178 2025-07-12 17:24:36,402 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746179_5355 src: /192.168.158.1:57350 dest: /192.168.158.4:9866 2025-07-12 17:24:36,439 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57350, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_571469433_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746179_5355, duration(ns): 28069600 2025-07-12 17:24:36,439 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746179_5355, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-12 17:24:38,545 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746179_5355 replica FinalizedReplica, blk_1073746179_5355, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746179 for deletion 2025-07-12 17:24:38,546 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746179_5355 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746179 2025-07-12 17:27:46,407 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746182_5358 src: /192.168.158.1:59524 dest: /192.168.158.4:9866 2025-07-12 17:27:46,439 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59524, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_81539965_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746182_5358, duration(ns): 22632952 2025-07-12 17:27:46,439 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746182_5358, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-12 17:27:50,550 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746182_5358 replica FinalizedReplica, blk_1073746182_5358, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746182 for deletion 2025-07-12 17:27:50,552 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746182_5358 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746182 2025-07-12 17:28:46,403 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746183_5359 src: /192.168.158.1:45228 dest: /192.168.158.4:9866 2025-07-12 17:28:46,434 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45228, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_948184660_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746183_5359, duration(ns): 22324725 2025-07-12 17:28:46,434 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746183_5359, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-12 17:28:50,552 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746183_5359 replica FinalizedReplica, blk_1073746183_5359, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746183 for deletion 2025-07-12 17:28:50,554 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746183_5359 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746183 2025-07-12 17:32:01,432 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746186_5362 src: /192.168.158.1:38592 dest: /192.168.158.4:9866 2025-07-12 17:32:01,461 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38592, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1758441630_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746186_5362, duration(ns): 20457479 2025-07-12 17:32:01,462 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746186_5362, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-12 17:32:05,556 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746186_5362 replica FinalizedReplica, blk_1073746186_5362, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746186 for deletion 2025-07-12 17:32:05,557 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746186_5362 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746186 2025-07-12 17:33:01,410 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746187_5363 src: /192.168.158.1:40296 dest: /192.168.158.4:9866 2025-07-12 17:33:01,443 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40296, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_270638783_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746187_5363, duration(ns): 24262887 2025-07-12 17:33:01,443 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746187_5363, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-12 17:33:02,557 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746187_5363 replica FinalizedReplica, blk_1073746187_5363, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746187 for deletion 2025-07-12 17:33:02,558 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746187_5363 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746187 2025-07-12 17:34:06,408 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746188_5364 src: /192.168.158.6:36904 dest: /192.168.158.4:9866 2025-07-12 17:34:06,433 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:36904, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1044315303_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746188_5364, duration(ns): 19173242 2025-07-12 17:34:06,433 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746188_5364, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-12 17:34:08,559 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746188_5364 replica FinalizedReplica, blk_1073746188_5364, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746188 for deletion 2025-07-12 17:34:08,560 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746188_5364 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746188 2025-07-12 17:35:06,411 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746189_5365 src: /192.168.158.7:51574 dest: /192.168.158.4:9866 2025-07-12 17:35:06,436 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:51574, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-907111485_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746189_5365, duration(ns): 19509773 2025-07-12 17:35:06,436 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746189_5365, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-12 17:35:08,560 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746189_5365 replica FinalizedReplica, blk_1073746189_5365, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746189 for deletion 2025-07-12 17:35:08,562 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746189_5365 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746189 2025-07-12 17:36:13,268 INFO org.apache.hadoop.hdfs.server.datanode.DirectoryScanner: BlockPool BP-1059995147-192.168.158.1-1752101929360 Total blocks: 8, missing metadata files:0, missing block files:0, missing blocks in memory:0, mismatched blocks:0 2025-07-12 17:37:20,565 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Successfully sent block report 0x12cfd9a757d31f31, containing 4 storage report(s), of which we sent 4. The reports had 8 total blocks and used 1 RPC(s). This took 0 msec to generate and 4 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5. 2025-07-12 17:37:20,565 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Got finalize command for block pool BP-1059995147-192.168.158.1-1752101929360 2025-07-12 17:38:06,413 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746192_5368 src: /192.168.158.1:46862 dest: /192.168.158.4:9866 2025-07-12 17:38:06,445 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46862, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_232972107_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746192_5368, duration(ns): 23017879 2025-07-12 17:38:06,445 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746192_5368, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-12 17:38:08,563 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746192_5368 replica FinalizedReplica, blk_1073746192_5368, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746192 for deletion 2025-07-12 17:38:08,564 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746192_5368 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746192 2025-07-12 17:41:21,421 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746195_5371 src: /192.168.158.1:51374 dest: /192.168.158.4:9866 2025-07-12 17:41:21,452 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51374, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1300112288_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746195_5371, duration(ns): 23108768 2025-07-12 17:41:21,452 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746195_5371, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-12 17:41:26,573 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746195_5371 replica FinalizedReplica, blk_1073746195_5371, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746195 for deletion 2025-07-12 17:41:26,574 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746195_5371 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746195 2025-07-12 17:42:21,424 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746196_5372 src: /192.168.158.5:47556 dest: /192.168.158.4:9866 2025-07-12 17:42:21,449 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:47556, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_336509855_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746196_5372, duration(ns): 19384652 2025-07-12 17:42:21,449 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746196_5372, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-12 17:42:26,573 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746196_5372 replica FinalizedReplica, blk_1073746196_5372, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746196 for deletion 2025-07-12 17:42:26,574 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746196_5372 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746196 2025-07-12 17:44:31,414 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746198_5374 src: /192.168.158.1:48640 dest: /192.168.158.4:9866 2025-07-12 17:44:31,444 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48640, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1158325794_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746198_5374, duration(ns): 21485166 2025-07-12 17:44:31,444 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746198_5374, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-12 17:44:35,578 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746198_5374 replica FinalizedReplica, blk_1073746198_5374, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746198 for deletion 2025-07-12 17:44:35,579 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746198_5374 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746198 2025-07-12 17:45:31,414 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746199_5375 src: /192.168.158.1:58444 dest: /192.168.158.4:9866 2025-07-12 17:45:31,444 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58444, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1953144760_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746199_5375, duration(ns): 21194810 2025-07-12 17:45:31,444 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746199_5375, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-12 17:45:35,578 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746199_5375 replica FinalizedReplica, blk_1073746199_5375, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746199 for deletion 2025-07-12 17:45:35,580 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746199_5375 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746199 2025-07-12 17:50:36,444 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746204_5380 src: /192.168.158.7:51672 dest: /192.168.158.4:9866 2025-07-12 17:50:36,462 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:51672, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-627648492_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746204_5380, duration(ns): 15213703 2025-07-12 17:50:36,462 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746204_5380, type=LAST_IN_PIPELINE terminating 2025-07-12 17:50:41,587 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746204_5380 replica FinalizedReplica, blk_1073746204_5380, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746204 for deletion 2025-07-12 17:50:41,588 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746204_5380 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746204 2025-07-12 17:55:41,441 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746209_5385 src: /192.168.158.9:45200 dest: /192.168.158.4:9866 2025-07-12 17:55:41,465 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:45200, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1410801210_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746209_5385, duration(ns): 18396119 2025-07-12 17:55:41,465 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746209_5385, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-12 17:55:44,597 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746209_5385 replica FinalizedReplica, blk_1073746209_5385, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746209 for deletion 2025-07-12 17:55:44,599 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746209_5385 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746209 2025-07-12 17:56:41,438 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746210_5386 src: /192.168.158.1:45826 dest: /192.168.158.4:9866 2025-07-12 17:56:41,467 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45826, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1369366985_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746210_5386, duration(ns): 20871206 2025-07-12 17:56:41,468 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746210_5386, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-12 17:56:47,604 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746210_5386 replica FinalizedReplica, blk_1073746210_5386, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746210 for deletion 2025-07-12 17:56:47,605 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746210_5386 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746210 2025-07-12 17:57:46,436 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746211_5387 src: /192.168.158.7:50642 dest: /192.168.158.4:9866 2025-07-12 17:57:46,461 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:50642, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_16223318_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746211_5387, duration(ns): 19360145 2025-07-12 17:57:46,461 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746211_5387, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-12 17:57:47,607 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746211_5387 replica FinalizedReplica, blk_1073746211_5387, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746211 for deletion 2025-07-12 17:57:47,608 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746211_5387 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746211 2025-07-12 17:58:51,435 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746212_5388 src: /192.168.158.1:59198 dest: /192.168.158.4:9866 2025-07-12 17:58:51,466 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59198, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2059210898_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746212_5388, duration(ns): 22631632 2025-07-12 17:58:51,466 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746212_5388, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-12 17:58:53,609 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746212_5388 replica FinalizedReplica, blk_1073746212_5388, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746212 for deletion 2025-07-12 17:58:53,610 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746212_5388 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746212 2025-07-12 17:59:56,472 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746213_5389 src: /192.168.158.9:36854 dest: /192.168.158.4:9866 2025-07-12 17:59:56,491 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:36854, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1304454273_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746213_5389, duration(ns): 17228825 2025-07-12 17:59:56,491 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746213_5389, type=LAST_IN_PIPELINE terminating 2025-07-12 17:59:59,611 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746213_5389 replica FinalizedReplica, blk_1073746213_5389, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746213 for deletion 2025-07-12 17:59:59,612 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746213_5389 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746213 2025-07-12 18:00:56,445 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746214_5390 src: /192.168.158.9:41592 dest: /192.168.158.4:9866 2025-07-12 18:00:56,463 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:41592, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1770050548_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746214_5390, duration(ns): 15517612 2025-07-12 18:00:56,463 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746214_5390, type=LAST_IN_PIPELINE terminating 2025-07-12 18:00:59,612 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746214_5390 replica FinalizedReplica, blk_1073746214_5390, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746214 for deletion 2025-07-12 18:00:59,614 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746214_5390 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746214 2025-07-12 18:02:01,454 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746215_5391 src: /192.168.158.9:35556 dest: /192.168.158.4:9866 2025-07-12 18:02:01,473 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:35556, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_944256878_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746215_5391, duration(ns): 15629751 2025-07-12 18:02:01,473 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746215_5391, type=LAST_IN_PIPELINE terminating 2025-07-12 18:02:02,616 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746215_5391 replica FinalizedReplica, blk_1073746215_5391, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746215 for deletion 2025-07-12 18:02:02,617 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746215_5391 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746215 2025-07-12 18:03:01,452 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746216_5392 src: /192.168.158.1:52040 dest: /192.168.158.4:9866 2025-07-12 18:03:01,483 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52040, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1720244796_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746216_5392, duration(ns): 22065100 2025-07-12 18:03:01,483 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746216_5392, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-12 18:03:05,616 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746216_5392 replica FinalizedReplica, blk_1073746216_5392, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746216 for deletion 2025-07-12 18:03:05,618 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746216_5392 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746216 2025-07-12 18:04:01,470 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746217_5393 src: /192.168.158.7:34390 dest: /192.168.158.4:9866 2025-07-12 18:04:01,490 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:34390, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1403362855_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746217_5393, duration(ns): 17519420 2025-07-12 18:04:01,490 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746217_5393, type=LAST_IN_PIPELINE terminating 2025-07-12 18:04:02,618 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746217_5393 replica FinalizedReplica, blk_1073746217_5393, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746217 for deletion 2025-07-12 18:04:02,619 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746217_5393 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746217 2025-07-12 18:05:01,457 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746218_5394 src: /192.168.158.1:36684 dest: /192.168.158.4:9866 2025-07-12 18:05:01,490 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36684, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1542141494_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746218_5394, duration(ns): 22287180 2025-07-12 18:05:01,490 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746218_5394, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-12 18:05:02,619 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746218_5394 replica FinalizedReplica, blk_1073746218_5394, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746218 for deletion 2025-07-12 18:05:02,620 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746218_5394 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746218 2025-07-12 18:06:06,466 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746219_5395 src: /192.168.158.8:49724 dest: /192.168.158.4:9866 2025-07-12 18:06:06,482 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:49724, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-261535561_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746219_5395, duration(ns): 14688237 2025-07-12 18:06:06,483 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746219_5395, type=LAST_IN_PIPELINE terminating 2025-07-12 18:06:11,619 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746219_5395 replica FinalizedReplica, blk_1073746219_5395, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746219 for deletion 2025-07-12 18:06:11,620 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746219_5395 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746219 2025-07-12 18:07:06,444 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746220_5396 src: /192.168.158.1:43272 dest: /192.168.158.4:9866 2025-07-12 18:07:06,475 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43272, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1963529540_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746220_5396, duration(ns): 21844665 2025-07-12 18:07:06,475 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746220_5396, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-12 18:07:08,621 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746220_5396 replica FinalizedReplica, blk_1073746220_5396, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746220 for deletion 2025-07-12 18:07:08,622 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746220_5396 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746220 2025-07-12 18:08:06,451 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746221_5397 src: /192.168.158.6:59252 dest: /192.168.158.4:9866 2025-07-12 18:08:06,476 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:59252, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1535050997_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746221_5397, duration(ns): 18600611 2025-07-12 18:08:06,476 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746221_5397, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-12 18:08:08,621 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746221_5397 replica FinalizedReplica, blk_1073746221_5397, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746221 for deletion 2025-07-12 18:08:08,622 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746221_5397 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746221 2025-07-12 18:09:06,459 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746222_5398 src: /192.168.158.5:35592 dest: /192.168.158.4:9866 2025-07-12 18:09:06,480 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:35592, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2127346819_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746222_5398, duration(ns): 19183716 2025-07-12 18:09:06,481 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746222_5398, type=LAST_IN_PIPELINE terminating 2025-07-12 18:09:11,624 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746222_5398 replica FinalizedReplica, blk_1073746222_5398, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746222 for deletion 2025-07-12 18:09:11,625 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746222_5398 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746222 2025-07-12 18:10:11,465 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746223_5399 src: /192.168.158.8:60604 dest: /192.168.158.4:9866 2025-07-12 18:10:11,488 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:60604, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_379744812_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746223_5399, duration(ns): 18223442 2025-07-12 18:10:11,489 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746223_5399, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-12 18:10:14,626 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746223_5399 replica FinalizedReplica, blk_1073746223_5399, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746223 for deletion 2025-07-12 18:10:14,627 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746223_5399 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746223 2025-07-12 18:16:16,470 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746229_5405 src: /192.168.158.1:51708 dest: /192.168.158.4:9866 2025-07-12 18:16:16,500 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51708, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_247637660_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746229_5405, duration(ns): 21722159 2025-07-12 18:16:16,500 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746229_5405, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-12 18:16:17,642 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746229_5405 replica FinalizedReplica, blk_1073746229_5405, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746229 for deletion 2025-07-12 18:16:17,643 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746229_5405 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746229 2025-07-12 18:19:21,498 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746232_5408 src: /192.168.158.8:52466 dest: /192.168.158.4:9866 2025-07-12 18:19:21,522 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:52466, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_888311671_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746232_5408, duration(ns): 18742679 2025-07-12 18:19:21,522 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746232_5408, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-12 18:19:23,651 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746232_5408 replica FinalizedReplica, blk_1073746232_5408, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746232 for deletion 2025-07-12 18:19:23,652 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746232_5408 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746232 2025-07-12 18:20:21,501 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746233_5409 src: /192.168.158.9:54380 dest: /192.168.158.4:9866 2025-07-12 18:20:21,521 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:54380, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1379423245_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746233_5409, duration(ns): 17167983 2025-07-12 18:20:21,521 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746233_5409, type=LAST_IN_PIPELINE terminating 2025-07-12 18:20:26,653 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746233_5409 replica FinalizedReplica, blk_1073746233_5409, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746233 for deletion 2025-07-12 18:20:26,654 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746233_5409 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746233 2025-07-12 18:24:31,507 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746237_5413 src: /192.168.158.6:37282 dest: /192.168.158.4:9866 2025-07-12 18:24:31,532 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:37282, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_925751449_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746237_5413, duration(ns): 19641366 2025-07-12 18:24:31,533 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746237_5413, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-12 18:24:32,664 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746237_5413 replica FinalizedReplica, blk_1073746237_5413, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746237 for deletion 2025-07-12 18:24:32,666 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746237_5413 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746237 2025-07-12 18:26:31,511 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746239_5415 src: /192.168.158.1:44818 dest: /192.168.158.4:9866 2025-07-12 18:26:31,542 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44818, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_582394213_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746239_5415, duration(ns): 21519665 2025-07-12 18:26:31,542 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746239_5415, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-12 18:26:35,671 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746239_5415 replica FinalizedReplica, blk_1073746239_5415, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746239 for deletion 2025-07-12 18:26:35,672 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746239_5415 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746239 2025-07-12 18:27:31,511 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746240_5416 src: /192.168.158.5:60558 dest: /192.168.158.4:9866 2025-07-12 18:27:31,534 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:60558, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-146789308_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746240_5416, duration(ns): 17766697 2025-07-12 18:27:31,534 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746240_5416, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-12 18:27:32,671 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746240_5416 replica FinalizedReplica, blk_1073746240_5416, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746240 for deletion 2025-07-12 18:27:32,672 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746240_5416 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746240 2025-07-12 18:31:41,494 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746244_5420 src: /192.168.158.1:43264 dest: /192.168.158.4:9866 2025-07-12 18:31:41,527 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43264, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_20873914_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746244_5420, duration(ns): 23877422 2025-07-12 18:31:41,527 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746244_5420, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-12 18:31:44,683 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746244_5420 replica FinalizedReplica, blk_1073746244_5420, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746244 for deletion 2025-07-12 18:31:44,684 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746244_5420 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746244 2025-07-12 18:33:41,524 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746246_5422 src: /192.168.158.1:60304 dest: /192.168.158.4:9866 2025-07-12 18:33:41,557 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60304, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1813199514_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746246_5422, duration(ns): 24407667 2025-07-12 18:33:41,558 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746246_5422, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-12 18:33:44,687 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746246_5422 replica FinalizedReplica, blk_1073746246_5422, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746246 for deletion 2025-07-12 18:33:44,688 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746246_5422 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746246 2025-07-12 18:35:46,518 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746248_5424 src: /192.168.158.1:51012 dest: /192.168.158.4:9866 2025-07-12 18:35:46,549 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51012, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_477606789_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746248_5424, duration(ns): 22349315 2025-07-12 18:35:46,549 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746248_5424, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-12 18:35:47,689 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746248_5424 replica FinalizedReplica, blk_1073746248_5424, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746248 for deletion 2025-07-12 18:35:47,690 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746248_5424 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746248 2025-07-12 18:37:51,511 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746250_5426 src: /192.168.158.1:54798 dest: /192.168.158.4:9866 2025-07-12 18:37:51,543 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54798, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1993382729_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746250_5426, duration(ns): 23457286 2025-07-12 18:37:51,544 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746250_5426, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-12 18:37:53,691 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746250_5426 replica FinalizedReplica, blk_1073746250_5426, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746250 for deletion 2025-07-12 18:37:53,692 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746250_5426 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746250 2025-07-12 18:39:56,531 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746252_5428 src: /192.168.158.9:58802 dest: /192.168.158.4:9866 2025-07-12 18:39:56,557 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:58802, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-48723774_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746252_5428, duration(ns): 20116049 2025-07-12 18:39:56,557 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746252_5428, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-12 18:39:59,695 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746252_5428 replica FinalizedReplica, blk_1073746252_5428, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746252 for deletion 2025-07-12 18:39:59,696 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746252_5428 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746252 2025-07-12 18:46:06,530 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746258_5434 src: /192.168.158.7:35480 dest: /192.168.158.4:9866 2025-07-12 18:46:06,548 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:35480, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_165285679_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746258_5434, duration(ns): 15650676 2025-07-12 18:46:06,549 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746258_5434, type=LAST_IN_PIPELINE terminating 2025-07-12 18:46:08,705 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746258_5434 replica FinalizedReplica, blk_1073746258_5434, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746258 for deletion 2025-07-12 18:46:08,706 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746258_5434 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746258 2025-07-12 18:48:06,535 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746260_5436 src: /192.168.158.6:57772 dest: /192.168.158.4:9866 2025-07-12 18:48:06,553 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:57772, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_802172530_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746260_5436, duration(ns): 15819982 2025-07-12 18:48:06,553 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746260_5436, type=LAST_IN_PIPELINE terminating 2025-07-12 18:48:08,707 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746260_5436 replica FinalizedReplica, blk_1073746260_5436, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746260 for deletion 2025-07-12 18:48:08,708 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746260_5436 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746260 2025-07-12 18:53:11,534 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746265_5441 src: /192.168.158.1:33782 dest: /192.168.158.4:9866 2025-07-12 18:53:11,563 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33782, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-763820880_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746265_5441, duration(ns): 21218942 2025-07-12 18:53:11,563 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746265_5441, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-12 18:53:14,713 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746265_5441 replica FinalizedReplica, blk_1073746265_5441, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746265 for deletion 2025-07-12 18:53:14,714 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746265_5441 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746265 2025-07-12 18:54:11,534 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746266_5442 src: /192.168.158.8:43588 dest: /192.168.158.4:9866 2025-07-12 18:54:11,556 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:43588, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_229481638_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746266_5442, duration(ns): 17051569 2025-07-12 18:54:11,556 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746266_5442, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-12 18:54:14,713 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746266_5442 replica FinalizedReplica, blk_1073746266_5442, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746266 for deletion 2025-07-12 18:54:14,715 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746266_5442 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746266 2025-07-12 18:56:16,549 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746268_5444 src: /192.168.158.1:56990 dest: /192.168.158.4:9866 2025-07-12 18:56:16,579 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56990, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_257988597_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746268_5444, duration(ns): 21803892 2025-07-12 18:56:16,580 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746268_5444, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-12 18:56:17,715 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746268_5444 replica FinalizedReplica, blk_1073746268_5444, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746268 for deletion 2025-07-12 18:56:17,717 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746268_5444 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746268 2025-07-12 18:57:21,543 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746269_5445 src: /192.168.158.6:55636 dest: /192.168.158.4:9866 2025-07-12 18:57:21,566 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:55636, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_63208134_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746269_5445, duration(ns): 18440020 2025-07-12 18:57:21,567 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746269_5445, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-12 18:57:23,717 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746269_5445 replica FinalizedReplica, blk_1073746269_5445, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746269 for deletion 2025-07-12 18:57:23,718 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746269_5445 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746269 2025-07-12 18:58:26,534 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746270_5446 src: /192.168.158.9:48538 dest: /192.168.158.4:9866 2025-07-12 18:58:26,559 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:48538, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1627647851_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746270_5446, duration(ns): 19919563 2025-07-12 18:58:26,559 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746270_5446, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-12 18:58:29,718 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746270_5446 replica FinalizedReplica, blk_1073746270_5446, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746270 for deletion 2025-07-12 18:58:29,719 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746270_5446 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746270 2025-07-12 19:01:36,544 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746273_5449 src: /192.168.158.1:51162 dest: /192.168.158.4:9866 2025-07-12 19:01:36,576 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51162, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1795592649_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746273_5449, duration(ns): 22854024 2025-07-12 19:01:36,576 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746273_5449, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-12 19:01:38,720 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746273_5449 replica FinalizedReplica, blk_1073746273_5449, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746273 for deletion 2025-07-12 19:01:38,721 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746273_5449 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746273 2025-07-12 19:02:36,558 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746274_5450 src: /192.168.158.8:37474 dest: /192.168.158.4:9866 2025-07-12 19:02:36,585 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:37474, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-311374385_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746274_5450, duration(ns): 20716007 2025-07-12 19:02:36,585 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746274_5450, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-12 19:02:38,722 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746274_5450 replica FinalizedReplica, blk_1073746274_5450, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746274 for deletion 2025-07-12 19:02:38,723 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746274_5450 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746274 2025-07-12 19:03:41,550 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746275_5451 src: /192.168.158.1:49156 dest: /192.168.158.4:9866 2025-07-12 19:03:41,584 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49156, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2108112279_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746275_5451, duration(ns): 24499856 2025-07-12 19:03:41,584 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746275_5451, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-12 19:03:44,721 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746275_5451 replica FinalizedReplica, blk_1073746275_5451, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746275 for deletion 2025-07-12 19:03:44,722 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746275_5451 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746275 2025-07-12 19:04:41,564 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746276_5452 src: /192.168.158.1:47764 dest: /192.168.158.4:9866 2025-07-12 19:04:41,593 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47764, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-88863047_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746276_5452, duration(ns): 20557783 2025-07-12 19:04:41,593 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746276_5452, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-12 19:04:44,724 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746276_5452 replica FinalizedReplica, blk_1073746276_5452, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746276 for deletion 2025-07-12 19:04:44,725 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746276_5452 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746276 2025-07-12 19:06:51,553 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746278_5454 src: /192.168.158.8:56740 dest: /192.168.158.4:9866 2025-07-12 19:06:51,581 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:56740, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1082110825_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746278_5454, duration(ns): 23111350 2025-07-12 19:06:51,581 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746278_5454, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-12 19:06:56,728 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746278_5454 replica FinalizedReplica, blk_1073746278_5454, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746278 for deletion 2025-07-12 19:06:56,729 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746278_5454 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746278 2025-07-12 19:07:51,565 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746279_5455 src: /192.168.158.8:48548 dest: /192.168.158.4:9866 2025-07-12 19:07:51,583 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:48548, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1073308994_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746279_5455, duration(ns): 15581519 2025-07-12 19:07:51,583 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746279_5455, type=LAST_IN_PIPELINE terminating 2025-07-12 19:07:53,729 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746279_5455 replica FinalizedReplica, blk_1073746279_5455, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746279 for deletion 2025-07-12 19:07:53,730 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746279_5455 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746279 2025-07-12 19:08:56,561 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746280_5456 src: /192.168.158.7:34164 dest: /192.168.158.4:9866 2025-07-12 19:08:56,578 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:34164, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-154297533_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746280_5456, duration(ns): 15026882 2025-07-12 19:08:56,578 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746280_5456, type=LAST_IN_PIPELINE terminating 2025-07-12 19:08:59,730 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746280_5456 replica FinalizedReplica, blk_1073746280_5456, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746280 for deletion 2025-07-12 19:08:59,731 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746280_5456 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746280 2025-07-12 19:10:01,565 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746281_5457 src: /192.168.158.7:41282 dest: /192.168.158.4:9866 2025-07-12 19:10:01,588 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:41282, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1644378896_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746281_5457, duration(ns): 17839891 2025-07-12 19:10:01,589 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746281_5457, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-12 19:10:05,732 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746281_5457 replica FinalizedReplica, blk_1073746281_5457, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746281 for deletion 2025-07-12 19:10:05,733 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746281_5457 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746281 2025-07-12 19:13:01,564 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746284_5460 src: /192.168.158.8:47462 dest: /192.168.158.4:9866 2025-07-12 19:13:01,588 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:47462, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2132011035_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746284_5460, duration(ns): 18883887 2025-07-12 19:13:01,588 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746284_5460, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-12 19:13:02,735 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746284_5460 replica FinalizedReplica, blk_1073746284_5460, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746284 for deletion 2025-07-12 19:13:02,736 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746284_5460 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746284 2025-07-12 19:16:01,574 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746287_5463 src: /192.168.158.8:36226 dest: /192.168.158.4:9866 2025-07-12 19:16:01,592 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:36226, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1968086994_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746287_5463, duration(ns): 16016125 2025-07-12 19:16:01,593 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746287_5463, type=LAST_IN_PIPELINE terminating 2025-07-12 19:16:02,742 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746287_5463 replica FinalizedReplica, blk_1073746287_5463, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746287 for deletion 2025-07-12 19:16:02,744 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746287_5463 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746287 2025-07-12 19:17:01,576 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746288_5464 src: /192.168.158.1:49884 dest: /192.168.158.4:9866 2025-07-12 19:17:01,608 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49884, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1764467477_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746288_5464, duration(ns): 23979123 2025-07-12 19:17:01,609 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746288_5464, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-12 19:17:02,745 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746288_5464 replica FinalizedReplica, blk_1073746288_5464, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746288 for deletion 2025-07-12 19:17:02,746 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746288_5464 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746288 2025-07-12 19:18:01,579 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746289_5465 src: /192.168.158.8:51238 dest: /192.168.158.4:9866 2025-07-12 19:18:01,605 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:51238, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_7240923_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746289_5465, duration(ns): 19887284 2025-07-12 19:18:01,605 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746289_5465, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-12 19:18:02,747 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746289_5465 replica FinalizedReplica, blk_1073746289_5465, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746289 for deletion 2025-07-12 19:18:02,749 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746289_5465 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746289 2025-07-12 19:19:01,582 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746290_5466 src: /192.168.158.5:46558 dest: /192.168.158.4:9866 2025-07-12 19:19:01,607 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:46558, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-346780031_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746290_5466, duration(ns): 20285158 2025-07-12 19:19:01,608 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746290_5466, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-12 19:19:05,748 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746290_5466 replica FinalizedReplica, blk_1073746290_5466, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746290 for deletion 2025-07-12 19:19:05,749 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746290_5466 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746290 2025-07-12 19:20:01,573 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746291_5467 src: /192.168.158.9:37530 dest: /192.168.158.4:9866 2025-07-12 19:20:01,599 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:37530, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1271080318_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746291_5467, duration(ns): 19998307 2025-07-12 19:20:01,600 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746291_5467, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-12 19:20:05,749 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746291_5467 replica FinalizedReplica, blk_1073746291_5467, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746291 for deletion 2025-07-12 19:20:05,751 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746291_5467 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746291 2025-07-12 19:22:06,602 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746293_5469 src: /192.168.158.8:55932 dest: /192.168.158.4:9866 2025-07-12 19:22:06,621 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:55932, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_46260575_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746293_5469, duration(ns): 16925494 2025-07-12 19:22:06,622 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746293_5469, type=LAST_IN_PIPELINE terminating 2025-07-12 19:22:11,754 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746293_5469 replica FinalizedReplica, blk_1073746293_5469, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746293 for deletion 2025-07-12 19:22:11,755 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746293_5469 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746293 2025-07-12 19:23:11,562 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746294_5470 src: /192.168.158.1:50554 dest: /192.168.158.4:9866 2025-07-12 19:23:11,595 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50554, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1751307130_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746294_5470, duration(ns): 24165686 2025-07-12 19:23:11,595 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746294_5470, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-12 19:23:14,757 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746294_5470 replica FinalizedReplica, blk_1073746294_5470, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746294 for deletion 2025-07-12 19:23:14,758 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746294_5470 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746294 2025-07-12 19:26:16,585 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746297_5473 src: /192.168.158.1:55784 dest: /192.168.158.4:9866 2025-07-12 19:26:16,618 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55784, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1653158215_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746297_5473, duration(ns): 23996078 2025-07-12 19:26:16,618 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746297_5473, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-12 19:26:20,762 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746297_5473 replica FinalizedReplica, blk_1073746297_5473, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746297 for deletion 2025-07-12 19:26:20,763 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746297_5473 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746297 2025-07-12 19:28:21,589 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746299_5475 src: /192.168.158.7:55996 dest: /192.168.158.4:9866 2025-07-12 19:28:21,615 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:55996, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-65175149_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746299_5475, duration(ns): 20506586 2025-07-12 19:28:21,615 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746299_5475, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-12 19:28:26,764 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746299_5475 replica FinalizedReplica, blk_1073746299_5475, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746299 for deletion 2025-07-12 19:28:26,765 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746299_5475 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746299 2025-07-12 19:30:26,588 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746301_5477 src: /192.168.158.1:38928 dest: /192.168.158.4:9866 2025-07-12 19:30:26,619 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38928, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-90089885_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746301_5477, duration(ns): 21660551 2025-07-12 19:30:26,619 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746301_5477, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-12 19:30:32,766 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746301_5477 replica FinalizedReplica, blk_1073746301_5477, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746301 for deletion 2025-07-12 19:30:32,767 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746301_5477 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746301 2025-07-12 19:31:26,583 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746302_5478 src: /192.168.158.6:36230 dest: /192.168.158.4:9866 2025-07-12 19:31:26,603 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:36230, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1043540580_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746302_5478, duration(ns): 17909754 2025-07-12 19:31:26,603 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746302_5478, type=LAST_IN_PIPELINE terminating 2025-07-12 19:31:29,767 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746302_5478 replica FinalizedReplica, blk_1073746302_5478, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746302 for deletion 2025-07-12 19:31:29,768 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746302_5478 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746302 2025-07-12 19:35:36,594 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746306_5482 src: /192.168.158.8:44698 dest: /192.168.158.4:9866 2025-07-12 19:35:36,621 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:44698, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1997717711_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746306_5482, duration(ns): 21885497 2025-07-12 19:35:36,621 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746306_5482, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-12 19:35:38,777 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746306_5482 replica FinalizedReplica, blk_1073746306_5482, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746306 for deletion 2025-07-12 19:35:38,778 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746306_5482 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746306 2025-07-12 19:36:41,601 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746307_5483 src: /192.168.158.1:52484 dest: /192.168.158.4:9866 2025-07-12 19:36:41,634 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52484, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-926758278_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746307_5483, duration(ns): 24337000 2025-07-12 19:36:41,634 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746307_5483, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-12 19:36:44,778 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746307_5483 replica FinalizedReplica, blk_1073746307_5483, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746307 for deletion 2025-07-12 19:36:44,779 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746307_5483 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746307 2025-07-12 19:38:46,603 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746309_5485 src: /192.168.158.5:59476 dest: /192.168.158.4:9866 2025-07-12 19:38:46,631 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:59476, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1271567020_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746309_5485, duration(ns): 21963959 2025-07-12 19:38:46,631 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746309_5485, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-12 19:38:47,781 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746309_5485 replica FinalizedReplica, blk_1073746309_5485, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746309 for deletion 2025-07-12 19:38:47,782 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746309_5485 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746309 2025-07-12 19:44:51,611 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746315_5491 src: /192.168.158.9:41418 dest: /192.168.158.4:9866 2025-07-12 19:44:51,628 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:41418, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1650788253_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746315_5491, duration(ns): 15392657 2025-07-12 19:44:51,628 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746315_5491, type=LAST_IN_PIPELINE terminating 2025-07-12 19:44:56,782 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746315_5491 replica FinalizedReplica, blk_1073746315_5491, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746315 for deletion 2025-07-12 19:44:56,783 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746315_5491 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746315 2025-07-12 19:45:51,621 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746316_5492 src: /192.168.158.9:42928 dest: /192.168.158.4:9866 2025-07-12 19:45:51,639 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:42928, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1596338954_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746316_5492, duration(ns): 16502428 2025-07-12 19:45:51,640 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746316_5492, type=LAST_IN_PIPELINE terminating 2025-07-12 19:45:53,781 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746316_5492 replica FinalizedReplica, blk_1073746316_5492, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746316 for deletion 2025-07-12 19:45:53,783 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746316_5492 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746316 2025-07-12 19:47:51,624 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746318_5494 src: /192.168.158.8:44538 dest: /192.168.158.4:9866 2025-07-12 19:47:51,650 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:44538, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_353838866_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746318_5494, duration(ns): 20270613 2025-07-12 19:47:51,650 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746318_5494, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-12 19:47:53,782 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746318_5494 replica FinalizedReplica, blk_1073746318_5494, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746318 for deletion 2025-07-12 19:47:53,783 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746318_5494 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746318 2025-07-12 19:49:56,622 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746320_5496 src: /192.168.158.8:36650 dest: /192.168.158.4:9866 2025-07-12 19:49:56,640 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:36650, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1113955622_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746320_5496, duration(ns): 15437924 2025-07-12 19:49:56,640 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746320_5496, type=LAST_IN_PIPELINE terminating 2025-07-12 19:50:02,783 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746320_5496 replica FinalizedReplica, blk_1073746320_5496, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746320 for deletion 2025-07-12 19:50:02,784 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746320_5496 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746320 2025-07-12 19:52:56,624 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746323_5499 src: /192.168.158.8:58774 dest: /192.168.158.4:9866 2025-07-12 19:52:56,642 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:58774, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_325729646_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746323_5499, duration(ns): 15945083 2025-07-12 19:52:56,643 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746323_5499, type=LAST_IN_PIPELINE terminating 2025-07-12 19:52:59,793 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746323_5499 replica FinalizedReplica, blk_1073746323_5499, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746323 for deletion 2025-07-12 19:52:59,794 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746323_5499 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746323 2025-07-12 20:04:06,637 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746334_5510 src: /192.168.158.1:60602 dest: /192.168.158.4:9866 2025-07-12 20:04:06,667 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60602, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_251114464_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746334_5510, duration(ns): 21406951 2025-07-12 20:04:06,667 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746334_5510, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-12 20:04:08,822 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746334_5510 replica FinalizedReplica, blk_1073746334_5510, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746334 for deletion 2025-07-12 20:04:08,823 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746334_5510 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746334 2025-07-12 20:07:06,653 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746337_5513 src: /192.168.158.9:49314 dest: /192.168.158.4:9866 2025-07-12 20:07:06,671 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:49314, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_245210251_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746337_5513, duration(ns): 16201179 2025-07-12 20:07:06,671 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746337_5513, type=LAST_IN_PIPELINE terminating 2025-07-12 20:07:11,831 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746337_5513 replica FinalizedReplica, blk_1073746337_5513, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746337 for deletion 2025-07-12 20:07:11,832 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746337_5513 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746337 2025-07-12 20:10:11,650 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746340_5516 src: /192.168.158.5:44324 dest: /192.168.158.4:9866 2025-07-12 20:10:11,674 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:44324, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1586483859_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746340_5516, duration(ns): 18357012 2025-07-12 20:10:11,674 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746340_5516, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-12 20:10:14,837 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746340_5516 replica FinalizedReplica, blk_1073746340_5516, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746340 for deletion 2025-07-12 20:10:14,838 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746340_5516 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746340 2025-07-12 20:11:16,660 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746341_5517 src: /192.168.158.8:37596 dest: /192.168.158.4:9866 2025-07-12 20:11:16,679 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:37596, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1603164428_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746341_5517, duration(ns): 16761109 2025-07-12 20:11:16,679 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746341_5517, type=LAST_IN_PIPELINE terminating 2025-07-12 20:11:17,839 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746341_5517 replica FinalizedReplica, blk_1073746341_5517, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746341 for deletion 2025-07-12 20:11:17,840 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746341_5517 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746341 2025-07-12 20:14:16,664 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746344_5520 src: /192.168.158.8:42236 dest: /192.168.158.4:9866 2025-07-12 20:14:16,688 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:42236, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1589231225_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746344_5520, duration(ns): 18276488 2025-07-12 20:14:16,688 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746344_5520, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-12 20:14:20,845 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746344_5520 replica FinalizedReplica, blk_1073746344_5520, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746344 for deletion 2025-07-12 20:14:20,846 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746344_5520 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746344 2025-07-12 20:15:21,652 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746345_5521 src: /192.168.158.5:35814 dest: /192.168.158.4:9866 2025-07-12 20:15:21,679 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:35814, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-944634082_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746345_5521, duration(ns): 21187702 2025-07-12 20:15:21,680 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746345_5521, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-12 20:15:26,850 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746345_5521 replica FinalizedReplica, blk_1073746345_5521, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746345 for deletion 2025-07-12 20:15:26,851 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746345_5521 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746345 2025-07-12 20:16:21,664 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746346_5522 src: /192.168.158.9:33838 dest: /192.168.158.4:9866 2025-07-12 20:16:21,682 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:33838, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1861800463_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746346_5522, duration(ns): 16132211 2025-07-12 20:16:21,682 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746346_5522, type=LAST_IN_PIPELINE terminating 2025-07-12 20:16:23,852 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746346_5522 replica FinalizedReplica, blk_1073746346_5522, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746346 for deletion 2025-07-12 20:16:23,853 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746346_5522 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746346 2025-07-12 20:20:26,660 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746350_5526 src: /192.168.158.1:39830 dest: /192.168.158.4:9866 2025-07-12 20:20:26,692 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39830, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_288222270_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746350_5526, duration(ns): 23352051 2025-07-12 20:20:26,692 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746350_5526, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-12 20:20:32,857 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746350_5526 replica FinalizedReplica, blk_1073746350_5526, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746350 for deletion 2025-07-12 20:20:32,859 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746350_5526 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746350 2025-07-12 20:22:31,704 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746352_5528 src: /192.168.158.9:46608 dest: /192.168.158.4:9866 2025-07-12 20:22:31,729 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:46608, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_570951488_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746352_5528, duration(ns): 19708661 2025-07-12 20:22:31,729 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746352_5528, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-12 20:22:32,860 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746352_5528 replica FinalizedReplica, blk_1073746352_5528, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746352 for deletion 2025-07-12 20:22:32,861 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746352_5528 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746352 2025-07-12 20:26:36,662 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746356_5532 src: /192.168.158.6:33026 dest: /192.168.158.4:9866 2025-07-12 20:26:36,678 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:33026, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-448181102_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746356_5532, duration(ns): 13729464 2025-07-12 20:26:36,678 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746356_5532, type=LAST_IN_PIPELINE terminating 2025-07-12 20:26:38,871 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746356_5532 replica FinalizedReplica, blk_1073746356_5532, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746356 for deletion 2025-07-12 20:26:38,873 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746356_5532 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746356 2025-07-12 20:27:41,664 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746357_5533 src: /192.168.158.1:48340 dest: /192.168.158.4:9866 2025-07-12 20:27:41,693 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48340, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-974480671_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746357_5533, duration(ns): 20279827 2025-07-12 20:27:41,693 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746357_5533, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-12 20:27:44,872 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746357_5533 replica FinalizedReplica, blk_1073746357_5533, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746357 for deletion 2025-07-12 20:27:44,874 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746357_5533 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746357 2025-07-12 20:30:46,669 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746360_5536 src: /192.168.158.5:55864 dest: /192.168.158.4:9866 2025-07-12 20:30:46,695 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:55864, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1004313998_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746360_5536, duration(ns): 20743759 2025-07-12 20:30:46,695 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746360_5536, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-12 20:30:50,880 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746360_5536 replica FinalizedReplica, blk_1073746360_5536, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746360 for deletion 2025-07-12 20:30:50,881 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746360_5536 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746360 2025-07-12 20:33:46,676 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746363_5539 src: /192.168.158.6:35774 dest: /192.168.158.4:9866 2025-07-12 20:33:46,699 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:35774, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1370176656_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746363_5539, duration(ns): 18235903 2025-07-12 20:33:46,700 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746363_5539, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-12 20:33:47,894 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746363_5539 replica FinalizedReplica, blk_1073746363_5539, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746363 for deletion 2025-07-12 20:33:47,895 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746363_5539 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746363 2025-07-12 20:35:46,678 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746365_5541 src: /192.168.158.1:39408 dest: /192.168.158.4:9866 2025-07-12 20:35:46,709 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39408, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_537891288_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746365_5541, duration(ns): 21714156 2025-07-12 20:35:46,709 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746365_5541, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-12 20:35:50,901 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746365_5541 replica FinalizedReplica, blk_1073746365_5541, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746365 for deletion 2025-07-12 20:35:50,902 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746365_5541 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746365 2025-07-12 20:36:46,678 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746366_5542 src: /192.168.158.1:41764 dest: /192.168.158.4:9866 2025-07-12 20:36:46,710 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41764, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-151535111_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746366_5542, duration(ns): 23516339 2025-07-12 20:36:46,710 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746366_5542, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-12 20:36:47,903 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746366_5542 replica FinalizedReplica, blk_1073746366_5542, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746366 for deletion 2025-07-12 20:36:47,904 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746366_5542 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746366 2025-07-12 20:38:46,683 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746368_5544 src: /192.168.158.8:56026 dest: /192.168.158.4:9866 2025-07-12 20:38:46,709 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:56026, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-326372163_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746368_5544, duration(ns): 20532925 2025-07-12 20:38:46,709 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746368_5544, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-12 20:38:47,906 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746368_5544 replica FinalizedReplica, blk_1073746368_5544, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746368 for deletion 2025-07-12 20:38:47,907 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746368_5544 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746368 2025-07-12 20:39:51,679 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746369_5545 src: /192.168.158.1:47406 dest: /192.168.158.4:9866 2025-07-12 20:39:51,711 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47406, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1858292167_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746369_5545, duration(ns): 22388327 2025-07-12 20:39:51,711 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746369_5545, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-12 20:39:56,908 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746369_5545 replica FinalizedReplica, blk_1073746369_5545, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746369 for deletion 2025-07-12 20:39:56,909 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746369_5545 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746369 2025-07-12 20:40:56,688 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746370_5546 src: /192.168.158.5:59440 dest: /192.168.158.4:9866 2025-07-12 20:40:56,715 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:59440, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_441337019_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746370_5546, duration(ns): 21159406 2025-07-12 20:40:56,715 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746370_5546, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-12 20:40:59,913 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746370_5546 replica FinalizedReplica, blk_1073746370_5546, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746370 for deletion 2025-07-12 20:40:59,914 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746370_5546 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746370 2025-07-12 20:42:01,692 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746371_5547 src: /192.168.158.6:38534 dest: /192.168.158.4:9866 2025-07-12 20:42:01,716 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:38534, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_832172927_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746371_5547, duration(ns): 19053018 2025-07-12 20:42:01,716 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746371_5547, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-12 20:42:05,915 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746371_5547 replica FinalizedReplica, blk_1073746371_5547, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746371 for deletion 2025-07-12 20:42:05,916 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746371_5547 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746371 2025-07-12 20:43:01,696 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746372_5548 src: /192.168.158.9:35130 dest: /192.168.158.4:9866 2025-07-12 20:43:01,714 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:35130, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-57171023_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746372_5548, duration(ns): 15738832 2025-07-12 20:43:01,715 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746372_5548, type=LAST_IN_PIPELINE terminating 2025-07-12 20:43:05,919 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746372_5548 replica FinalizedReplica, blk_1073746372_5548, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746372 for deletion 2025-07-12 20:43:05,920 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746372_5548 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746372 2025-07-12 20:45:06,692 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746374_5550 src: /192.168.158.1:38082 dest: /192.168.158.4:9866 2025-07-12 20:45:06,723 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38082, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1994864130_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746374_5550, duration(ns): 22023846 2025-07-12 20:45:06,723 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746374_5550, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-12 20:45:08,922 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746374_5550 replica FinalizedReplica, blk_1073746374_5550, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746374 for deletion 2025-07-12 20:45:08,923 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746374_5550 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746374 2025-07-12 20:48:11,703 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746377_5553 src: /192.168.158.8:44868 dest: /192.168.158.4:9866 2025-07-12 20:48:11,728 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:44868, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1403815588_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746377_5553, duration(ns): 19229688 2025-07-12 20:48:11,728 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746377_5553, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-12 20:48:17,926 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746377_5553 replica FinalizedReplica, blk_1073746377_5553, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746377 for deletion 2025-07-12 20:48:17,927 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746377_5553 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746377 2025-07-12 20:53:16,707 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746382_5558 src: /192.168.158.8:59280 dest: /192.168.158.4:9866 2025-07-12 20:53:16,723 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:59280, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-706128613_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746382_5558, duration(ns): 14683819 2025-07-12 20:53:16,724 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746382_5558, type=LAST_IN_PIPELINE terminating 2025-07-12 20:53:20,934 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746382_5558 replica FinalizedReplica, blk_1073746382_5558, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746382 for deletion 2025-07-12 20:53:20,936 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746382_5558 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746382 2025-07-12 20:54:16,711 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746383_5559 src: /192.168.158.5:39134 dest: /192.168.158.4:9866 2025-07-12 20:54:16,738 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:39134, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_427451242_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746383_5559, duration(ns): 21866730 2025-07-12 20:54:16,739 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746383_5559, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-12 20:54:20,939 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746383_5559 replica FinalizedReplica, blk_1073746383_5559, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746383 for deletion 2025-07-12 20:54:20,940 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746383_5559 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746383 2025-07-12 20:59:21,710 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746388_5564 src: /192.168.158.7:38734 dest: /192.168.158.4:9866 2025-07-12 20:59:21,727 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:38734, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1222801551_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746388_5564, duration(ns): 15428546 2025-07-12 20:59:21,728 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746388_5564, type=LAST_IN_PIPELINE terminating 2025-07-12 20:59:23,950 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746388_5564 replica FinalizedReplica, blk_1073746388_5564, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746388 for deletion 2025-07-12 20:59:23,952 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746388_5564 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746388 2025-07-12 21:00:21,765 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746389_5565 src: /192.168.158.8:51280 dest: /192.168.158.4:9866 2025-07-12 21:00:21,781 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:51280, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_373756278_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746389_5565, duration(ns): 13578464 2025-07-12 21:00:21,782 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746389_5565, type=LAST_IN_PIPELINE terminating 2025-07-12 21:00:23,954 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746389_5565 replica FinalizedReplica, blk_1073746389_5565, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746389 for deletion 2025-07-12 21:00:23,955 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746389_5565 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746389 2025-07-12 21:01:26,720 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746390_5566 src: /192.168.158.8:60740 dest: /192.168.158.4:9866 2025-07-12 21:01:26,746 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:60740, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-410748462_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746390_5566, duration(ns): 20722755 2025-07-12 21:01:26,748 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746390_5566, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-12 21:01:32,954 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746390_5566 replica FinalizedReplica, blk_1073746390_5566, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746390 for deletion 2025-07-12 21:01:32,956 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746390_5566 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746390 2025-07-12 21:04:26,721 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746393_5569 src: /192.168.158.6:51070 dest: /192.168.158.4:9866 2025-07-12 21:04:26,745 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:51070, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1493954095_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746393_5569, duration(ns): 18283656 2025-07-12 21:04:26,746 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746393_5569, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-12 21:04:29,962 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746393_5569 replica FinalizedReplica, blk_1073746393_5569, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746393 for deletion 2025-07-12 21:04:29,963 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746393_5569 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746393 2025-07-12 21:05:31,732 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746394_5570 src: /192.168.158.8:53612 dest: /192.168.158.4:9866 2025-07-12 21:05:31,749 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:53612, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-149784446_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746394_5570, duration(ns): 15116108 2025-07-12 21:05:31,749 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746394_5570, type=LAST_IN_PIPELINE terminating 2025-07-12 21:05:32,964 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746394_5570 replica FinalizedReplica, blk_1073746394_5570, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746394 for deletion 2025-07-12 21:05:32,965 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746394_5570 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746394 2025-07-12 21:06:36,727 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746395_5571 src: /192.168.158.6:42426 dest: /192.168.158.4:9866 2025-07-12 21:06:36,751 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:42426, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_720827233_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746395_5571, duration(ns): 18348205 2025-07-12 21:06:36,751 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746395_5571, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-12 21:06:41,966 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746395_5571 replica FinalizedReplica, blk_1073746395_5571, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746395 for deletion 2025-07-12 21:06:41,967 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746395_5571 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746395 2025-07-12 21:08:36,759 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746397_5573 src: /192.168.158.1:39000 dest: /192.168.158.4:9866 2025-07-12 21:08:36,793 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39000, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1653633927_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746397_5573, duration(ns): 23603213 2025-07-12 21:08:36,793 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746397_5573, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-12 21:08:38,970 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746397_5573 replica FinalizedReplica, blk_1073746397_5573, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746397 for deletion 2025-07-12 21:08:38,971 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746397_5573 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746397 2025-07-12 21:10:41,736 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746399_5575 src: /192.168.158.1:50306 dest: /192.168.158.4:9866 2025-07-12 21:10:41,769 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50306, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2091465039_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746399_5575, duration(ns): 24032311 2025-07-12 21:10:41,769 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746399_5575, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-12 21:10:44,975 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746399_5575 replica FinalizedReplica, blk_1073746399_5575, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746399 for deletion 2025-07-12 21:10:44,976 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746399_5575 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746399 2025-07-12 21:11:41,729 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746400_5576 src: /192.168.158.1:52292 dest: /192.168.158.4:9866 2025-07-12 21:11:41,761 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52292, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2138533724_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746400_5576, duration(ns): 23321861 2025-07-12 21:11:41,762 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746400_5576, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-12 21:11:44,975 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746400_5576 replica FinalizedReplica, blk_1073746400_5576, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746400 for deletion 2025-07-12 21:11:44,976 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746400_5576 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746400 2025-07-12 21:14:41,748 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746403_5579 src: /192.168.158.6:41108 dest: /192.168.158.4:9866 2025-07-12 21:14:41,765 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:41108, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1699182991_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746403_5579, duration(ns): 14736874 2025-07-12 21:14:41,765 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746403_5579, type=LAST_IN_PIPELINE terminating 2025-07-12 21:14:44,984 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746403_5579 replica FinalizedReplica, blk_1073746403_5579, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746403 for deletion 2025-07-12 21:14:44,985 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746403_5579 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746403 2025-07-12 21:16:46,745 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746405_5581 src: /192.168.158.1:37670 dest: /192.168.158.4:9866 2025-07-12 21:16:46,775 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37670, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1970938932_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746405_5581, duration(ns): 21872670 2025-07-12 21:16:46,776 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746405_5581, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-12 21:16:47,993 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746405_5581 replica FinalizedReplica, blk_1073746405_5581, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746405 for deletion 2025-07-12 21:16:47,994 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746405_5581 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746405 2025-07-12 21:17:51,771 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746406_5582 src: /192.168.158.7:58390 dest: /192.168.158.4:9866 2025-07-12 21:17:51,788 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:58390, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_476603465_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746406_5582, duration(ns): 15326998 2025-07-12 21:17:51,789 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746406_5582, type=LAST_IN_PIPELINE terminating 2025-07-12 21:17:56,995 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746406_5582 replica FinalizedReplica, blk_1073746406_5582, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746406 for deletion 2025-07-12 21:17:56,996 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746406_5582 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746406 2025-07-12 21:18:56,743 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746407_5583 src: /192.168.158.1:58602 dest: /192.168.158.4:9866 2025-07-12 21:18:56,776 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58602, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2089079568_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746407_5583, duration(ns): 23395372 2025-07-12 21:18:56,776 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746407_5583, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-12 21:18:59,996 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746407_5583 replica FinalizedReplica, blk_1073746407_5583, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746407 for deletion 2025-07-12 21:18:59,997 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746407_5583 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746407 2025-07-12 21:19:56,745 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746408_5584 src: /192.168.158.1:54884 dest: /192.168.158.4:9866 2025-07-12 21:19:56,779 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54884, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_442838855_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746408_5584, duration(ns): 24059448 2025-07-12 21:19:56,779 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746408_5584, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-12 21:19:59,999 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746408_5584 replica FinalizedReplica, blk_1073746408_5584, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746408 for deletion 2025-07-12 21:20:00,000 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746408_5584 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746408 2025-07-12 21:24:06,769 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746412_5588 src: /192.168.158.9:33140 dest: /192.168.158.4:9866 2025-07-12 21:24:06,796 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:33140, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_282891120_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746412_5588, duration(ns): 21007598 2025-07-12 21:24:06,796 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746412_5588, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-12 21:24:12,007 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746412_5588 replica FinalizedReplica, blk_1073746412_5588, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746412 for deletion 2025-07-12 21:24:12,009 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746412_5588 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746412 2025-07-12 21:25:06,770 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746413_5589 src: /192.168.158.7:37134 dest: /192.168.158.4:9866 2025-07-12 21:25:06,789 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:37134, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1648547527_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746413_5589, duration(ns): 16497836 2025-07-12 21:25:06,789 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746413_5589, type=LAST_IN_PIPELINE terminating 2025-07-12 21:25:12,010 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746413_5589 replica FinalizedReplica, blk_1073746413_5589, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746413 for deletion 2025-07-12 21:25:12,011 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746413_5589 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746413 2025-07-12 21:29:16,779 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746417_5593 src: /192.168.158.5:50758 dest: /192.168.158.4:9866 2025-07-12 21:29:16,804 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:50758, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1245093072_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746417_5593, duration(ns): 19987278 2025-07-12 21:29:16,804 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746417_5593, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-12 21:29:21,020 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746417_5593 replica FinalizedReplica, blk_1073746417_5593, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746417 for deletion 2025-07-12 21:29:21,021 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746417_5593 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746417 2025-07-12 21:30:16,788 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746418_5594 src: /192.168.158.5:34782 dest: /192.168.158.4:9866 2025-07-12 21:30:16,811 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:34782, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-509817779_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746418_5594, duration(ns): 18261785 2025-07-12 21:30:16,812 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746418_5594, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-12 21:30:21,020 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746418_5594 replica FinalizedReplica, blk_1073746418_5594, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746418 for deletion 2025-07-12 21:30:21,021 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746418_5594 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746418 2025-07-12 21:31:21,784 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746419_5595 src: /192.168.158.1:47000 dest: /192.168.158.4:9866 2025-07-12 21:31:21,819 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47000, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1842288376_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746419_5595, duration(ns): 24286080 2025-07-12 21:31:21,819 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746419_5595, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-12 21:31:24,020 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746419_5595 replica FinalizedReplica, blk_1073746419_5595, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746419 for deletion 2025-07-12 21:31:24,021 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746419_5595 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746419 2025-07-12 21:32:26,785 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746420_5596 src: /192.168.158.9:52586 dest: /192.168.158.4:9866 2025-07-12 21:32:26,803 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:52586, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1740486487_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746420_5596, duration(ns): 16322669 2025-07-12 21:32:26,803 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746420_5596, type=LAST_IN_PIPELINE terminating 2025-07-12 21:32:33,025 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746420_5596 replica FinalizedReplica, blk_1073746420_5596, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746420 for deletion 2025-07-12 21:32:33,026 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746420_5596 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746420 2025-07-12 21:34:31,787 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746422_5598 src: /192.168.158.1:38762 dest: /192.168.158.4:9866 2025-07-12 21:34:31,818 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38762, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1285674439_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746422_5598, duration(ns): 21480505 2025-07-12 21:34:31,818 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746422_5598, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-12 21:34:33,027 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746422_5598 replica FinalizedReplica, blk_1073746422_5598, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746422 for deletion 2025-07-12 21:34:33,028 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746422_5598 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746422 2025-07-12 21:37:36,779 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746425_5601 src: /192.168.158.5:40660 dest: /192.168.158.4:9866 2025-07-12 21:37:36,807 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:40660, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2055796628_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746425_5601, duration(ns): 21460625 2025-07-12 21:37:36,808 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746425_5601, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-12 21:37:39,035 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746425_5601 replica FinalizedReplica, blk_1073746425_5601, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746425 for deletion 2025-07-12 21:37:39,036 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746425_5601 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746425 2025-07-12 21:38:36,776 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746426_5602 src: /192.168.158.1:55704 dest: /192.168.158.4:9866 2025-07-12 21:38:36,809 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55704, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-6535685_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746426_5602, duration(ns): 24120501 2025-07-12 21:38:36,809 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746426_5602, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-12 21:38:39,036 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746426_5602 replica FinalizedReplica, blk_1073746426_5602, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746426 for deletion 2025-07-12 21:38:39,038 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746426_5602 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746426 2025-07-12 21:39:36,773 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746427_5603 src: /192.168.158.1:54524 dest: /192.168.158.4:9866 2025-07-12 21:39:36,804 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54524, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2127036877_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746427_5603, duration(ns): 21984663 2025-07-12 21:39:36,804 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746427_5603, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-12 21:39:42,036 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746427_5603 replica FinalizedReplica, blk_1073746427_5603, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746427 for deletion 2025-07-12 21:39:42,038 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746427_5603 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746427 2025-07-12 21:42:46,793 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746430_5606 src: /192.168.158.6:38242 dest: /192.168.158.4:9866 2025-07-12 21:42:46,816 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:38242, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-114520072_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746430_5606, duration(ns): 18566156 2025-07-12 21:42:46,816 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746430_5606, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-12 21:42:48,041 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746430_5606 replica FinalizedReplica, blk_1073746430_5606, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746430 for deletion 2025-07-12 21:42:48,042 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746430_5606 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746430 2025-07-12 21:43:51,780 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746431_5607 src: /192.168.158.9:36452 dest: /192.168.158.4:9866 2025-07-12 21:43:51,805 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:36452, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1193342690_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746431_5607, duration(ns): 19522260 2025-07-12 21:43:51,805 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746431_5607, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-12 21:43:54,046 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746431_5607 replica FinalizedReplica, blk_1073746431_5607, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746431 for deletion 2025-07-12 21:43:54,047 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746431_5607 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073746431 2025-07-12 21:44:56,795 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746432_5608 src: /192.168.158.9:50868 dest: /192.168.158.4:9866 2025-07-12 21:44:56,821 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:50868, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1532771711_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746432_5608, duration(ns): 19950547 2025-07-12 21:44:56,821 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746432_5608, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-12 21:45:03,048 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746432_5608 replica FinalizedReplica, blk_1073746432_5608, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746432 for deletion 2025-07-12 21:45:03,049 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746432_5608 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746432 2025-07-12 21:47:56,804 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746435_5611 src: /192.168.158.9:35012 dest: /192.168.158.4:9866 2025-07-12 21:47:56,827 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:35012, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1244293730_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746435_5611, duration(ns): 17750907 2025-07-12 21:47:56,828 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746435_5611, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-12 21:48:03,052 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746435_5611 replica FinalizedReplica, blk_1073746435_5611, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746435 for deletion 2025-07-12 21:48:03,053 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746435_5611 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746435 2025-07-12 21:54:01,808 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746441_5617 src: /192.168.158.5:57224 dest: /192.168.158.4:9866 2025-07-12 21:54:01,832 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:57224, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-950908999_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746441_5617, duration(ns): 17917469 2025-07-12 21:54:01,832 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746441_5617, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-12 21:54:06,066 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746441_5617 replica FinalizedReplica, blk_1073746441_5617, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746441 for deletion 2025-07-12 21:54:06,067 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746441_5617 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746441 2025-07-12 21:56:06,802 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746443_5619 src: /192.168.158.6:44984 dest: /192.168.158.4:9866 2025-07-12 21:56:06,820 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:44984, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1839033430_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746443_5619, duration(ns): 15463361 2025-07-12 21:56:06,820 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746443_5619, type=LAST_IN_PIPELINE terminating 2025-07-12 21:56:09,069 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746443_5619 replica FinalizedReplica, blk_1073746443_5619, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746443 for deletion 2025-07-12 21:56:09,070 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746443_5619 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746443 2025-07-12 22:03:21,820 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746450_5626 src: /192.168.158.5:57610 dest: /192.168.158.4:9866 2025-07-12 22:03:21,848 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:57610, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2082869155_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746450_5626, duration(ns): 22512369 2025-07-12 22:03:21,848 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746450_5626, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-12 22:03:24,083 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746450_5626 replica FinalizedReplica, blk_1073746450_5626, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746450 for deletion 2025-07-12 22:03:24,084 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746450_5626 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746450 2025-07-12 22:04:26,804 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746451_5627 src: /192.168.158.1:40536 dest: /192.168.158.4:9866 2025-07-12 22:04:26,837 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40536, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1805096737_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746451_5627, duration(ns): 22339485 2025-07-12 22:04:26,837 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746451_5627, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-12 22:04:30,084 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746451_5627 replica FinalizedReplica, blk_1073746451_5627, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746451 for deletion 2025-07-12 22:04:30,086 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746451_5627 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746451 2025-07-12 22:07:31,830 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746454_5630 src: /192.168.158.5:49636 dest: /192.168.158.4:9866 2025-07-12 22:07:31,855 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:49636, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-884697087_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746454_5630, duration(ns): 19458427 2025-07-12 22:07:31,855 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746454_5630, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-12 22:07:33,092 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746454_5630 replica FinalizedReplica, blk_1073746454_5630, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746454 for deletion 2025-07-12 22:07:33,093 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746454_5630 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746454 2025-07-12 22:10:36,841 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746457_5633 src: /192.168.158.1:56122 dest: /192.168.158.4:9866 2025-07-12 22:10:36,871 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56122, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-706245201_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746457_5633, duration(ns): 21099266 2025-07-12 22:10:36,871 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746457_5633, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-12 22:10:39,102 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746457_5633 replica FinalizedReplica, blk_1073746457_5633, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746457 for deletion 2025-07-12 22:10:39,103 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746457_5633 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746457 2025-07-12 22:13:36,827 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746460_5636 src: /192.168.158.8:46504 dest: /192.168.158.4:9866 2025-07-12 22:13:36,844 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:46504, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1012982286_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746460_5636, duration(ns): 14985416 2025-07-12 22:13:36,845 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746460_5636, type=LAST_IN_PIPELINE terminating 2025-07-12 22:13:39,111 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746460_5636 replica FinalizedReplica, blk_1073746460_5636, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746460 for deletion 2025-07-12 22:13:39,112 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746460_5636 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746460 2025-07-12 22:15:36,820 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746462_5638 src: /192.168.158.1:35330 dest: /192.168.158.4:9866 2025-07-12 22:15:36,853 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35330, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1127682305_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746462_5638, duration(ns): 24464534 2025-07-12 22:15:36,854 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746462_5638, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-12 22:15:39,117 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746462_5638 replica FinalizedReplica, blk_1073746462_5638, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746462 for deletion 2025-07-12 22:15:39,119 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746462_5638 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746462 2025-07-12 22:17:41,830 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746464_5640 src: /192.168.158.6:48110 dest: /192.168.158.4:9866 2025-07-12 22:17:41,848 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:48110, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-54137198_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746464_5640, duration(ns): 15223067 2025-07-12 22:17:41,848 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746464_5640, type=LAST_IN_PIPELINE terminating 2025-07-12 22:17:48,122 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746464_5640 replica FinalizedReplica, blk_1073746464_5640, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746464 for deletion 2025-07-12 22:17:48,123 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746464_5640 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746464 2025-07-12 22:18:41,829 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746465_5641 src: /192.168.158.5:44922 dest: /192.168.158.4:9866 2025-07-12 22:18:41,854 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:44922, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2022960534_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746465_5641, duration(ns): 19149835 2025-07-12 22:18:41,854 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746465_5641, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-12 22:18:45,121 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746465_5641 replica FinalizedReplica, blk_1073746465_5641, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746465 for deletion 2025-07-12 22:18:45,122 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746465_5641 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746465 2025-07-12 22:21:46,842 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746468_5644 src: /192.168.158.1:51186 dest: /192.168.158.4:9866 2025-07-12 22:21:46,873 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51186, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2008204903_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746468_5644, duration(ns): 22319175 2025-07-12 22:21:46,873 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746468_5644, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-12 22:21:51,135 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746468_5644 replica FinalizedReplica, blk_1073746468_5644, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746468 for deletion 2025-07-12 22:21:51,136 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746468_5644 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746468 2025-07-12 22:23:46,847 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746470_5646 src: /192.168.158.1:52442 dest: /192.168.158.4:9866 2025-07-12 22:23:46,882 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52442, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1913541011_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746470_5646, duration(ns): 25600988 2025-07-12 22:23:46,882 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746470_5646, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-12 22:23:48,138 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746470_5646 replica FinalizedReplica, blk_1073746470_5646, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746470 for deletion 2025-07-12 22:23:48,139 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746470_5646 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746470 2025-07-12 22:24:46,850 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746471_5647 src: /192.168.158.1:34642 dest: /192.168.158.4:9866 2025-07-12 22:24:46,881 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34642, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_562544260_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746471_5647, duration(ns): 21798138 2025-07-12 22:24:46,881 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746471_5647, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-12 22:24:51,143 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746471_5647 replica FinalizedReplica, blk_1073746471_5647, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746471 for deletion 2025-07-12 22:24:51,144 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746471_5647 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746471 2025-07-12 22:25:51,856 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746472_5648 src: /192.168.158.7:34568 dest: /192.168.158.4:9866 2025-07-12 22:25:51,881 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:34568, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1782734077_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746472_5648, duration(ns): 19805610 2025-07-12 22:25:51,882 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746472_5648, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-12 22:25:54,145 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746472_5648 replica FinalizedReplica, blk_1073746472_5648, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746472 for deletion 2025-07-12 22:25:54,146 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746472_5648 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746472 2025-07-12 22:27:56,853 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746474_5650 src: /192.168.158.9:45730 dest: /192.168.158.4:9866 2025-07-12 22:27:56,872 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:45730, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-41052100_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746474_5650, duration(ns): 16203578 2025-07-12 22:27:56,872 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746474_5650, type=LAST_IN_PIPELINE terminating 2025-07-12 22:28:00,180 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746474_5650 replica FinalizedReplica, blk_1073746474_5650, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746474 for deletion 2025-07-12 22:28:00,181 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746474_5650 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746474 2025-07-12 22:28:56,844 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746475_5651 src: /192.168.158.1:53130 dest: /192.168.158.4:9866 2025-07-12 22:28:56,876 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53130, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1187865365_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746475_5651, duration(ns): 22049139 2025-07-12 22:28:56,876 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746475_5651, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-12 22:29:00,149 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746475_5651 replica FinalizedReplica, blk_1073746475_5651, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746475 for deletion 2025-07-12 22:29:00,150 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746475_5651 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746475 2025-07-12 22:30:56,851 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746477_5653 src: /192.168.158.1:44480 dest: /192.168.158.4:9866 2025-07-12 22:30:56,882 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44480, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1439721119_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746477_5653, duration(ns): 22189431 2025-07-12 22:30:56,882 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746477_5653, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-12 22:31:03,155 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746477_5653 replica FinalizedReplica, blk_1073746477_5653, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746477 for deletion 2025-07-12 22:31:03,156 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746477_5653 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746477 2025-07-12 22:31:56,874 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746478_5654 src: /192.168.158.1:33002 dest: /192.168.158.4:9866 2025-07-12 22:31:56,907 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33002, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1722313000_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746478_5654, duration(ns): 23950981 2025-07-12 22:31:56,908 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746478_5654, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-12 22:32:00,155 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746478_5654 replica FinalizedReplica, blk_1073746478_5654, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746478 for deletion 2025-07-12 22:32:00,156 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746478_5654 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746478 2025-07-12 22:33:56,874 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746480_5656 src: /192.168.158.6:58384 dest: /192.168.158.4:9866 2025-07-12 22:33:56,893 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:58384, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1333187448_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746480_5656, duration(ns): 16682436 2025-07-12 22:33:56,893 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746480_5656, type=LAST_IN_PIPELINE terminating 2025-07-12 22:34:03,158 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746480_5656 replica FinalizedReplica, blk_1073746480_5656, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746480 for deletion 2025-07-12 22:34:03,159 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746480_5656 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746480 2025-07-12 22:34:56,870 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746481_5657 src: /192.168.158.7:45360 dest: /192.168.158.4:9866 2025-07-12 22:34:56,888 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:45360, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1507944480_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746481_5657, duration(ns): 15247131 2025-07-12 22:34:56,888 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746481_5657, type=LAST_IN_PIPELINE terminating 2025-07-12 22:35:00,157 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746481_5657 replica FinalizedReplica, blk_1073746481_5657, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746481 for deletion 2025-07-12 22:35:00,159 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746481_5657 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746481 2025-07-12 22:37:56,873 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746484_5660 src: /192.168.158.9:41066 dest: /192.168.158.4:9866 2025-07-12 22:37:56,899 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:41066, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_155527136_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746484_5660, duration(ns): 20681140 2025-07-12 22:37:56,899 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746484_5660, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-12 22:38:00,162 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746484_5660 replica FinalizedReplica, blk_1073746484_5660, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746484 for deletion 2025-07-12 22:38:00,163 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746484_5660 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746484 2025-07-12 22:40:01,871 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746486_5662 src: /192.168.158.1:40308 dest: /192.168.158.4:9866 2025-07-12 22:40:01,904 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40308, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_79407840_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746486_5662, duration(ns): 24130697 2025-07-12 22:40:01,904 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746486_5662, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-12 22:40:06,166 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746486_5662 replica FinalizedReplica, blk_1073746486_5662, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746486 for deletion 2025-07-12 22:40:06,167 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746486_5662 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746486 2025-07-12 22:42:11,935 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746488_5664 src: /192.168.158.1:43552 dest: /192.168.158.4:9866 2025-07-12 22:42:11,968 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43552, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_616870180_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746488_5664, duration(ns): 23369623 2025-07-12 22:42:11,968 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746488_5664, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-12 22:42:15,172 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746488_5664 replica FinalizedReplica, blk_1073746488_5664, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746488 for deletion 2025-07-12 22:42:15,174 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746488_5664 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746488 2025-07-12 22:43:11,913 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746489_5665 src: /192.168.158.9:33594 dest: /192.168.158.4:9866 2025-07-12 22:43:11,938 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:33594, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1646096078_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746489_5665, duration(ns): 20013557 2025-07-12 22:43:11,938 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746489_5665, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-12 22:43:15,175 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746489_5665 replica FinalizedReplica, blk_1073746489_5665, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746489 for deletion 2025-07-12 22:43:15,176 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746489_5665 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746489 2025-07-12 22:45:11,889 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746491_5667 src: /192.168.158.9:57100 dest: /192.168.158.4:9866 2025-07-12 22:45:11,913 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:57100, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1209828144_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746491_5667, duration(ns): 18968145 2025-07-12 22:45:11,914 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746491_5667, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-12 22:45:18,178 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746491_5667 replica FinalizedReplica, blk_1073746491_5667, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746491 for deletion 2025-07-12 22:45:18,179 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746491_5667 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746491 2025-07-12 22:46:11,884 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746492_5668 src: /192.168.158.1:51862 dest: /192.168.158.4:9866 2025-07-12 22:46:11,914 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51862, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1519489692_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746492_5668, duration(ns): 20877501 2025-07-12 22:46:11,914 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746492_5668, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-12 22:46:15,181 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746492_5668 replica FinalizedReplica, blk_1073746492_5668, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746492 for deletion 2025-07-12 22:46:15,182 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746492_5668 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746492 2025-07-12 22:47:11,894 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746493_5669 src: /192.168.158.8:50086 dest: /192.168.158.4:9866 2025-07-12 22:47:11,912 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:50086, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1110257289_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746493_5669, duration(ns): 15456623 2025-07-12 22:47:11,912 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746493_5669, type=LAST_IN_PIPELINE terminating 2025-07-12 22:47:15,183 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746493_5669 replica FinalizedReplica, blk_1073746493_5669, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746493 for deletion 2025-07-12 22:47:15,184 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746493_5669 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746493 2025-07-12 22:49:11,894 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746495_5671 src: /192.168.158.7:50892 dest: /192.168.158.4:9866 2025-07-12 22:49:11,918 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:50892, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1931160289_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746495_5671, duration(ns): 18846340 2025-07-12 22:49:11,918 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746495_5671, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-12 22:49:18,183 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746495_5671 replica FinalizedReplica, blk_1073746495_5671, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746495 for deletion 2025-07-12 22:49:18,185 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746495_5671 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746495 2025-07-12 22:51:11,900 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746497_5673 src: /192.168.158.9:40094 dest: /192.168.158.4:9866 2025-07-12 22:51:11,918 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:40094, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1852833993_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746497_5673, duration(ns): 15312509 2025-07-12 22:51:11,918 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746497_5673, type=LAST_IN_PIPELINE terminating 2025-07-12 22:51:15,187 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746497_5673 replica FinalizedReplica, blk_1073746497_5673, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746497 for deletion 2025-07-12 22:51:15,189 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746497_5673 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746497 2025-07-12 22:52:11,901 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746498_5674 src: /192.168.158.5:53388 dest: /192.168.158.4:9866 2025-07-12 22:52:11,920 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:53388, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1838636728_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746498_5674, duration(ns): 16328531 2025-07-12 22:52:11,920 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746498_5674, type=LAST_IN_PIPELINE terminating 2025-07-12 22:52:15,192 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746498_5674 replica FinalizedReplica, blk_1073746498_5674, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746498 for deletion 2025-07-12 22:52:15,194 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746498_5674 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746498 2025-07-12 22:53:16,899 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746499_5675 src: /192.168.158.1:49246 dest: /192.168.158.4:9866 2025-07-12 22:53:16,930 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49246, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_526274264_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746499_5675, duration(ns): 21520481 2025-07-12 22:53:16,930 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746499_5675, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-12 22:53:21,193 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746499_5675 replica FinalizedReplica, blk_1073746499_5675, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746499 for deletion 2025-07-12 22:53:21,194 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746499_5675 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746499 2025-07-12 22:54:16,900 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746500_5676 src: /192.168.158.7:44218 dest: /192.168.158.4:9866 2025-07-12 22:54:16,926 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:44218, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_645665801_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746500_5676, duration(ns): 20118372 2025-07-12 22:54:16,926 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746500_5676, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-12 22:54:18,194 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746500_5676 replica FinalizedReplica, blk_1073746500_5676, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746500 for deletion 2025-07-12 22:54:18,195 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746500_5676 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746500 2025-07-12 22:56:26,915 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746502_5678 src: /192.168.158.1:59116 dest: /192.168.158.4:9866 2025-07-12 22:56:26,949 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59116, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1401092626_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746502_5678, duration(ns): 24717397 2025-07-12 22:56:26,949 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746502_5678, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-12 22:56:33,198 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746502_5678 replica FinalizedReplica, blk_1073746502_5678, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746502 for deletion 2025-07-12 22:56:33,199 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746502_5678 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746502 2025-07-12 22:57:26,924 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746503_5679 src: /192.168.158.5:42284 dest: /192.168.158.4:9866 2025-07-12 22:57:26,942 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:42284, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-719167949_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746503_5679, duration(ns): 15322346 2025-07-12 22:57:26,942 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746503_5679, type=LAST_IN_PIPELINE terminating 2025-07-12 22:57:30,200 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746503_5679 replica FinalizedReplica, blk_1073746503_5679, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746503 for deletion 2025-07-12 22:57:30,201 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746503_5679 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746503 2025-07-12 22:59:31,910 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746505_5681 src: /192.168.158.7:49854 dest: /192.168.158.4:9866 2025-07-12 22:59:31,936 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:49854, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1241463730_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746505_5681, duration(ns): 20210568 2025-07-12 22:59:31,936 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746505_5681, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-12 22:59:36,203 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746505_5681 replica FinalizedReplica, blk_1073746505_5681, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746505 for deletion 2025-07-12 22:59:36,204 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746505_5681 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746505 2025-07-12 23:00:31,906 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746506_5682 src: /192.168.158.1:37346 dest: /192.168.158.4:9866 2025-07-12 23:00:31,936 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37346, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1295755860_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746506_5682, duration(ns): 20842319 2025-07-12 23:00:31,936 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746506_5682, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-12 23:00:33,207 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746506_5682 replica FinalizedReplica, blk_1073746506_5682, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746506 for deletion 2025-07-12 23:00:33,208 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746506_5682 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746506 2025-07-12 23:03:36,925 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746509_5685 src: /192.168.158.9:59016 dest: /192.168.158.4:9866 2025-07-12 23:03:36,945 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:59016, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-233238386_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746509_5685, duration(ns): 17376058 2025-07-12 23:03:36,945 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746509_5685, type=LAST_IN_PIPELINE terminating 2025-07-12 23:03:39,214 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746509_5685 replica FinalizedReplica, blk_1073746509_5685, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746509 for deletion 2025-07-12 23:03:39,215 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746509_5685 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746509 2025-07-12 23:04:36,916 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746510_5686 src: /192.168.158.8:39948 dest: /192.168.158.4:9866 2025-07-12 23:04:36,943 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:39948, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1440037266_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746510_5686, duration(ns): 21176907 2025-07-12 23:04:36,943 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746510_5686, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-12 23:04:39,215 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746510_5686 replica FinalizedReplica, blk_1073746510_5686, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746510 for deletion 2025-07-12 23:04:39,217 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746510_5686 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746510 2025-07-12 23:06:41,926 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746512_5688 src: /192.168.158.8:54282 dest: /192.168.158.4:9866 2025-07-12 23:06:41,950 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:54282, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-964416462_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746512_5688, duration(ns): 18210325 2025-07-12 23:06:41,950 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746512_5688, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-12 23:06:48,221 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746512_5688 replica FinalizedReplica, blk_1073746512_5688, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746512 for deletion 2025-07-12 23:06:48,222 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746512_5688 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746512 2025-07-12 23:07:41,920 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746513_5689 src: /192.168.158.5:34702 dest: /192.168.158.4:9866 2025-07-12 23:07:41,945 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:34702, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1039555414_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746513_5689, duration(ns): 19293452 2025-07-12 23:07:41,945 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746513_5689, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-12 23:07:45,222 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746513_5689 replica FinalizedReplica, blk_1073746513_5689, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746513 for deletion 2025-07-12 23:07:45,224 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746513_5689 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746513 2025-07-12 23:10:46,929 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746516_5692 src: /192.168.158.7:53056 dest: /192.168.158.4:9866 2025-07-12 23:10:46,947 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:53056, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1433867499_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746516_5692, duration(ns): 15305986 2025-07-12 23:10:46,947 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746516_5692, type=LAST_IN_PIPELINE terminating 2025-07-12 23:10:48,230 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746516_5692 replica FinalizedReplica, blk_1073746516_5692, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746516 for deletion 2025-07-12 23:10:48,231 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746516_5692 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746516 2025-07-12 23:11:51,927 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746517_5693 src: /192.168.158.9:52100 dest: /192.168.158.4:9866 2025-07-12 23:11:51,944 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:52100, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-655395472_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746517_5693, duration(ns): 15004950 2025-07-12 23:11:51,945 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746517_5693, type=LAST_IN_PIPELINE terminating 2025-07-12 23:11:54,231 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746517_5693 replica FinalizedReplica, blk_1073746517_5693, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746517 for deletion 2025-07-12 23:11:54,233 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746517_5693 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746517 2025-07-12 23:19:56,952 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746525_5701 src: /192.168.158.6:53484 dest: /192.168.158.4:9866 2025-07-12 23:19:56,976 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:53484, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-370600242_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746525_5701, duration(ns): 18506487 2025-07-12 23:19:56,977 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746525_5701, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-12 23:20:00,252 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746525_5701 replica FinalizedReplica, blk_1073746525_5701, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746525 for deletion 2025-07-12 23:20:00,253 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746525_5701 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746525 2025-07-12 23:21:56,947 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746527_5703 src: /192.168.158.7:48732 dest: /192.168.158.4:9866 2025-07-12 23:21:56,971 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:48732, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1408437716_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746527_5703, duration(ns): 18557100 2025-07-12 23:21:56,971 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746527_5703, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-12 23:22:00,259 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746527_5703 replica FinalizedReplica, blk_1073746527_5703, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746527 for deletion 2025-07-12 23:22:00,260 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746527_5703 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746527 2025-07-12 23:22:56,954 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746528_5704 src: /192.168.158.7:38286 dest: /192.168.158.4:9866 2025-07-12 23:22:56,979 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:38286, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_891726939_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746528_5704, duration(ns): 20158768 2025-07-12 23:22:56,980 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746528_5704, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-12 23:23:00,264 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746528_5704 replica FinalizedReplica, blk_1073746528_5704, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746528 for deletion 2025-07-12 23:23:00,265 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746528_5704 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746528 2025-07-12 23:24:56,958 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746530_5706 src: /192.168.158.1:57522 dest: /192.168.158.4:9866 2025-07-12 23:24:56,990 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57522, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1392492473_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746530_5706, duration(ns): 23472439 2025-07-12 23:24:56,990 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746530_5706, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-12 23:25:00,264 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746530_5706 replica FinalizedReplica, blk_1073746530_5706, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746530 for deletion 2025-07-12 23:25:00,265 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746530_5706 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746530 2025-07-12 23:26:56,967 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746532_5708 src: /192.168.158.8:40802 dest: /192.168.158.4:9866 2025-07-12 23:26:56,985 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:40802, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1430769954_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746532_5708, duration(ns): 15906265 2025-07-12 23:26:56,985 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746532_5708, type=LAST_IN_PIPELINE terminating 2025-07-12 23:27:00,270 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746532_5708 replica FinalizedReplica, blk_1073746532_5708, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746532 for deletion 2025-07-12 23:27:00,272 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746532_5708 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746532 2025-07-12 23:31:01,979 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746536_5712 src: /192.168.158.6:41822 dest: /192.168.158.4:9866 2025-07-12 23:31:01,997 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:41822, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1718661468_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746536_5712, duration(ns): 15949704 2025-07-12 23:31:01,997 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746536_5712, type=LAST_IN_PIPELINE terminating 2025-07-12 23:31:03,279 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746536_5712 replica FinalizedReplica, blk_1073746536_5712, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746536 for deletion 2025-07-12 23:31:03,280 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746536_5712 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746536 2025-07-12 23:34:06,988 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746539_5715 src: /192.168.158.9:53492 dest: /192.168.158.4:9866 2025-07-12 23:34:07,006 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:53492, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-322261172_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746539_5715, duration(ns): 15697992 2025-07-12 23:34:07,006 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746539_5715, type=LAST_IN_PIPELINE terminating 2025-07-12 23:34:12,279 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746539_5715 replica FinalizedReplica, blk_1073746539_5715, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746539 for deletion 2025-07-12 23:34:12,281 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746539_5715 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746539 2025-07-12 23:35:11,981 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746540_5716 src: /192.168.158.6:35296 dest: /192.168.158.4:9866 2025-07-12 23:35:11,999 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:35296, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_340132742_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746540_5716, duration(ns): 15446803 2025-07-12 23:35:11,999 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746540_5716, type=LAST_IN_PIPELINE terminating 2025-07-12 23:35:18,284 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746540_5716 replica FinalizedReplica, blk_1073746540_5716, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746540 for deletion 2025-07-12 23:35:18,285 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746540_5716 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746540 2025-07-12 23:36:11,983 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746541_5717 src: /192.168.158.9:45338 dest: /192.168.158.4:9866 2025-07-12 23:36:12,007 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:45338, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_627988541_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746541_5717, duration(ns): 18532580 2025-07-12 23:36:12,008 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746541_5717, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-12 23:36:13,268 INFO org.apache.hadoop.hdfs.server.datanode.DirectoryScanner: BlockPool BP-1059995147-192.168.158.1-1752101929360 Total blocks: 9, missing metadata files:0, missing block files:0, missing blocks in memory:0, mismatched blocks:0 2025-07-12 23:36:15,288 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746541_5717 replica FinalizedReplica, blk_1073746541_5717, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746541 for deletion 2025-07-12 23:36:15,289 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746541_5717 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746541 2025-07-12 23:37:21,293 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Successfully sent block report 0x12cfd9a757d31f32, containing 4 storage report(s), of which we sent 4. The reports had 8 total blocks and used 1 RPC(s). This took 1 msec to generate and 3 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5. 2025-07-12 23:37:21,294 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Got finalize command for block pool BP-1059995147-192.168.158.1-1752101929360 2025-07-12 23:38:11,984 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746543_5719 src: /192.168.158.5:50664 dest: /192.168.158.4:9866 2025-07-12 23:38:12,004 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:50664, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1477124750_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746543_5719, duration(ns): 17760953 2025-07-12 23:38:12,005 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746543_5719, type=LAST_IN_PIPELINE terminating 2025-07-12 23:38:18,291 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746543_5719 replica FinalizedReplica, blk_1073746543_5719, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746543 for deletion 2025-07-12 23:38:18,292 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746543_5719 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746543 2025-07-12 23:39:16,983 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746544_5720 src: /192.168.158.8:58558 dest: /192.168.158.4:9866 2025-07-12 23:39:17,008 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:58558, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1599441426_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746544_5720, duration(ns): 17892908 2025-07-12 23:39:17,008 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746544_5720, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-12 23:39:18,291 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746544_5720 replica FinalizedReplica, blk_1073746544_5720, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746544 for deletion 2025-07-12 23:39:18,292 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746544_5720 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746544 2025-07-12 23:41:16,987 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746546_5722 src: /192.168.158.7:52334 dest: /192.168.158.4:9866 2025-07-12 23:41:17,013 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:52334, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-12456248_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746546_5722, duration(ns): 20000653 2025-07-12 23:41:17,013 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746546_5722, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-12 23:41:18,292 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746546_5722 replica FinalizedReplica, blk_1073746546_5722, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746546 for deletion 2025-07-12 23:41:18,293 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746546_5722 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746546 2025-07-12 23:42:21,985 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746547_5723 src: /192.168.158.1:49404 dest: /192.168.158.4:9866 2025-07-12 23:42:22,015 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49404, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-973398149_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746547_5723, duration(ns): 21805032 2025-07-12 23:42:22,016 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746547_5723, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-12 23:42:24,295 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746547_5723 replica FinalizedReplica, blk_1073746547_5723, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746547 for deletion 2025-07-12 23:42:24,296 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746547_5723 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746547 2025-07-12 23:44:26,993 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746549_5725 src: /192.168.158.6:49400 dest: /192.168.158.4:9866 2025-07-12 23:44:27,016 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:49400, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2147109879_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746549_5725, duration(ns): 17295687 2025-07-12 23:44:27,019 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746549_5725, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-12 23:44:30,302 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746549_5725 replica FinalizedReplica, blk_1073746549_5725, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746549 for deletion 2025-07-12 23:44:30,303 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746549_5725 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746549 2025-07-12 23:46:26,993 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746551_5727 src: /192.168.158.6:49738 dest: /192.168.158.4:9866 2025-07-12 23:46:27,016 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:49738, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-963849500_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746551_5727, duration(ns): 17609239 2025-07-12 23:46:27,016 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746551_5727, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-12 23:46:27,304 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746551_5727 replica FinalizedReplica, blk_1073746551_5727, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746551 for deletion 2025-07-12 23:46:27,305 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746551_5727 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746551 2025-07-12 23:47:31,995 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746552_5728 src: /192.168.158.6:43710 dest: /192.168.158.4:9866 2025-07-12 23:47:32,021 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:43710, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1971854178_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746552_5728, duration(ns): 20452243 2025-07-12 23:47:32,021 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746552_5728, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-12 23:47:33,310 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746552_5728 replica FinalizedReplica, blk_1073746552_5728, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746552 for deletion 2025-07-12 23:47:33,311 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746552_5728 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746552 2025-07-12 23:49:36,995 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746554_5730 src: /192.168.158.1:53070 dest: /192.168.158.4:9866 2025-07-12 23:49:37,025 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53070, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-278152947_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746554_5730, duration(ns): 21371499 2025-07-12 23:49:37,025 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746554_5730, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-12 23:49:39,313 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746554_5730 replica FinalizedReplica, blk_1073746554_5730, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746554 for deletion 2025-07-12 23:49:39,314 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746554_5730 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746554 2025-07-12 23:50:37,000 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746555_5731 src: /192.168.158.6:40180 dest: /192.168.158.4:9866 2025-07-12 23:50:37,025 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:40180, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1279735002_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746555_5731, duration(ns): 19768409 2025-07-12 23:50:37,025 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746555_5731, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-12 23:50:42,313 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746555_5731 replica FinalizedReplica, blk_1073746555_5731, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746555 for deletion 2025-07-12 23:50:42,315 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746555_5731 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746555 2025-07-12 23:52:37,006 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746557_5733 src: /192.168.158.9:42366 dest: /192.168.158.4:9866 2025-07-12 23:52:37,022 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:42366, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1815969561_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746557_5733, duration(ns): 14209094 2025-07-12 23:52:37,022 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746557_5733, type=LAST_IN_PIPELINE terminating 2025-07-12 23:52:39,314 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746557_5733 replica FinalizedReplica, blk_1073746557_5733, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746557 for deletion 2025-07-12 23:52:39,315 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746557_5733 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746557 2025-07-12 23:54:36,999 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746559_5735 src: /192.168.158.1:35264 dest: /192.168.158.4:9866 2025-07-12 23:54:37,029 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35264, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-783079092_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746559_5735, duration(ns): 21070151 2025-07-12 23:54:37,030 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746559_5735, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-12 23:54:39,316 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746559_5735 replica FinalizedReplica, blk_1073746559_5735, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746559 for deletion 2025-07-12 23:54:39,317 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746559_5735 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746559 2025-07-12 23:58:42,010 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746563_5739 src: /192.168.158.6:48622 dest: /192.168.158.4:9866 2025-07-12 23:58:42,033 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:48622, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_596194099_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746563_5739, duration(ns): 17154120 2025-07-12 23:58:42,033 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746563_5739, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-12 23:58:42,322 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746563_5739 replica FinalizedReplica, blk_1073746563_5739, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746563 for deletion 2025-07-12 23:58:42,323 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746563_5739 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746563 2025-07-12 23:59:47,008 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746564_5740 src: /192.168.158.1:43996 dest: /192.168.158.4:9866 2025-07-12 23:59:47,038 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43996, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-21531270_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746564_5740, duration(ns): 21882871 2025-07-12 23:59:47,039 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746564_5740, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-12 23:59:51,325 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746564_5740 replica FinalizedReplica, blk_1073746564_5740, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746564 for deletion 2025-07-12 23:59:51,327 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746564_5740 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746564 2025-07-13 00:00:47,020 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746565_5741 src: /192.168.158.1:32952 dest: /192.168.158.4:9866 2025-07-13 00:00:47,052 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:32952, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_563132062_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746565_5741, duration(ns): 22698305 2025-07-13 00:00:47,052 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746565_5741, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-13 00:00:51,328 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746565_5741 replica FinalizedReplica, blk_1073746565_5741, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746565 for deletion 2025-07-13 00:00:51,329 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746565_5741 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746565 2025-07-13 00:01:47,023 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746566_5742 src: /192.168.158.6:41928 dest: /192.168.158.4:9866 2025-07-13 00:01:47,043 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:41928, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1087012819_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746566_5742, duration(ns): 17003984 2025-07-13 00:01:47,043 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746566_5742, type=LAST_IN_PIPELINE terminating 2025-07-13 00:01:51,331 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746566_5742 replica FinalizedReplica, blk_1073746566_5742, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746566 for deletion 2025-07-13 00:01:51,332 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746566_5742 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746566 2025-07-13 00:02:52,014 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746567_5743 src: /192.168.158.1:35768 dest: /192.168.158.4:9866 2025-07-13 00:02:52,045 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35768, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-810151592_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746567_5743, duration(ns): 22295759 2025-07-13 00:02:52,045 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746567_5743, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-13 00:02:57,336 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746567_5743 replica FinalizedReplica, blk_1073746567_5743, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746567 for deletion 2025-07-13 00:02:57,337 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746567_5743 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746567 2025-07-13 00:05:57,033 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746570_5746 src: /192.168.158.5:41354 dest: /192.168.158.4:9866 2025-07-13 00:05:57,053 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:41354, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_145581449_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746570_5746, duration(ns): 17466243 2025-07-13 00:05:57,053 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746570_5746, type=LAST_IN_PIPELINE terminating 2025-07-13 00:05:57,340 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746570_5746 replica FinalizedReplica, blk_1073746570_5746, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746570 for deletion 2025-07-13 00:05:57,341 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746570_5746 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746570 2025-07-13 00:08:02,033 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746572_5748 src: /192.168.158.1:37098 dest: /192.168.158.4:9866 2025-07-13 00:08:02,066 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37098, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1203775377_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746572_5748, duration(ns): 22776995 2025-07-13 00:08:02,066 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746572_5748, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-13 00:08:03,342 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746572_5748 replica FinalizedReplica, blk_1073746572_5748, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746572 for deletion 2025-07-13 00:08:03,343 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746572_5748 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746572 2025-07-13 00:09:02,027 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746573_5749 src: /192.168.158.8:55136 dest: /192.168.158.4:9866 2025-07-13 00:09:02,052 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:55136, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-165352290_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746573_5749, duration(ns): 19282872 2025-07-13 00:09:02,052 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746573_5749, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-13 00:09:06,343 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746573_5749 replica FinalizedReplica, blk_1073746573_5749, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746573 for deletion 2025-07-13 00:09:06,344 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746573_5749 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746573 2025-07-13 00:10:07,036 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746574_5750 src: /192.168.158.5:60344 dest: /192.168.158.4:9866 2025-07-13 00:10:07,052 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:60344, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1273549312_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746574_5750, duration(ns): 14020645 2025-07-13 00:10:07,052 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746574_5750, type=LAST_IN_PIPELINE terminating 2025-07-13 00:10:12,346 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746574_5750 replica FinalizedReplica, blk_1073746574_5750, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746574 for deletion 2025-07-13 00:10:12,348 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746574_5750 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746574 2025-07-13 00:12:12,031 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746576_5752 src: /192.168.158.1:34410 dest: /192.168.158.4:9866 2025-07-13 00:12:12,064 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34410, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_796501480_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746576_5752, duration(ns): 24564011 2025-07-13 00:12:12,064 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746576_5752, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-13 00:12:12,351 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746576_5752 replica FinalizedReplica, blk_1073746576_5752, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746576 for deletion 2025-07-13 00:12:12,353 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746576_5752 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746576 2025-07-13 00:13:12,041 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746577_5753 src: /192.168.158.6:53158 dest: /192.168.158.4:9866 2025-07-13 00:13:12,057 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:53158, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1056949355_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746577_5753, duration(ns): 14675572 2025-07-13 00:13:12,058 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746577_5753, type=LAST_IN_PIPELINE terminating 2025-07-13 00:13:12,355 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746577_5753 replica FinalizedReplica, blk_1073746577_5753, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746577 for deletion 2025-07-13 00:13:12,356 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746577_5753 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746577 2025-07-13 00:16:17,048 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746580_5756 src: /192.168.158.1:43822 dest: /192.168.158.4:9866 2025-07-13 00:16:17,082 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43822, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1381980628_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746580_5756, duration(ns): 24133677 2025-07-13 00:16:17,082 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746580_5756, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-13 00:16:18,360 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746580_5756 replica FinalizedReplica, blk_1073746580_5756, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746580 for deletion 2025-07-13 00:16:18,361 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746580_5756 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746580 2025-07-13 00:17:17,046 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746581_5757 src: /192.168.158.8:42394 dest: /192.168.158.4:9866 2025-07-13 00:17:17,076 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:42394, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1425827444_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746581_5757, duration(ns): 24231535 2025-07-13 00:17:17,076 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746581_5757, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-13 00:17:18,361 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746581_5757 replica FinalizedReplica, blk_1073746581_5757, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746581 for deletion 2025-07-13 00:17:18,363 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746581_5757 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746581 2025-07-13 00:22:17,059 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746586_5762 src: /192.168.158.1:48162 dest: /192.168.158.4:9866 2025-07-13 00:22:17,090 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48162, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_197106750_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746586_5762, duration(ns): 22847498 2025-07-13 00:22:17,090 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746586_5762, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-13 00:22:18,374 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746586_5762 replica FinalizedReplica, blk_1073746586_5762, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746586 for deletion 2025-07-13 00:22:18,375 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746586_5762 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746586 2025-07-13 00:23:17,057 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746587_5763 src: /192.168.158.8:47446 dest: /192.168.158.4:9866 2025-07-13 00:23:17,081 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:47446, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-534272854_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746587_5763, duration(ns): 18923296 2025-07-13 00:23:17,081 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746587_5763, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-13 00:23:18,375 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746587_5763 replica FinalizedReplica, blk_1073746587_5763, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746587 for deletion 2025-07-13 00:23:18,377 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746587_5763 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746587 2025-07-13 00:24:17,049 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746588_5764 src: /192.168.158.1:48938 dest: /192.168.158.4:9866 2025-07-13 00:24:17,081 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48938, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_532482539_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746588_5764, duration(ns): 23680742 2025-07-13 00:24:17,081 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746588_5764, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-13 00:24:21,375 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746588_5764 replica FinalizedReplica, blk_1073746588_5764, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746588 for deletion 2025-07-13 00:24:21,377 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746588_5764 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746588 2025-07-13 00:26:17,060 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746590_5766 src: /192.168.158.7:47710 dest: /192.168.158.4:9866 2025-07-13 00:26:17,084 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:47710, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2019601843_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746590_5766, duration(ns): 19232905 2025-07-13 00:26:17,085 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746590_5766, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-13 00:26:18,379 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746590_5766 replica FinalizedReplica, blk_1073746590_5766, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746590 for deletion 2025-07-13 00:26:18,380 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746590_5766 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746590 2025-07-13 00:29:22,093 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746593_5769 src: /192.168.158.7:58344 dest: /192.168.158.4:9866 2025-07-13 00:29:22,111 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:58344, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1357742650_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746593_5769, duration(ns): 16086245 2025-07-13 00:29:22,112 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746593_5769, type=LAST_IN_PIPELINE terminating 2025-07-13 00:29:27,383 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746593_5769 replica FinalizedReplica, blk_1073746593_5769, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746593 for deletion 2025-07-13 00:29:27,385 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746593_5769 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746593 2025-07-13 00:30:22,094 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746594_5770 src: /192.168.158.5:40508 dest: /192.168.158.4:9866 2025-07-13 00:30:22,111 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:40508, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2084395290_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746594_5770, duration(ns): 14949392 2025-07-13 00:30:22,111 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746594_5770, type=LAST_IN_PIPELINE terminating 2025-07-13 00:30:24,384 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746594_5770 replica FinalizedReplica, blk_1073746594_5770, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746594 for deletion 2025-07-13 00:30:24,385 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746594_5770 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746594 2025-07-13 00:35:32,067 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746599_5775 src: /192.168.158.5:45864 dest: /192.168.158.4:9866 2025-07-13 00:35:32,090 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:45864, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1373775562_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746599_5775, duration(ns): 18165626 2025-07-13 00:35:32,091 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746599_5775, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-13 00:35:33,397 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746599_5775 replica FinalizedReplica, blk_1073746599_5775, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746599 for deletion 2025-07-13 00:35:33,399 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746599_5775 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746599 2025-07-13 00:36:32,066 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746600_5776 src: /192.168.158.9:43252 dest: /192.168.158.4:9866 2025-07-13 00:36:32,082 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:43252, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1639497325_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746600_5776, duration(ns): 14566795 2025-07-13 00:36:32,083 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746600_5776, type=LAST_IN_PIPELINE terminating 2025-07-13 00:36:33,398 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746600_5776 replica FinalizedReplica, blk_1073746600_5776, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746600 for deletion 2025-07-13 00:36:33,399 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746600_5776 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746600 2025-07-13 00:38:32,079 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746602_5778 src: /192.168.158.7:52832 dest: /192.168.158.4:9866 2025-07-13 00:38:32,102 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:52832, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2005184201_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746602_5778, duration(ns): 17184385 2025-07-13 00:38:32,103 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746602_5778, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-13 00:38:33,404 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746602_5778 replica FinalizedReplica, blk_1073746602_5778, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746602 for deletion 2025-07-13 00:38:33,406 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746602_5778 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746602 2025-07-13 00:39:32,088 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746603_5779 src: /192.168.158.1:39314 dest: /192.168.158.4:9866 2025-07-13 00:39:32,120 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39314, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-348642152_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746603_5779, duration(ns): 23637673 2025-07-13 00:39:32,120 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746603_5779, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-13 00:39:36,407 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746603_5779 replica FinalizedReplica, blk_1073746603_5779, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746603 for deletion 2025-07-13 00:39:36,408 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746603_5779 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746603 2025-07-13 00:45:37,072 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746609_5785 src: /192.168.158.8:40784 dest: /192.168.158.4:9866 2025-07-13 00:45:37,089 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:40784, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-877266085_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746609_5785, duration(ns): 14762745 2025-07-13 00:45:37,089 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746609_5785, type=LAST_IN_PIPELINE terminating 2025-07-13 00:45:39,420 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746609_5785 replica FinalizedReplica, blk_1073746609_5785, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746609 for deletion 2025-07-13 00:45:39,421 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746609_5785 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746609 2025-07-13 00:48:42,099 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746612_5788 src: /192.168.158.6:45508 dest: /192.168.158.4:9866 2025-07-13 00:48:42,124 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:45508, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_743279426_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746612_5788, duration(ns): 19489168 2025-07-13 00:48:42,124 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746612_5788, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-13 00:48:42,430 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746612_5788 replica FinalizedReplica, blk_1073746612_5788, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746612 for deletion 2025-07-13 00:48:42,431 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746612_5788 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746612 2025-07-13 00:51:47,097 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746615_5791 src: /192.168.158.5:33136 dest: /192.168.158.4:9866 2025-07-13 00:51:47,121 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:33136, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1028092891_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746615_5791, duration(ns): 18203266 2025-07-13 00:51:47,121 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746615_5791, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-13 00:51:48,438 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746615_5791 replica FinalizedReplica, blk_1073746615_5791, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746615 for deletion 2025-07-13 00:51:48,439 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746615_5791 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746615 2025-07-13 00:53:57,101 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746617_5793 src: /192.168.158.6:50506 dest: /192.168.158.4:9866 2025-07-13 00:53:57,117 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:50506, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1641001391_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746617_5793, duration(ns): 15087348 2025-07-13 00:53:57,118 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746617_5793, type=LAST_IN_PIPELINE terminating 2025-07-13 00:53:57,443 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746617_5793 replica FinalizedReplica, blk_1073746617_5793, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746617 for deletion 2025-07-13 00:53:57,444 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746617_5793 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746617 2025-07-13 00:56:57,103 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746620_5796 src: /192.168.158.1:54260 dest: /192.168.158.4:9866 2025-07-13 00:56:57,137 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54260, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-67355586_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746620_5796, duration(ns): 23813992 2025-07-13 00:56:57,137 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746620_5796, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-13 00:57:00,450 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746620_5796 replica FinalizedReplica, blk_1073746620_5796, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746620 for deletion 2025-07-13 00:57:00,451 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746620_5796 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746620 2025-07-13 00:58:57,103 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746622_5798 src: /192.168.158.6:55054 dest: /192.168.158.4:9866 2025-07-13 00:58:57,126 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:55054, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-380300488_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746622_5798, duration(ns): 18463322 2025-07-13 00:58:57,127 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746622_5798, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-13 00:58:57,456 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746622_5798 replica FinalizedReplica, blk_1073746622_5798, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746622 for deletion 2025-07-13 00:58:57,457 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746622_5798 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746622 2025-07-13 00:59:57,113 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746623_5799 src: /192.168.158.6:44730 dest: /192.168.158.4:9866 2025-07-13 00:59:57,129 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:44730, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1314416827_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746623_5799, duration(ns): 14783674 2025-07-13 00:59:57,130 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746623_5799, type=LAST_IN_PIPELINE terminating 2025-07-13 01:00:00,457 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746623_5799 replica FinalizedReplica, blk_1073746623_5799, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746623 for deletion 2025-07-13 01:00:00,458 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746623_5799 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746623 2025-07-13 01:03:02,118 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746626_5802 src: /192.168.158.9:40094 dest: /192.168.158.4:9866 2025-07-13 01:03:02,142 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:40094, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_475482192_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746626_5802, duration(ns): 17401786 2025-07-13 01:03:02,142 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746626_5802, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-13 01:03:06,463 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746626_5802 replica FinalizedReplica, blk_1073746626_5802, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746626 for deletion 2025-07-13 01:03:06,464 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746626_5802 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746626 2025-07-13 01:04:02,126 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746627_5803 src: /192.168.158.8:57932 dest: /192.168.158.4:9866 2025-07-13 01:04:02,142 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:57932, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1997187893_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746627_5803, duration(ns): 14001773 2025-07-13 01:04:02,143 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746627_5803, type=LAST_IN_PIPELINE terminating 2025-07-13 01:04:03,467 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746627_5803 replica FinalizedReplica, blk_1073746627_5803, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746627 for deletion 2025-07-13 01:04:03,468 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746627_5803 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746627 2025-07-13 01:05:02,119 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746628_5804 src: /192.168.158.7:53536 dest: /192.168.158.4:9866 2025-07-13 01:05:02,137 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:53536, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789579173_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746628_5804, duration(ns): 16888445 2025-07-13 01:05:02,138 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746628_5804, type=LAST_IN_PIPELINE terminating 2025-07-13 01:05:03,473 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746628_5804 replica FinalizedReplica, blk_1073746628_5804, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746628 for deletion 2025-07-13 01:05:03,474 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746628_5804 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746628 2025-07-13 01:08:12,122 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746631_5807 src: /192.168.158.8:46998 dest: /192.168.158.4:9866 2025-07-13 01:08:12,146 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:46998, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1521633964_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746631_5807, duration(ns): 18849824 2025-07-13 01:08:12,146 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746631_5807, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-13 01:08:12,483 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746631_5807 replica FinalizedReplica, blk_1073746631_5807, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746631 for deletion 2025-07-13 01:08:12,484 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746631_5807 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746631 2025-07-13 01:09:12,126 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746632_5808 src: /192.168.158.5:47656 dest: /192.168.158.4:9866 2025-07-13 01:09:12,144 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:47656, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1402606327_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746632_5808, duration(ns): 15477571 2025-07-13 01:09:12,144 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746632_5808, type=LAST_IN_PIPELINE terminating 2025-07-13 01:09:15,485 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746632_5808 replica FinalizedReplica, blk_1073746632_5808, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746632 for deletion 2025-07-13 01:09:15,486 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746632_5808 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746632 2025-07-13 01:12:12,126 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746635_5811 src: /192.168.158.7:49090 dest: /192.168.158.4:9866 2025-07-13 01:12:12,150 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:49090, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1363118089_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746635_5811, duration(ns): 19516360 2025-07-13 01:12:12,150 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746635_5811, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-13 01:12:15,492 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746635_5811 replica FinalizedReplica, blk_1073746635_5811, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746635 for deletion 2025-07-13 01:12:15,493 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746635_5811 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746635 2025-07-13 01:13:12,125 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746636_5812 src: /192.168.158.7:37258 dest: /192.168.158.4:9866 2025-07-13 01:13:12,142 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:37258, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2018232365_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746636_5812, duration(ns): 15078810 2025-07-13 01:13:12,143 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746636_5812, type=LAST_IN_PIPELINE terminating 2025-07-13 01:13:12,494 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746636_5812 replica FinalizedReplica, blk_1073746636_5812, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746636 for deletion 2025-07-13 01:13:12,495 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746636_5812 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746636 2025-07-13 01:14:17,132 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746637_5813 src: /192.168.158.1:59366 dest: /192.168.158.4:9866 2025-07-13 01:14:17,161 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59366, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1107182214_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746637_5813, duration(ns): 20591659 2025-07-13 01:14:17,161 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746637_5813, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-13 01:14:18,497 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746637_5813 replica FinalizedReplica, blk_1073746637_5813, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746637 for deletion 2025-07-13 01:14:18,498 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746637_5813 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746637 2025-07-13 01:17:22,110 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746640_5816 src: /192.168.158.6:42898 dest: /192.168.158.4:9866 2025-07-13 01:17:22,125 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:42898, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_92738265_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746640_5816, duration(ns): 12872549 2025-07-13 01:17:22,125 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746640_5816, type=LAST_IN_PIPELINE terminating 2025-07-13 01:17:24,503 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746640_5816 replica FinalizedReplica, blk_1073746640_5816, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746640 for deletion 2025-07-13 01:17:24,504 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746640_5816 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746640 2025-07-13 01:20:22,138 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746643_5819 src: /192.168.158.1:33410 dest: /192.168.158.4:9866 2025-07-13 01:20:22,171 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33410, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1364051040_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746643_5819, duration(ns): 24873340 2025-07-13 01:20:22,171 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746643_5819, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-13 01:20:24,509 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746643_5819 replica FinalizedReplica, blk_1073746643_5819, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746643 for deletion 2025-07-13 01:20:24,510 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746643_5819 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746643 2025-07-13 01:21:22,143 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746644_5820 src: /192.168.158.8:48148 dest: /192.168.158.4:9866 2025-07-13 01:21:22,160 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:48148, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2026425135_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746644_5820, duration(ns): 14701011 2025-07-13 01:21:22,160 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746644_5820, type=LAST_IN_PIPELINE terminating 2025-07-13 01:21:24,510 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746644_5820 replica FinalizedReplica, blk_1073746644_5820, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746644 for deletion 2025-07-13 01:21:24,512 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746644_5820 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746644 2025-07-13 01:22:27,144 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746645_5821 src: /192.168.158.7:38174 dest: /192.168.158.4:9866 2025-07-13 01:22:27,160 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:38174, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1861420321_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746645_5821, duration(ns): 13405836 2025-07-13 01:22:27,160 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746645_5821, type=LAST_IN_PIPELINE terminating 2025-07-13 01:22:27,511 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746645_5821 replica FinalizedReplica, blk_1073746645_5821, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746645 for deletion 2025-07-13 01:22:27,512 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746645_5821 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746645 2025-07-13 01:23:32,160 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746646_5822 src: /192.168.158.8:34362 dest: /192.168.158.4:9866 2025-07-13 01:23:32,186 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:34362, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2097010298_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746646_5822, duration(ns): 20593307 2025-07-13 01:23:32,186 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746646_5822, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-13 01:23:33,515 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746646_5822 replica FinalizedReplica, blk_1073746646_5822, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746646 for deletion 2025-07-13 01:23:33,516 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746646_5822 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746646 2025-07-13 01:26:32,181 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746649_5825 src: /192.168.158.6:58728 dest: /192.168.158.4:9866 2025-07-13 01:26:32,207 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:58728, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1646638710_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746649_5825, duration(ns): 20849555 2025-07-13 01:26:32,207 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746649_5825, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-13 01:26:36,522 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746649_5825 replica FinalizedReplica, blk_1073746649_5825, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746649 for deletion 2025-07-13 01:26:36,523 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746649_5825 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746649 2025-07-13 01:27:37,163 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746650_5826 src: /192.168.158.5:53930 dest: /192.168.158.4:9866 2025-07-13 01:27:37,182 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:53930, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1871050244_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746650_5826, duration(ns): 16966318 2025-07-13 01:27:37,182 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746650_5826, type=LAST_IN_PIPELINE terminating 2025-07-13 01:27:42,525 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746650_5826 replica FinalizedReplica, blk_1073746650_5826, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746650 for deletion 2025-07-13 01:27:42,526 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746650_5826 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746650 2025-07-13 01:32:42,167 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746655_5831 src: /192.168.158.7:34348 dest: /192.168.158.4:9866 2025-07-13 01:32:42,187 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:34348, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1990909011_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746655_5831, duration(ns): 18024534 2025-07-13 01:32:42,187 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746655_5831, type=LAST_IN_PIPELINE terminating 2025-07-13 01:32:42,537 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746655_5831 replica FinalizedReplica, blk_1073746655_5831, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746655 for deletion 2025-07-13 01:32:42,538 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746655_5831 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746655 2025-07-13 01:34:42,179 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746657_5833 src: /192.168.158.6:49330 dest: /192.168.158.4:9866 2025-07-13 01:34:42,204 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:49330, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1200009357_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746657_5833, duration(ns): 19918063 2025-07-13 01:34:42,204 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746657_5833, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-13 01:34:42,542 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746657_5833 replica FinalizedReplica, blk_1073746657_5833, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746657 for deletion 2025-07-13 01:34:42,543 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746657_5833 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746657 2025-07-13 01:35:47,177 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746658_5834 src: /192.168.158.9:58846 dest: /192.168.158.4:9866 2025-07-13 01:35:47,198 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:58846, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2092528291_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746658_5834, duration(ns): 19113912 2025-07-13 01:35:47,199 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746658_5834, type=LAST_IN_PIPELINE terminating 2025-07-13 01:35:48,544 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746658_5834 replica FinalizedReplica, blk_1073746658_5834, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746658 for deletion 2025-07-13 01:35:48,545 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746658_5834 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746658 2025-07-13 01:38:47,182 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746661_5837 src: /192.168.158.6:36992 dest: /192.168.158.4:9866 2025-07-13 01:38:47,200 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:36992, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_106852302_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746661_5837, duration(ns): 15293773 2025-07-13 01:38:47,200 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746661_5837, type=LAST_IN_PIPELINE terminating 2025-07-13 01:38:48,552 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746661_5837 replica FinalizedReplica, blk_1073746661_5837, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746661 for deletion 2025-07-13 01:38:48,553 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746661_5837 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746661 2025-07-13 01:39:47,188 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746662_5838 src: /192.168.158.7:51698 dest: /192.168.158.4:9866 2025-07-13 01:39:47,205 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:51698, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-791294241_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746662_5838, duration(ns): 15207182 2025-07-13 01:39:47,205 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746662_5838, type=LAST_IN_PIPELINE terminating 2025-07-13 01:39:48,551 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746662_5838 replica FinalizedReplica, blk_1073746662_5838, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746662 for deletion 2025-07-13 01:39:48,552 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746662_5838 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746662 2025-07-13 01:40:52,198 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746663_5839 src: /192.168.158.6:37126 dest: /192.168.158.4:9866 2025-07-13 01:40:52,214 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:37126, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1534949670_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746663_5839, duration(ns): 14116866 2025-07-13 01:40:52,214 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746663_5839, type=LAST_IN_PIPELINE terminating 2025-07-13 01:40:54,556 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746663_5839 replica FinalizedReplica, blk_1073746663_5839, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746663 for deletion 2025-07-13 01:40:54,557 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746663_5839 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746663 2025-07-13 01:42:57,178 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746665_5841 src: /192.168.158.6:44466 dest: /192.168.158.4:9866 2025-07-13 01:42:57,194 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:44466, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_397674283_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746665_5841, duration(ns): 13778324 2025-07-13 01:42:57,195 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746665_5841, type=LAST_IN_PIPELINE terminating 2025-07-13 01:42:57,559 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746665_5841 replica FinalizedReplica, blk_1073746665_5841, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746665 for deletion 2025-07-13 01:42:57,560 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746665_5841 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746665 2025-07-13 01:44:57,181 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746667_5843 src: /192.168.158.9:36642 dest: /192.168.158.4:9866 2025-07-13 01:44:57,204 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:36642, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-665017145_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746667_5843, duration(ns): 17739513 2025-07-13 01:44:57,204 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746667_5843, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-13 01:44:57,562 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746667_5843 replica FinalizedReplica, blk_1073746667_5843, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746667 for deletion 2025-07-13 01:44:57,563 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746667_5843 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746667 2025-07-13 01:49:02,186 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746671_5847 src: /192.168.158.9:49548 dest: /192.168.158.4:9866 2025-07-13 01:49:02,202 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:49548, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-140524916_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746671_5847, duration(ns): 13621825 2025-07-13 01:49:02,202 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746671_5847, type=LAST_IN_PIPELINE terminating 2025-07-13 01:49:03,568 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746671_5847 replica FinalizedReplica, blk_1073746671_5847, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746671 for deletion 2025-07-13 01:49:03,569 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746671_5847 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746671 2025-07-13 01:50:02,190 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746672_5848 src: /192.168.158.1:60104 dest: /192.168.158.4:9866 2025-07-13 01:50:02,220 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60104, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_133161261_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746672_5848, duration(ns): 21515692 2025-07-13 01:50:02,220 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746672_5848, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-13 01:50:06,569 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746672_5848 replica FinalizedReplica, blk_1073746672_5848, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746672 for deletion 2025-07-13 01:50:06,570 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746672_5848 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746672 2025-07-13 01:51:02,206 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746673_5849 src: /192.168.158.8:58808 dest: /192.168.158.4:9866 2025-07-13 01:51:02,226 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:58808, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-168947089_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746673_5849, duration(ns): 17779424 2025-07-13 01:51:02,227 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746673_5849, type=LAST_IN_PIPELINE terminating 2025-07-13 01:51:03,570 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746673_5849 replica FinalizedReplica, blk_1073746673_5849, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746673 for deletion 2025-07-13 01:51:03,571 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746673_5849 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746673 2025-07-13 01:53:07,191 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746675_5851 src: /192.168.158.1:51926 dest: /192.168.158.4:9866 2025-07-13 01:53:07,222 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51926, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-746364855_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746675_5851, duration(ns): 22763813 2025-07-13 01:53:07,222 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746675_5851, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-13 01:53:09,575 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746675_5851 replica FinalizedReplica, blk_1073746675_5851, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746675 for deletion 2025-07-13 01:53:09,576 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746675_5851 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746675 2025-07-13 01:56:07,204 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746678_5854 src: /192.168.158.6:55856 dest: /192.168.158.4:9866 2025-07-13 01:56:07,228 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:55856, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1530782685_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746678_5854, duration(ns): 19163022 2025-07-13 01:56:07,229 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746678_5854, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-13 01:56:09,580 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746678_5854 replica FinalizedReplica, blk_1073746678_5854, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746678 for deletion 2025-07-13 01:56:09,581 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746678_5854 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746678 2025-07-13 01:57:07,211 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746679_5855 src: /192.168.158.5:43534 dest: /192.168.158.4:9866 2025-07-13 01:57:07,230 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:43534, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-925315286_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746679_5855, duration(ns): 16702241 2025-07-13 01:57:07,230 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746679_5855, type=LAST_IN_PIPELINE terminating 2025-07-13 01:57:09,582 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746679_5855 replica FinalizedReplica, blk_1073746679_5855, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746679 for deletion 2025-07-13 01:57:09,583 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746679_5855 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746679 2025-07-13 01:58:07,239 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746680_5856 src: /192.168.158.9:44520 dest: /192.168.158.4:9866 2025-07-13 01:58:07,255 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:44520, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-943120948_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746680_5856, duration(ns): 14171003 2025-07-13 01:58:07,255 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746680_5856, type=LAST_IN_PIPELINE terminating 2025-07-13 01:58:09,585 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746680_5856 replica FinalizedReplica, blk_1073746680_5856, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746680 for deletion 2025-07-13 01:58:09,586 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746680_5856 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746680 2025-07-13 02:00:17,213 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746682_5858 src: /192.168.158.5:48234 dest: /192.168.158.4:9866 2025-07-13 02:00:17,236 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:48234, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-743429168_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746682_5858, duration(ns): 18520955 2025-07-13 02:00:17,237 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746682_5858, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-13 02:00:21,591 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746682_5858 replica FinalizedReplica, blk_1073746682_5858, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746682 for deletion 2025-07-13 02:00:21,592 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746682_5858 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746682 2025-07-13 02:01:22,210 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746683_5859 src: /192.168.158.6:42132 dest: /192.168.158.4:9866 2025-07-13 02:01:22,234 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:42132, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-137563998_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746683_5859, duration(ns): 19193814 2025-07-13 02:01:22,234 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746683_5859, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-13 02:01:27,592 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746683_5859 replica FinalizedReplica, blk_1073746683_5859, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746683 for deletion 2025-07-13 02:01:27,593 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746683_5859 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746683 2025-07-13 02:03:22,217 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746685_5861 src: /192.168.158.8:54388 dest: /192.168.158.4:9866 2025-07-13 02:03:22,240 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:54388, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1950782075_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746685_5861, duration(ns): 17961608 2025-07-13 02:03:22,240 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746685_5861, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-13 02:03:27,594 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746685_5861 replica FinalizedReplica, blk_1073746685_5861, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746685 for deletion 2025-07-13 02:03:27,595 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746685_5861 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746685 2025-07-13 02:04:22,209 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746686_5862 src: /192.168.158.1:50264 dest: /192.168.158.4:9866 2025-07-13 02:04:22,241 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50264, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-223130594_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746686_5862, duration(ns): 22274160 2025-07-13 02:04:22,241 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746686_5862, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-13 02:04:24,598 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746686_5862 replica FinalizedReplica, blk_1073746686_5862, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746686 for deletion 2025-07-13 02:04:24,599 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746686_5862 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746686 2025-07-13 02:05:27,211 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746687_5863 src: /192.168.158.9:52300 dest: /192.168.158.4:9866 2025-07-13 02:05:27,234 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:52300, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_82869203_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746687_5863, duration(ns): 17628653 2025-07-13 02:05:27,236 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746687_5863, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-13 02:05:30,599 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746687_5863 replica FinalizedReplica, blk_1073746687_5863, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746687 for deletion 2025-07-13 02:05:30,600 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746687_5863 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073746687 2025-07-13 02:08:32,238 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746690_5866 src: /192.168.158.9:48050 dest: /192.168.158.4:9866 2025-07-13 02:08:32,258 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:48050, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1261601913_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746690_5866, duration(ns): 17310964 2025-07-13 02:08:32,258 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746690_5866, type=LAST_IN_PIPELINE terminating 2025-07-13 02:08:33,606 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746690_5866 replica FinalizedReplica, blk_1073746690_5866, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746690 for deletion 2025-07-13 02:08:33,607 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746690_5866 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746690 2025-07-13 02:09:32,233 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746691_5867 src: /192.168.158.5:48810 dest: /192.168.158.4:9866 2025-07-13 02:09:32,250 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:48810, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-774238650_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746691_5867, duration(ns): 14446750 2025-07-13 02:09:32,250 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746691_5867, type=LAST_IN_PIPELINE terminating 2025-07-13 02:09:36,608 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746691_5867 replica FinalizedReplica, blk_1073746691_5867, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746691 for deletion 2025-07-13 02:09:36,609 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746691_5867 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746691 2025-07-13 02:11:37,233 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746693_5869 src: /192.168.158.9:58628 dest: /192.168.158.4:9866 2025-07-13 02:11:37,251 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:58628, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1252379696_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746693_5869, duration(ns): 15990702 2025-07-13 02:11:37,251 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746693_5869, type=LAST_IN_PIPELINE terminating 2025-07-13 02:11:42,611 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746693_5869 replica FinalizedReplica, blk_1073746693_5869, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746693 for deletion 2025-07-13 02:11:42,612 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746693_5869 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746693 2025-07-13 02:12:37,244 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746694_5870 src: /192.168.158.7:48988 dest: /192.168.158.4:9866 2025-07-13 02:12:37,264 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:48988, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1453756919_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746694_5870, duration(ns): 17526139 2025-07-13 02:12:37,264 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746694_5870, type=LAST_IN_PIPELINE terminating 2025-07-13 02:12:39,612 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746694_5870 replica FinalizedReplica, blk_1073746694_5870, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746694 for deletion 2025-07-13 02:12:39,613 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746694_5870 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746694 2025-07-13 02:13:37,256 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746695_5871 src: /192.168.158.9:59112 dest: /192.168.158.4:9866 2025-07-13 02:13:37,275 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:59112, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1251952134_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746695_5871, duration(ns): 16805287 2025-07-13 02:13:37,275 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746695_5871, type=LAST_IN_PIPELINE terminating 2025-07-13 02:13:39,616 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746695_5871 replica FinalizedReplica, blk_1073746695_5871, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746695 for deletion 2025-07-13 02:13:39,617 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746695_5871 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746695 2025-07-13 02:14:42,231 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746696_5872 src: /192.168.158.1:35386 dest: /192.168.158.4:9866 2025-07-13 02:14:42,262 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35386, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1662214592_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746696_5872, duration(ns): 23073989 2025-07-13 02:14:42,263 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746696_5872, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-13 02:14:42,615 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746696_5872 replica FinalizedReplica, blk_1073746696_5872, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746696 for deletion 2025-07-13 02:14:42,616 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746696_5872 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746696 2025-07-13 02:15:42,240 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746697_5873 src: /192.168.158.8:55568 dest: /192.168.158.4:9866 2025-07-13 02:15:42,269 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:55568, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-789931055_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746697_5873, duration(ns): 22686828 2025-07-13 02:15:42,269 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746697_5873, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-13 02:15:42,620 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746697_5873 replica FinalizedReplica, blk_1073746697_5873, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746697 for deletion 2025-07-13 02:15:42,621 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746697_5873 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746697 2025-07-13 02:17:47,241 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746699_5875 src: /192.168.158.6:55322 dest: /192.168.158.4:9866 2025-07-13 02:17:47,265 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:55322, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_542115563_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746699_5875, duration(ns): 19495169 2025-07-13 02:17:47,266 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746699_5875, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-13 02:17:48,624 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746699_5875 replica FinalizedReplica, blk_1073746699_5875, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746699 for deletion 2025-07-13 02:17:48,626 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746699_5875 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746699 2025-07-13 02:22:52,296 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746704_5880 src: /192.168.158.9:55668 dest: /192.168.158.4:9866 2025-07-13 02:22:52,321 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:55668, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1573726454_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746704_5880, duration(ns): 19284883 2025-07-13 02:22:52,321 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746704_5880, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-13 02:22:54,638 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746704_5880 replica FinalizedReplica, blk_1073746704_5880, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746704 for deletion 2025-07-13 02:22:54,639 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746704_5880 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746704 2025-07-13 02:23:57,252 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746705_5881 src: /192.168.158.5:44952 dest: /192.168.158.4:9866 2025-07-13 02:23:57,270 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:44952, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1520744229_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746705_5881, duration(ns): 16613186 2025-07-13 02:23:57,271 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746705_5881, type=LAST_IN_PIPELINE terminating 2025-07-13 02:24:00,641 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746705_5881 replica FinalizedReplica, blk_1073746705_5881, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746705 for deletion 2025-07-13 02:24:00,642 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746705_5881 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746705 2025-07-13 02:24:57,263 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746706_5882 src: /192.168.158.8:34958 dest: /192.168.158.4:9866 2025-07-13 02:24:57,288 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:34958, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-821643189_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746706_5882, duration(ns): 19954201 2025-07-13 02:24:57,289 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746706_5882, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-13 02:24:57,642 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746706_5882 replica FinalizedReplica, blk_1073746706_5882, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746706 for deletion 2025-07-13 02:24:57,644 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746706_5882 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746706 2025-07-13 02:25:57,259 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746707_5883 src: /192.168.158.1:45996 dest: /192.168.158.4:9866 2025-07-13 02:25:57,293 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45996, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1269501191_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746707_5883, duration(ns): 25139394 2025-07-13 02:25:57,294 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746707_5883, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-13 02:26:00,646 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746707_5883 replica FinalizedReplica, blk_1073746707_5883, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746707 for deletion 2025-07-13 02:26:00,647 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746707_5883 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746707 2025-07-13 02:27:02,253 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746708_5884 src: /192.168.158.9:53520 dest: /192.168.158.4:9866 2025-07-13 02:27:02,278 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:53520, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1376164882_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746708_5884, duration(ns): 19915360 2025-07-13 02:27:02,279 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746708_5884, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-13 02:27:03,647 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746708_5884 replica FinalizedReplica, blk_1073746708_5884, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746708 for deletion 2025-07-13 02:27:03,648 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746708_5884 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746708 2025-07-13 02:28:02,247 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746709_5885 src: /192.168.158.1:46614 dest: /192.168.158.4:9866 2025-07-13 02:28:02,279 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46614, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1713384983_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746709_5885, duration(ns): 23481543 2025-07-13 02:28:02,280 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746709_5885, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-13 02:28:03,651 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746709_5885 replica FinalizedReplica, blk_1073746709_5885, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746709 for deletion 2025-07-13 02:28:03,652 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746709_5885 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746709 2025-07-13 02:29:02,255 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746710_5886 src: /192.168.158.9:37810 dest: /192.168.158.4:9866 2025-07-13 02:29:02,279 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:37810, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1733727084_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746710_5886, duration(ns): 19266461 2025-07-13 02:29:02,279 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746710_5886, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-13 02:29:03,655 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746710_5886 replica FinalizedReplica, blk_1073746710_5886, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746710 for deletion 2025-07-13 02:29:03,656 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746710_5886 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746710 2025-07-13 02:33:12,269 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746714_5890 src: /192.168.158.9:36722 dest: /192.168.158.4:9866 2025-07-13 02:33:12,287 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:36722, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_332392663_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746714_5890, duration(ns): 15571011 2025-07-13 02:33:12,287 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746714_5890, type=LAST_IN_PIPELINE terminating 2025-07-13 02:33:15,659 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746714_5890 replica FinalizedReplica, blk_1073746714_5890, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746714 for deletion 2025-07-13 02:33:15,660 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746714_5890 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746714 2025-07-13 02:35:12,264 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746716_5892 src: /192.168.158.6:50054 dest: /192.168.158.4:9866 2025-07-13 02:35:12,280 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:50054, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1142147263_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746716_5892, duration(ns): 14915167 2025-07-13 02:35:12,281 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746716_5892, type=LAST_IN_PIPELINE terminating 2025-07-13 02:35:18,665 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746716_5892 replica FinalizedReplica, blk_1073746716_5892, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746716 for deletion 2025-07-13 02:35:18,666 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746716_5892 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746716 2025-07-13 02:36:17,274 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746717_5893 src: /192.168.158.5:35176 dest: /192.168.158.4:9866 2025-07-13 02:36:17,292 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:35176, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_386862483_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746717_5893, duration(ns): 16508622 2025-07-13 02:36:17,292 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746717_5893, type=LAST_IN_PIPELINE terminating 2025-07-13 02:36:24,667 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746717_5893 replica FinalizedReplica, blk_1073746717_5893, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746717 for deletion 2025-07-13 02:36:24,668 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746717_5893 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746717 2025-07-13 02:37:17,276 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746718_5894 src: /192.168.158.1:39364 dest: /192.168.158.4:9866 2025-07-13 02:37:17,308 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39364, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1076110694_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746718_5894, duration(ns): 23327973 2025-07-13 02:37:17,308 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746718_5894, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-13 02:37:24,668 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746718_5894 replica FinalizedReplica, blk_1073746718_5894, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746718 for deletion 2025-07-13 02:37:24,669 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746718_5894 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746718 2025-07-13 02:44:27,285 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746725_5901 src: /192.168.158.8:38950 dest: /192.168.158.4:9866 2025-07-13 02:44:27,302 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:38950, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1037996752_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746725_5901, duration(ns): 15251974 2025-07-13 02:44:27,302 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746725_5901, type=LAST_IN_PIPELINE terminating 2025-07-13 02:44:33,677 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746725_5901 replica FinalizedReplica, blk_1073746725_5901, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746725 for deletion 2025-07-13 02:44:33,678 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746725_5901 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746725 2025-07-13 02:46:27,287 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746727_5903 src: /192.168.158.8:58628 dest: /192.168.158.4:9866 2025-07-13 02:46:27,310 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:58628, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-205962480_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746727_5903, duration(ns): 18080846 2025-07-13 02:46:27,311 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746727_5903, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-13 02:46:30,678 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746727_5903 replica FinalizedReplica, blk_1073746727_5903, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746727 for deletion 2025-07-13 02:46:30,679 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746727_5903 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746727 2025-07-13 02:48:27,290 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746729_5905 src: /192.168.158.9:42170 dest: /192.168.158.4:9866 2025-07-13 02:48:27,313 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:42170, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1057031941_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746729_5905, duration(ns): 17895216 2025-07-13 02:48:27,313 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746729_5905, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-13 02:48:30,680 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746729_5905 replica FinalizedReplica, blk_1073746729_5905, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746729 for deletion 2025-07-13 02:48:30,681 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746729_5905 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746729 2025-07-13 02:51:27,296 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746732_5908 src: /192.168.158.1:44558 dest: /192.168.158.4:9866 2025-07-13 02:51:27,327 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44558, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_395635210_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746732_5908, duration(ns): 22819735 2025-07-13 02:51:27,327 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746732_5908, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-13 02:51:30,681 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746732_5908 replica FinalizedReplica, blk_1073746732_5908, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746732 for deletion 2025-07-13 02:51:30,682 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746732_5908 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746732 2025-07-13 02:52:27,330 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746733_5909 src: /192.168.158.5:37526 dest: /192.168.158.4:9866 2025-07-13 02:52:27,352 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:37526, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1983686489_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746733_5909, duration(ns): 17392210 2025-07-13 02:52:27,353 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746733_5909, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-13 02:52:30,684 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746733_5909 replica FinalizedReplica, blk_1073746733_5909, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746733 for deletion 2025-07-13 02:52:30,685 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746733_5909 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746733 2025-07-13 02:53:27,300 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746734_5910 src: /192.168.158.5:57446 dest: /192.168.158.4:9866 2025-07-13 02:53:27,319 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:57446, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1169801872_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746734_5910, duration(ns): 17007884 2025-07-13 02:53:27,319 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746734_5910, type=LAST_IN_PIPELINE terminating 2025-07-13 02:53:30,685 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746734_5910 replica FinalizedReplica, blk_1073746734_5910, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746734 for deletion 2025-07-13 02:53:30,687 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746734_5910 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746734 2025-07-13 02:56:32,305 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746737_5913 src: /192.168.158.1:56842 dest: /192.168.158.4:9866 2025-07-13 02:56:32,337 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56842, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1182783723_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746737_5913, duration(ns): 22789319 2025-07-13 02:56:32,337 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746737_5913, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-13 02:56:36,689 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746737_5913 replica FinalizedReplica, blk_1073746737_5913, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746737 for deletion 2025-07-13 02:56:36,691 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746737_5913 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746737 2025-07-13 02:59:32,311 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746740_5916 src: /192.168.158.1:55922 dest: /192.168.158.4:9866 2025-07-13 02:59:32,347 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55922, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1038379780_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746740_5916, duration(ns): 25768329 2025-07-13 02:59:32,347 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746740_5916, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-13 02:59:39,698 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746740_5916 replica FinalizedReplica, blk_1073746740_5916, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746740 for deletion 2025-07-13 02:59:39,699 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746740_5916 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746740 2025-07-13 03:02:37,316 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746743_5919 src: /192.168.158.7:41066 dest: /192.168.158.4:9866 2025-07-13 03:02:37,343 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:41066, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1052038365_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746743_5919, duration(ns): 22174744 2025-07-13 03:02:37,344 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746743_5919, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-13 03:02:42,703 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746743_5919 replica FinalizedReplica, blk_1073746743_5919, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746743 for deletion 2025-07-13 03:02:42,704 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746743_5919 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746743 2025-07-13 03:03:42,316 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746744_5920 src: /192.168.158.8:59720 dest: /192.168.158.4:9866 2025-07-13 03:03:42,342 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:59720, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-680533476_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746744_5920, duration(ns): 19794019 2025-07-13 03:03:42,342 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746744_5920, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-13 03:03:48,703 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746744_5920 replica FinalizedReplica, blk_1073746744_5920, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746744 for deletion 2025-07-13 03:03:48,705 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746744_5920 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746744 2025-07-13 03:04:42,325 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746745_5921 src: /192.168.158.9:43900 dest: /192.168.158.4:9866 2025-07-13 03:04:42,349 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:43900, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1730947132_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746745_5921, duration(ns): 19107992 2025-07-13 03:04:42,349 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746745_5921, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-13 03:04:45,706 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746745_5921 replica FinalizedReplica, blk_1073746745_5921, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746745 for deletion 2025-07-13 03:04:45,708 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746745_5921 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746745 2025-07-13 03:05:42,316 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746746_5922 src: /192.168.158.1:51844 dest: /192.168.158.4:9866 2025-07-13 03:05:42,346 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51844, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1076419863_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746746_5922, duration(ns): 20557095 2025-07-13 03:05:42,346 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746746_5922, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-13 03:05:45,714 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746746_5922 replica FinalizedReplica, blk_1073746746_5922, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746746 for deletion 2025-07-13 03:05:45,715 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746746_5922 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746746 2025-07-13 03:06:42,327 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746747_5923 src: /192.168.158.1:33606 dest: /192.168.158.4:9866 2025-07-13 03:06:42,360 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33606, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-359641692_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746747_5923, duration(ns): 23805972 2025-07-13 03:06:42,360 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746747_5923, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-13 03:06:48,716 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746747_5923 replica FinalizedReplica, blk_1073746747_5923, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746747 for deletion 2025-07-13 03:06:48,717 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746747_5923 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746747 2025-07-13 03:08:42,338 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746749_5925 src: /192.168.158.8:46824 dest: /192.168.158.4:9866 2025-07-13 03:08:42,355 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:46824, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_419571975_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746749_5925, duration(ns): 14802966 2025-07-13 03:08:42,356 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746749_5925, type=LAST_IN_PIPELINE terminating 2025-07-13 03:08:48,718 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746749_5925 replica FinalizedReplica, blk_1073746749_5925, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746749 for deletion 2025-07-13 03:08:48,720 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746749_5925 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746749 2025-07-13 03:09:42,338 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746750_5926 src: /192.168.158.7:50752 dest: /192.168.158.4:9866 2025-07-13 03:09:42,356 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:50752, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-267686930_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746750_5926, duration(ns): 16447994 2025-07-13 03:09:42,356 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746750_5926, type=LAST_IN_PIPELINE terminating 2025-07-13 03:09:48,721 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746750_5926 replica FinalizedReplica, blk_1073746750_5926, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746750 for deletion 2025-07-13 03:09:48,722 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746750_5926 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746750 2025-07-13 03:10:42,376 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746751_5927 src: /192.168.158.5:35200 dest: /192.168.158.4:9866 2025-07-13 03:10:42,394 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:35200, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1553554381_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746751_5927, duration(ns): 16389065 2025-07-13 03:10:42,394 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746751_5927, type=LAST_IN_PIPELINE terminating 2025-07-13 03:10:45,724 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746751_5927 replica FinalizedReplica, blk_1073746751_5927, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746751 for deletion 2025-07-13 03:10:45,725 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746751_5927 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746751 2025-07-13 03:11:47,335 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746752_5928 src: /192.168.158.9:60838 dest: /192.168.158.4:9866 2025-07-13 03:11:47,353 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:60838, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_12564561_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746752_5928, duration(ns): 15793101 2025-07-13 03:11:47,354 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746752_5928, type=LAST_IN_PIPELINE terminating 2025-07-13 03:11:54,726 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746752_5928 replica FinalizedReplica, blk_1073746752_5928, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746752 for deletion 2025-07-13 03:11:54,727 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746752_5928 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746752 2025-07-13 03:12:52,334 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746753_5929 src: /192.168.158.6:57908 dest: /192.168.158.4:9866 2025-07-13 03:12:52,358 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:57908, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1179959687_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746753_5929, duration(ns): 18284892 2025-07-13 03:12:52,358 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746753_5929, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-13 03:12:57,725 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746753_5929 replica FinalizedReplica, blk_1073746753_5929, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746753 for deletion 2025-07-13 03:12:57,727 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746753_5929 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746753 2025-07-13 03:17:57,343 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746758_5934 src: /192.168.158.9:41898 dest: /192.168.158.4:9866 2025-07-13 03:17:57,368 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:41898, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1547647798_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746758_5934, duration(ns): 19548146 2025-07-13 03:17:57,369 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746758_5934, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-13 03:18:00,741 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746758_5934 replica FinalizedReplica, blk_1073746758_5934, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746758 for deletion 2025-07-13 03:18:00,743 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746758_5934 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746758 2025-07-13 03:18:57,350 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746759_5935 src: /192.168.158.5:53742 dest: /192.168.158.4:9866 2025-07-13 03:18:57,367 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:53742, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_978478522_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746759_5935, duration(ns): 15442861 2025-07-13 03:18:57,368 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746759_5935, type=LAST_IN_PIPELINE terminating 2025-07-13 03:19:03,742 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746759_5935 replica FinalizedReplica, blk_1073746759_5935, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746759 for deletion 2025-07-13 03:19:03,743 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746759_5935 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746759 2025-07-13 03:21:57,344 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746762_5938 src: /192.168.158.9:39390 dest: /192.168.158.4:9866 2025-07-13 03:21:57,368 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:39390, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_912733143_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746762_5938, duration(ns): 19673370 2025-07-13 03:21:57,370 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746762_5938, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-13 03:22:00,742 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746762_5938 replica FinalizedReplica, blk_1073746762_5938, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746762 for deletion 2025-07-13 03:22:00,743 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746762_5938 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746762 2025-07-13 03:25:02,348 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746765_5941 src: /192.168.158.5:33994 dest: /192.168.158.4:9866 2025-07-13 03:25:02,364 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:33994, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_834297350_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746765_5941, duration(ns): 13897845 2025-07-13 03:25:02,364 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746765_5941, type=LAST_IN_PIPELINE terminating 2025-07-13 03:25:06,747 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746765_5941 replica FinalizedReplica, blk_1073746765_5941, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746765 for deletion 2025-07-13 03:25:06,748 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746765_5941 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746765 2025-07-13 03:34:12,356 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746774_5950 src: /192.168.158.1:40524 dest: /192.168.158.4:9866 2025-07-13 03:34:12,387 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40524, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-758946950_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746774_5950, duration(ns): 21489357 2025-07-13 03:34:12,387 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746774_5950, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-13 03:34:15,768 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746774_5950 replica FinalizedReplica, blk_1073746774_5950, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746774 for deletion 2025-07-13 03:34:15,770 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746774_5950 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746774 2025-07-13 03:38:17,365 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746778_5954 src: /192.168.158.9:37430 dest: /192.168.158.4:9866 2025-07-13 03:38:17,390 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:37430, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-255692879_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746778_5954, duration(ns): 18393824 2025-07-13 03:38:17,390 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746778_5954, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-13 03:38:21,779 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746778_5954 replica FinalizedReplica, blk_1073746778_5954, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746778 for deletion 2025-07-13 03:38:21,780 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746778_5954 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746778 2025-07-13 03:39:17,375 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746779_5955 src: /192.168.158.5:33742 dest: /192.168.158.4:9866 2025-07-13 03:39:17,392 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:33742, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1551991089_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746779_5955, duration(ns): 14856888 2025-07-13 03:39:17,392 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746779_5955, type=LAST_IN_PIPELINE terminating 2025-07-13 03:39:21,780 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746779_5955 replica FinalizedReplica, blk_1073746779_5955, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746779 for deletion 2025-07-13 03:39:21,782 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746779_5955 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746779 2025-07-13 03:41:22,370 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746781_5957 src: /192.168.158.5:42156 dest: /192.168.158.4:9866 2025-07-13 03:41:22,396 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:42156, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2024421673_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746781_5957, duration(ns): 20629069 2025-07-13 03:41:22,397 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746781_5957, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-13 03:41:30,788 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746781_5957 replica FinalizedReplica, blk_1073746781_5957, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746781 for deletion 2025-07-13 03:41:30,789 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746781_5957 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746781 2025-07-13 03:42:27,363 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746782_5958 src: /192.168.158.1:59554 dest: /192.168.158.4:9866 2025-07-13 03:42:27,395 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59554, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-128471547_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746782_5958, duration(ns): 22506477 2025-07-13 03:42:27,395 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746782_5958, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-13 03:42:33,790 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746782_5958 replica FinalizedReplica, blk_1073746782_5958, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746782 for deletion 2025-07-13 03:42:33,792 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746782_5958 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746782 2025-07-13 03:44:32,377 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746784_5960 src: /192.168.158.5:53090 dest: /192.168.158.4:9866 2025-07-13 03:44:32,395 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:53090, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1541846975_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746784_5960, duration(ns): 15577976 2025-07-13 03:44:32,396 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746784_5960, type=LAST_IN_PIPELINE terminating 2025-07-13 03:44:36,794 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746784_5960 replica FinalizedReplica, blk_1073746784_5960, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746784 for deletion 2025-07-13 03:44:36,795 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746784_5960 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746784 2025-07-13 03:45:37,380 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746785_5961 src: /192.168.158.5:38720 dest: /192.168.158.4:9866 2025-07-13 03:45:37,400 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:38720, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-549297794_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746785_5961, duration(ns): 17543909 2025-07-13 03:45:37,400 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746785_5961, type=LAST_IN_PIPELINE terminating 2025-07-13 03:45:45,795 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746785_5961 replica FinalizedReplica, blk_1073746785_5961, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746785 for deletion 2025-07-13 03:45:45,796 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746785_5961 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746785 2025-07-13 03:47:42,374 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746787_5963 src: /192.168.158.6:45414 dest: /192.168.158.4:9866 2025-07-13 03:47:42,397 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:45414, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-663817244_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746787_5963, duration(ns): 18099377 2025-07-13 03:47:42,398 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746787_5963, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-13 03:47:45,799 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746787_5963 replica FinalizedReplica, blk_1073746787_5963, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746787 for deletion 2025-07-13 03:47:45,800 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746787_5963 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746787 2025-07-13 03:50:52,389 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746790_5966 src: /192.168.158.1:42722 dest: /192.168.158.4:9866 2025-07-13 03:50:52,421 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42722, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1366160002_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746790_5966, duration(ns): 22591092 2025-07-13 03:50:52,421 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746790_5966, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-13 03:51:00,805 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746790_5966 replica FinalizedReplica, blk_1073746790_5966, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746790 for deletion 2025-07-13 03:51:00,806 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746790_5966 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746790 2025-07-13 03:51:57,385 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746791_5967 src: /192.168.158.6:59994 dest: /192.168.158.4:9866 2025-07-13 03:51:57,403 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:59994, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-155725848_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746791_5967, duration(ns): 15583926 2025-07-13 03:51:57,403 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746791_5967, type=LAST_IN_PIPELINE terminating 2025-07-13 03:52:00,808 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746791_5967 replica FinalizedReplica, blk_1073746791_5967, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746791 for deletion 2025-07-13 03:52:00,809 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746791_5967 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746791 2025-07-13 03:53:02,376 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746792_5968 src: /192.168.158.1:38380 dest: /192.168.158.4:9866 2025-07-13 03:53:02,406 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38380, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1696046344_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746792_5968, duration(ns): 21485127 2025-07-13 03:53:02,406 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746792_5968, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-13 03:53:09,811 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746792_5968 replica FinalizedReplica, blk_1073746792_5968, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746792 for deletion 2025-07-13 03:53:09,812 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746792_5968 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746792 2025-07-13 03:57:17,397 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746796_5972 src: /192.168.158.9:46476 dest: /192.168.158.4:9866 2025-07-13 03:57:17,414 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:46476, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_631958777_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746796_5972, duration(ns): 14939993 2025-07-13 03:57:17,415 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746796_5972, type=LAST_IN_PIPELINE terminating 2025-07-13 03:57:21,819 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746796_5972 replica FinalizedReplica, blk_1073746796_5972, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746796 for deletion 2025-07-13 03:57:21,821 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746796_5972 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746796 2025-07-13 03:58:17,386 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746797_5973 src: /192.168.158.5:54004 dest: /192.168.158.4:9866 2025-07-13 03:58:17,408 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:54004, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-608719432_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746797_5973, duration(ns): 16981642 2025-07-13 03:58:17,408 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746797_5973, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-13 03:58:21,820 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746797_5973 replica FinalizedReplica, blk_1073746797_5973, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746797 for deletion 2025-07-13 03:58:21,821 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746797_5973 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746797 2025-07-13 04:05:32,431 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746804_5980 src: /192.168.158.1:41126 dest: /192.168.158.4:9866 2025-07-13 04:05:32,464 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41126, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_49565104_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746804_5980, duration(ns): 24469703 2025-07-13 04:05:32,464 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746804_5980, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-13 04:05:39,835 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746804_5980 replica FinalizedReplica, blk_1073746804_5980, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746804 for deletion 2025-07-13 04:05:39,837 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746804_5980 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746804 2025-07-13 04:07:32,397 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746806_5982 src: /192.168.158.1:56004 dest: /192.168.158.4:9866 2025-07-13 04:07:32,428 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56004, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2074314019_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746806_5982, duration(ns): 22452434 2025-07-13 04:07:32,429 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746806_5982, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-13 04:07:36,841 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746806_5982 replica FinalizedReplica, blk_1073746806_5982, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746806 for deletion 2025-07-13 04:07:36,842 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746806_5982 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746806 2025-07-13 04:08:37,401 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746807_5983 src: /192.168.158.6:42748 dest: /192.168.158.4:9866 2025-07-13 04:08:37,425 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:42748, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1056033319_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746807_5983, duration(ns): 18366431 2025-07-13 04:08:37,425 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746807_5983, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-13 04:08:42,843 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746807_5983 replica FinalizedReplica, blk_1073746807_5983, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746807 for deletion 2025-07-13 04:08:42,844 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746807_5983 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746807 2025-07-13 04:10:42,398 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746809_5985 src: /192.168.158.1:57688 dest: /192.168.158.4:9866 2025-07-13 04:10:42,430 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57688, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_590071269_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746809_5985, duration(ns): 23344046 2025-07-13 04:10:42,430 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746809_5985, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-13 04:10:45,848 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746809_5985 replica FinalizedReplica, blk_1073746809_5985, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746809 for deletion 2025-07-13 04:10:45,849 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746809_5985 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746809 2025-07-13 04:14:42,438 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746813_5989 src: /192.168.158.1:35170 dest: /192.168.158.4:9866 2025-07-13 04:14:42,470 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35170, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_355768788_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746813_5989, duration(ns): 23865661 2025-07-13 04:14:42,471 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746813_5989, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-13 04:14:45,855 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746813_5989 replica FinalizedReplica, blk_1073746813_5989, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746813 for deletion 2025-07-13 04:14:45,856 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746813_5989 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746813 2025-07-13 04:20:52,408 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746819_5995 src: /192.168.158.1:46272 dest: /192.168.158.4:9866 2025-07-13 04:20:52,439 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46272, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1176340933_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746819_5995, duration(ns): 22095542 2025-07-13 04:20:52,439 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746819_5995, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-13 04:20:57,867 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746819_5995 replica FinalizedReplica, blk_1073746819_5995, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746819 for deletion 2025-07-13 04:20:57,868 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746819_5995 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746819 2025-07-13 04:23:57,415 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746822_5998 src: /192.168.158.7:48960 dest: /192.168.158.4:9866 2025-07-13 04:23:57,434 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:48960, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_601278612_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746822_5998, duration(ns): 15976328 2025-07-13 04:23:57,434 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746822_5998, type=LAST_IN_PIPELINE terminating 2025-07-13 04:24:03,875 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746822_5998 replica FinalizedReplica, blk_1073746822_5998, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746822 for deletion 2025-07-13 04:24:03,876 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746822_5998 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746822 2025-07-13 04:24:57,412 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746823_5999 src: /192.168.158.1:46180 dest: /192.168.158.4:9866 2025-07-13 04:24:57,444 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46180, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1563808957_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746823_5999, duration(ns): 23328104 2025-07-13 04:24:57,445 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746823_5999, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-13 04:25:03,878 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746823_5999 replica FinalizedReplica, blk_1073746823_5999, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746823 for deletion 2025-07-13 04:25:03,879 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746823_5999 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746823 2025-07-13 04:28:12,419 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746826_6002 src: /192.168.158.6:41384 dest: /192.168.158.4:9866 2025-07-13 04:28:12,442 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:41384, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1289489509_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746826_6002, duration(ns): 17930903 2025-07-13 04:28:12,442 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746826_6002, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-13 04:28:18,883 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746826_6002 replica FinalizedReplica, blk_1073746826_6002, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746826 for deletion 2025-07-13 04:28:18,884 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746826_6002 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746826 2025-07-13 04:29:12,417 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746827_6003 src: /192.168.158.1:43800 dest: /192.168.158.4:9866 2025-07-13 04:29:12,446 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43800, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1170863252_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746827_6003, duration(ns): 21169077 2025-07-13 04:29:12,447 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746827_6003, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-13 04:29:15,886 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746827_6003 replica FinalizedReplica, blk_1073746827_6003, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746827 for deletion 2025-07-13 04:29:15,887 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746827_6003 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746827 2025-07-13 04:30:12,417 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746828_6004 src: /192.168.158.9:46226 dest: /192.168.158.4:9866 2025-07-13 04:30:12,444 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:46226, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-705910732_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746828_6004, duration(ns): 21290940 2025-07-13 04:30:12,444 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746828_6004, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-13 04:30:18,891 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746828_6004 replica FinalizedReplica, blk_1073746828_6004, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746828 for deletion 2025-07-13 04:30:18,892 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746828_6004 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746828 2025-07-13 04:36:27,433 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746834_6010 src: /192.168.158.9:45070 dest: /192.168.158.4:9866 2025-07-13 04:36:27,457 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:45070, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-164873768_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746834_6010, duration(ns): 18941592 2025-07-13 04:36:27,458 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746834_6010, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-13 04:36:30,901 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746834_6010 replica FinalizedReplica, blk_1073746834_6010, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746834 for deletion 2025-07-13 04:36:30,903 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746834_6010 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746834 2025-07-13 04:37:27,433 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746835_6011 src: /192.168.158.7:33194 dest: /192.168.158.4:9866 2025-07-13 04:37:27,451 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:33194, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1841394271_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746835_6011, duration(ns): 14790600 2025-07-13 04:37:27,451 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746835_6011, type=LAST_IN_PIPELINE terminating 2025-07-13 04:37:30,902 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746835_6011 replica FinalizedReplica, blk_1073746835_6011, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746835 for deletion 2025-07-13 04:37:30,904 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746835_6011 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746835 2025-07-13 04:39:27,435 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746837_6013 src: /192.168.158.1:45166 dest: /192.168.158.4:9866 2025-07-13 04:39:27,467 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45166, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-769483348_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746837_6013, duration(ns): 23158367 2025-07-13 04:39:27,468 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746837_6013, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-13 04:39:30,911 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746837_6013 replica FinalizedReplica, blk_1073746837_6013, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746837 for deletion 2025-07-13 04:39:30,912 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746837_6013 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746837 2025-07-13 04:41:27,437 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746839_6015 src: /192.168.158.1:56596 dest: /192.168.158.4:9866 2025-07-13 04:41:27,468 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56596, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-625612206_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746839_6015, duration(ns): 22873887 2025-07-13 04:41:27,468 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746839_6015, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-13 04:41:30,916 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746839_6015 replica FinalizedReplica, blk_1073746839_6015, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746839 for deletion 2025-07-13 04:41:30,917 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746839_6015 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746839 2025-07-13 04:42:32,441 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746840_6016 src: /192.168.158.5:56452 dest: /192.168.158.4:9866 2025-07-13 04:42:32,460 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:56452, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1983052743_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746840_6016, duration(ns): 17054792 2025-07-13 04:42:32,460 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746840_6016, type=LAST_IN_PIPELINE terminating 2025-07-13 04:42:39,917 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746840_6016 replica FinalizedReplica, blk_1073746840_6016, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746840 for deletion 2025-07-13 04:42:39,919 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746840_6016 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746840 2025-07-13 04:46:42,433 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746844_6020 src: /192.168.158.7:55518 dest: /192.168.158.4:9866 2025-07-13 04:46:42,455 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:55518, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-496606963_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746844_6020, duration(ns): 17032948 2025-07-13 04:46:42,456 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746844_6020, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-13 04:46:45,923 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746844_6020 replica FinalizedReplica, blk_1073746844_6020, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746844 for deletion 2025-07-13 04:46:45,924 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746844_6020 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746844 2025-07-13 04:48:42,457 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746846_6022 src: /192.168.158.9:48826 dest: /192.168.158.4:9866 2025-07-13 04:48:42,474 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:48826, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_186297962_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746846_6022, duration(ns): 15065819 2025-07-13 04:48:42,475 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746846_6022, type=LAST_IN_PIPELINE terminating 2025-07-13 04:48:45,926 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746846_6022 replica FinalizedReplica, blk_1073746846_6022, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746846 for deletion 2025-07-13 04:48:45,928 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746846_6022 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746846 2025-07-13 04:50:52,454 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746848_6024 src: /192.168.158.9:45988 dest: /192.168.158.4:9866 2025-07-13 04:50:52,472 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:45988, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1535603631_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746848_6024, duration(ns): 16475292 2025-07-13 04:50:52,472 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746848_6024, type=LAST_IN_PIPELINE terminating 2025-07-13 04:50:57,936 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746848_6024 replica FinalizedReplica, blk_1073746848_6024, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746848 for deletion 2025-07-13 04:50:57,937 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746848_6024 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746848 2025-07-13 04:57:02,460 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746854_6030 src: /192.168.158.5:55924 dest: /192.168.158.4:9866 2025-07-13 04:57:02,485 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:55924, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1032997848_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746854_6030, duration(ns): 19904350 2025-07-13 04:57:02,486 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746854_6030, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-13 04:57:06,955 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746854_6030 replica FinalizedReplica, blk_1073746854_6030, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746854 for deletion 2025-07-13 04:57:06,956 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746854_6030 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746854 2025-07-13 04:58:02,467 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746855_6031 src: /192.168.158.6:45132 dest: /192.168.158.4:9866 2025-07-13 04:58:02,485 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:45132, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1740326618_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746855_6031, duration(ns): 16357018 2025-07-13 04:58:02,485 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746855_6031, type=LAST_IN_PIPELINE terminating 2025-07-13 04:58:09,957 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746855_6031 replica FinalizedReplica, blk_1073746855_6031, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746855 for deletion 2025-07-13 04:58:09,959 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746855_6031 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746855 2025-07-13 04:59:02,463 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746856_6032 src: /192.168.158.5:35890 dest: /192.168.158.4:9866 2025-07-13 04:59:02,489 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:35890, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1848094061_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746856_6032, duration(ns): 20341502 2025-07-13 04:59:02,489 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746856_6032, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-13 04:59:06,958 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746856_6032 replica FinalizedReplica, blk_1073746856_6032, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746856 for deletion 2025-07-13 04:59:06,959 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746856_6032 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746856 2025-07-13 05:00:07,465 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746857_6033 src: /192.168.158.6:41222 dest: /192.168.158.4:9866 2025-07-13 05:00:07,483 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:41222, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1852634582_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746857_6033, duration(ns): 15281196 2025-07-13 05:00:07,483 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746857_6033, type=LAST_IN_PIPELINE terminating 2025-07-13 05:00:12,960 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746857_6033 replica FinalizedReplica, blk_1073746857_6033, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746857 for deletion 2025-07-13 05:00:12,961 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746857_6033 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746857 2025-07-13 05:02:17,469 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746859_6035 src: /192.168.158.1:35000 dest: /192.168.158.4:9866 2025-07-13 05:02:17,502 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35000, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_737038547_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746859_6035, duration(ns): 24809428 2025-07-13 05:02:17,503 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746859_6035, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-13 05:02:21,967 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746859_6035 replica FinalizedReplica, blk_1073746859_6035, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746859 for deletion 2025-07-13 05:02:21,968 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746859_6035 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746859 2025-07-13 05:03:17,474 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746860_6036 src: /192.168.158.9:34996 dest: /192.168.158.4:9866 2025-07-13 05:03:17,499 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:34996, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1784656117_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746860_6036, duration(ns): 19245281 2025-07-13 05:03:17,499 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746860_6036, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-13 05:03:24,967 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746860_6036 replica FinalizedReplica, blk_1073746860_6036, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746860 for deletion 2025-07-13 05:03:24,968 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746860_6036 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746860 2025-07-13 05:04:17,486 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746861_6037 src: /192.168.158.8:59474 dest: /192.168.158.4:9866 2025-07-13 05:04:17,504 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:59474, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1066844102_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746861_6037, duration(ns): 16235863 2025-07-13 05:04:17,504 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746861_6037, type=LAST_IN_PIPELINE terminating 2025-07-13 05:04:21,969 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746861_6037 replica FinalizedReplica, blk_1073746861_6037, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746861 for deletion 2025-07-13 05:04:21,970 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746861_6037 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746861 2025-07-13 05:09:17,484 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746866_6042 src: /192.168.158.1:58110 dest: /192.168.158.4:9866 2025-07-13 05:09:17,515 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58110, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2068823109_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746866_6042, duration(ns): 22703645 2025-07-13 05:09:17,516 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746866_6042, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-13 05:09:21,976 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746866_6042 replica FinalizedReplica, blk_1073746866_6042, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746866 for deletion 2025-07-13 05:09:21,977 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746866_6042 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746866 2025-07-13 05:11:17,487 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746868_6044 src: /192.168.158.8:48290 dest: /192.168.158.4:9866 2025-07-13 05:11:17,503 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:48290, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1262307175_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746868_6044, duration(ns): 14080208 2025-07-13 05:11:17,503 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746868_6044, type=LAST_IN_PIPELINE terminating 2025-07-13 05:11:21,979 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746868_6044 replica FinalizedReplica, blk_1073746868_6044, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746868 for deletion 2025-07-13 05:11:21,981 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746868_6044 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746868 2025-07-13 05:12:22,484 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746869_6045 src: /192.168.158.1:59648 dest: /192.168.158.4:9866 2025-07-13 05:12:22,516 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59648, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1048491130_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746869_6045, duration(ns): 22957875 2025-07-13 05:12:22,516 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746869_6045, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-13 05:12:27,983 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746869_6045 replica FinalizedReplica, blk_1073746869_6045, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746869 for deletion 2025-07-13 05:12:27,984 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746869_6045 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746869 2025-07-13 05:15:27,490 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746872_6048 src: /192.168.158.1:41640 dest: /192.168.158.4:9866 2025-07-13 05:15:27,521 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41640, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_411083773_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746872_6048, duration(ns): 21617639 2025-07-13 05:15:27,521 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746872_6048, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-13 05:15:30,992 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746872_6048 replica FinalizedReplica, blk_1073746872_6048, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746872 for deletion 2025-07-13 05:15:30,993 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746872_6048 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746872 2025-07-13 05:22:42,502 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746879_6055 src: /192.168.158.1:40562 dest: /192.168.158.4:9866 2025-07-13 05:22:42,536 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40562, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2034364842_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746879_6055, duration(ns): 24096149 2025-07-13 05:22:42,536 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746879_6055, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-13 05:22:46,009 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746879_6055 replica FinalizedReplica, blk_1073746879_6055, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746879 for deletion 2025-07-13 05:22:46,011 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746879_6055 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746879 2025-07-13 05:26:47,533 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746883_6059 src: /192.168.158.8:38712 dest: /192.168.158.4:9866 2025-07-13 05:26:47,553 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:38712, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1533292734_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746883_6059, duration(ns): 17470222 2025-07-13 05:26:47,553 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746883_6059, type=LAST_IN_PIPELINE terminating 2025-07-13 05:26:52,021 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746883_6059 replica FinalizedReplica, blk_1073746883_6059, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746883 for deletion 2025-07-13 05:26:52,022 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746883_6059 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746883 2025-07-13 05:27:47,536 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746884_6060 src: /192.168.158.7:37866 dest: /192.168.158.4:9866 2025-07-13 05:27:47,555 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:37866, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1302269580_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746884_6060, duration(ns): 15854784 2025-07-13 05:27:47,555 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746884_6060, type=LAST_IN_PIPELINE terminating 2025-07-13 05:27:55,025 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746884_6060 replica FinalizedReplica, blk_1073746884_6060, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746884 for deletion 2025-07-13 05:27:55,026 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746884_6060 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746884 2025-07-13 05:28:47,535 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746885_6061 src: /192.168.158.8:46842 dest: /192.168.158.4:9866 2025-07-13 05:28:47,563 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:46842, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-834888610_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746885_6061, duration(ns): 22482303 2025-07-13 05:28:47,564 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746885_6061, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-13 05:28:52,025 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746885_6061 replica FinalizedReplica, blk_1073746885_6061, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746885 for deletion 2025-07-13 05:28:52,026 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746885_6061 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746885 2025-07-13 05:30:52,531 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746887_6063 src: /192.168.158.1:47734 dest: /192.168.158.4:9866 2025-07-13 05:30:52,562 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47734, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_549453099_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746887_6063, duration(ns): 21300348 2025-07-13 05:30:52,563 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746887_6063, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-13 05:30:58,030 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746887_6063 replica FinalizedReplica, blk_1073746887_6063, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746887 for deletion 2025-07-13 05:30:58,031 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746887_6063 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746887 2025-07-13 05:35:02,542 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746891_6067 src: /192.168.158.6:50146 dest: /192.168.158.4:9866 2025-07-13 05:35:02,566 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:50146, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1232317641_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746891_6067, duration(ns): 18625757 2025-07-13 05:35:02,566 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746891_6067, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-13 05:35:07,041 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746891_6067 replica FinalizedReplica, blk_1073746891_6067, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746891 for deletion 2025-07-13 05:35:07,042 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746891_6067 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746891 2025-07-13 05:36:13,268 INFO org.apache.hadoop.hdfs.server.datanode.DirectoryScanner: BlockPool BP-1059995147-192.168.158.1-1752101929360 Total blocks: 8, missing metadata files:0, missing block files:0, missing blocks in memory:0, mismatched blocks:0 2025-07-13 05:37:02,542 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746893_6069 src: /192.168.158.8:34116 dest: /192.168.158.4:9866 2025-07-13 05:37:02,566 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:34116, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1349970802_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746893_6069, duration(ns): 18994289 2025-07-13 05:37:02,566 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746893_6069, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-13 05:37:07,044 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746893_6069 replica FinalizedReplica, blk_1073746893_6069, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746893 for deletion 2025-07-13 05:37:07,045 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746893_6069 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746893 2025-07-13 05:37:19,048 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Successfully sent block report 0x12cfd9a757d31f33, containing 4 storage report(s), of which we sent 4. The reports had 8 total blocks and used 1 RPC(s). This took 1 msec to generate and 4 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5. 2025-07-13 05:37:19,048 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Got finalize command for block pool BP-1059995147-192.168.158.1-1752101929360 2025-07-13 05:39:02,545 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746895_6071 src: /192.168.158.7:48138 dest: /192.168.158.4:9866 2025-07-13 05:39:02,563 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:48138, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_718957066_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746895_6071, duration(ns): 15273545 2025-07-13 05:39:02,563 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746895_6071, type=LAST_IN_PIPELINE terminating 2025-07-13 05:39:10,050 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746895_6071 replica FinalizedReplica, blk_1073746895_6071, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746895 for deletion 2025-07-13 05:39:10,051 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746895_6071 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746895 2025-07-13 05:41:07,542 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746897_6073 src: /192.168.158.1:51254 dest: /192.168.158.4:9866 2025-07-13 05:41:07,572 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51254, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-708965998_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746897_6073, duration(ns): 21960473 2025-07-13 05:41:07,573 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746897_6073, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-13 05:41:16,054 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746897_6073 replica FinalizedReplica, blk_1073746897_6073, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746897 for deletion 2025-07-13 05:41:16,055 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746897_6073 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746897 2025-07-13 05:44:12,544 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746900_6076 src: /192.168.158.1:52902 dest: /192.168.158.4:9866 2025-07-13 05:44:12,574 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52902, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2041227273_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746900_6076, duration(ns): 21334840 2025-07-13 05:44:12,574 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746900_6076, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-13 05:44:19,063 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746900_6076 replica FinalizedReplica, blk_1073746900_6076, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746900 for deletion 2025-07-13 05:44:19,065 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746900_6076 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746900 2025-07-13 05:46:12,557 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746902_6078 src: /192.168.158.1:33324 dest: /192.168.158.4:9866 2025-07-13 05:46:12,589 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33324, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1879338420_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746902_6078, duration(ns): 23518426 2025-07-13 05:46:12,590 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746902_6078, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-13 05:46:16,069 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746902_6078 replica FinalizedReplica, blk_1073746902_6078, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746902 for deletion 2025-07-13 05:46:16,070 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746902_6078 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746902 2025-07-13 05:50:22,570 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746906_6082 src: /192.168.158.8:44250 dest: /192.168.158.4:9866 2025-07-13 05:50:22,585 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:44250, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1143340181_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746906_6082, duration(ns): 13522913 2025-07-13 05:50:22,585 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746906_6082, type=LAST_IN_PIPELINE terminating 2025-07-13 05:50:28,081 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746906_6082 replica FinalizedReplica, blk_1073746906_6082, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746906 for deletion 2025-07-13 05:50:28,082 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746906_6082 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746906 2025-07-13 05:54:37,571 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746910_6086 src: /192.168.158.9:48654 dest: /192.168.158.4:9866 2025-07-13 05:54:37,590 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:48654, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1437097740_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746910_6086, duration(ns): 16334581 2025-07-13 05:54:37,590 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746910_6086, type=LAST_IN_PIPELINE terminating 2025-07-13 05:54:43,091 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746910_6086 replica FinalizedReplica, blk_1073746910_6086, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746910 for deletion 2025-07-13 05:54:43,092 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746910_6086 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746910 2025-07-13 05:55:37,557 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746911_6087 src: /192.168.158.8:38880 dest: /192.168.158.4:9866 2025-07-13 05:55:37,582 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:38880, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-637012094_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746911_6087, duration(ns): 19628731 2025-07-13 05:55:37,583 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746911_6087, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-13 05:55:43,096 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746911_6087 replica FinalizedReplica, blk_1073746911_6087, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746911 for deletion 2025-07-13 05:55:43,097 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746911_6087 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746911 2025-07-13 05:59:37,566 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746915_6091 src: /192.168.158.7:57796 dest: /192.168.158.4:9866 2025-07-13 05:59:37,590 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:57796, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1124665579_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746915_6091, duration(ns): 17653432 2025-07-13 05:59:37,591 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746915_6091, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-13 05:59:46,108 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746915_6091 replica FinalizedReplica, blk_1073746915_6091, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746915 for deletion 2025-07-13 05:59:46,109 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746915_6091 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746915 2025-07-13 06:01:37,583 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746917_6093 src: /192.168.158.6:53742 dest: /192.168.158.4:9866 2025-07-13 06:01:37,609 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:53742, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1980396588_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746917_6093, duration(ns): 19870585 2025-07-13 06:01:37,609 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746917_6093, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-13 06:01:43,110 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746917_6093 replica FinalizedReplica, blk_1073746917_6093, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746917 for deletion 2025-07-13 06:01:43,111 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746917_6093 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746917 2025-07-13 06:02:42,577 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746918_6094 src: /192.168.158.5:35514 dest: /192.168.158.4:9866 2025-07-13 06:02:42,592 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:35514, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-351012220_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746918_6094, duration(ns): 13148444 2025-07-13 06:02:42,592 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746918_6094, type=LAST_IN_PIPELINE terminating 2025-07-13 06:02:46,113 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746918_6094 replica FinalizedReplica, blk_1073746918_6094, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746918 for deletion 2025-07-13 06:02:46,114 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746918_6094 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746918 2025-07-13 06:03:42,568 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746919_6095 src: /192.168.158.1:49708 dest: /192.168.158.4:9866 2025-07-13 06:03:42,597 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49708, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1461467267_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746919_6095, duration(ns): 20424378 2025-07-13 06:03:42,597 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746919_6095, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-13 06:03:49,115 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746919_6095 replica FinalizedReplica, blk_1073746919_6095, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746919 for deletion 2025-07-13 06:03:49,116 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746919_6095 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746919 2025-07-13 06:04:42,573 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746920_6096 src: /192.168.158.7:50492 dest: /192.168.158.4:9866 2025-07-13 06:04:42,597 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:50492, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1926396244_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746920_6096, duration(ns): 18646991 2025-07-13 06:04:42,597 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746920_6096, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-13 06:04:49,119 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746920_6096 replica FinalizedReplica, blk_1073746920_6096, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746920 for deletion 2025-07-13 06:04:49,120 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746920_6096 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746920 2025-07-13 06:07:42,572 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746923_6099 src: /192.168.158.1:50460 dest: /192.168.158.4:9866 2025-07-13 06:07:42,605 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50460, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1930535643_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746923_6099, duration(ns): 23697863 2025-07-13 06:07:42,605 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746923_6099, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-13 06:07:46,125 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746923_6099 replica FinalizedReplica, blk_1073746923_6099, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746923 for deletion 2025-07-13 06:07:46,126 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746923_6099 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746923 2025-07-13 06:09:47,573 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746925_6101 src: /192.168.158.6:54728 dest: /192.168.158.4:9866 2025-07-13 06:09:47,600 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:54728, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1409548358_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746925_6101, duration(ns): 20419956 2025-07-13 06:09:47,600 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746925_6101, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-13 06:09:55,128 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746925_6101 replica FinalizedReplica, blk_1073746925_6101, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746925 for deletion 2025-07-13 06:09:55,129 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746925_6101 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746925 2025-07-13 06:11:52,601 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746927_6103 src: /192.168.158.1:41672 dest: /192.168.158.4:9866 2025-07-13 06:11:52,635 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41672, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-476330438_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746927_6103, duration(ns): 25395306 2025-07-13 06:11:52,635 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746927_6103, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-13 06:12:01,134 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746927_6103 replica FinalizedReplica, blk_1073746927_6103, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746927 for deletion 2025-07-13 06:12:01,135 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746927_6103 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746927 2025-07-13 06:12:52,591 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746928_6104 src: /192.168.158.8:43912 dest: /192.168.158.4:9866 2025-07-13 06:12:52,610 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:43912, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1568033775_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746928_6104, duration(ns): 13696183 2025-07-13 06:12:52,611 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746928_6104, type=LAST_IN_PIPELINE terminating 2025-07-13 06:13:01,138 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746928_6104 replica FinalizedReplica, blk_1073746928_6104, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746928 for deletion 2025-07-13 06:13:01,139 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746928_6104 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746928 2025-07-13 06:13:52,596 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746929_6105 src: /192.168.158.1:48056 dest: /192.168.158.4:9866 2025-07-13 06:13:52,630 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48056, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_835787734_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746929_6105, duration(ns): 25760571 2025-07-13 06:13:52,631 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746929_6105, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-13 06:14:01,142 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746929_6105 replica FinalizedReplica, blk_1073746929_6105, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746929 for deletion 2025-07-13 06:14:01,143 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746929_6105 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746929 2025-07-13 06:14:52,586 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746930_6106 src: /192.168.158.7:59588 dest: /192.168.158.4:9866 2025-07-13 06:14:52,611 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:59588, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1044105010_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746930_6106, duration(ns): 19863192 2025-07-13 06:14:52,611 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746930_6106, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-13 06:15:01,145 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746930_6106 replica FinalizedReplica, blk_1073746930_6106, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746930 for deletion 2025-07-13 06:15:01,146 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746930_6106 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746930 2025-07-13 06:16:57,592 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746932_6108 src: /192.168.158.8:37284 dest: /192.168.158.4:9866 2025-07-13 06:16:57,615 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:37284, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2077488171_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746932_6108, duration(ns): 17670967 2025-07-13 06:16:57,615 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746932_6108, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-13 06:17:01,146 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746932_6108 replica FinalizedReplica, blk_1073746932_6108, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746932 for deletion 2025-07-13 06:17:01,147 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746932_6108 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746932 2025-07-13 06:21:02,602 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746936_6112 src: /192.168.158.1:37172 dest: /192.168.158.4:9866 2025-07-13 06:21:02,632 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37172, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2030957301_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746936_6112, duration(ns): 22002216 2025-07-13 06:21:02,633 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746936_6112, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-13 06:21:07,157 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746936_6112 replica FinalizedReplica, blk_1073746936_6112, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746936 for deletion 2025-07-13 06:21:07,158 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746936_6112 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746936 2025-07-13 06:22:02,617 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746937_6113 src: /192.168.158.5:45918 dest: /192.168.158.4:9866 2025-07-13 06:22:02,642 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:45918, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_149557382_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746937_6113, duration(ns): 20039711 2025-07-13 06:22:02,642 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746937_6113, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-13 06:22:07,161 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746937_6113 replica FinalizedReplica, blk_1073746937_6113, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746937 for deletion 2025-07-13 06:22:07,162 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746937_6113 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746937 2025-07-13 06:23:02,622 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746938_6114 src: /192.168.158.1:60054 dest: /192.168.158.4:9866 2025-07-13 06:23:02,653 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60054, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_453244689_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746938_6114, duration(ns): 22239894 2025-07-13 06:23:02,654 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746938_6114, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-13 06:23:07,165 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746938_6114 replica FinalizedReplica, blk_1073746938_6114, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746938 for deletion 2025-07-13 06:23:07,166 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746938_6114 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746938 2025-07-13 06:24:02,602 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746939_6115 src: /192.168.158.1:36242 dest: /192.168.158.4:9866 2025-07-13 06:24:02,632 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36242, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2106510900_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746939_6115, duration(ns): 21040022 2025-07-13 06:24:02,632 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746939_6115, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-13 06:24:07,168 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746939_6115 replica FinalizedReplica, blk_1073746939_6115, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746939 for deletion 2025-07-13 06:24:07,169 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746939_6115 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746939 2025-07-13 06:27:02,621 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746942_6118 src: /192.168.158.5:40018 dest: /192.168.158.4:9866 2025-07-13 06:27:02,646 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:40018, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1260567536_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746942_6118, duration(ns): 19152817 2025-07-13 06:27:02,646 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746942_6118, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-13 06:27:10,179 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746942_6118 replica FinalizedReplica, blk_1073746942_6118, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746942 for deletion 2025-07-13 06:27:10,180 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746942_6118 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746942 2025-07-13 06:28:07,616 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746943_6119 src: /192.168.158.6:35986 dest: /192.168.158.4:9866 2025-07-13 06:28:07,640 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:35986, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_989452871_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746943_6119, duration(ns): 19031674 2025-07-13 06:28:07,640 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746943_6119, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-13 06:28:16,182 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746943_6119 replica FinalizedReplica, blk_1073746943_6119, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746943 for deletion 2025-07-13 06:28:16,183 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746943_6119 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073746943 2025-07-13 06:30:12,627 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746945_6121 src: /192.168.158.6:52456 dest: /192.168.158.4:9866 2025-07-13 06:30:12,644 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:52456, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1308757252_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746945_6121, duration(ns): 15065070 2025-07-13 06:30:12,644 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746945_6121, type=LAST_IN_PIPELINE terminating 2025-07-13 06:30:19,184 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746945_6121 replica FinalizedReplica, blk_1073746945_6121, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073746945 for deletion 2025-07-13 06:30:19,186 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746945_6121 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073746945 2025-07-13 06:31:17,616 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746946_6122 src: /192.168.158.7:58120 dest: /192.168.158.4:9866 2025-07-13 06:31:17,641 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:58120, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1023147168_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746946_6122, duration(ns): 19803924 2025-07-13 06:31:17,641 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746946_6122, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-13 06:31:25,187 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746946_6122 replica FinalizedReplica, blk_1073746946_6122, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073746946 for deletion 2025-07-13 06:31:25,188 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746946_6122 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073746946 2025-07-13 06:35:22,627 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746950_6126 src: /192.168.158.7:41204 dest: /192.168.158.4:9866 2025-07-13 06:35:22,651 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:41204, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1675662541_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746950_6126, duration(ns): 18361632 2025-07-13 06:35:22,651 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746950_6126, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-13 06:35:25,199 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746950_6126 replica FinalizedReplica, blk_1073746950_6126, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073746950 for deletion 2025-07-13 06:35:25,200 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746950_6126 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073746950 2025-07-13 06:36:22,629 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746951_6127 src: /192.168.158.5:44022 dest: /192.168.158.4:9866 2025-07-13 06:36:22,654 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:44022, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1623204464_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746951_6127, duration(ns): 19958287 2025-07-13 06:36:22,655 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746951_6127, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-13 06:36:25,203 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746951_6127 replica FinalizedReplica, blk_1073746951_6127, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073746951 for deletion 2025-07-13 06:36:25,205 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746951_6127 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073746951 2025-07-13 06:39:27,632 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746954_6130 src: /192.168.158.7:37264 dest: /192.168.158.4:9866 2025-07-13 06:39:27,656 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:37264, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_538805499_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746954_6130, duration(ns): 18476714 2025-07-13 06:39:27,656 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746954_6130, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-13 06:39:34,209 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746954_6130 replica FinalizedReplica, blk_1073746954_6130, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073746954 for deletion 2025-07-13 06:39:34,210 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746954_6130 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073746954 2025-07-13 06:40:27,638 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746955_6131 src: /192.168.158.9:41540 dest: /192.168.158.4:9866 2025-07-13 06:40:27,656 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:41540, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1053930904_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746955_6131, duration(ns): 15292364 2025-07-13 06:40:27,656 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746955_6131, type=LAST_IN_PIPELINE terminating 2025-07-13 06:40:34,213 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746955_6131 replica FinalizedReplica, blk_1073746955_6131, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073746955 for deletion 2025-07-13 06:40:34,214 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746955_6131 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073746955 2025-07-13 06:41:32,644 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746956_6132 src: /192.168.158.9:42904 dest: /192.168.158.4:9866 2025-07-13 06:41:32,662 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:42904, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2110128883_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746956_6132, duration(ns): 15771735 2025-07-13 06:41:32,663 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746956_6132, type=LAST_IN_PIPELINE terminating 2025-07-13 06:41:37,215 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746956_6132 replica FinalizedReplica, blk_1073746956_6132, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073746956 for deletion 2025-07-13 06:41:37,216 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746956_6132 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073746956 2025-07-13 06:43:37,635 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746958_6134 src: /192.168.158.9:34094 dest: /192.168.158.4:9866 2025-07-13 06:43:37,659 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:34094, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1934822766_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746958_6134, duration(ns): 19348756 2025-07-13 06:43:37,660 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746958_6134, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-13 06:43:40,218 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746958_6134 replica FinalizedReplica, blk_1073746958_6134, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073746958 for deletion 2025-07-13 06:43:40,220 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746958_6134 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073746958 2025-07-13 06:46:47,638 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746961_6137 src: /192.168.158.9:55140 dest: /192.168.158.4:9866 2025-07-13 06:46:47,658 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:55140, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-583410380_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746961_6137, duration(ns): 17381578 2025-07-13 06:46:47,658 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746961_6137, type=LAST_IN_PIPELINE terminating 2025-07-13 06:46:52,224 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746961_6137 replica FinalizedReplica, blk_1073746961_6137, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073746961 for deletion 2025-07-13 06:46:52,225 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746961_6137 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073746961 2025-07-13 06:49:52,646 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746964_6140 src: /192.168.158.7:39178 dest: /192.168.158.4:9866 2025-07-13 06:49:52,670 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:39178, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_471327570_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746964_6140, duration(ns): 18330140 2025-07-13 06:49:52,670 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746964_6140, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-13 06:49:55,229 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746964_6140 replica FinalizedReplica, blk_1073746964_6140, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073746964 for deletion 2025-07-13 06:49:55,230 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746964_6140 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073746964 2025-07-13 06:50:52,644 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746965_6141 src: /192.168.158.8:38422 dest: /192.168.158.4:9866 2025-07-13 06:50:52,672 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:38422, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1633587173_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746965_6141, duration(ns): 21809280 2025-07-13 06:50:52,673 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746965_6141, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-13 06:50:55,231 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746965_6141 replica FinalizedReplica, blk_1073746965_6141, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073746965 for deletion 2025-07-13 06:50:55,233 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746965_6141 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073746965 2025-07-13 06:53:52,648 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746968_6144 src: /192.168.158.1:58662 dest: /192.168.158.4:9866 2025-07-13 06:53:52,680 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58662, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_605958731_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746968_6144, duration(ns): 23570654 2025-07-13 06:53:52,680 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746968_6144, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-13 06:53:58,239 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746968_6144 replica FinalizedReplica, blk_1073746968_6144, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073746968 for deletion 2025-07-13 06:53:58,240 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746968_6144 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073746968 2025-07-13 06:54:52,653 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746969_6145 src: /192.168.158.7:44546 dest: /192.168.158.4:9866 2025-07-13 06:54:52,680 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:44546, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1529150996_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746969_6145, duration(ns): 21695824 2025-07-13 06:54:52,680 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746969_6145, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-13 06:54:55,243 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746969_6145 replica FinalizedReplica, blk_1073746969_6145, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073746969 for deletion 2025-07-13 06:54:55,244 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746969_6145 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073746969 2025-07-13 07:00:07,682 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746974_6150 src: /192.168.158.1:34496 dest: /192.168.158.4:9866 2025-07-13 07:00:07,713 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34496, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1047320567_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746974_6150, duration(ns): 22424486 2025-07-13 07:00:07,714 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746974_6150, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-13 07:00:10,257 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746974_6150 replica FinalizedReplica, blk_1073746974_6150, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073746974 for deletion 2025-07-13 07:00:10,258 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746974_6150 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073746974 2025-07-13 07:01:12,671 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746975_6151 src: /192.168.158.1:47540 dest: /192.168.158.4:9866 2025-07-13 07:01:12,702 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47540, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-612231029_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746975_6151, duration(ns): 21545436 2025-07-13 07:01:12,703 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746975_6151, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-13 07:01:19,260 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746975_6151 replica FinalizedReplica, blk_1073746975_6151, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073746975 for deletion 2025-07-13 07:01:19,261 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746975_6151 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073746975 2025-07-13 07:03:12,678 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746977_6153 src: /192.168.158.5:38238 dest: /192.168.158.4:9866 2025-07-13 07:03:12,702 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:38238, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-107842343_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746977_6153, duration(ns): 18929867 2025-07-13 07:03:12,702 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746977_6153, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-13 07:03:16,267 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746977_6153 replica FinalizedReplica, blk_1073746977_6153, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073746977 for deletion 2025-07-13 07:03:16,268 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746977_6153 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073746977 2025-07-13 07:04:17,666 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746978_6154 src: /192.168.158.1:50732 dest: /192.168.158.4:9866 2025-07-13 07:04:17,700 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50732, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-315626258_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746978_6154, duration(ns): 23380093 2025-07-13 07:04:17,700 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746978_6154, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-13 07:04:22,270 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746978_6154 replica FinalizedReplica, blk_1073746978_6154, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073746978 for deletion 2025-07-13 07:04:22,271 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746978_6154 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073746978 2025-07-13 07:09:17,677 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746983_6159 src: /192.168.158.1:59234 dest: /192.168.158.4:9866 2025-07-13 07:09:17,709 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59234, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1371229273_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746983_6159, duration(ns): 23477790 2025-07-13 07:09:17,709 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746983_6159, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-13 07:09:22,278 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746983_6159 replica FinalizedReplica, blk_1073746983_6159, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073746983 for deletion 2025-07-13 07:09:22,279 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746983_6159 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073746983 2025-07-13 07:10:17,682 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746984_6160 src: /192.168.158.6:55516 dest: /192.168.158.4:9866 2025-07-13 07:10:17,699 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:55516, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1798774508_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746984_6160, duration(ns): 15198577 2025-07-13 07:10:17,700 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746984_6160, type=LAST_IN_PIPELINE terminating 2025-07-13 07:10:25,279 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746984_6160 replica FinalizedReplica, blk_1073746984_6160, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073746984 for deletion 2025-07-13 07:10:25,281 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746984_6160 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073746984 2025-07-13 07:11:22,693 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746985_6161 src: /192.168.158.1:57568 dest: /192.168.158.4:9866 2025-07-13 07:11:22,724 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57568, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-276606458_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746985_6161, duration(ns): 21708515 2025-07-13 07:11:22,724 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746985_6161, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-13 07:11:25,279 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746985_6161 replica FinalizedReplica, blk_1073746985_6161, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073746985 for deletion 2025-07-13 07:11:25,280 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746985_6161 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073746985 2025-07-13 07:15:27,696 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746989_6165 src: /192.168.158.1:40374 dest: /192.168.158.4:9866 2025-07-13 07:15:27,728 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40374, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2137463372_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746989_6165, duration(ns): 22419116 2025-07-13 07:15:27,728 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746989_6165, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-13 07:15:34,297 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746989_6165 replica FinalizedReplica, blk_1073746989_6165, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073746989 for deletion 2025-07-13 07:15:34,298 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746989_6165 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073746989 2025-07-13 07:18:32,700 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746992_6168 src: /192.168.158.9:45646 dest: /192.168.158.4:9866 2025-07-13 07:18:32,726 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:45646, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1772789570_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746992_6168, duration(ns): 20891831 2025-07-13 07:18:32,727 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746992_6168, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-13 07:18:40,303 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746992_6168 replica FinalizedReplica, blk_1073746992_6168, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073746992 for deletion 2025-07-13 07:18:40,304 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746992_6168 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073746992 2025-07-13 07:19:32,700 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746993_6169 src: /192.168.158.7:50782 dest: /192.168.158.4:9866 2025-07-13 07:19:32,718 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:50782, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-816532709_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746993_6169, duration(ns): 16092198 2025-07-13 07:19:32,719 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746993_6169, type=LAST_IN_PIPELINE terminating 2025-07-13 07:19:37,302 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746993_6169 replica FinalizedReplica, blk_1073746993_6169, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073746993 for deletion 2025-07-13 07:19:37,304 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746993_6169 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073746993 2025-07-13 07:21:32,706 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073746995_6171 src: /192.168.158.1:56174 dest: /192.168.158.4:9866 2025-07-13 07:21:32,738 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56174, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2002277126_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073746995_6171, duration(ns): 23217948 2025-07-13 07:21:32,738 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073746995_6171, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-13 07:21:37,310 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073746995_6171 replica FinalizedReplica, blk_1073746995_6171, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073746995 for deletion 2025-07-13 07:21:37,311 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073746995_6171 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073746995 2025-07-13 07:30:47,726 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747004_6180 src: /192.168.158.5:37904 dest: /192.168.158.4:9866 2025-07-13 07:30:47,750 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:37904, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_949764691_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747004_6180, duration(ns): 19522940 2025-07-13 07:30:47,751 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747004_6180, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-13 07:30:52,337 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747004_6180 replica FinalizedReplica, blk_1073747004_6180, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747004 for deletion 2025-07-13 07:30:52,338 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747004_6180 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747004 2025-07-13 07:33:52,736 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747007_6183 src: /192.168.158.6:48924 dest: /192.168.158.4:9866 2025-07-13 07:33:52,763 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:48924, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-30424531_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747007_6183, duration(ns): 20956727 2025-07-13 07:33:52,763 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747007_6183, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-13 07:33:55,345 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747007_6183 replica FinalizedReplica, blk_1073747007_6183, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747007 for deletion 2025-07-13 07:33:55,346 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747007_6183 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747007 2025-07-13 07:34:52,738 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747008_6184 src: /192.168.158.5:32930 dest: /192.168.158.4:9866 2025-07-13 07:34:52,761 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:32930, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-805850449_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747008_6184, duration(ns): 17258806 2025-07-13 07:34:52,761 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747008_6184, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-13 07:34:55,349 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747008_6184 replica FinalizedReplica, blk_1073747008_6184, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747008 for deletion 2025-07-13 07:34:55,350 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747008_6184 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747008 2025-07-13 07:35:52,725 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747009_6185 src: /192.168.158.9:60906 dest: /192.168.158.4:9866 2025-07-13 07:35:52,750 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:60906, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1931894194_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747009_6185, duration(ns): 19632432 2025-07-13 07:35:52,750 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747009_6185, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-13 07:35:55,349 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747009_6185 replica FinalizedReplica, blk_1073747009_6185, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747009 for deletion 2025-07-13 07:35:55,350 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747009_6185 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747009 2025-07-13 07:36:52,722 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747010_6186 src: /192.168.158.7:53972 dest: /192.168.158.4:9866 2025-07-13 07:36:52,750 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:53972, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_495558050_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747010_6186, duration(ns): 22482416 2025-07-13 07:36:52,750 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747010_6186, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-13 07:36:58,352 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747010_6186 replica FinalizedReplica, blk_1073747010_6186, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747010 for deletion 2025-07-13 07:36:58,353 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747010_6186 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747010 2025-07-13 07:41:57,719 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747015_6191 src: /192.168.158.1:45720 dest: /192.168.158.4:9866 2025-07-13 07:41:57,749 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45720, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_716708381_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747015_6191, duration(ns): 21336349 2025-07-13 07:41:57,749 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747015_6191, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-13 07:42:04,363 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747015_6191 replica FinalizedReplica, blk_1073747015_6191, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747015 for deletion 2025-07-13 07:42:04,364 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747015_6191 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747015 2025-07-13 07:44:57,728 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747018_6194 src: /192.168.158.5:57920 dest: /192.168.158.4:9866 2025-07-13 07:44:57,747 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:57920, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-391634121_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747018_6194, duration(ns): 15830821 2025-07-13 07:44:57,747 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747018_6194, type=LAST_IN_PIPELINE terminating 2025-07-13 07:45:01,368 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747018_6194 replica FinalizedReplica, blk_1073747018_6194, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747018 for deletion 2025-07-13 07:45:01,370 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747018_6194 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747018 2025-07-13 07:47:02,732 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747020_6196 src: /192.168.158.6:51346 dest: /192.168.158.4:9866 2025-07-13 07:47:02,757 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:51346, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1899901545_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747020_6196, duration(ns): 19483961 2025-07-13 07:47:02,758 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747020_6196, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-13 07:47:07,374 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747020_6196 replica FinalizedReplica, blk_1073747020_6196, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747020 for deletion 2025-07-13 07:47:07,376 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747020_6196 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747020 2025-07-13 07:51:17,738 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747024_6200 src: /192.168.158.5:40526 dest: /192.168.158.4:9866 2025-07-13 07:51:17,762 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:40526, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_354545846_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747024_6200, duration(ns): 19155177 2025-07-13 07:51:17,763 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747024_6200, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-13 07:51:22,388 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747024_6200 replica FinalizedReplica, blk_1073747024_6200, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747024 for deletion 2025-07-13 07:51:22,389 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747024_6200 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747024 2025-07-13 07:52:22,737 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747025_6201 src: /192.168.158.8:38532 dest: /192.168.158.4:9866 2025-07-13 07:52:22,761 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:38532, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1275987546_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747025_6201, duration(ns): 18991028 2025-07-13 07:52:22,761 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747025_6201, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-13 07:52:25,389 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747025_6201 replica FinalizedReplica, blk_1073747025_6201, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747025 for deletion 2025-07-13 07:52:25,390 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747025_6201 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747025 2025-07-13 07:53:22,735 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747026_6202 src: /192.168.158.1:52594 dest: /192.168.158.4:9866 2025-07-13 07:53:22,766 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52594, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-834389717_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747026_6202, duration(ns): 21970445 2025-07-13 07:53:22,766 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747026_6202, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-13 07:53:28,391 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747026_6202 replica FinalizedReplica, blk_1073747026_6202, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747026 for deletion 2025-07-13 07:53:28,392 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747026_6202 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747026 2025-07-13 07:54:22,738 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747027_6203 src: /192.168.158.6:48652 dest: /192.168.158.4:9866 2025-07-13 07:54:22,762 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:48652, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1364467890_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747027_6203, duration(ns): 19064946 2025-07-13 07:54:22,763 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747027_6203, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-13 07:54:28,393 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747027_6203 replica FinalizedReplica, blk_1073747027_6203, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747027 for deletion 2025-07-13 07:54:28,394 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747027_6203 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747027 2025-07-13 07:56:22,739 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747029_6205 src: /192.168.158.1:37806 dest: /192.168.158.4:9866 2025-07-13 07:56:22,771 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37806, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1086271044_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747029_6205, duration(ns): 22570787 2025-07-13 07:56:22,771 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747029_6205, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-13 07:56:28,397 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747029_6205 replica FinalizedReplica, blk_1073747029_6205, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747029 for deletion 2025-07-13 07:56:28,399 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747029_6205 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747029 2025-07-13 07:57:27,741 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747030_6206 src: /192.168.158.7:33870 dest: /192.168.158.4:9866 2025-07-13 07:57:27,766 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:33870, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_602023909_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747030_6206, duration(ns): 19722267 2025-07-13 07:57:27,766 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747030_6206, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-13 07:57:31,404 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747030_6206 replica FinalizedReplica, blk_1073747030_6206, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747030 for deletion 2025-07-13 07:57:31,405 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747030_6206 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747030 2025-07-13 07:58:27,749 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747031_6207 src: /192.168.158.5:34156 dest: /192.168.158.4:9866 2025-07-13 07:58:27,767 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:34156, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-591777497_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747031_6207, duration(ns): 15449477 2025-07-13 07:58:27,767 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747031_6207, type=LAST_IN_PIPELINE terminating 2025-07-13 07:58:34,406 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747031_6207 replica FinalizedReplica, blk_1073747031_6207, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747031 for deletion 2025-07-13 07:58:34,408 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747031_6207 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747031 2025-07-13 07:59:32,751 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747032_6208 src: /192.168.158.1:38080 dest: /192.168.158.4:9866 2025-07-13 07:59:32,782 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38080, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-174200296_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747032_6208, duration(ns): 22402422 2025-07-13 07:59:32,782 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747032_6208, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-13 07:59:40,408 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747032_6208 replica FinalizedReplica, blk_1073747032_6208, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747032 for deletion 2025-07-13 07:59:40,409 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747032_6208 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747032 2025-07-13 08:02:37,756 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747035_6211 src: /192.168.158.1:40146 dest: /192.168.158.4:9866 2025-07-13 08:02:37,789 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40146, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2015379948_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747035_6211, duration(ns): 24452026 2025-07-13 08:02:37,789 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747035_6211, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-13 08:02:40,416 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747035_6211 replica FinalizedReplica, blk_1073747035_6211, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747035 for deletion 2025-07-13 08:02:40,417 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747035_6211 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747035 2025-07-13 08:03:37,761 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747036_6212 src: /192.168.158.5:34668 dest: /192.168.158.4:9866 2025-07-13 08:03:37,785 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:34668, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1024119136_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747036_6212, duration(ns): 19457278 2025-07-13 08:03:37,786 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747036_6212, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-13 08:03:40,420 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747036_6212 replica FinalizedReplica, blk_1073747036_6212, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747036 for deletion 2025-07-13 08:03:40,421 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747036_6212 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747036 2025-07-13 08:06:37,777 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747039_6215 src: /192.168.158.1:42284 dest: /192.168.158.4:9866 2025-07-13 08:06:37,809 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42284, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-987217188_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747039_6215, duration(ns): 22535178 2025-07-13 08:06:37,809 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747039_6215, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-13 08:06:40,425 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747039_6215 replica FinalizedReplica, blk_1073747039_6215, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747039 for deletion 2025-07-13 08:06:40,426 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747039_6215 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747039 2025-07-13 08:08:47,758 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747041_6217 src: /192.168.158.1:43560 dest: /192.168.158.4:9866 2025-07-13 08:08:47,791 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43560, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1558399053_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747041_6217, duration(ns): 24194513 2025-07-13 08:08:47,791 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747041_6217, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-13 08:08:55,431 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747041_6217 replica FinalizedReplica, blk_1073747041_6217, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747041 for deletion 2025-07-13 08:08:55,432 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747041_6217 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747041 2025-07-13 08:11:47,776 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747044_6220 src: /192.168.158.9:35008 dest: /192.168.158.4:9866 2025-07-13 08:11:47,799 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:35008, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-232157870_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747044_6220, duration(ns): 18124037 2025-07-13 08:11:47,800 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747044_6220, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-13 08:11:55,437 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747044_6220 replica FinalizedReplica, blk_1073747044_6220, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747044 for deletion 2025-07-13 08:11:55,438 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747044_6220 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747044 2025-07-13 08:12:47,772 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747045_6221 src: /192.168.158.5:35800 dest: /192.168.158.4:9866 2025-07-13 08:12:47,796 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:35800, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1048172815_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747045_6221, duration(ns): 17939290 2025-07-13 08:12:47,796 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747045_6221, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-13 08:12:55,438 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747045_6221 replica FinalizedReplica, blk_1073747045_6221, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747045 for deletion 2025-07-13 08:12:55,440 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747045_6221 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747045 2025-07-13 08:13:47,780 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747046_6222 src: /192.168.158.5:39336 dest: /192.168.158.4:9866 2025-07-13 08:13:47,808 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:39336, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1684465529_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747046_6222, duration(ns): 22620309 2025-07-13 08:13:47,808 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747046_6222, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-13 08:13:55,440 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747046_6222 replica FinalizedReplica, blk_1073747046_6222, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747046 for deletion 2025-07-13 08:13:55,442 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747046_6222 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747046 2025-07-13 08:14:47,783 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747047_6223 src: /192.168.158.1:57990 dest: /192.168.158.4:9866 2025-07-13 08:14:47,815 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57990, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_881861650_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747047_6223, duration(ns): 23063952 2025-07-13 08:14:47,815 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747047_6223, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-13 08:14:55,443 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747047_6223 replica FinalizedReplica, blk_1073747047_6223, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747047 for deletion 2025-07-13 08:14:55,444 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747047_6223 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747047 2025-07-13 08:16:47,789 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747049_6225 src: /192.168.158.8:59740 dest: /192.168.158.4:9866 2025-07-13 08:16:47,808 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:59740, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_627830882_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747049_6225, duration(ns): 17039959 2025-07-13 08:16:47,809 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747049_6225, type=LAST_IN_PIPELINE terminating 2025-07-13 08:16:55,447 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747049_6225 replica FinalizedReplica, blk_1073747049_6225, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747049 for deletion 2025-07-13 08:16:55,448 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747049_6225 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747049 2025-07-13 08:18:52,789 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747051_6227 src: /192.168.158.5:44822 dest: /192.168.158.4:9866 2025-07-13 08:18:52,806 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:44822, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-93949280_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747051_6227, duration(ns): 14939094 2025-07-13 08:18:52,807 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747051_6227, type=LAST_IN_PIPELINE terminating 2025-07-13 08:18:55,453 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747051_6227 replica FinalizedReplica, blk_1073747051_6227, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747051 for deletion 2025-07-13 08:18:55,455 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747051_6227 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747051 2025-07-13 08:22:57,835 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747055_6231 src: /192.168.158.8:54246 dest: /192.168.158.4:9866 2025-07-13 08:22:57,864 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:54246, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1562556713_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747055_6231, duration(ns): 23285687 2025-07-13 08:22:57,867 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747055_6231, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-13 08:23:01,460 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747055_6231 replica FinalizedReplica, blk_1073747055_6231, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747055 for deletion 2025-07-13 08:23:01,462 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747055_6231 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747055 2025-07-13 08:25:02,794 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747057_6233 src: /192.168.158.1:41020 dest: /192.168.158.4:9866 2025-07-13 08:25:02,825 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41020, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1995014492_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747057_6233, duration(ns): 22433847 2025-07-13 08:25:02,826 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747057_6233, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-13 08:25:07,467 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747057_6233 replica FinalizedReplica, blk_1073747057_6233, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747057 for deletion 2025-07-13 08:25:07,468 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747057_6233 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747057 2025-07-13 08:26:07,792 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747058_6234 src: /192.168.158.1:53924 dest: /192.168.158.4:9866 2025-07-13 08:26:07,827 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53924, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1591487042_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747058_6234, duration(ns): 25263090 2025-07-13 08:26:07,827 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747058_6234, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-13 08:26:10,468 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747058_6234 replica FinalizedReplica, blk_1073747058_6234, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747058 for deletion 2025-07-13 08:26:10,469 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747058_6234 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747058 2025-07-13 08:29:12,804 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747061_6237 src: /192.168.158.9:53150 dest: /192.168.158.4:9866 2025-07-13 08:29:12,822 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:53150, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1792257682_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747061_6237, duration(ns): 15758027 2025-07-13 08:29:12,823 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747061_6237, type=LAST_IN_PIPELINE terminating 2025-07-13 08:29:19,476 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747061_6237 replica FinalizedReplica, blk_1073747061_6237, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747061 for deletion 2025-07-13 08:29:19,477 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747061_6237 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747061 2025-07-13 08:31:12,806 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747063_6239 src: /192.168.158.7:44292 dest: /192.168.158.4:9866 2025-07-13 08:31:12,825 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:44292, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1653359312_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747063_6239, duration(ns): 16922344 2025-07-13 08:31:12,826 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747063_6239, type=LAST_IN_PIPELINE terminating 2025-07-13 08:31:16,476 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747063_6239 replica FinalizedReplica, blk_1073747063_6239, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747063 for deletion 2025-07-13 08:31:16,477 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747063_6239 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747063 2025-07-13 08:34:12,796 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747066_6242 src: /192.168.158.1:43526 dest: /192.168.158.4:9866 2025-07-13 08:34:12,827 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43526, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_49508413_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747066_6242, duration(ns): 21659790 2025-07-13 08:34:12,827 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747066_6242, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-13 08:34:19,482 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747066_6242 replica FinalizedReplica, blk_1073747066_6242, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747066 for deletion 2025-07-13 08:34:19,483 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747066_6242 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747066 2025-07-13 08:38:17,810 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747070_6246 src: /192.168.158.8:47782 dest: /192.168.158.4:9866 2025-07-13 08:38:17,837 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:47782, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_930602666_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747070_6246, duration(ns): 21059776 2025-07-13 08:38:17,837 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747070_6246, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-13 08:38:22,489 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747070_6246 replica FinalizedReplica, blk_1073747070_6246, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747070 for deletion 2025-07-13 08:38:22,491 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747070_6246 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747070 2025-07-13 08:40:17,820 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747072_6248 src: /192.168.158.5:52812 dest: /192.168.158.4:9866 2025-07-13 08:40:17,844 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:52812, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-702331977_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747072_6248, duration(ns): 18925099 2025-07-13 08:40:17,844 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747072_6248, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-13 08:40:22,492 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747072_6248 replica FinalizedReplica, blk_1073747072_6248, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747072 for deletion 2025-07-13 08:40:22,494 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747072_6248 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747072 2025-07-13 08:47:22,839 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747079_6255 src: /192.168.158.5:42370 dest: /192.168.158.4:9866 2025-07-13 08:47:22,856 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:42370, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-466531719_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747079_6255, duration(ns): 14897205 2025-07-13 08:47:22,857 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747079_6255, type=LAST_IN_PIPELINE terminating 2025-07-13 08:47:28,508 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747079_6255 replica FinalizedReplica, blk_1073747079_6255, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747079 for deletion 2025-07-13 08:47:28,509 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747079_6255 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747079 2025-07-13 08:55:32,848 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747087_6263 src: /192.168.158.6:44316 dest: /192.168.158.4:9866 2025-07-13 08:55:32,874 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:44316, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-18357121_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747087_6263, duration(ns): 20686659 2025-07-13 08:55:32,875 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747087_6263, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-13 08:55:37,526 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747087_6263 replica FinalizedReplica, blk_1073747087_6263, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747087 for deletion 2025-07-13 08:55:37,527 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747087_6263 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747087 2025-07-13 08:57:32,869 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747089_6265 src: /192.168.158.8:44368 dest: /192.168.158.4:9866 2025-07-13 08:57:32,886 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:44368, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-80812479_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747089_6265, duration(ns): 14373319 2025-07-13 08:57:32,886 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747089_6265, type=LAST_IN_PIPELINE terminating 2025-07-13 08:57:37,531 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747089_6265 replica FinalizedReplica, blk_1073747089_6265, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747089 for deletion 2025-07-13 08:57:37,532 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747089_6265 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747089 2025-07-13 09:00:42,866 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747092_6268 src: /192.168.158.1:56182 dest: /192.168.158.4:9866 2025-07-13 09:00:42,900 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56182, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1177661982_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747092_6268, duration(ns): 24499650 2025-07-13 09:00:42,901 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747092_6268, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-13 09:00:46,540 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747092_6268 replica FinalizedReplica, blk_1073747092_6268, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747092 for deletion 2025-07-13 09:00:46,541 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747092_6268 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747092 2025-07-13 09:01:42,869 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747093_6269 src: /192.168.158.5:51476 dest: /192.168.158.4:9866 2025-07-13 09:01:42,888 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:51476, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_669978491_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747093_6269, duration(ns): 17227492 2025-07-13 09:01:42,889 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747093_6269, type=LAST_IN_PIPELINE terminating 2025-07-13 09:01:46,541 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747093_6269 replica FinalizedReplica, blk_1073747093_6269, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747093 for deletion 2025-07-13 09:01:46,543 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747093_6269 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747093 2025-07-13 09:02:42,888 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747094_6270 src: /192.168.158.9:35540 dest: /192.168.158.4:9866 2025-07-13 09:02:42,912 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:35540, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_843166584_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747094_6270, duration(ns): 19127936 2025-07-13 09:02:42,912 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747094_6270, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-13 09:02:46,544 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747094_6270 replica FinalizedReplica, blk_1073747094_6270, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747094 for deletion 2025-07-13 09:02:46,545 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747094_6270 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747094 2025-07-13 09:04:42,892 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747096_6272 src: /192.168.158.1:45166 dest: /192.168.158.4:9866 2025-07-13 09:04:42,925 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45166, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1694253785_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747096_6272, duration(ns): 24307522 2025-07-13 09:04:42,926 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747096_6272, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-13 09:04:46,548 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747096_6272 replica FinalizedReplica, blk_1073747096_6272, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747096 for deletion 2025-07-13 09:04:46,549 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747096_6272 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747096 2025-07-13 09:06:47,884 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747098_6274 src: /192.168.158.5:35130 dest: /192.168.158.4:9866 2025-07-13 09:06:47,908 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:35130, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1270474872_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747098_6274, duration(ns): 18327164 2025-07-13 09:06:47,908 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747098_6274, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-13 09:06:55,552 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747098_6274 replica FinalizedReplica, blk_1073747098_6274, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747098 for deletion 2025-07-13 09:06:55,554 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747098_6274 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747098 2025-07-13 09:07:47,873 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747099_6275 src: /192.168.158.9:49596 dest: /192.168.158.4:9866 2025-07-13 09:07:47,898 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:49596, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1350351773_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747099_6275, duration(ns): 19351575 2025-07-13 09:07:47,898 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747099_6275, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-13 09:07:52,553 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747099_6275 replica FinalizedReplica, blk_1073747099_6275, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747099 for deletion 2025-07-13 09:07:52,555 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747099_6275 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747099 2025-07-13 09:11:52,882 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747103_6279 src: /192.168.158.1:59194 dest: /192.168.158.4:9866 2025-07-13 09:11:52,913 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59194, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1609510020_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747103_6279, duration(ns): 22842825 2025-07-13 09:11:52,914 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747103_6279, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-13 09:11:55,564 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747103_6279 replica FinalizedReplica, blk_1073747103_6279, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747103 for deletion 2025-07-13 09:11:55,564 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747103_6279 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747103 2025-07-13 09:12:52,898 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747104_6280 src: /192.168.158.9:33762 dest: /192.168.158.4:9866 2025-07-13 09:12:52,917 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:33762, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_555033958_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747104_6280, duration(ns): 16690313 2025-07-13 09:12:52,917 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747104_6280, type=LAST_IN_PIPELINE terminating 2025-07-13 09:12:58,567 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747104_6280 replica FinalizedReplica, blk_1073747104_6280, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747104 for deletion 2025-07-13 09:12:58,568 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747104_6280 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747104 2025-07-13 09:13:57,885 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747105_6281 src: /192.168.158.1:53510 dest: /192.168.158.4:9866 2025-07-13 09:13:57,916 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53510, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1336533497_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747105_6281, duration(ns): 23511717 2025-07-13 09:13:57,917 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747105_6281, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-13 09:14:01,569 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747105_6281 replica FinalizedReplica, blk_1073747105_6281, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747105 for deletion 2025-07-13 09:14:01,570 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747105_6281 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747105 2025-07-13 09:16:57,909 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747108_6284 src: /192.168.158.7:41186 dest: /192.168.158.4:9866 2025-07-13 09:16:57,937 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:41186, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2088814751_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747108_6284, duration(ns): 21142877 2025-07-13 09:16:57,937 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747108_6284, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-13 09:17:01,577 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747108_6284 replica FinalizedReplica, blk_1073747108_6284, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747108 for deletion 2025-07-13 09:17:01,578 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747108_6284 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747108 2025-07-13 09:18:57,917 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747110_6286 src: /192.168.158.8:52318 dest: /192.168.158.4:9866 2025-07-13 09:18:57,943 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:52318, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1207219964_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747110_6286, duration(ns): 20928127 2025-07-13 09:18:57,944 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747110_6286, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-13 09:19:01,582 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747110_6286 replica FinalizedReplica, blk_1073747110_6286, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747110 for deletion 2025-07-13 09:19:01,583 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747110_6286 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747110 2025-07-13 09:19:57,933 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747111_6287 src: /192.168.158.8:44132 dest: /192.168.158.4:9866 2025-07-13 09:19:57,950 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:44132, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1890611762_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747111_6287, duration(ns): 15049000 2025-07-13 09:19:57,950 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747111_6287, type=LAST_IN_PIPELINE terminating 2025-07-13 09:20:01,583 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747111_6287 replica FinalizedReplica, blk_1073747111_6287, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747111 for deletion 2025-07-13 09:20:01,585 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747111_6287 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747111 2025-07-13 09:23:02,909 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747114_6290 src: /192.168.158.7:33880 dest: /192.168.158.4:9866 2025-07-13 09:23:02,936 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:33880, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-41820508_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747114_6290, duration(ns): 21455009 2025-07-13 09:23:02,936 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747114_6290, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-13 09:23:10,591 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747114_6290 replica FinalizedReplica, blk_1073747114_6290, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747114 for deletion 2025-07-13 09:23:10,592 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747114_6290 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747114 2025-07-13 09:24:02,934 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747115_6291 src: /192.168.158.1:55392 dest: /192.168.158.4:9866 2025-07-13 09:24:02,967 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55392, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-905221076_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747115_6291, duration(ns): 24469226 2025-07-13 09:24:02,968 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747115_6291, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-13 09:24:07,592 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747115_6291 replica FinalizedReplica, blk_1073747115_6291, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747115 for deletion 2025-07-13 09:24:07,593 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747115_6291 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747115 2025-07-13 09:25:02,945 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747116_6292 src: /192.168.158.9:59392 dest: /192.168.158.4:9866 2025-07-13 09:25:02,963 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:59392, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_161237037_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747116_6292, duration(ns): 15431301 2025-07-13 09:25:02,963 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747116_6292, type=LAST_IN_PIPELINE terminating 2025-07-13 09:25:07,591 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747116_6292 replica FinalizedReplica, blk_1073747116_6292, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747116 for deletion 2025-07-13 09:25:07,593 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747116_6292 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747116 2025-07-13 09:26:02,917 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747117_6293 src: /192.168.158.9:35738 dest: /192.168.158.4:9866 2025-07-13 09:26:02,933 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:35738, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_268660928_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747117_6293, duration(ns): 14100741 2025-07-13 09:26:02,933 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747117_6293, type=LAST_IN_PIPELINE terminating 2025-07-13 09:26:07,594 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747117_6293 replica FinalizedReplica, blk_1073747117_6293, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747117 for deletion 2025-07-13 09:26:07,595 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747117_6293 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747117 2025-07-13 09:27:02,911 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747118_6294 src: /192.168.158.1:53424 dest: /192.168.158.4:9866 2025-07-13 09:27:02,943 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53424, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1128503481_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747118_6294, duration(ns): 23048885 2025-07-13 09:27:02,943 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747118_6294, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-13 09:27:07,594 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747118_6294 replica FinalizedReplica, blk_1073747118_6294, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747118 for deletion 2025-07-13 09:27:07,596 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747118_6294 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747118 2025-07-13 09:31:02,919 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747122_6298 src: /192.168.158.6:33920 dest: /192.168.158.4:9866 2025-07-13 09:31:02,944 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:33920, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-568738264_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747122_6298, duration(ns): 20040702 2025-07-13 09:31:02,944 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747122_6298, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-13 09:31:07,603 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747122_6298 replica FinalizedReplica, blk_1073747122_6298, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747122 for deletion 2025-07-13 09:31:07,604 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747122_6298 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747122 2025-07-13 09:32:02,923 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747123_6299 src: /192.168.158.6:56354 dest: /192.168.158.4:9866 2025-07-13 09:32:02,947 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:56354, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1063626925_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747123_6299, duration(ns): 18634800 2025-07-13 09:32:02,949 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747123_6299, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-13 09:32:10,605 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747123_6299 replica FinalizedReplica, blk_1073747123_6299, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747123 for deletion 2025-07-13 09:32:10,607 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747123_6299 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747123 2025-07-13 09:34:02,919 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747125_6301 src: /192.168.158.1:37068 dest: /192.168.158.4:9866 2025-07-13 09:34:02,950 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37068, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1849358677_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747125_6301, duration(ns): 23328058 2025-07-13 09:34:02,951 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747125_6301, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-13 09:34:07,607 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747125_6301 replica FinalizedReplica, blk_1073747125_6301, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747125 for deletion 2025-07-13 09:34:07,608 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747125_6301 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747125 2025-07-13 09:37:07,929 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747128_6304 src: /192.168.158.8:41132 dest: /192.168.158.4:9866 2025-07-13 09:37:07,954 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:41132, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_699545930_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747128_6304, duration(ns): 19636468 2025-07-13 09:37:07,956 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747128_6304, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-13 09:37:13,613 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747128_6304 replica FinalizedReplica, blk_1073747128_6304, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747128 for deletion 2025-07-13 09:37:13,614 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747128_6304 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747128 2025-07-13 09:38:12,917 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747129_6305 src: /192.168.158.1:47702 dest: /192.168.158.4:9866 2025-07-13 09:38:12,946 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47702, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_705070113_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747129_6305, duration(ns): 21116080 2025-07-13 09:38:12,948 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747129_6305, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-13 09:38:16,615 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747129_6305 replica FinalizedReplica, blk_1073747129_6305, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747129 for deletion 2025-07-13 09:38:16,616 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747129_6305 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747129 2025-07-13 09:39:17,944 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747130_6306 src: /192.168.158.1:49016 dest: /192.168.158.4:9866 2025-07-13 09:39:17,977 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49016, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-430447381_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747130_6306, duration(ns): 23953505 2025-07-13 09:39:17,980 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747130_6306, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-13 09:39:22,616 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747130_6306 replica FinalizedReplica, blk_1073747130_6306, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747130 for deletion 2025-07-13 09:39:22,617 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747130_6306 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747130 2025-07-13 09:40:17,926 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747131_6307 src: /192.168.158.1:43228 dest: /192.168.158.4:9866 2025-07-13 09:40:17,958 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43228, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_576064737_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747131_6307, duration(ns): 23374136 2025-07-13 09:40:17,960 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747131_6307, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-13 09:40:22,620 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747131_6307 replica FinalizedReplica, blk_1073747131_6307, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747131 for deletion 2025-07-13 09:40:22,621 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747131_6307 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747131 2025-07-13 09:43:22,953 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747134_6310 src: /192.168.158.7:58928 dest: /192.168.158.4:9866 2025-07-13 09:43:22,977 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:58928, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-726017168_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747134_6310, duration(ns): 18856961 2025-07-13 09:43:22,978 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747134_6310, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-13 09:43:28,624 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747134_6310 replica FinalizedReplica, blk_1073747134_6310, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747134 for deletion 2025-07-13 09:43:28,625 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747134_6310 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747134 2025-07-13 09:44:27,933 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747135_6311 src: /192.168.158.7:51264 dest: /192.168.158.4:9866 2025-07-13 09:44:27,950 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:51264, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_447109055_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747135_6311, duration(ns): 15398674 2025-07-13 09:44:27,951 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747135_6311, type=LAST_IN_PIPELINE terminating 2025-07-13 09:44:31,626 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747135_6311 replica FinalizedReplica, blk_1073747135_6311, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747135 for deletion 2025-07-13 09:44:31,627 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747135_6311 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747135 2025-07-13 09:45:32,944 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747136_6312 src: /192.168.158.1:51084 dest: /192.168.158.4:9866 2025-07-13 09:45:32,976 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51084, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2144444392_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747136_6312, duration(ns): 22182389 2025-07-13 09:45:32,976 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747136_6312, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-13 09:45:40,630 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747136_6312 replica FinalizedReplica, blk_1073747136_6312, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747136 for deletion 2025-07-13 09:45:40,631 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747136_6312 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747136 2025-07-13 09:46:32,929 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747137_6313 src: /192.168.158.1:53676 dest: /192.168.158.4:9866 2025-07-13 09:46:32,961 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53676, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1622412355_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747137_6313, duration(ns): 23004286 2025-07-13 09:46:32,962 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747137_6313, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-13 09:46:37,630 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747137_6313 replica FinalizedReplica, blk_1073747137_6313, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747137 for deletion 2025-07-13 09:46:37,631 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747137_6313 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747137 2025-07-13 09:51:37,942 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747142_6318 src: /192.168.158.1:35284 dest: /192.168.158.4:9866 2025-07-13 09:51:37,973 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35284, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2086138212_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747142_6318, duration(ns): 22320675 2025-07-13 09:51:37,973 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747142_6318, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-13 09:51:40,643 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747142_6318 replica FinalizedReplica, blk_1073747142_6318, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747142 for deletion 2025-07-13 09:51:40,644 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747142_6318 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747142 2025-07-13 09:53:37,939 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747144_6320 src: /192.168.158.1:51986 dest: /192.168.158.4:9866 2025-07-13 09:53:37,973 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51986, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-50416136_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747144_6320, duration(ns): 24754900 2025-07-13 09:53:37,973 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747144_6320, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-13 09:53:40,647 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747144_6320 replica FinalizedReplica, blk_1073747144_6320, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747144 for deletion 2025-07-13 09:53:40,648 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747144_6320 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747144 2025-07-13 09:55:42,963 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747146_6322 src: /192.168.158.6:34006 dest: /192.168.158.4:9866 2025-07-13 09:55:42,988 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:34006, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2100473010_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747146_6322, duration(ns): 20435544 2025-07-13 09:55:42,989 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747146_6322, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-13 09:55:49,651 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747146_6322 replica FinalizedReplica, blk_1073747146_6322, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747146 for deletion 2025-07-13 09:55:49,652 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747146_6322 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747146 2025-07-13 09:56:42,965 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747147_6323 src: /192.168.158.8:35596 dest: /192.168.158.4:9866 2025-07-13 09:56:42,984 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:35596, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1903343435_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747147_6323, duration(ns): 16155487 2025-07-13 09:56:42,984 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747147_6323, type=LAST_IN_PIPELINE terminating 2025-07-13 09:56:49,655 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747147_6323 replica FinalizedReplica, blk_1073747147_6323, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747147 for deletion 2025-07-13 09:56:49,656 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747147_6323 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747147 2025-07-13 09:58:52,957 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747149_6325 src: /192.168.158.9:54868 dest: /192.168.158.4:9866 2025-07-13 09:58:52,981 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:54868, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1727029584_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747149_6325, duration(ns): 18333349 2025-07-13 09:58:52,981 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747149_6325, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-13 09:58:55,659 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747149_6325 replica FinalizedReplica, blk_1073747149_6325, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747149 for deletion 2025-07-13 09:58:55,660 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747149_6325 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747149 2025-07-13 09:59:52,958 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747150_6326 src: /192.168.158.1:51958 dest: /192.168.158.4:9866 2025-07-13 09:59:52,991 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51958, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1580431452_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747150_6326, duration(ns): 23635341 2025-07-13 09:59:52,991 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747150_6326, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-13 09:59:55,660 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747150_6326 replica FinalizedReplica, blk_1073747150_6326, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747150 for deletion 2025-07-13 09:59:55,662 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747150_6326 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747150 2025-07-13 10:00:52,964 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747151_6327 src: /192.168.158.6:44318 dest: /192.168.158.4:9866 2025-07-13 10:00:52,989 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:44318, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1472589532_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747151_6327, duration(ns): 19508149 2025-07-13 10:00:52,990 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747151_6327, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-13 10:00:55,661 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747151_6327 replica FinalizedReplica, blk_1073747151_6327, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747151 for deletion 2025-07-13 10:00:55,662 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747151_6327 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747151 2025-07-13 10:05:57,980 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747156_6332 src: /192.168.158.8:47218 dest: /192.168.158.4:9866 2025-07-13 10:05:58,003 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:47218, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2146227001_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747156_6332, duration(ns): 17596923 2025-07-13 10:05:58,003 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747156_6332, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-13 10:06:04,672 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747156_6332 replica FinalizedReplica, blk_1073747156_6332, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747156 for deletion 2025-07-13 10:06:04,673 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747156_6332 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747156 2025-07-13 10:07:57,979 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747158_6334 src: /192.168.158.9:33774 dest: /192.168.158.4:9866 2025-07-13 10:07:57,997 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:33774, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1406428533_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747158_6334, duration(ns): 15539479 2025-07-13 10:07:57,997 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747158_6334, type=LAST_IN_PIPELINE terminating 2025-07-13 10:08:01,676 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747158_6334 replica FinalizedReplica, blk_1073747158_6334, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747158 for deletion 2025-07-13 10:08:01,677 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747158_6334 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747158 2025-07-13 10:10:02,982 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747160_6336 src: /192.168.158.5:54394 dest: /192.168.158.4:9866 2025-07-13 10:10:03,000 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:54394, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-91997940_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747160_6336, duration(ns): 16419448 2025-07-13 10:10:03,001 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747160_6336, type=LAST_IN_PIPELINE terminating 2025-07-13 10:10:10,679 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747160_6336 replica FinalizedReplica, blk_1073747160_6336, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747160 for deletion 2025-07-13 10:10:10,680 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747160_6336 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747160 2025-07-13 10:13:07,989 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747163_6339 src: /192.168.158.9:45522 dest: /192.168.158.4:9866 2025-07-13 10:13:08,014 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:45522, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_991210430_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747163_6339, duration(ns): 20257533 2025-07-13 10:13:08,015 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747163_6339, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-13 10:13:10,683 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747163_6339 replica FinalizedReplica, blk_1073747163_6339, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747163 for deletion 2025-07-13 10:13:10,685 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747163_6339 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747163 2025-07-13 10:14:07,976 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747164_6340 src: /192.168.158.1:50964 dest: /192.168.158.4:9866 2025-07-13 10:14:08,008 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50964, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1794646960_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747164_6340, duration(ns): 23731349 2025-07-13 10:14:08,008 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747164_6340, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-13 10:14:10,686 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747164_6340 replica FinalizedReplica, blk_1073747164_6340, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747164 for deletion 2025-07-13 10:14:10,687 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747164_6340 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747164 2025-07-13 10:15:07,993 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747165_6341 src: /192.168.158.1:54330 dest: /192.168.158.4:9866 2025-07-13 10:15:08,025 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54330, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2129138993_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747165_6341, duration(ns): 22924078 2025-07-13 10:15:08,025 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747165_6341, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-13 10:15:10,686 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747165_6341 replica FinalizedReplica, blk_1073747165_6341, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747165 for deletion 2025-07-13 10:15:10,687 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747165_6341 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747165 2025-07-13 10:16:08,003 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747166_6342 src: /192.168.158.6:37166 dest: /192.168.158.4:9866 2025-07-13 10:16:08,023 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:37166, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1471400590_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747166_6342, duration(ns): 18521830 2025-07-13 10:16:08,024 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747166_6342, type=LAST_IN_PIPELINE terminating 2025-07-13 10:16:10,689 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747166_6342 replica FinalizedReplica, blk_1073747166_6342, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747166 for deletion 2025-07-13 10:16:10,689 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747166_6342 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747166 2025-07-13 10:18:13,024 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747168_6344 src: /192.168.158.9:42774 dest: /192.168.158.4:9866 2025-07-13 10:18:13,042 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:42774, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-912105870_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747168_6344, duration(ns): 15854908 2025-07-13 10:18:13,043 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747168_6344, type=LAST_IN_PIPELINE terminating 2025-07-13 10:18:16,694 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747168_6344 replica FinalizedReplica, blk_1073747168_6344, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747168 for deletion 2025-07-13 10:18:16,696 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747168_6344 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747168 2025-07-13 10:19:12,990 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747169_6345 src: /192.168.158.1:60556 dest: /192.168.158.4:9866 2025-07-13 10:19:13,019 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60556, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1392313280_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747169_6345, duration(ns): 20427867 2025-07-13 10:19:13,020 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747169_6345, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-13 10:19:16,696 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747169_6345 replica FinalizedReplica, blk_1073747169_6345, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747169 for deletion 2025-07-13 10:19:16,697 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747169_6345 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747169 2025-07-13 10:22:17,997 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747172_6348 src: /192.168.158.9:46134 dest: /192.168.158.4:9866 2025-07-13 10:22:18,021 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:46134, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1962017481_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747172_6348, duration(ns): 18857628 2025-07-13 10:22:18,021 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747172_6348, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-13 10:22:22,704 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747172_6348 replica FinalizedReplica, blk_1073747172_6348, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747172 for deletion 2025-07-13 10:22:22,705 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747172_6348 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747172 2025-07-13 10:25:17,999 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747175_6351 src: /192.168.158.1:57504 dest: /192.168.158.4:9866 2025-07-13 10:25:18,030 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57504, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1826662326_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747175_6351, duration(ns): 22642700 2025-07-13 10:25:18,030 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747175_6351, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-13 10:25:25,706 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747175_6351 replica FinalizedReplica, blk_1073747175_6351, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747175 for deletion 2025-07-13 10:25:25,708 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747175_6351 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747175 2025-07-13 10:26:18,004 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747176_6352 src: /192.168.158.1:44192 dest: /192.168.158.4:9866 2025-07-13 10:26:18,034 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44192, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1446251523_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747176_6352, duration(ns): 21727830 2025-07-13 10:26:18,035 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747176_6352, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-13 10:26:22,711 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747176_6352 replica FinalizedReplica, blk_1073747176_6352, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747176 for deletion 2025-07-13 10:26:22,712 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747176_6352 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747176 2025-07-13 10:27:23,018 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747177_6353 src: /192.168.158.8:40988 dest: /192.168.158.4:9866 2025-07-13 10:27:23,043 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:40988, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_862673923_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747177_6353, duration(ns): 19658468 2025-07-13 10:27:23,043 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747177_6353, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-13 10:27:28,710 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747177_6353 replica FinalizedReplica, blk_1073747177_6353, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747177 for deletion 2025-07-13 10:27:28,711 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747177_6353 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747177 2025-07-13 10:28:28,030 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747178_6354 src: /192.168.158.1:58780 dest: /192.168.158.4:9866 2025-07-13 10:28:28,062 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58780, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-415014768_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747178_6354, duration(ns): 22560220 2025-07-13 10:28:28,062 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747178_6354, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-13 10:28:31,712 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747178_6354 replica FinalizedReplica, blk_1073747178_6354, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747178 for deletion 2025-07-13 10:28:31,713 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747178_6354 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747178 2025-07-13 10:31:28,032 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747181_6357 src: /192.168.158.1:42668 dest: /192.168.158.4:9866 2025-07-13 10:31:28,067 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42668, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_997102976_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747181_6357, duration(ns): 24572573 2025-07-13 10:31:28,067 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747181_6357, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-13 10:31:31,721 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747181_6357 replica FinalizedReplica, blk_1073747181_6357, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747181 for deletion 2025-07-13 10:31:31,722 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747181_6357 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747181 2025-07-13 10:35:38,045 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747185_6361 src: /192.168.158.6:49826 dest: /192.168.158.4:9866 2025-07-13 10:35:38,072 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:49826, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1338156547_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747185_6361, duration(ns): 20549958 2025-07-13 10:35:38,073 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747185_6361, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-13 10:35:40,734 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747185_6361 replica FinalizedReplica, blk_1073747185_6361, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747185 for deletion 2025-07-13 10:35:40,736 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747185_6361 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747185 2025-07-13 10:37:38,052 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747187_6363 src: /192.168.158.9:54052 dest: /192.168.158.4:9866 2025-07-13 10:37:38,070 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:54052, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1117236852_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747187_6363, duration(ns): 15830428 2025-07-13 10:37:38,071 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747187_6363, type=LAST_IN_PIPELINE terminating 2025-07-13 10:37:40,742 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747187_6363 replica FinalizedReplica, blk_1073747187_6363, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747187 for deletion 2025-07-13 10:37:40,743 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747187_6363 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747187 2025-07-13 10:40:43,051 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747190_6366 src: /192.168.158.8:33096 dest: /192.168.158.4:9866 2025-07-13 10:40:43,076 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:33096, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-696091445_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747190_6366, duration(ns): 20356658 2025-07-13 10:40:43,076 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747190_6366, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-13 10:40:49,755 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747190_6366 replica FinalizedReplica, blk_1073747190_6366, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747190 for deletion 2025-07-13 10:40:49,756 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747190_6366 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747190 2025-07-13 10:41:48,048 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747191_6367 src: /192.168.158.7:55396 dest: /192.168.158.4:9866 2025-07-13 10:41:48,064 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:55396, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1815252488_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747191_6367, duration(ns): 12249090 2025-07-13 10:41:48,065 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747191_6367, type=LAST_IN_PIPELINE terminating 2025-07-13 10:41:52,755 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747191_6367 replica FinalizedReplica, blk_1073747191_6367, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747191 for deletion 2025-07-13 10:41:52,757 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747191_6367 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747191 2025-07-13 10:42:53,039 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747192_6368 src: /192.168.158.1:56720 dest: /192.168.158.4:9866 2025-07-13 10:42:53,071 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56720, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1092305704_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747192_6368, duration(ns): 23356085 2025-07-13 10:42:53,071 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747192_6368, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-13 10:42:58,758 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747192_6368 replica FinalizedReplica, blk_1073747192_6368, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747192 for deletion 2025-07-13 10:42:58,759 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747192_6368 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747192 2025-07-13 10:43:53,050 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747193_6369 src: /192.168.158.6:38492 dest: /192.168.158.4:9866 2025-07-13 10:43:53,071 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:38492, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1672995514_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747193_6369, duration(ns): 18180333 2025-07-13 10:43:53,071 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747193_6369, type=LAST_IN_PIPELINE terminating 2025-07-13 10:43:55,758 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747193_6369 replica FinalizedReplica, blk_1073747193_6369, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747193 for deletion 2025-07-13 10:43:55,759 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747193_6369 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747193 2025-07-13 10:46:58,056 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747196_6372 src: /192.168.158.7:36760 dest: /192.168.158.4:9866 2025-07-13 10:46:58,081 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:36760, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-106860829_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747196_6372, duration(ns): 19036813 2025-07-13 10:46:58,081 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747196_6372, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-13 10:47:04,764 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747196_6372 replica FinalizedReplica, blk_1073747196_6372, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747196 for deletion 2025-07-13 10:47:04,765 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747196_6372 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747196 2025-07-13 10:48:58,048 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747198_6374 src: /192.168.158.5:38244 dest: /192.168.158.4:9866 2025-07-13 10:48:58,072 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:38244, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1010471779_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747198_6374, duration(ns): 19056415 2025-07-13 10:48:58,072 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747198_6374, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-13 10:49:01,770 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747198_6374 replica FinalizedReplica, blk_1073747198_6374, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747198 for deletion 2025-07-13 10:49:01,771 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747198_6374 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073747198 2025-07-13 10:54:13,068 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747203_6379 src: /192.168.158.7:35218 dest: /192.168.158.4:9866 2025-07-13 10:54:13,087 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:35218, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-473104672_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747203_6379, duration(ns): 16546755 2025-07-13 10:54:13,087 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747203_6379, type=LAST_IN_PIPELINE terminating 2025-07-13 10:54:16,787 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747203_6379 replica FinalizedReplica, blk_1073747203_6379, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747203 for deletion 2025-07-13 10:54:16,788 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747203_6379 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747203 2025-07-13 10:56:18,046 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747205_6381 src: /192.168.158.5:49184 dest: /192.168.158.4:9866 2025-07-13 10:56:18,064 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:49184, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_955739698_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747205_6381, duration(ns): 15958057 2025-07-13 10:56:18,065 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747205_6381, type=LAST_IN_PIPELINE terminating 2025-07-13 10:56:22,790 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747205_6381 replica FinalizedReplica, blk_1073747205_6381, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747205 for deletion 2025-07-13 10:56:22,792 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747205_6381 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747205 2025-07-13 10:57:18,067 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747206_6382 src: /192.168.158.7:48628 dest: /192.168.158.4:9866 2025-07-13 10:57:18,096 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:48628, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1815310991_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747206_6382, duration(ns): 22971754 2025-07-13 10:57:18,096 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747206_6382, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-13 10:57:25,795 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747206_6382 replica FinalizedReplica, blk_1073747206_6382, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747206 for deletion 2025-07-13 10:57:25,796 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747206_6382 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747206 2025-07-13 10:59:23,061 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747208_6384 src: /192.168.158.7:36964 dest: /192.168.158.4:9866 2025-07-13 10:59:23,078 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:36964, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1289203181_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747208_6384, duration(ns): 14268657 2025-07-13 10:59:23,078 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747208_6384, type=LAST_IN_PIPELINE terminating 2025-07-13 10:59:25,800 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747208_6384 replica FinalizedReplica, blk_1073747208_6384, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747208 for deletion 2025-07-13 10:59:25,801 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747208_6384 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747208 2025-07-13 11:02:28,066 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747211_6387 src: /192.168.158.6:37954 dest: /192.168.158.4:9866 2025-07-13 11:02:28,084 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:37954, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1056071978_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747211_6387, duration(ns): 16327330 2025-07-13 11:02:28,085 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747211_6387, type=LAST_IN_PIPELINE terminating 2025-07-13 11:02:31,802 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747211_6387 replica FinalizedReplica, blk_1073747211_6387, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747211 for deletion 2025-07-13 11:02:31,803 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747211_6387 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747211 2025-07-13 11:03:28,072 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747212_6388 src: /192.168.158.7:36526 dest: /192.168.158.4:9866 2025-07-13 11:03:28,090 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:36526, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1324987751_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747212_6388, duration(ns): 16129446 2025-07-13 11:03:28,090 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747212_6388, type=LAST_IN_PIPELINE terminating 2025-07-13 11:03:34,804 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747212_6388 replica FinalizedReplica, blk_1073747212_6388, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747212 for deletion 2025-07-13 11:03:34,805 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747212_6388 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747212 2025-07-13 11:04:28,048 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747213_6389 src: /192.168.158.1:58602 dest: /192.168.158.4:9866 2025-07-13 11:04:28,078 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58602, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1233443645_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747213_6389, duration(ns): 21638462 2025-07-13 11:04:28,079 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747213_6389, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-13 11:04:31,806 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747213_6389 replica FinalizedReplica, blk_1073747213_6389, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747213 for deletion 2025-07-13 11:04:31,808 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747213_6389 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747213 2025-07-13 11:06:28,094 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747215_6391 src: /192.168.158.8:57056 dest: /192.168.158.4:9866 2025-07-13 11:06:28,118 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:57056, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-123525254_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747215_6391, duration(ns): 18367210 2025-07-13 11:06:28,118 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747215_6391, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-13 11:06:31,807 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747215_6391 replica FinalizedReplica, blk_1073747215_6391, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747215 for deletion 2025-07-13 11:06:31,808 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747215_6391 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747215 2025-07-13 11:09:28,078 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747218_6394 src: /192.168.158.1:47118 dest: /192.168.158.4:9866 2025-07-13 11:09:28,107 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47118, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_805507132_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747218_6394, duration(ns): 20826237 2025-07-13 11:09:28,108 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747218_6394, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-13 11:09:34,816 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747218_6394 replica FinalizedReplica, blk_1073747218_6394, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747218 for deletion 2025-07-13 11:09:34,817 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747218_6394 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747218 2025-07-13 11:11:33,103 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747220_6396 src: /192.168.158.1:49022 dest: /192.168.158.4:9866 2025-07-13 11:11:33,134 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49022, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_162806926_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747220_6396, duration(ns): 22550914 2025-07-13 11:11:33,134 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747220_6396, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-13 11:11:40,820 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747220_6396 replica FinalizedReplica, blk_1073747220_6396, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747220 for deletion 2025-07-13 11:11:40,821 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747220_6396 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747220 2025-07-13 11:12:33,075 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747221_6397 src: /192.168.158.8:60944 dest: /192.168.158.4:9866 2025-07-13 11:12:33,093 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:60944, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1704752842_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747221_6397, duration(ns): 15833039 2025-07-13 11:12:33,093 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747221_6397, type=LAST_IN_PIPELINE terminating 2025-07-13 11:12:40,824 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747221_6397 replica FinalizedReplica, blk_1073747221_6397, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747221 for deletion 2025-07-13 11:12:40,825 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747221_6397 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747221 2025-07-13 11:14:33,072 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747223_6399 src: /192.168.158.1:54082 dest: /192.168.158.4:9866 2025-07-13 11:14:33,106 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54082, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1469020207_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747223_6399, duration(ns): 24272163 2025-07-13 11:14:33,107 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747223_6399, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-13 11:14:37,828 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747223_6399 replica FinalizedReplica, blk_1073747223_6399, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747223 for deletion 2025-07-13 11:14:37,829 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747223_6399 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747223 2025-07-13 11:16:33,090 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747225_6401 src: /192.168.158.5:51266 dest: /192.168.158.4:9866 2025-07-13 11:16:33,109 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:51266, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1976426211_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747225_6401, duration(ns): 16447305 2025-07-13 11:16:33,109 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747225_6401, type=LAST_IN_PIPELINE terminating 2025-07-13 11:16:37,834 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747225_6401 replica FinalizedReplica, blk_1073747225_6401, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747225 for deletion 2025-07-13 11:16:37,835 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747225_6401 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747225 2025-07-13 11:19:33,096 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747228_6404 src: /192.168.158.8:48586 dest: /192.168.158.4:9866 2025-07-13 11:19:33,122 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:48586, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_299261399_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747228_6404, duration(ns): 20155838 2025-07-13 11:19:33,122 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747228_6404, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-13 11:19:40,836 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747228_6404 replica FinalizedReplica, blk_1073747228_6404, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747228 for deletion 2025-07-13 11:19:40,837 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747228_6404 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747228 2025-07-13 11:20:33,106 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747229_6405 src: /192.168.158.7:58482 dest: /192.168.158.4:9866 2025-07-13 11:20:33,131 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:58482, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-287272414_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747229_6405, duration(ns): 19058983 2025-07-13 11:20:33,131 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747229_6405, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-13 11:20:40,840 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747229_6405 replica FinalizedReplica, blk_1073747229_6405, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747229 for deletion 2025-07-13 11:20:40,841 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747229_6405 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747229 2025-07-13 11:21:33,090 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747230_6406 src: /192.168.158.7:55900 dest: /192.168.158.4:9866 2025-07-13 11:21:33,107 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:55900, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1599625428_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747230_6406, duration(ns): 14275860 2025-07-13 11:21:33,107 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747230_6406, type=LAST_IN_PIPELINE terminating 2025-07-13 11:21:40,842 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747230_6406 replica FinalizedReplica, blk_1073747230_6406, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747230 for deletion 2025-07-13 11:21:40,843 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747230_6406 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747230 2025-07-13 11:22:33,087 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747231_6407 src: /192.168.158.9:50180 dest: /192.168.158.4:9866 2025-07-13 11:22:33,113 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:50180, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-366453498_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747231_6407, duration(ns): 19204353 2025-07-13 11:22:33,113 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747231_6407, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-13 11:22:37,842 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747231_6407 replica FinalizedReplica, blk_1073747231_6407, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747231 for deletion 2025-07-13 11:22:37,843 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747231_6407 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747231 2025-07-13 11:34:43,125 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747243_6419 src: /192.168.158.5:45240 dest: /192.168.158.4:9866 2025-07-13 11:34:43,143 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:45240, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-296766591_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747243_6419, duration(ns): 15966860 2025-07-13 11:34:43,144 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747243_6419, type=LAST_IN_PIPELINE terminating 2025-07-13 11:34:49,866 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747243_6419 replica FinalizedReplica, blk_1073747243_6419, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747243 for deletion 2025-07-13 11:34:49,867 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747243_6419 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747243 2025-07-13 11:35:43,169 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747244_6420 src: /192.168.158.8:50756 dest: /192.168.158.4:9866 2025-07-13 11:35:43,188 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:50756, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_42524082_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747244_6420, duration(ns): 16407120 2025-07-13 11:35:43,188 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747244_6420, type=LAST_IN_PIPELINE terminating 2025-07-13 11:35:49,870 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747244_6420 replica FinalizedReplica, blk_1073747244_6420, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747244 for deletion 2025-07-13 11:35:49,871 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747244_6420 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747244 2025-07-13 11:36:13,268 INFO org.apache.hadoop.hdfs.server.datanode.DirectoryScanner: BlockPool BP-1059995147-192.168.158.1-1752101929360 Total blocks: 8, missing metadata files:0, missing block files:0, missing blocks in memory:0, mismatched blocks:0 2025-07-13 11:36:48,133 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747245_6421 src: /192.168.158.1:56726 dest: /192.168.158.4:9866 2025-07-13 11:36:48,164 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56726, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1705210038_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747245_6421, duration(ns): 22216286 2025-07-13 11:36:48,164 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747245_6421, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-13 11:36:52,870 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747245_6421 replica FinalizedReplica, blk_1073747245_6421, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747245 for deletion 2025-07-13 11:36:52,871 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747245_6421 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747245 2025-07-13 11:37:19,873 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Successfully sent block report 0x12cfd9a757d31f34, containing 4 storage report(s), of which we sent 4. The reports had 8 total blocks and used 1 RPC(s). This took 0 msec to generate and 3 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5. 2025-07-13 11:37:19,874 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Got finalize command for block pool BP-1059995147-192.168.158.1-1752101929360 2025-07-13 11:38:58,134 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747247_6423 src: /192.168.158.6:53774 dest: /192.168.158.4:9866 2025-07-13 11:38:58,150 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:53774, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1831567614_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747247_6423, duration(ns): 14670812 2025-07-13 11:38:58,151 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747247_6423, type=LAST_IN_PIPELINE terminating 2025-07-13 11:39:01,870 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747247_6423 replica FinalizedReplica, blk_1073747247_6423, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747247 for deletion 2025-07-13 11:39:01,872 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747247_6423 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747247 2025-07-13 11:39:58,136 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747248_6424 src: /192.168.158.7:36568 dest: /192.168.158.4:9866 2025-07-13 11:39:58,161 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:36568, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1439971743_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747248_6424, duration(ns): 19625106 2025-07-13 11:39:58,163 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747248_6424, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-13 11:40:01,875 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747248_6424 replica FinalizedReplica, blk_1073747248_6424, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747248 for deletion 2025-07-13 11:40:01,876 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747248_6424 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747248 2025-07-13 11:42:08,147 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747250_6426 src: /192.168.158.5:40942 dest: /192.168.158.4:9866 2025-07-13 11:42:08,168 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:40942, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_289799511_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747250_6426, duration(ns): 18366969 2025-07-13 11:42:08,168 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747250_6426, type=LAST_IN_PIPELINE terminating 2025-07-13 11:42:10,880 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747250_6426 replica FinalizedReplica, blk_1073747250_6426, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747250 for deletion 2025-07-13 11:42:10,881 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747250_6426 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747250 2025-07-13 11:43:08,129 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747251_6427 src: /192.168.158.7:35984 dest: /192.168.158.4:9866 2025-07-13 11:43:08,146 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:35984, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1456984084_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747251_6427, duration(ns): 15578706 2025-07-13 11:43:08,147 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747251_6427, type=LAST_IN_PIPELINE terminating 2025-07-13 11:43:13,882 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747251_6427 replica FinalizedReplica, blk_1073747251_6427, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747251 for deletion 2025-07-13 11:43:13,883 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747251_6427 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747251 2025-07-13 11:44:08,123 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747252_6428 src: /192.168.158.1:37304 dest: /192.168.158.4:9866 2025-07-13 11:44:08,154 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37304, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-343482687_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747252_6428, duration(ns): 22517073 2025-07-13 11:44:08,154 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747252_6428, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-13 11:44:10,886 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747252_6428 replica FinalizedReplica, blk_1073747252_6428, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747252 for deletion 2025-07-13 11:44:10,887 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747252_6428 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747252 2025-07-13 11:45:08,135 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747253_6429 src: /192.168.158.1:50478 dest: /192.168.158.4:9866 2025-07-13 11:45:08,167 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50478, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_490084933_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747253_6429, duration(ns): 22951060 2025-07-13 11:45:08,167 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747253_6429, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-13 11:45:13,886 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747253_6429 replica FinalizedReplica, blk_1073747253_6429, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747253 for deletion 2025-07-13 11:45:13,887 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747253_6429 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747253 2025-07-13 11:49:18,156 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747257_6433 src: /192.168.158.8:35026 dest: /192.168.158.4:9866 2025-07-13 11:49:18,179 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:35026, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1958695211_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747257_6433, duration(ns): 17656118 2025-07-13 11:49:18,179 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747257_6433, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-13 11:49:22,897 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747257_6433 replica FinalizedReplica, blk_1073747257_6433, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747257 for deletion 2025-07-13 11:49:22,898 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747257_6433 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747257 2025-07-13 11:53:18,161 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747261_6437 src: /192.168.158.9:52118 dest: /192.168.158.4:9866 2025-07-13 11:53:18,185 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:52118, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-138490285_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747261_6437, duration(ns): 18135584 2025-07-13 11:53:18,185 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747261_6437, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-13 11:53:22,907 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747261_6437 replica FinalizedReplica, blk_1073747261_6437, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747261 for deletion 2025-07-13 11:53:22,908 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747261_6437 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747261 2025-07-13 11:54:18,160 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747262_6438 src: /192.168.158.1:45380 dest: /192.168.158.4:9866 2025-07-13 11:54:18,191 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45380, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1598189828_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747262_6438, duration(ns): 22040142 2025-07-13 11:54:18,191 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747262_6438, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-13 11:54:22,909 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747262_6438 replica FinalizedReplica, blk_1073747262_6438, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747262 for deletion 2025-07-13 11:54:22,910 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747262_6438 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747262 2025-07-13 11:59:28,150 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747267_6443 src: /192.168.158.1:52080 dest: /192.168.158.4:9866 2025-07-13 11:59:28,183 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52080, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_711027514_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747267_6443, duration(ns): 24189973 2025-07-13 11:59:28,183 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747267_6443, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-13 11:59:34,921 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747267_6443 replica FinalizedReplica, blk_1073747267_6443, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747267 for deletion 2025-07-13 11:59:34,922 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747267_6443 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747267 2025-07-13 12:00:33,152 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747268_6444 src: /192.168.158.1:54576 dest: /192.168.158.4:9866 2025-07-13 12:00:33,183 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54576, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_93732551_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747268_6444, duration(ns): 21995919 2025-07-13 12:00:33,184 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747268_6444, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-13 12:00:40,922 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747268_6444 replica FinalizedReplica, blk_1073747268_6444, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747268 for deletion 2025-07-13 12:00:40,923 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747268_6444 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747268 2025-07-13 12:03:38,169 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747271_6447 src: /192.168.158.1:47592 dest: /192.168.158.4:9866 2025-07-13 12:03:38,203 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47592, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1287883709_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747271_6447, duration(ns): 23602088 2025-07-13 12:03:38,203 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747271_6447, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-13 12:03:40,928 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747271_6447 replica FinalizedReplica, blk_1073747271_6447, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747271 for deletion 2025-07-13 12:03:40,930 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747271_6447 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747271 2025-07-13 12:05:38,160 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747273_6449 src: /192.168.158.1:39346 dest: /192.168.158.4:9866 2025-07-13 12:05:38,194 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39346, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-517149272_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747273_6449, duration(ns): 25303597 2025-07-13 12:05:38,194 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747273_6449, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-13 12:05:43,932 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747273_6449 replica FinalizedReplica, blk_1073747273_6449, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747273 for deletion 2025-07-13 12:05:43,933 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747273_6449 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747273 2025-07-13 12:10:48,168 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747278_6454 src: /192.168.158.9:40552 dest: /192.168.158.4:9866 2025-07-13 12:10:48,192 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:40552, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-767382445_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747278_6454, duration(ns): 18346950 2025-07-13 12:10:48,192 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747278_6454, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-13 12:10:52,943 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747278_6454 replica FinalizedReplica, blk_1073747278_6454, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747278 for deletion 2025-07-13 12:10:52,944 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747278_6454 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747278 2025-07-13 12:17:08,177 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747284_6460 src: /192.168.158.6:52908 dest: /192.168.158.4:9866 2025-07-13 12:17:08,194 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:52908, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-260534705_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747284_6460, duration(ns): 15003939 2025-07-13 12:17:08,194 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747284_6460, type=LAST_IN_PIPELINE terminating 2025-07-13 12:17:10,951 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747284_6460 replica FinalizedReplica, blk_1073747284_6460, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747284 for deletion 2025-07-13 12:17:10,952 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747284_6460 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747284 2025-07-13 12:18:08,176 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747285_6461 src: /192.168.158.8:57604 dest: /192.168.158.4:9866 2025-07-13 12:18:08,193 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:57604, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-213791688_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747285_6461, duration(ns): 15166386 2025-07-13 12:18:08,193 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747285_6461, type=LAST_IN_PIPELINE terminating 2025-07-13 12:18:10,952 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747285_6461 replica FinalizedReplica, blk_1073747285_6461, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747285 for deletion 2025-07-13 12:18:10,954 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747285_6461 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747285 2025-07-13 12:19:08,177 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747286_6462 src: /192.168.158.1:39216 dest: /192.168.158.4:9866 2025-07-13 12:19:08,208 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39216, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-301288601_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747286_6462, duration(ns): 22324334 2025-07-13 12:19:08,209 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747286_6462, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-13 12:19:10,955 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747286_6462 replica FinalizedReplica, blk_1073747286_6462, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747286 for deletion 2025-07-13 12:19:10,956 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747286_6462 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747286 2025-07-13 12:20:13,177 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747287_6463 src: /192.168.158.8:56276 dest: /192.168.158.4:9866 2025-07-13 12:20:13,201 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:56276, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2041973163_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747287_6463, duration(ns): 18251737 2025-07-13 12:20:13,201 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747287_6463, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-13 12:20:16,955 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747287_6463 replica FinalizedReplica, blk_1073747287_6463, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747287 for deletion 2025-07-13 12:20:16,957 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747287_6463 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747287 2025-07-13 12:24:23,182 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747291_6467 src: /192.168.158.6:43370 dest: /192.168.158.4:9866 2025-07-13 12:24:23,207 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:43370, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1375744012_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747291_6467, duration(ns): 18742568 2025-07-13 12:24:23,207 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747291_6467, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-13 12:24:28,962 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747291_6467 replica FinalizedReplica, blk_1073747291_6467, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747291 for deletion 2025-07-13 12:24:28,963 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747291_6467 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747291 2025-07-13 12:26:23,184 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747293_6469 src: /192.168.158.1:54478 dest: /192.168.158.4:9866 2025-07-13 12:26:23,217 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54478, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-191297847_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747293_6469, duration(ns): 23208255 2025-07-13 12:26:23,217 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747293_6469, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-13 12:26:25,969 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747293_6469 replica FinalizedReplica, blk_1073747293_6469, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747293 for deletion 2025-07-13 12:26:25,970 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747293_6469 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747293 2025-07-13 12:28:23,188 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747295_6471 src: /192.168.158.9:56886 dest: /192.168.158.4:9866 2025-07-13 12:28:23,206 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:56886, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1149329845_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747295_6471, duration(ns): 15601898 2025-07-13 12:28:23,206 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747295_6471, type=LAST_IN_PIPELINE terminating 2025-07-13 12:28:25,974 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747295_6471 replica FinalizedReplica, blk_1073747295_6471, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747295 for deletion 2025-07-13 12:28:25,975 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747295_6471 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747295 2025-07-13 12:30:23,193 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747297_6473 src: /192.168.158.1:33916 dest: /192.168.158.4:9866 2025-07-13 12:30:23,225 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33916, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1030089063_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747297_6473, duration(ns): 23463397 2025-07-13 12:30:23,226 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747297_6473, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-13 12:30:25,981 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747297_6473 replica FinalizedReplica, blk_1073747297_6473, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747297 for deletion 2025-07-13 12:30:25,982 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747297_6473 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747297 2025-07-13 12:32:23,206 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747299_6475 src: /192.168.158.6:36858 dest: /192.168.158.4:9866 2025-07-13 12:32:23,231 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:36858, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_991500485_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747299_6475, duration(ns): 20308963 2025-07-13 12:32:23,232 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747299_6475, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-13 12:32:28,983 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747299_6475 replica FinalizedReplica, blk_1073747299_6475, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747299 for deletion 2025-07-13 12:32:28,984 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747299_6475 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747299 2025-07-13 12:39:33,202 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747306_6482 src: /192.168.158.5:49970 dest: /192.168.158.4:9866 2025-07-13 12:39:33,226 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:49970, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_643282725_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747306_6482, duration(ns): 19199055 2025-07-13 12:39:33,227 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747306_6482, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-13 12:39:37,998 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747306_6482 replica FinalizedReplica, blk_1073747306_6482, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747306 for deletion 2025-07-13 12:39:37,999 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747306_6482 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747306 2025-07-13 12:41:38,219 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747308_6484 src: /192.168.158.5:35422 dest: /192.168.158.4:9866 2025-07-13 12:41:38,238 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:35422, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1671212576_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747308_6484, duration(ns): 16810469 2025-07-13 12:41:38,239 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747308_6484, type=LAST_IN_PIPELINE terminating 2025-07-13 12:41:41,000 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747308_6484 replica FinalizedReplica, blk_1073747308_6484, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747308 for deletion 2025-07-13 12:41:41,002 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747308_6484 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747308 2025-07-13 12:43:38,217 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747310_6486 src: /192.168.158.8:46850 dest: /192.168.158.4:9866 2025-07-13 12:43:38,242 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:46850, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1586002616_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747310_6486, duration(ns): 20020113 2025-07-13 12:43:38,242 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747310_6486, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-13 12:43:41,006 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747310_6486 replica FinalizedReplica, blk_1073747310_6486, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747310 for deletion 2025-07-13 12:43:41,007 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747310_6486 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747310 2025-07-13 12:47:38,229 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747314_6490 src: /192.168.158.1:47864 dest: /192.168.158.4:9866 2025-07-13 12:47:38,263 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47864, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_993634680_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747314_6490, duration(ns): 25023174 2025-07-13 12:47:38,263 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747314_6490, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-13 12:47:41,018 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747314_6490 replica FinalizedReplica, blk_1073747314_6490, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747314 for deletion 2025-07-13 12:47:41,019 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747314_6490 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747314 2025-07-13 12:48:43,236 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747315_6491 src: /192.168.158.7:40990 dest: /192.168.158.4:9866 2025-07-13 12:48:43,254 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:40990, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_878289041_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747315_6491, duration(ns): 16407182 2025-07-13 12:48:43,255 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747315_6491, type=LAST_IN_PIPELINE terminating 2025-07-13 12:48:50,020 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747315_6491 replica FinalizedReplica, blk_1073747315_6491, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747315 for deletion 2025-07-13 12:48:50,021 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747315_6491 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747315 2025-07-13 12:52:43,239 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747319_6495 src: /192.168.158.5:59920 dest: /192.168.158.4:9866 2025-07-13 12:52:43,257 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:59920, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1504722967_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747319_6495, duration(ns): 15413735 2025-07-13 12:52:43,257 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747319_6495, type=LAST_IN_PIPELINE terminating 2025-07-13 12:52:47,027 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747319_6495 replica FinalizedReplica, blk_1073747319_6495, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747319 for deletion 2025-07-13 12:52:47,028 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747319_6495 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747319 2025-07-13 12:53:43,238 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747320_6496 src: /192.168.158.9:60144 dest: /192.168.158.4:9866 2025-07-13 12:53:43,264 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:60144, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1121184069_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747320_6496, duration(ns): 20270623 2025-07-13 12:53:43,264 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747320_6496, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-13 12:53:50,030 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747320_6496 replica FinalizedReplica, blk_1073747320_6496, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747320 for deletion 2025-07-13 12:53:50,031 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747320_6496 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747320 2025-07-13 12:55:43,240 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747322_6498 src: /192.168.158.1:44552 dest: /192.168.158.4:9866 2025-07-13 12:55:43,272 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44552, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-809803199_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747322_6498, duration(ns): 23045219 2025-07-13 12:55:43,272 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747322_6498, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-13 12:55:47,031 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747322_6498 replica FinalizedReplica, blk_1073747322_6498, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747322 for deletion 2025-07-13 12:55:47,032 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747322_6498 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747322 2025-07-13 12:57:43,244 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747324_6500 src: /192.168.158.6:35700 dest: /192.168.158.4:9866 2025-07-13 12:57:43,270 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:35700, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_779422785_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747324_6500, duration(ns): 20558236 2025-07-13 12:57:43,271 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747324_6500, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-13 12:57:50,038 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747324_6500 replica FinalizedReplica, blk_1073747324_6500, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747324 for deletion 2025-07-13 12:57:50,039 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747324_6500 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747324 2025-07-13 12:58:43,237 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747325_6501 src: /192.168.158.1:33900 dest: /192.168.158.4:9866 2025-07-13 12:58:43,269 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33900, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1699326199_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747325_6501, duration(ns): 22654497 2025-07-13 12:58:43,269 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747325_6501, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-13 12:58:50,041 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747325_6501 replica FinalizedReplica, blk_1073747325_6501, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747325 for deletion 2025-07-13 12:58:50,042 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747325_6501 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747325 2025-07-13 12:59:43,243 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747326_6502 src: /192.168.158.5:34846 dest: /192.168.158.4:9866 2025-07-13 12:59:43,267 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:34846, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2127788022_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747326_6502, duration(ns): 18916769 2025-07-13 12:59:43,267 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747326_6502, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-13 12:59:47,043 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747326_6502 replica FinalizedReplica, blk_1073747326_6502, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747326 for deletion 2025-07-13 12:59:47,044 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747326_6502 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747326 2025-07-13 13:01:53,266 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747328_6504 src: /192.168.158.5:48118 dest: /192.168.158.4:9866 2025-07-13 13:01:53,284 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:48118, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-810256442_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747328_6504, duration(ns): 15891110 2025-07-13 13:01:53,285 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747328_6504, type=LAST_IN_PIPELINE terminating 2025-07-13 13:01:56,045 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747328_6504 replica FinalizedReplica, blk_1073747328_6504, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747328 for deletion 2025-07-13 13:01:56,046 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747328_6504 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747328 2025-07-13 13:16:18,283 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747342_6518 src: /192.168.158.5:53886 dest: /192.168.158.4:9866 2025-07-13 13:16:18,301 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:53886, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-535543334_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747342_6518, duration(ns): 16516268 2025-07-13 13:16:18,302 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747342_6518, type=LAST_IN_PIPELINE terminating 2025-07-13 13:16:23,079 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747342_6518 replica FinalizedReplica, blk_1073747342_6518, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747342 for deletion 2025-07-13 13:16:23,080 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747342_6518 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747342 2025-07-13 13:18:23,287 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747344_6520 src: /192.168.158.9:43920 dest: /192.168.158.4:9866 2025-07-13 13:18:23,304 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:43920, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1194007423_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747344_6520, duration(ns): 14873878 2025-07-13 13:18:23,304 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747344_6520, type=LAST_IN_PIPELINE terminating 2025-07-13 13:18:29,082 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747344_6520 replica FinalizedReplica, blk_1073747344_6520, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747344 for deletion 2025-07-13 13:18:29,083 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747344_6520 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747344 2025-07-13 13:19:28,292 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747345_6521 src: /192.168.158.1:44612 dest: /192.168.158.4:9866 2025-07-13 13:19:28,324 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44612, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-103878112_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747345_6521, duration(ns): 22713831 2025-07-13 13:19:28,325 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747345_6521, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-13 13:19:32,086 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747345_6521 replica FinalizedReplica, blk_1073747345_6521, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747345 for deletion 2025-07-13 13:19:32,087 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747345_6521 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747345 2025-07-13 13:20:28,297 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747346_6522 src: /192.168.158.8:43116 dest: /192.168.158.4:9866 2025-07-13 13:20:28,322 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:43116, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1590778491_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747346_6522, duration(ns): 19629450 2025-07-13 13:20:28,322 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747346_6522, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-13 13:20:35,089 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747346_6522 replica FinalizedReplica, blk_1073747346_6522, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747346 for deletion 2025-07-13 13:20:35,090 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747346_6522 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747346 2025-07-13 13:22:28,309 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747348_6524 src: /192.168.158.9:51826 dest: /192.168.158.4:9866 2025-07-13 13:22:28,333 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:51826, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_222599899_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747348_6524, duration(ns): 18965245 2025-07-13 13:22:28,334 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747348_6524, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-13 13:22:32,093 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747348_6524 replica FinalizedReplica, blk_1073747348_6524, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747348 for deletion 2025-07-13 13:22:32,094 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747348_6524 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747348 2025-07-13 13:24:28,277 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747350_6526 src: /192.168.158.6:42302 dest: /192.168.158.4:9866 2025-07-13 13:24:28,303 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:42302, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_345101788_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747350_6526, duration(ns): 20534614 2025-07-13 13:24:28,303 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747350_6526, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-13 13:24:32,094 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747350_6526 replica FinalizedReplica, blk_1073747350_6526, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747350 for deletion 2025-07-13 13:24:32,095 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747350_6526 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747350 2025-07-13 13:25:28,301 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747351_6527 src: /192.168.158.1:60744 dest: /192.168.158.4:9866 2025-07-13 13:25:28,333 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60744, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1401360358_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747351_6527, duration(ns): 23567141 2025-07-13 13:25:28,334 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747351_6527, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-13 13:25:32,099 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747351_6527 replica FinalizedReplica, blk_1073747351_6527, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747351 for deletion 2025-07-13 13:25:32,100 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747351_6527 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747351 2025-07-13 13:26:28,301 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747352_6528 src: /192.168.158.8:53696 dest: /192.168.158.4:9866 2025-07-13 13:26:28,328 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:53696, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_171414843_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747352_6528, duration(ns): 21881469 2025-07-13 13:26:28,328 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747352_6528, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-13 13:26:32,102 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747352_6528 replica FinalizedReplica, blk_1073747352_6528, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747352 for deletion 2025-07-13 13:26:32,103 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747352_6528 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747352 2025-07-13 13:28:38,304 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747354_6530 src: /192.168.158.1:34140 dest: /192.168.158.4:9866 2025-07-13 13:28:38,338 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34140, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1090521284_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747354_6530, duration(ns): 24431703 2025-07-13 13:28:38,338 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747354_6530, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-13 13:28:41,107 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747354_6530 replica FinalizedReplica, blk_1073747354_6530, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747354 for deletion 2025-07-13 13:28:41,108 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747354_6530 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747354 2025-07-13 13:29:38,313 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747355_6531 src: /192.168.158.1:33070 dest: /192.168.158.4:9866 2025-07-13 13:29:38,345 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33070, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2078720062_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747355_6531, duration(ns): 23480099 2025-07-13 13:29:38,345 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747355_6531, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-13 13:29:44,107 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747355_6531 replica FinalizedReplica, blk_1073747355_6531, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747355 for deletion 2025-07-13 13:29:44,109 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747355_6531 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747355 2025-07-13 13:32:38,318 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747358_6534 src: /192.168.158.1:44596 dest: /192.168.158.4:9866 2025-07-13 13:32:38,349 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44596, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1304286703_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747358_6534, duration(ns): 21864937 2025-07-13 13:32:38,349 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747358_6534, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-13 13:32:44,113 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747358_6534 replica FinalizedReplica, blk_1073747358_6534, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747358 for deletion 2025-07-13 13:32:44,115 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747358_6534 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747358 2025-07-13 13:33:38,330 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747359_6535 src: /192.168.158.9:54178 dest: /192.168.158.4:9866 2025-07-13 13:33:38,348 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:54178, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1863695567_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747359_6535, duration(ns): 15327145 2025-07-13 13:33:38,348 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747359_6535, type=LAST_IN_PIPELINE terminating 2025-07-13 13:33:41,117 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747359_6535 replica FinalizedReplica, blk_1073747359_6535, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747359 for deletion 2025-07-13 13:33:41,118 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747359_6535 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747359 2025-07-13 13:37:43,320 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747363_6539 src: /192.168.158.7:33794 dest: /192.168.158.4:9866 2025-07-13 13:37:43,337 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:33794, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1882714776_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747363_6539, duration(ns): 15078063 2025-07-13 13:37:43,338 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747363_6539, type=LAST_IN_PIPELINE terminating 2025-07-13 13:37:47,127 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747363_6539 replica FinalizedReplica, blk_1073747363_6539, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747363 for deletion 2025-07-13 13:37:47,128 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747363_6539 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747363 2025-07-13 13:39:48,332 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747365_6541 src: /192.168.158.7:54318 dest: /192.168.158.4:9866 2025-07-13 13:39:48,357 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:54318, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-458122763_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747365_6541, duration(ns): 20079135 2025-07-13 13:39:48,358 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747365_6541, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-13 13:39:53,130 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747365_6541 replica FinalizedReplica, blk_1073747365_6541, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747365 for deletion 2025-07-13 13:39:53,131 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747365_6541 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747365 2025-07-13 13:40:48,346 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747366_6542 src: /192.168.158.1:34694 dest: /192.168.158.4:9866 2025-07-13 13:40:48,381 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34694, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1575449927_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747366_6542, duration(ns): 25401645 2025-07-13 13:40:48,381 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747366_6542, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-13 13:40:53,131 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747366_6542 replica FinalizedReplica, blk_1073747366_6542, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747366 for deletion 2025-07-13 13:40:53,132 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747366_6542 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747366 2025-07-13 13:43:48,350 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747369_6545 src: /192.168.158.5:56744 dest: /192.168.158.4:9866 2025-07-13 13:43:48,374 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:56744, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1537934571_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747369_6545, duration(ns): 19245443 2025-07-13 13:43:48,375 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747369_6545, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-13 13:43:53,134 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747369_6545 replica FinalizedReplica, blk_1073747369_6545, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747369 for deletion 2025-07-13 13:43:53,136 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747369_6545 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747369 2025-07-13 13:45:53,346 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747371_6547 src: /192.168.158.7:46972 dest: /192.168.158.4:9866 2025-07-13 13:45:53,372 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:46972, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1213665757_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747371_6547, duration(ns): 19602690 2025-07-13 13:45:53,372 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747371_6547, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-13 13:45:56,140 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747371_6547 replica FinalizedReplica, blk_1073747371_6547, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747371 for deletion 2025-07-13 13:45:56,141 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747371_6547 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747371 2025-07-13 13:51:03,341 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747376_6552 src: /192.168.158.7:40420 dest: /192.168.158.4:9866 2025-07-13 13:51:03,366 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:40420, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1150767917_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747376_6552, duration(ns): 19914758 2025-07-13 13:51:03,367 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747376_6552, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-13 13:51:08,154 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747376_6552 replica FinalizedReplica, blk_1073747376_6552, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747376 for deletion 2025-07-13 13:51:08,155 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747376_6552 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747376 2025-07-13 13:52:03,359 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747377_6553 src: /192.168.158.7:43762 dest: /192.168.158.4:9866 2025-07-13 13:52:03,383 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:43762, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1094725983_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747377_6553, duration(ns): 19128227 2025-07-13 13:52:03,383 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747377_6553, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-13 13:52:11,155 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747377_6553 replica FinalizedReplica, blk_1073747377_6553, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747377 for deletion 2025-07-13 13:52:11,156 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747377_6553 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747377 2025-07-13 13:53:03,342 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747378_6554 src: /192.168.158.5:38944 dest: /192.168.158.4:9866 2025-07-13 13:53:03,364 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:38944, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_436272205_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747378_6554, duration(ns): 17169161 2025-07-13 13:53:03,364 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747378_6554, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-13 13:53:11,158 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747378_6554 replica FinalizedReplica, blk_1073747378_6554, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747378 for deletion 2025-07-13 13:53:11,159 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747378_6554 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747378 2025-07-13 13:55:08,355 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747380_6556 src: /192.168.158.7:51118 dest: /192.168.158.4:9866 2025-07-13 13:55:08,372 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:51118, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1322483207_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747380_6556, duration(ns): 15447397 2025-07-13 13:55:08,373 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747380_6556, type=LAST_IN_PIPELINE terminating 2025-07-13 13:55:14,161 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747380_6556 replica FinalizedReplica, blk_1073747380_6556, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747380 for deletion 2025-07-13 13:55:14,162 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747380_6556 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747380 2025-07-13 13:57:08,356 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747382_6558 src: /192.168.158.6:56518 dest: /192.168.158.4:9866 2025-07-13 13:57:08,382 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:56518, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_810896730_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747382_6558, duration(ns): 20249687 2025-07-13 13:57:08,382 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747382_6558, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-13 13:57:14,166 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747382_6558 replica FinalizedReplica, blk_1073747382_6558, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747382 for deletion 2025-07-13 13:57:14,167 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747382_6558 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747382 2025-07-13 14:04:13,364 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747389_6565 src: /192.168.158.8:46516 dest: /192.168.158.4:9866 2025-07-13 14:04:13,382 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:46516, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_617063390_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747389_6565, duration(ns): 15480603 2025-07-13 14:04:13,382 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747389_6565, type=LAST_IN_PIPELINE terminating 2025-07-13 14:04:17,180 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747389_6565 replica FinalizedReplica, blk_1073747389_6565, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747389 for deletion 2025-07-13 14:04:17,181 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747389_6565 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747389 2025-07-13 14:05:18,363 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747390_6566 src: /192.168.158.9:56748 dest: /192.168.158.4:9866 2025-07-13 14:05:18,388 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:56748, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_10011839_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747390_6566, duration(ns): 19426669 2025-07-13 14:05:18,388 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747390_6566, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-13 14:05:23,183 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747390_6566 replica FinalizedReplica, blk_1073747390_6566, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747390 for deletion 2025-07-13 14:05:23,184 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747390_6566 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747390 2025-07-13 14:07:18,371 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747392_6568 src: /192.168.158.8:37350 dest: /192.168.158.4:9866 2025-07-13 14:07:18,389 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:37350, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1219474344_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747392_6568, duration(ns): 15607463 2025-07-13 14:07:18,390 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747392_6568, type=LAST_IN_PIPELINE terminating 2025-07-13 14:07:23,188 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747392_6568 replica FinalizedReplica, blk_1073747392_6568, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747392 for deletion 2025-07-13 14:07:23,189 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747392_6568 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747392 2025-07-13 14:08:18,400 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747393_6569 src: /192.168.158.6:49428 dest: /192.168.158.4:9866 2025-07-13 14:08:18,426 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:49428, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1059534087_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747393_6569, duration(ns): 20242955 2025-07-13 14:08:18,426 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747393_6569, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-13 14:08:23,189 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747393_6569 replica FinalizedReplica, blk_1073747393_6569, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747393 for deletion 2025-07-13 14:08:23,190 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747393_6569 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747393 2025-07-13 14:10:18,364 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747395_6571 src: /192.168.158.6:40182 dest: /192.168.158.4:9866 2025-07-13 14:10:18,390 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:40182, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1897290921_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747395_6571, duration(ns): 20459576 2025-07-13 14:10:18,390 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747395_6571, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-13 14:10:26,196 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747395_6571 replica FinalizedReplica, blk_1073747395_6571, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747395 for deletion 2025-07-13 14:10:26,197 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747395_6571 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747395 2025-07-13 14:12:23,385 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747397_6573 src: /192.168.158.1:49458 dest: /192.168.158.4:9866 2025-07-13 14:12:23,420 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49458, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1876992599_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747397_6573, duration(ns): 26104501 2025-07-13 14:12:23,420 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747397_6573, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-13 14:12:26,199 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747397_6573 replica FinalizedReplica, blk_1073747397_6573, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747397 for deletion 2025-07-13 14:12:26,200 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747397_6573 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747397 2025-07-13 14:13:28,378 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747398_6574 src: /192.168.158.5:33152 dest: /192.168.158.4:9866 2025-07-13 14:13:28,396 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:33152, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_266043612_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747398_6574, duration(ns): 15204061 2025-07-13 14:13:28,396 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747398_6574, type=LAST_IN_PIPELINE terminating 2025-07-13 14:13:35,202 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747398_6574 replica FinalizedReplica, blk_1073747398_6574, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747398 for deletion 2025-07-13 14:13:35,203 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747398_6574 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747398 2025-07-13 14:14:28,381 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747399_6575 src: /192.168.158.9:43756 dest: /192.168.158.4:9866 2025-07-13 14:14:28,398 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:43756, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1617838387_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747399_6575, duration(ns): 15848874 2025-07-13 14:14:28,399 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747399_6575, type=LAST_IN_PIPELINE terminating 2025-07-13 14:14:35,203 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747399_6575 replica FinalizedReplica, blk_1073747399_6575, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747399 for deletion 2025-07-13 14:14:35,205 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747399_6575 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747399 2025-07-13 14:16:28,376 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747401_6577 src: /192.168.158.1:54906 dest: /192.168.158.4:9866 2025-07-13 14:16:28,407 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54906, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1133198622_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747401_6577, duration(ns): 22171793 2025-07-13 14:16:28,408 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747401_6577, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-13 14:16:35,206 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747401_6577 replica FinalizedReplica, blk_1073747401_6577, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747401 for deletion 2025-07-13 14:16:35,207 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747401_6577 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747401 2025-07-13 14:20:33,376 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747405_6581 src: /192.168.158.9:37142 dest: /192.168.158.4:9866 2025-07-13 14:20:33,396 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:37142, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2105462741_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747405_6581, duration(ns): 17871742 2025-07-13 14:20:33,397 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747405_6581, type=LAST_IN_PIPELINE terminating 2025-07-13 14:20:35,215 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747405_6581 replica FinalizedReplica, blk_1073747405_6581, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747405 for deletion 2025-07-13 14:20:35,216 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747405_6581 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747405 2025-07-13 14:21:38,373 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747406_6582 src: /192.168.158.1:45076 dest: /192.168.158.4:9866 2025-07-13 14:21:38,404 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45076, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1800532452_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747406_6582, duration(ns): 22347786 2025-07-13 14:21:38,404 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747406_6582, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-13 14:21:41,220 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747406_6582 replica FinalizedReplica, blk_1073747406_6582, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747406 for deletion 2025-07-13 14:21:41,221 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747406_6582 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747406 2025-07-13 14:23:43,393 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747408_6584 src: /192.168.158.1:48430 dest: /192.168.158.4:9866 2025-07-13 14:23:43,425 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48430, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1558808725_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747408_6584, duration(ns): 22747480 2025-07-13 14:23:43,425 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747408_6584, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-13 14:23:50,224 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747408_6584 replica FinalizedReplica, blk_1073747408_6584, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747408 for deletion 2025-07-13 14:23:50,225 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747408_6584 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747408 2025-07-13 14:24:48,386 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747409_6585 src: /192.168.158.1:59080 dest: /192.168.158.4:9866 2025-07-13 14:24:48,418 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59080, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_611396958_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747409_6585, duration(ns): 23554185 2025-07-13 14:24:48,418 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747409_6585, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-13 14:24:50,225 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747409_6585 replica FinalizedReplica, blk_1073747409_6585, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747409 for deletion 2025-07-13 14:24:50,228 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747409_6585 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747409 2025-07-13 14:30:58,412 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747415_6591 src: /192.168.158.1:51950 dest: /192.168.158.4:9866 2025-07-13 14:30:58,445 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51950, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2013235716_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747415_6591, duration(ns): 23563572 2025-07-13 14:30:58,445 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747415_6591, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-13 14:31:02,240 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747415_6591 replica FinalizedReplica, blk_1073747415_6591, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747415 for deletion 2025-07-13 14:31:02,241 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747415_6591 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747415 2025-07-13 14:34:08,400 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747418_6594 src: /192.168.158.1:51652 dest: /192.168.158.4:9866 2025-07-13 14:34:08,435 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51652, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1190757254_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747418_6594, duration(ns): 26129912 2025-07-13 14:34:08,436 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747418_6594, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-13 14:34:14,250 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747418_6594 replica FinalizedReplica, blk_1073747418_6594, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747418 for deletion 2025-07-13 14:34:14,252 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747418_6594 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747418 2025-07-13 14:35:08,394 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747419_6595 src: /192.168.158.8:38332 dest: /192.168.158.4:9866 2025-07-13 14:35:08,418 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:38332, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_758310366_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747419_6595, duration(ns): 19205387 2025-07-13 14:35:08,419 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747419_6595, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-13 14:35:11,254 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747419_6595 replica FinalizedReplica, blk_1073747419_6595, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747419 for deletion 2025-07-13 14:35:11,255 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747419_6595 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747419 2025-07-13 14:36:08,409 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747420_6596 src: /192.168.158.6:38728 dest: /192.168.158.4:9866 2025-07-13 14:36:08,434 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:38728, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1926631089_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747420_6596, duration(ns): 19932115 2025-07-13 14:36:08,434 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747420_6596, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-13 14:36:11,256 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747420_6596 replica FinalizedReplica, blk_1073747420_6596, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747420 for deletion 2025-07-13 14:36:11,257 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747420_6596 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747420 2025-07-13 14:37:13,400 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747421_6597 src: /192.168.158.9:59304 dest: /192.168.158.4:9866 2025-07-13 14:37:13,423 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:59304, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1867564938_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747421_6597, duration(ns): 18233503 2025-07-13 14:37:13,424 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747421_6597, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-13 14:37:20,258 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747421_6597 replica FinalizedReplica, blk_1073747421_6597, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747421 for deletion 2025-07-13 14:37:20,259 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747421_6597 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747421 2025-07-13 14:40:18,403 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747424_6600 src: /192.168.158.1:51894 dest: /192.168.158.4:9866 2025-07-13 14:40:18,435 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51894, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_837789440_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747424_6600, duration(ns): 22852966 2025-07-13 14:40:18,435 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747424_6600, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-13 14:40:20,267 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747424_6600 replica FinalizedReplica, blk_1073747424_6600, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747424 for deletion 2025-07-13 14:40:20,268 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747424_6600 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747424 2025-07-13 14:41:18,407 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747425_6601 src: /192.168.158.7:49006 dest: /192.168.158.4:9866 2025-07-13 14:41:18,433 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:49006, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_748328192_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747425_6601, duration(ns): 20530649 2025-07-13 14:41:18,434 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747425_6601, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-13 14:41:23,270 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747425_6601 replica FinalizedReplica, blk_1073747425_6601, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747425 for deletion 2025-07-13 14:41:23,271 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747425_6601 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747425 2025-07-13 14:43:28,407 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747427_6603 src: /192.168.158.1:35646 dest: /192.168.158.4:9866 2025-07-13 14:43:28,438 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35646, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_646682172_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747427_6603, duration(ns): 21540364 2025-07-13 14:43:28,438 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747427_6603, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-13 14:43:35,273 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747427_6603 replica FinalizedReplica, blk_1073747427_6603, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747427 for deletion 2025-07-13 14:43:35,274 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747427_6603 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747427 2025-07-13 14:44:28,414 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747428_6604 src: /192.168.158.1:52536 dest: /192.168.158.4:9866 2025-07-13 14:44:28,446 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52536, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1740105525_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747428_6604, duration(ns): 23175262 2025-07-13 14:44:28,446 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747428_6604, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-13 14:44:32,274 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747428_6604 replica FinalizedReplica, blk_1073747428_6604, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747428 for deletion 2025-07-13 14:44:32,275 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747428_6604 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747428 2025-07-13 14:47:33,419 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747431_6607 src: /192.168.158.8:57624 dest: /192.168.158.4:9866 2025-07-13 14:47:33,438 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:57624, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1121090197_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747431_6607, duration(ns): 16724724 2025-07-13 14:47:33,438 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747431_6607, type=LAST_IN_PIPELINE terminating 2025-07-13 14:47:38,280 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747431_6607 replica FinalizedReplica, blk_1073747431_6607, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747431 for deletion 2025-07-13 14:47:38,281 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747431_6607 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747431 2025-07-13 14:50:33,422 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747434_6610 src: /192.168.158.5:36008 dest: /192.168.158.4:9866 2025-07-13 14:50:33,441 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:36008, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-72621023_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747434_6610, duration(ns): 16822468 2025-07-13 14:50:33,441 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747434_6610, type=LAST_IN_PIPELINE terminating 2025-07-13 14:50:35,284 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747434_6610 replica FinalizedReplica, blk_1073747434_6610, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747434 for deletion 2025-07-13 14:50:35,285 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747434_6610 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747434 2025-07-13 14:51:33,426 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747435_6611 src: /192.168.158.8:43514 dest: /192.168.158.4:9866 2025-07-13 14:51:33,442 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:43514, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_64553862_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747435_6611, duration(ns): 14632628 2025-07-13 14:51:33,443 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747435_6611, type=LAST_IN_PIPELINE terminating 2025-07-13 14:51:35,284 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747435_6611 replica FinalizedReplica, blk_1073747435_6611, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747435 for deletion 2025-07-13 14:51:35,286 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747435_6611 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747435 2025-07-13 14:52:33,433 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747436_6612 src: /192.168.158.8:47076 dest: /192.168.158.4:9866 2025-07-13 14:52:33,450 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:47076, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1028337160_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747436_6612, duration(ns): 15685439 2025-07-13 14:52:33,450 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747436_6612, type=LAST_IN_PIPELINE terminating 2025-07-13 14:52:35,287 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747436_6612 replica FinalizedReplica, blk_1073747436_6612, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747436 for deletion 2025-07-13 14:52:35,288 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747436_6612 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747436 2025-07-13 14:53:33,416 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747437_6613 src: /192.168.158.1:52398 dest: /192.168.158.4:9866 2025-07-13 14:53:33,446 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52398, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-894378046_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747437_6613, duration(ns): 22753981 2025-07-13 14:53:33,447 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747437_6613, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-13 14:53:35,288 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747437_6613 replica FinalizedReplica, blk_1073747437_6613, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747437 for deletion 2025-07-13 14:53:35,290 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747437_6613 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747437 2025-07-13 14:56:38,424 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747440_6616 src: /192.168.158.1:40348 dest: /192.168.158.4:9866 2025-07-13 14:56:38,454 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40348, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1998760631_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747440_6616, duration(ns): 20201260 2025-07-13 14:56:38,454 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747440_6616, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-13 14:56:41,295 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747440_6616 replica FinalizedReplica, blk_1073747440_6616, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747440 for deletion 2025-07-13 14:56:41,297 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747440_6616 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747440 2025-07-13 14:58:38,428 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747442_6618 src: /192.168.158.9:58520 dest: /192.168.158.4:9866 2025-07-13 14:58:38,445 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:58520, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_265765619_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747442_6618, duration(ns): 14597114 2025-07-13 14:58:38,445 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747442_6618, type=LAST_IN_PIPELINE terminating 2025-07-13 14:58:41,297 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747442_6618 replica FinalizedReplica, blk_1073747442_6618, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747442 for deletion 2025-07-13 14:58:41,298 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747442_6618 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747442 2025-07-13 15:00:48,434 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747444_6620 src: /192.168.158.5:48056 dest: /192.168.158.4:9866 2025-07-13 15:00:48,460 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:48056, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1466052516_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747444_6620, duration(ns): 19370184 2025-07-13 15:00:48,461 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747444_6620, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-13 15:00:50,299 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747444_6620 replica FinalizedReplica, blk_1073747444_6620, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747444 for deletion 2025-07-13 15:00:50,300 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747444_6620 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747444 2025-07-13 15:03:58,438 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747447_6623 src: /192.168.158.9:57916 dest: /192.168.158.4:9866 2025-07-13 15:03:58,460 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:57916, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_612469540_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747447_6623, duration(ns): 17553954 2025-07-13 15:03:58,461 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747447_6623, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-13 15:04:05,306 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747447_6623 replica FinalizedReplica, blk_1073747447_6623, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747447 for deletion 2025-07-13 15:04:05,307 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747447_6623 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747447 2025-07-13 15:05:03,448 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747448_6624 src: /192.168.158.6:52184 dest: /192.168.158.4:9866 2025-07-13 15:05:03,465 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:52184, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-223543341_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747448_6624, duration(ns): 14689888 2025-07-13 15:05:03,465 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747448_6624, type=LAST_IN_PIPELINE terminating 2025-07-13 15:05:05,310 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747448_6624 replica FinalizedReplica, blk_1073747448_6624, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747448 for deletion 2025-07-13 15:05:05,311 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747448_6624 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747448 2025-07-13 15:07:08,438 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747450_6626 src: /192.168.158.6:34114 dest: /192.168.158.4:9866 2025-07-13 15:07:08,464 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:34114, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1339699663_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747450_6626, duration(ns): 20368575 2025-07-13 15:07:08,464 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747450_6626, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-13 15:07:11,316 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747450_6626 replica FinalizedReplica, blk_1073747450_6626, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747450 for deletion 2025-07-13 15:07:11,317 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747450_6626 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747450 2025-07-13 15:09:13,429 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747452_6628 src: /192.168.158.1:46562 dest: /192.168.158.4:9866 2025-07-13 15:09:13,461 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46562, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_721166204_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747452_6628, duration(ns): 23716528 2025-07-13 15:09:13,462 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747452_6628, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-13 15:09:20,318 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747452_6628 replica FinalizedReplica, blk_1073747452_6628, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747452 for deletion 2025-07-13 15:09:20,319 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747452_6628 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747452 2025-07-13 15:10:13,475 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747453_6629 src: /192.168.158.6:50368 dest: /192.168.158.4:9866 2025-07-13 15:10:13,493 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:50368, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_876300763_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747453_6629, duration(ns): 14804125 2025-07-13 15:10:13,493 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747453_6629, type=LAST_IN_PIPELINE terminating 2025-07-13 15:10:17,320 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747453_6629 replica FinalizedReplica, blk_1073747453_6629, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747453 for deletion 2025-07-13 15:10:17,321 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747453_6629 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747453 2025-07-13 15:11:13,468 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747454_6630 src: /192.168.158.8:34302 dest: /192.168.158.4:9866 2025-07-13 15:11:13,495 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:34302, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-740810047_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747454_6630, duration(ns): 21939791 2025-07-13 15:11:13,495 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747454_6630, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-13 15:11:20,324 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747454_6630 replica FinalizedReplica, blk_1073747454_6630, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747454 for deletion 2025-07-13 15:11:20,325 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747454_6630 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747454 2025-07-13 15:12:13,449 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747455_6631 src: /192.168.158.7:48640 dest: /192.168.158.4:9866 2025-07-13 15:12:13,466 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:48640, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1001546115_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747455_6631, duration(ns): 15715472 2025-07-13 15:12:13,467 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747455_6631, type=LAST_IN_PIPELINE terminating 2025-07-13 15:12:17,325 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747455_6631 replica FinalizedReplica, blk_1073747455_6631, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747455 for deletion 2025-07-13 15:12:17,326 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747455_6631 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073747455 2025-07-13 15:13:13,450 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747456_6632 src: /192.168.158.9:57332 dest: /192.168.158.4:9866 2025-07-13 15:13:13,468 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:57332, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_830924216_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747456_6632, duration(ns): 15817664 2025-07-13 15:13:13,468 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747456_6632, type=LAST_IN_PIPELINE terminating 2025-07-13 15:13:17,328 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747456_6632 replica FinalizedReplica, blk_1073747456_6632, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747456 for deletion 2025-07-13 15:13:17,329 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747456_6632 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747456 2025-07-13 15:16:18,449 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747459_6635 src: /192.168.158.6:34642 dest: /192.168.158.4:9866 2025-07-13 15:16:18,474 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:34642, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_241773501_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747459_6635, duration(ns): 19553828 2025-07-13 15:16:18,475 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747459_6635, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-13 15:16:20,331 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747459_6635 replica FinalizedReplica, blk_1073747459_6635, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747459 for deletion 2025-07-13 15:16:20,333 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747459_6635 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747459 2025-07-13 15:18:23,450 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747461_6637 src: /192.168.158.1:60502 dest: /192.168.158.4:9866 2025-07-13 15:18:23,482 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60502, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2048266703_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747461_6637, duration(ns): 21784412 2025-07-13 15:18:23,483 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747461_6637, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-13 15:18:26,333 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747461_6637 replica FinalizedReplica, blk_1073747461_6637, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747461 for deletion 2025-07-13 15:18:26,334 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747461_6637 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747461 2025-07-13 15:19:23,453 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747462_6638 src: /192.168.158.1:54466 dest: /192.168.158.4:9866 2025-07-13 15:19:23,485 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54466, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_765698045_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747462_6638, duration(ns): 22635960 2025-07-13 15:19:23,485 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747462_6638, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-13 15:19:26,337 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747462_6638 replica FinalizedReplica, blk_1073747462_6638, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747462 for deletion 2025-07-13 15:19:26,338 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747462_6638 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747462 2025-07-13 15:20:28,454 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747463_6639 src: /192.168.158.1:59628 dest: /192.168.158.4:9866 2025-07-13 15:20:28,486 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59628, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-825421947_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747463_6639, duration(ns): 22814548 2025-07-13 15:20:28,486 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747463_6639, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-13 15:20:35,337 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747463_6639 replica FinalizedReplica, blk_1073747463_6639, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747463 for deletion 2025-07-13 15:20:35,339 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747463_6639 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747463 2025-07-13 15:21:28,463 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747464_6640 src: /192.168.158.9:51224 dest: /192.168.158.4:9866 2025-07-13 15:21:28,482 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:51224, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-598803727_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747464_6640, duration(ns): 16853575 2025-07-13 15:21:28,482 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747464_6640, type=LAST_IN_PIPELINE terminating 2025-07-13 15:21:35,339 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747464_6640 replica FinalizedReplica, blk_1073747464_6640, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747464 for deletion 2025-07-13 15:21:35,341 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747464_6640 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747464 2025-07-13 15:22:28,463 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747465_6641 src: /192.168.158.6:36422 dest: /192.168.158.4:9866 2025-07-13 15:22:28,481 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:36422, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1615244178_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747465_6641, duration(ns): 15866256 2025-07-13 15:22:28,481 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747465_6641, type=LAST_IN_PIPELINE terminating 2025-07-13 15:22:35,339 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747465_6641 replica FinalizedReplica, blk_1073747465_6641, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747465 for deletion 2025-07-13 15:22:35,341 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747465_6641 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747465 2025-07-13 15:23:28,465 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747466_6642 src: /192.168.158.6:35498 dest: /192.168.158.4:9866 2025-07-13 15:23:28,489 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:35498, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_559027070_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747466_6642, duration(ns): 18609770 2025-07-13 15:23:28,490 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747466_6642, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-13 15:23:32,342 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747466_6642 replica FinalizedReplica, blk_1073747466_6642, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747466 for deletion 2025-07-13 15:23:32,343 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747466_6642 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747466 2025-07-13 15:24:28,463 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747467_6643 src: /192.168.158.1:59118 dest: /192.168.158.4:9866 2025-07-13 15:24:28,497 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59118, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1588526684_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747467_6643, duration(ns): 24141945 2025-07-13 15:24:28,499 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747467_6643, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-13 15:24:35,344 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747467_6643 replica FinalizedReplica, blk_1073747467_6643, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747467 for deletion 2025-07-13 15:24:35,346 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747467_6643 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747467 2025-07-13 15:25:33,435 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747468_6644 src: /192.168.158.1:50104 dest: /192.168.158.4:9866 2025-07-13 15:25:33,467 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50104, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1042511960_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747468_6644, duration(ns): 23184472 2025-07-13 15:25:33,467 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747468_6644, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-13 15:25:35,345 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747468_6644 replica FinalizedReplica, blk_1073747468_6644, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747468 for deletion 2025-07-13 15:25:35,346 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747468_6644 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747468 2025-07-13 15:26:33,462 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747469_6645 src: /192.168.158.9:38994 dest: /192.168.158.4:9866 2025-07-13 15:26:33,487 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:38994, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2011753749_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747469_6645, duration(ns): 19496641 2025-07-13 15:26:33,487 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747469_6645, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-13 15:26:35,347 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747469_6645 replica FinalizedReplica, blk_1073747469_6645, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747469 for deletion 2025-07-13 15:26:35,348 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747469_6645 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747469 2025-07-13 15:27:33,460 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747470_6646 src: /192.168.158.1:40548 dest: /192.168.158.4:9866 2025-07-13 15:27:33,494 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40548, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-986986208_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747470_6646, duration(ns): 25559009 2025-07-13 15:27:33,494 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747470_6646, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-13 15:27:35,351 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747470_6646 replica FinalizedReplica, blk_1073747470_6646, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747470 for deletion 2025-07-13 15:27:35,352 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747470_6646 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747470 2025-07-13 15:28:33,474 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747471_6647 src: /192.168.158.7:50476 dest: /192.168.158.4:9866 2025-07-13 15:28:33,492 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:50476, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-38809194_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747471_6647, duration(ns): 15606532 2025-07-13 15:28:33,492 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747471_6647, type=LAST_IN_PIPELINE terminating 2025-07-13 15:28:35,355 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747471_6647 replica FinalizedReplica, blk_1073747471_6647, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747471 for deletion 2025-07-13 15:28:35,356 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747471_6647 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747471 2025-07-13 15:30:33,479 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747473_6649 src: /192.168.158.5:49800 dest: /192.168.158.4:9866 2025-07-13 15:30:33,496 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:49800, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2037183648_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747473_6649, duration(ns): 14552207 2025-07-13 15:30:33,496 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747473_6649, type=LAST_IN_PIPELINE terminating 2025-07-13 15:30:38,360 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747473_6649 replica FinalizedReplica, blk_1073747473_6649, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747473 for deletion 2025-07-13 15:30:38,361 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747473_6649 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747473 2025-07-13 15:31:33,473 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747474_6650 src: /192.168.158.6:52520 dest: /192.168.158.4:9866 2025-07-13 15:31:33,498 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:52520, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1124403870_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747474_6650, duration(ns): 19592020 2025-07-13 15:31:33,498 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747474_6650, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-13 15:31:38,361 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747474_6650 replica FinalizedReplica, blk_1073747474_6650, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747474 for deletion 2025-07-13 15:31:38,363 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747474_6650 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747474 2025-07-13 15:32:33,476 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747475_6651 src: /192.168.158.5:46676 dest: /192.168.158.4:9866 2025-07-13 15:32:33,494 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:46676, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1042561810_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747475_6651, duration(ns): 16161526 2025-07-13 15:32:33,494 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747475_6651, type=LAST_IN_PIPELINE terminating 2025-07-13 15:32:35,363 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747475_6651 replica FinalizedReplica, blk_1073747475_6651, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747475 for deletion 2025-07-13 15:32:35,364 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747475_6651 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747475 2025-07-13 15:37:33,482 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747480_6656 src: /192.168.158.9:39086 dest: /192.168.158.4:9866 2025-07-13 15:37:33,506 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:39086, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1349395673_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747480_6656, duration(ns): 18576178 2025-07-13 15:37:33,507 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747480_6656, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-13 15:37:35,373 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747480_6656 replica FinalizedReplica, blk_1073747480_6656, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747480 for deletion 2025-07-13 15:37:35,374 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747480_6656 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747480 2025-07-13 15:39:33,492 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747482_6658 src: /192.168.158.8:41390 dest: /192.168.158.4:9866 2025-07-13 15:39:33,520 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:41390, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_717497407_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747482_6658, duration(ns): 22321829 2025-07-13 15:39:33,521 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747482_6658, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-13 15:39:38,380 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747482_6658 replica FinalizedReplica, blk_1073747482_6658, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747482 for deletion 2025-07-13 15:39:38,381 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747482_6658 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747482 2025-07-13 15:41:38,488 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747484_6660 src: /192.168.158.5:55488 dest: /192.168.158.4:9866 2025-07-13 15:41:38,514 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:55488, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1760243511_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747484_6660, duration(ns): 20065907 2025-07-13 15:41:38,514 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747484_6660, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-13 15:41:44,387 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747484_6660 replica FinalizedReplica, blk_1073747484_6660, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747484 for deletion 2025-07-13 15:41:44,388 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747484_6660 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747484 2025-07-13 15:42:38,491 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747485_6661 src: /192.168.158.8:48092 dest: /192.168.158.4:9866 2025-07-13 15:42:38,516 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:48092, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1301905383_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747485_6661, duration(ns): 17982572 2025-07-13 15:42:38,516 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747485_6661, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-13 15:42:41,388 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747485_6661 replica FinalizedReplica, blk_1073747485_6661, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747485 for deletion 2025-07-13 15:42:41,389 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747485_6661 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747485 2025-07-13 15:44:43,488 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747487_6663 src: /192.168.158.1:33592 dest: /192.168.158.4:9866 2025-07-13 15:44:43,521 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33592, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1048289990_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747487_6663, duration(ns): 23532856 2025-07-13 15:44:43,521 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747487_6663, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-13 15:44:47,393 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747487_6663 replica FinalizedReplica, blk_1073747487_6663, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747487 for deletion 2025-07-13 15:44:47,394 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747487_6663 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747487 2025-07-13 15:45:43,496 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747488_6664 src: /192.168.158.7:59508 dest: /192.168.158.4:9866 2025-07-13 15:45:43,514 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:59508, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1387050723_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747488_6664, duration(ns): 16197594 2025-07-13 15:45:43,515 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747488_6664, type=LAST_IN_PIPELINE terminating 2025-07-13 15:45:47,395 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747488_6664 replica FinalizedReplica, blk_1073747488_6664, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747488 for deletion 2025-07-13 15:45:47,396 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747488_6664 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747488 2025-07-13 15:50:48,487 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747493_6669 src: /192.168.158.7:47078 dest: /192.168.158.4:9866 2025-07-13 15:50:48,513 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:47078, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2008450315_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747493_6669, duration(ns): 20742931 2025-07-13 15:50:48,514 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747493_6669, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-13 15:50:53,403 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747493_6669 replica FinalizedReplica, blk_1073747493_6669, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747493 for deletion 2025-07-13 15:50:53,404 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747493_6669 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747493 2025-07-13 15:51:48,485 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747494_6670 src: /192.168.158.1:37426 dest: /192.168.158.4:9866 2025-07-13 15:51:48,518 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37426, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1719187852_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747494_6670, duration(ns): 24599990 2025-07-13 15:51:48,519 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747494_6670, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-13 15:51:53,406 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747494_6670 replica FinalizedReplica, blk_1073747494_6670, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747494 for deletion 2025-07-13 15:51:53,407 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747494_6670 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747494 2025-07-13 15:52:48,489 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747495_6671 src: /192.168.158.1:49658 dest: /192.168.158.4:9866 2025-07-13 15:52:48,519 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49658, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1251077568_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747495_6671, duration(ns): 22107249 2025-07-13 15:52:48,520 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747495_6671, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-13 15:52:50,405 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747495_6671 replica FinalizedReplica, blk_1073747495_6671, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747495 for deletion 2025-07-13 15:52:50,407 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747495_6671 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747495 2025-07-13 15:57:58,534 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747500_6676 src: /192.168.158.9:58674 dest: /192.168.158.4:9866 2025-07-13 15:57:58,552 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:58674, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_17146171_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747500_6676, duration(ns): 15413114 2025-07-13 15:57:58,552 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747500_6676, type=LAST_IN_PIPELINE terminating 2025-07-13 15:58:05,416 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747500_6676 replica FinalizedReplica, blk_1073747500_6676, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747500 for deletion 2025-07-13 15:58:05,418 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747500_6676 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747500 2025-07-13 15:58:58,551 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747501_6677 src: /192.168.158.7:48286 dest: /192.168.158.4:9866 2025-07-13 15:58:58,573 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:48286, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_479674083_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747501_6677, duration(ns): 17301114 2025-07-13 15:58:58,573 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747501_6677, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-13 15:59:02,421 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747501_6677 replica FinalizedReplica, blk_1073747501_6677, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747501 for deletion 2025-07-13 15:59:02,422 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747501_6677 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747501 2025-07-13 15:59:58,525 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747502_6678 src: /192.168.158.6:46242 dest: /192.168.158.4:9866 2025-07-13 15:59:58,540 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:46242, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_837347063_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747502_6678, duration(ns): 13645494 2025-07-13 15:59:58,541 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747502_6678, type=LAST_IN_PIPELINE terminating 2025-07-13 16:00:05,423 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747502_6678 replica FinalizedReplica, blk_1073747502_6678, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747502 for deletion 2025-07-13 16:00:05,424 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747502_6678 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747502 2025-07-13 16:01:58,512 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747504_6680 src: /192.168.158.1:38684 dest: /192.168.158.4:9866 2025-07-13 16:01:58,544 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38684, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-610204576_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747504_6680, duration(ns): 23253284 2025-07-13 16:01:58,544 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747504_6680, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-13 16:02:05,426 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747504_6680 replica FinalizedReplica, blk_1073747504_6680, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747504 for deletion 2025-07-13 16:02:05,427 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747504_6680 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747504 2025-07-13 16:05:03,515 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747507_6683 src: /192.168.158.1:57044 dest: /192.168.158.4:9866 2025-07-13 16:05:03,546 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57044, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1203734182_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747507_6683, duration(ns): 22034892 2025-07-13 16:05:03,546 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747507_6683, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-13 16:05:05,434 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747507_6683 replica FinalizedReplica, blk_1073747507_6683, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747507 for deletion 2025-07-13 16:05:05,435 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747507_6683 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747507 2025-07-13 16:09:13,558 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747511_6687 src: /192.168.158.6:59456 dest: /192.168.158.4:9866 2025-07-13 16:09:13,577 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:59456, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1649685260_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747511_6687, duration(ns): 16702392 2025-07-13 16:09:13,577 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747511_6687, type=LAST_IN_PIPELINE terminating 2025-07-13 16:09:17,441 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747511_6687 replica FinalizedReplica, blk_1073747511_6687, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747511 for deletion 2025-07-13 16:09:17,442 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747511_6687 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747511 2025-07-13 16:11:13,560 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747513_6689 src: /192.168.158.7:35072 dest: /192.168.158.4:9866 2025-07-13 16:11:13,577 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:35072, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-375045975_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747513_6689, duration(ns): 15175442 2025-07-13 16:11:13,577 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747513_6689, type=LAST_IN_PIPELINE terminating 2025-07-13 16:11:20,445 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747513_6689 replica FinalizedReplica, blk_1073747513_6689, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747513 for deletion 2025-07-13 16:11:20,446 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747513_6689 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747513 2025-07-13 16:12:13,546 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747514_6690 src: /192.168.158.5:38672 dest: /192.168.158.4:9866 2025-07-13 16:12:13,570 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:38672, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2028706290_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747514_6690, duration(ns): 18935007 2025-07-13 16:12:13,570 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747514_6690, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-13 16:12:17,447 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747514_6690 replica FinalizedReplica, blk_1073747514_6690, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747514 for deletion 2025-07-13 16:12:17,448 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747514_6690 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747514 2025-07-13 16:13:18,560 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747515_6691 src: /192.168.158.6:37014 dest: /192.168.158.4:9866 2025-07-13 16:13:18,578 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:37014, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1448832545_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747515_6691, duration(ns): 16638432 2025-07-13 16:13:18,579 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747515_6691, type=LAST_IN_PIPELINE terminating 2025-07-13 16:13:20,450 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747515_6691 replica FinalizedReplica, blk_1073747515_6691, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747515 for deletion 2025-07-13 16:13:20,451 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747515_6691 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747515 2025-07-13 16:14:23,514 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747516_6692 src: /192.168.158.1:51126 dest: /192.168.158.4:9866 2025-07-13 16:14:23,545 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51126, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1556341472_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747516_6692, duration(ns): 22184171 2025-07-13 16:14:23,546 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747516_6692, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-13 16:14:26,455 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747516_6692 replica FinalizedReplica, blk_1073747516_6692, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747516 for deletion 2025-07-13 16:14:26,456 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747516_6692 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747516 2025-07-13 16:17:23,601 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747519_6695 src: /192.168.158.5:46052 dest: /192.168.158.4:9866 2025-07-13 16:17:23,621 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:46052, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1049110113_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747519_6695, duration(ns): 17164329 2025-07-13 16:17:23,621 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747519_6695, type=LAST_IN_PIPELINE terminating 2025-07-13 16:17:26,461 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747519_6695 replica FinalizedReplica, blk_1073747519_6695, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747519 for deletion 2025-07-13 16:17:26,462 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747519_6695 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747519 2025-07-13 16:19:23,568 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747521_6697 src: /192.168.158.9:49602 dest: /192.168.158.4:9866 2025-07-13 16:19:23,587 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:49602, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1645850191_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747521_6697, duration(ns): 16190961 2025-07-13 16:19:23,587 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747521_6697, type=LAST_IN_PIPELINE terminating 2025-07-13 16:19:29,465 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747521_6697 replica FinalizedReplica, blk_1073747521_6697, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747521 for deletion 2025-07-13 16:19:29,467 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747521_6697 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747521 2025-07-13 16:21:23,576 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747523_6699 src: /192.168.158.6:46966 dest: /192.168.158.4:9866 2025-07-13 16:21:23,596 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:46966, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1406389088_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747523_6699, duration(ns): 17349573 2025-07-13 16:21:23,597 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747523_6699, type=LAST_IN_PIPELINE terminating 2025-07-13 16:21:29,472 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747523_6699 replica FinalizedReplica, blk_1073747523_6699, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747523 for deletion 2025-07-13 16:21:29,473 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747523_6699 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747523 2025-07-13 16:24:23,582 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747526_6702 src: /192.168.158.1:59812 dest: /192.168.158.4:9866 2025-07-13 16:24:23,613 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59812, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1018562917_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747526_6702, duration(ns): 23463668 2025-07-13 16:24:23,614 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747526_6702, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-13 16:24:26,478 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747526_6702 replica FinalizedReplica, blk_1073747526_6702, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747526 for deletion 2025-07-13 16:24:26,479 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747526_6702 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747526 2025-07-13 16:28:28,594 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747530_6706 src: /192.168.158.8:47810 dest: /192.168.158.4:9866 2025-07-13 16:28:28,621 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:47810, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1216814990_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747530_6706, duration(ns): 21497582 2025-07-13 16:28:28,622 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747530_6706, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-13 16:28:32,487 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747530_6706 replica FinalizedReplica, blk_1073747530_6706, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747530 for deletion 2025-07-13 16:28:32,489 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747530_6706 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747530 2025-07-13 16:31:33,581 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747533_6709 src: /192.168.158.9:35540 dest: /192.168.158.4:9866 2025-07-13 16:31:33,600 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:35540, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_821107696_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747533_6709, duration(ns): 16906281 2025-07-13 16:31:33,600 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747533_6709, type=LAST_IN_PIPELINE terminating 2025-07-13 16:31:38,493 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747533_6709 replica FinalizedReplica, blk_1073747533_6709, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747533 for deletion 2025-07-13 16:31:38,495 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747533_6709 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747533 2025-07-13 16:32:33,571 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747534_6710 src: /192.168.158.1:60604 dest: /192.168.158.4:9866 2025-07-13 16:32:33,602 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60604, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1445153025_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747534_6710, duration(ns): 21922105 2025-07-13 16:32:33,603 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747534_6710, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-13 16:32:38,495 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747534_6710 replica FinalizedReplica, blk_1073747534_6710, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747534 for deletion 2025-07-13 16:32:38,497 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747534_6710 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747534 2025-07-13 16:34:38,572 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747536_6712 src: /192.168.158.8:55364 dest: /192.168.158.4:9866 2025-07-13 16:34:38,596 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:55364, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_995512722_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747536_6712, duration(ns): 19014333 2025-07-13 16:34:38,596 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747536_6712, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-13 16:34:41,499 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747536_6712 replica FinalizedReplica, blk_1073747536_6712, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747536 for deletion 2025-07-13 16:34:41,500 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747536_6712 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747536 2025-07-13 16:38:48,590 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747540_6716 src: /192.168.158.7:35640 dest: /192.168.158.4:9866 2025-07-13 16:38:48,607 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:35640, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1920250688_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747540_6716, duration(ns): 15264599 2025-07-13 16:38:48,608 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747540_6716, type=LAST_IN_PIPELINE terminating 2025-07-13 16:38:50,509 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747540_6716 replica FinalizedReplica, blk_1073747540_6716, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747540 for deletion 2025-07-13 16:38:50,510 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747540_6716 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747540 2025-07-13 16:39:53,584 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747541_6717 src: /192.168.158.8:57474 dest: /192.168.158.4:9866 2025-07-13 16:39:53,610 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:57474, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1446337302_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747541_6717, duration(ns): 20175476 2025-07-13 16:39:53,611 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747541_6717, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-13 16:39:59,512 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747541_6717 replica FinalizedReplica, blk_1073747541_6717, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747541 for deletion 2025-07-13 16:39:59,513 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747541_6717 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747541 2025-07-13 16:42:58,597 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747544_6720 src: /192.168.158.6:48994 dest: /192.168.158.4:9866 2025-07-13 16:42:58,615 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:48994, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2088611610_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747544_6720, duration(ns): 15308167 2025-07-13 16:42:58,615 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747544_6720, type=LAST_IN_PIPELINE terminating 2025-07-13 16:43:02,517 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747544_6720 replica FinalizedReplica, blk_1073747544_6720, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747544 for deletion 2025-07-13 16:43:02,518 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747544_6720 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747544 2025-07-13 16:43:58,592 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747545_6721 src: /192.168.158.8:47038 dest: /192.168.158.4:9866 2025-07-13 16:43:58,610 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:47038, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1181983962_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747545_6721, duration(ns): 15077995 2025-07-13 16:43:58,610 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747545_6721, type=LAST_IN_PIPELINE terminating 2025-07-13 16:44:05,517 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747545_6721 replica FinalizedReplica, blk_1073747545_6721, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747545 for deletion 2025-07-13 16:44:05,518 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747545_6721 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747545 2025-07-13 16:45:03,606 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747546_6722 src: /192.168.158.8:59998 dest: /192.168.158.4:9866 2025-07-13 16:45:03,630 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:59998, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_907423677_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747546_6722, duration(ns): 18172552 2025-07-13 16:45:03,630 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747546_6722, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-13 16:45:08,518 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747546_6722 replica FinalizedReplica, blk_1073747546_6722, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747546 for deletion 2025-07-13 16:45:08,520 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747546_6722 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747546 2025-07-13 16:46:03,591 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747547_6723 src: /192.168.158.1:47032 dest: /192.168.158.4:9866 2025-07-13 16:46:03,626 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47032, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2132959215_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747547_6723, duration(ns): 26509305 2025-07-13 16:46:03,627 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747547_6723, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-13 16:46:08,523 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747547_6723 replica FinalizedReplica, blk_1073747547_6723, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747547 for deletion 2025-07-13 16:46:08,524 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747547_6723 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747547 2025-07-13 16:47:03,596 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747548_6724 src: /192.168.158.9:49562 dest: /192.168.158.4:9866 2025-07-13 16:47:03,614 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:49562, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_882557440_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747548_6724, duration(ns): 16588007 2025-07-13 16:47:03,615 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747548_6724, type=LAST_IN_PIPELINE terminating 2025-07-13 16:47:05,525 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747548_6724 replica FinalizedReplica, blk_1073747548_6724, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747548 for deletion 2025-07-13 16:47:05,527 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747548_6724 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747548 2025-07-13 16:49:08,596 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747550_6726 src: /192.168.158.1:44542 dest: /192.168.158.4:9866 2025-07-13 16:49:08,631 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44542, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2004816326_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747550_6726, duration(ns): 26439964 2025-07-13 16:49:08,631 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747550_6726, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-13 16:49:11,528 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747550_6726 replica FinalizedReplica, blk_1073747550_6726, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747550 for deletion 2025-07-13 16:49:11,529 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747550_6726 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747550 2025-07-13 16:52:08,601 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747553_6729 src: /192.168.158.1:44908 dest: /192.168.158.4:9866 2025-07-13 16:52:08,634 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44908, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_504506185_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747553_6729, duration(ns): 23469939 2025-07-13 16:52:08,634 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747553_6729, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-13 16:52:14,537 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747553_6729 replica FinalizedReplica, blk_1073747553_6729, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747553 for deletion 2025-07-13 16:52:14,538 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747553_6729 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747553 2025-07-13 16:56:08,629 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747557_6733 src: /192.168.158.6:34014 dest: /192.168.158.4:9866 2025-07-13 16:56:08,655 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:34014, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_789879113_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747557_6733, duration(ns): 19700098 2025-07-13 16:56:08,655 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747557_6733, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-13 16:56:11,546 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747557_6733 replica FinalizedReplica, blk_1073747557_6733, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747557 for deletion 2025-07-13 16:56:11,548 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747557_6733 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747557 2025-07-13 17:00:08,635 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747561_6737 src: /192.168.158.5:58748 dest: /192.168.158.4:9866 2025-07-13 17:00:08,659 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:58748, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2039644434_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747561_6737, duration(ns): 19225699 2025-07-13 17:00:08,659 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747561_6737, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-13 17:00:11,553 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747561_6737 replica FinalizedReplica, blk_1073747561_6737, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747561 for deletion 2025-07-13 17:00:11,554 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747561_6737 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747561 2025-07-13 17:02:08,643 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747563_6739 src: /192.168.158.1:49672 dest: /192.168.158.4:9866 2025-07-13 17:02:08,674 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49672, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1551300066_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747563_6739, duration(ns): 21880185 2025-07-13 17:02:08,674 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747563_6739, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-13 17:02:11,556 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747563_6739 replica FinalizedReplica, blk_1073747563_6739, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747563 for deletion 2025-07-13 17:02:11,557 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747563_6739 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747563 2025-07-13 17:04:18,644 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747565_6741 src: /192.168.158.9:36284 dest: /192.168.158.4:9866 2025-07-13 17:04:18,663 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:36284, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1582173878_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747565_6741, duration(ns): 16754606 2025-07-13 17:04:18,663 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747565_6741, type=LAST_IN_PIPELINE terminating 2025-07-13 17:04:23,558 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747565_6741 replica FinalizedReplica, blk_1073747565_6741, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747565 for deletion 2025-07-13 17:04:23,559 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747565_6741 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747565 2025-07-13 17:08:18,629 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747569_6745 src: /192.168.158.8:42958 dest: /192.168.158.4:9866 2025-07-13 17:08:18,654 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:42958, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-76883965_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747569_6745, duration(ns): 19032292 2025-07-13 17:08:18,654 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747569_6745, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-13 17:08:20,570 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747569_6745 replica FinalizedReplica, blk_1073747569_6745, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747569 for deletion 2025-07-13 17:08:20,571 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747569_6745 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747569 2025-07-13 17:09:18,646 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747570_6746 src: /192.168.158.9:59890 dest: /192.168.158.4:9866 2025-07-13 17:09:18,674 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:59890, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1393665621_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747570_6746, duration(ns): 21977432 2025-07-13 17:09:18,674 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747570_6746, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-13 17:09:20,573 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747570_6746 replica FinalizedReplica, blk_1073747570_6746, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747570 for deletion 2025-07-13 17:09:20,574 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747570_6746 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747570 2025-07-13 17:11:23,670 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747572_6748 src: /192.168.158.1:36340 dest: /192.168.158.4:9866 2025-07-13 17:11:23,703 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36340, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1764941896_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747572_6748, duration(ns): 24523665 2025-07-13 17:11:23,703 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747572_6748, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-13 17:11:26,575 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747572_6748 replica FinalizedReplica, blk_1073747572_6748, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747572 for deletion 2025-07-13 17:11:26,576 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747572_6748 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747572 2025-07-13 17:13:23,675 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747574_6750 src: /192.168.158.7:43214 dest: /192.168.158.4:9866 2025-07-13 17:13:23,694 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:43214, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-208018868_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747574_6750, duration(ns): 16241485 2025-07-13 17:13:23,694 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747574_6750, type=LAST_IN_PIPELINE terminating 2025-07-13 17:13:29,582 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747574_6750 replica FinalizedReplica, blk_1073747574_6750, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747574 for deletion 2025-07-13 17:13:29,583 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747574_6750 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747574 2025-07-13 17:17:38,659 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747578_6754 src: /192.168.158.1:44904 dest: /192.168.158.4:9866 2025-07-13 17:17:38,690 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44904, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1972188919_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747578_6754, duration(ns): 22802068 2025-07-13 17:17:38,690 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747578_6754, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-13 17:17:41,596 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747578_6754 replica FinalizedReplica, blk_1073747578_6754, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747578 for deletion 2025-07-13 17:17:41,597 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747578_6754 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747578 2025-07-13 17:18:43,662 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747579_6755 src: /192.168.158.1:39866 dest: /192.168.158.4:9866 2025-07-13 17:18:43,693 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39866, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_868579174_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747579_6755, duration(ns): 21695589 2025-07-13 17:18:43,693 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747579_6755, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-13 17:18:50,598 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747579_6755 replica FinalizedReplica, blk_1073747579_6755, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747579 for deletion 2025-07-13 17:18:50,599 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747579_6755 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747579 2025-07-13 17:19:43,667 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747580_6756 src: /192.168.158.7:59540 dest: /192.168.158.4:9866 2025-07-13 17:19:43,690 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:59540, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1930846352_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747580_6756, duration(ns): 20072088 2025-07-13 17:19:43,690 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747580_6756, type=LAST_IN_PIPELINE terminating 2025-07-13 17:19:47,599 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747580_6756 replica FinalizedReplica, blk_1073747580_6756, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747580 for deletion 2025-07-13 17:19:47,601 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747580_6756 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747580 2025-07-13 17:20:43,679 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747581_6757 src: /192.168.158.9:51758 dest: /192.168.158.4:9866 2025-07-13 17:20:43,705 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:51758, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1616641252_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747581_6757, duration(ns): 19734582 2025-07-13 17:20:43,705 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747581_6757, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-13 17:20:50,602 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747581_6757 replica FinalizedReplica, blk_1073747581_6757, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747581 for deletion 2025-07-13 17:20:50,603 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747581_6757 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747581 2025-07-13 17:22:48,665 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747583_6759 src: /192.168.158.9:38322 dest: /192.168.158.4:9866 2025-07-13 17:22:48,689 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:38322, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_722191265_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747583_6759, duration(ns): 18452927 2025-07-13 17:22:48,689 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747583_6759, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-13 17:22:50,606 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747583_6759 replica FinalizedReplica, blk_1073747583_6759, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747583 for deletion 2025-07-13 17:22:50,607 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747583_6759 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747583 2025-07-13 17:23:48,683 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747584_6760 src: /192.168.158.5:52130 dest: /192.168.158.4:9866 2025-07-13 17:23:48,707 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:52130, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2133280853_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747584_6760, duration(ns): 19171626 2025-07-13 17:23:48,708 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747584_6760, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-13 17:23:53,607 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747584_6760 replica FinalizedReplica, blk_1073747584_6760, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747584 for deletion 2025-07-13 17:23:53,608 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747584_6760 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747584 2025-07-13 17:24:48,674 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747585_6761 src: /192.168.158.1:48018 dest: /192.168.158.4:9866 2025-07-13 17:24:48,706 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48018, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1304313970_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747585_6761, duration(ns): 23121650 2025-07-13 17:24:48,706 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747585_6761, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-13 17:24:53,610 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747585_6761 replica FinalizedReplica, blk_1073747585_6761, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747585 for deletion 2025-07-13 17:24:53,611 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747585_6761 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747585 2025-07-13 17:28:53,669 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747589_6765 src: /192.168.158.1:41092 dest: /192.168.158.4:9866 2025-07-13 17:28:53,699 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41092, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_159554087_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747589_6765, duration(ns): 22331377 2025-07-13 17:28:53,700 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747589_6765, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-13 17:28:56,616 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747589_6765 replica FinalizedReplica, blk_1073747589_6765, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747589 for deletion 2025-07-13 17:28:56,617 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747589_6765 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747589 2025-07-13 17:30:58,671 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747591_6767 src: /192.168.158.1:43138 dest: /192.168.158.4:9866 2025-07-13 17:30:58,704 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43138, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-730839503_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747591_6767, duration(ns): 24340282 2025-07-13 17:30:58,705 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747591_6767, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-13 17:31:05,620 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747591_6767 replica FinalizedReplica, blk_1073747591_6767, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747591 for deletion 2025-07-13 17:31:05,622 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747591_6767 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747591 2025-07-13 17:32:58,682 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747593_6769 src: /192.168.158.8:54100 dest: /192.168.158.4:9866 2025-07-13 17:32:58,707 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:54100, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1535414071_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747593_6769, duration(ns): 19050375 2025-07-13 17:32:58,707 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747593_6769, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-13 17:33:02,625 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747593_6769 replica FinalizedReplica, blk_1073747593_6769, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747593 for deletion 2025-07-13 17:33:02,626 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747593_6769 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747593 2025-07-13 17:34:03,677 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747594_6770 src: /192.168.158.1:45854 dest: /192.168.158.4:9866 2025-07-13 17:34:03,708 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45854, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2123257318_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747594_6770, duration(ns): 22344850 2025-07-13 17:34:03,708 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747594_6770, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-13 17:34:05,628 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747594_6770 replica FinalizedReplica, blk_1073747594_6770, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747594 for deletion 2025-07-13 17:34:05,629 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747594_6770 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747594 2025-07-13 17:36:13,268 INFO org.apache.hadoop.hdfs.server.datanode.DirectoryScanner: BlockPool BP-1059995147-192.168.158.1-1752101929360 Total blocks: 8, missing metadata files:0, missing block files:0, missing blocks in memory:0, mismatched blocks:0 2025-07-13 17:37:03,703 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747597_6773 src: /192.168.158.8:35608 dest: /192.168.158.4:9866 2025-07-13 17:37:03,725 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:35608, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1246551477_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747597_6773, duration(ns): 17182381 2025-07-13 17:37:03,726 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747597_6773, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-13 17:37:08,632 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747597_6773 replica FinalizedReplica, blk_1073747597_6773, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747597 for deletion 2025-07-13 17:37:08,633 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747597_6773 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747597 2025-07-13 17:37:20,635 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Successfully sent block report 0x12cfd9a757d31f35, containing 4 storage report(s), of which we sent 4. The reports had 8 total blocks and used 1 RPC(s). This took 0 msec to generate and 3 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5. 2025-07-13 17:37:20,636 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Got finalize command for block pool BP-1059995147-192.168.158.1-1752101929360 2025-07-13 17:38:03,687 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747598_6774 src: /192.168.158.5:39468 dest: /192.168.158.4:9866 2025-07-13 17:38:03,710 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:39468, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1227298141_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747598_6774, duration(ns): 18042807 2025-07-13 17:38:03,710 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747598_6774, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-13 17:38:08,633 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747598_6774 replica FinalizedReplica, blk_1073747598_6774, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747598 for deletion 2025-07-13 17:38:08,635 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747598_6774 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747598 2025-07-13 17:39:08,697 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747599_6775 src: /192.168.158.8:44858 dest: /192.168.158.4:9866 2025-07-13 17:39:08,713 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:44858, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1541974403_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747599_6775, duration(ns): 14705930 2025-07-13 17:39:08,714 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747599_6775, type=LAST_IN_PIPELINE terminating 2025-07-13 17:39:14,637 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747599_6775 replica FinalizedReplica, blk_1073747599_6775, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747599 for deletion 2025-07-13 17:39:14,638 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747599_6775 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747599 2025-07-13 17:42:08,691 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747602_6778 src: /192.168.158.1:39610 dest: /192.168.158.4:9866 2025-07-13 17:42:08,722 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39610, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1310708390_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747602_6778, duration(ns): 22680716 2025-07-13 17:42:08,722 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747602_6778, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-13 17:42:11,643 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747602_6778 replica FinalizedReplica, blk_1073747602_6778, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747602 for deletion 2025-07-13 17:42:11,644 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747602_6778 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747602 2025-07-13 17:45:13,686 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747605_6781 src: /192.168.158.1:51642 dest: /192.168.158.4:9866 2025-07-13 17:45:13,720 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51642, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1106128393_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747605_6781, duration(ns): 25008326 2025-07-13 17:45:13,720 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747605_6781, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-13 17:45:17,647 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747605_6781 replica FinalizedReplica, blk_1073747605_6781, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747605 for deletion 2025-07-13 17:45:17,649 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747605_6781 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747605 2025-07-13 17:46:13,700 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747606_6782 src: /192.168.158.9:38810 dest: /192.168.158.4:9866 2025-07-13 17:46:13,725 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:38810, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1553479382_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747606_6782, duration(ns): 20028071 2025-07-13 17:46:13,726 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747606_6782, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-13 17:46:17,651 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747606_6782 replica FinalizedReplica, blk_1073747606_6782, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747606 for deletion 2025-07-13 17:46:17,652 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747606_6782 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747606 2025-07-13 17:47:13,702 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747607_6783 src: /192.168.158.9:45718 dest: /192.168.158.4:9866 2025-07-13 17:47:13,720 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:45718, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_111982452_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747607_6783, duration(ns): 15350121 2025-07-13 17:47:13,720 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747607_6783, type=LAST_IN_PIPELINE terminating 2025-07-13 17:47:17,654 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747607_6783 replica FinalizedReplica, blk_1073747607_6783, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747607 for deletion 2025-07-13 17:47:17,655 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747607_6783 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747607 2025-07-13 17:49:23,700 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747609_6785 src: /192.168.158.5:40246 dest: /192.168.158.4:9866 2025-07-13 17:49:23,721 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:40246, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1018930069_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747609_6785, duration(ns): 18588637 2025-07-13 17:49:23,721 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747609_6785, type=LAST_IN_PIPELINE terminating 2025-07-13 17:49:26,660 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747609_6785 replica FinalizedReplica, blk_1073747609_6785, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747609 for deletion 2025-07-13 17:49:26,661 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747609_6785 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747609 2025-07-13 17:54:28,739 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747614_6790 src: /192.168.158.8:38778 dest: /192.168.158.4:9866 2025-07-13 17:54:28,757 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:38778, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2142165499_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747614_6790, duration(ns): 15780952 2025-07-13 17:54:28,757 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747614_6790, type=LAST_IN_PIPELINE terminating 2025-07-13 17:54:32,671 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747614_6790 replica FinalizedReplica, blk_1073747614_6790, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747614 for deletion 2025-07-13 17:54:32,672 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747614_6790 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747614 2025-07-13 17:55:33,700 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747615_6791 src: /192.168.158.1:60306 dest: /192.168.158.4:9866 2025-07-13 17:55:33,732 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60306, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1169764112_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747615_6791, duration(ns): 22930697 2025-07-13 17:55:33,732 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747615_6791, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-13 17:55:38,674 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747615_6791 replica FinalizedReplica, blk_1073747615_6791, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747615 for deletion 2025-07-13 17:55:38,675 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747615_6791 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747615 2025-07-13 17:57:33,720 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747617_6793 src: /192.168.158.7:34366 dest: /192.168.158.4:9866 2025-07-13 17:57:33,743 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:34366, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1622492421_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747617_6793, duration(ns): 18060487 2025-07-13 17:57:33,743 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747617_6793, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-13 17:57:38,679 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747617_6793 replica FinalizedReplica, blk_1073747617_6793, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747617 for deletion 2025-07-13 17:57:38,680 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747617_6793 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747617 2025-07-13 18:00:43,739 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747620_6796 src: /192.168.158.9:41202 dest: /192.168.158.4:9866 2025-07-13 18:00:43,756 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:41202, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_398878591_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747620_6796, duration(ns): 15620802 2025-07-13 18:00:43,757 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747620_6796, type=LAST_IN_PIPELINE terminating 2025-07-13 18:00:47,684 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747620_6796 replica FinalizedReplica, blk_1073747620_6796, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747620 for deletion 2025-07-13 18:00:47,686 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747620_6796 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747620 2025-07-13 18:01:43,732 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747621_6797 src: /192.168.158.9:43012 dest: /192.168.158.4:9866 2025-07-13 18:01:43,755 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:43012, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1958005835_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747621_6797, duration(ns): 18406824 2025-07-13 18:01:43,755 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747621_6797, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-13 18:01:47,689 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747621_6797 replica FinalizedReplica, blk_1073747621_6797, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747621 for deletion 2025-07-13 18:01:47,690 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747621_6797 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747621 2025-07-13 18:02:43,737 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747622_6798 src: /192.168.158.5:52312 dest: /192.168.158.4:9866 2025-07-13 18:02:43,760 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:52312, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1252570667_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747622_6798, duration(ns): 17682273 2025-07-13 18:02:43,760 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747622_6798, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-13 18:02:50,691 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747622_6798 replica FinalizedReplica, blk_1073747622_6798, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747622 for deletion 2025-07-13 18:02:50,692 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747622_6798 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747622 2025-07-13 18:03:48,739 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747623_6799 src: /192.168.158.5:33024 dest: /192.168.158.4:9866 2025-07-13 18:03:48,758 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:33024, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1695934229_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747623_6799, duration(ns): 16420739 2025-07-13 18:03:48,758 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747623_6799, type=LAST_IN_PIPELINE terminating 2025-07-13 18:03:50,692 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747623_6799 replica FinalizedReplica, blk_1073747623_6799, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747623 for deletion 2025-07-13 18:03:50,693 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747623_6799 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747623 2025-07-13 18:04:48,743 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747624_6800 src: /192.168.158.8:35142 dest: /192.168.158.4:9866 2025-07-13 18:04:48,761 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:35142, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-384267924_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747624_6800, duration(ns): 16050426 2025-07-13 18:04:48,761 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747624_6800, type=LAST_IN_PIPELINE terminating 2025-07-13 18:04:53,692 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747624_6800 replica FinalizedReplica, blk_1073747624_6800, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747624 for deletion 2025-07-13 18:04:53,693 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747624_6800 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747624 2025-07-13 18:06:48,720 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747626_6802 src: /192.168.158.8:37438 dest: /192.168.158.4:9866 2025-07-13 18:06:48,737 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:37438, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-217778884_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747626_6802, duration(ns): 15576369 2025-07-13 18:06:48,738 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747626_6802, type=LAST_IN_PIPELINE terminating 2025-07-13 18:06:50,695 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747626_6802 replica FinalizedReplica, blk_1073747626_6802, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747626 for deletion 2025-07-13 18:06:50,697 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747626_6802 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747626 2025-07-13 18:07:48,721 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747627_6803 src: /192.168.158.9:52230 dest: /192.168.158.4:9866 2025-07-13 18:07:48,745 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:52230, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1766993552_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747627_6803, duration(ns): 18694234 2025-07-13 18:07:48,746 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747627_6803, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-13 18:07:50,698 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747627_6803 replica FinalizedReplica, blk_1073747627_6803, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747627 for deletion 2025-07-13 18:07:50,699 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747627_6803 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747627 2025-07-13 18:08:48,718 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747628_6804 src: /192.168.158.1:53576 dest: /192.168.158.4:9866 2025-07-13 18:08:48,750 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53576, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1842098533_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747628_6804, duration(ns): 23145618 2025-07-13 18:08:48,750 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747628_6804, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-13 18:08:50,698 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747628_6804 replica FinalizedReplica, blk_1073747628_6804, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747628 for deletion 2025-07-13 18:08:50,699 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747628_6804 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747628 2025-07-13 18:09:53,725 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747629_6805 src: /192.168.158.1:50678 dest: /192.168.158.4:9866 2025-07-13 18:09:53,757 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50678, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1609326544_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747629_6805, duration(ns): 23326588 2025-07-13 18:09:53,757 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747629_6805, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-13 18:09:59,699 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747629_6805 replica FinalizedReplica, blk_1073747629_6805, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747629 for deletion 2025-07-13 18:09:59,700 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747629_6805 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747629 2025-07-13 18:10:53,728 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747630_6806 src: /192.168.158.6:38620 dest: /192.168.158.4:9866 2025-07-13 18:10:53,747 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:38620, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-360744727_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747630_6806, duration(ns): 17112130 2025-07-13 18:10:53,747 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747630_6806, type=LAST_IN_PIPELINE terminating 2025-07-13 18:10:59,701 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747630_6806 replica FinalizedReplica, blk_1073747630_6806, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747630 for deletion 2025-07-13 18:10:59,702 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747630_6806 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747630 2025-07-13 18:11:58,721 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747631_6807 src: /192.168.158.1:51816 dest: /192.168.158.4:9866 2025-07-13 18:11:58,752 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51816, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1760909458_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747631_6807, duration(ns): 22836067 2025-07-13 18:11:58,752 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747631_6807, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-13 18:12:02,702 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747631_6807 replica FinalizedReplica, blk_1073747631_6807, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747631 for deletion 2025-07-13 18:12:02,703 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747631_6807 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747631 2025-07-13 18:12:58,731 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747632_6808 src: /192.168.158.7:58102 dest: /192.168.158.4:9866 2025-07-13 18:12:58,756 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:58102, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_324500208_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747632_6808, duration(ns): 19980955 2025-07-13 18:12:58,756 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747632_6808, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-13 18:13:02,704 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747632_6808 replica FinalizedReplica, blk_1073747632_6808, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747632 for deletion 2025-07-13 18:13:02,705 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747632_6808 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747632 2025-07-13 18:13:58,757 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747633_6809 src: /192.168.158.5:33990 dest: /192.168.158.4:9866 2025-07-13 18:13:58,780 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:33990, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1836568775_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747633_6809, duration(ns): 17817317 2025-07-13 18:13:58,780 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747633_6809, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-13 18:14:02,706 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747633_6809 replica FinalizedReplica, blk_1073747633_6809, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747633 for deletion 2025-07-13 18:14:02,707 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747633_6809 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747633 2025-07-13 18:14:58,723 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747634_6810 src: /192.168.158.5:32872 dest: /192.168.158.4:9866 2025-07-13 18:14:58,748 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:32872, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_366374493_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747634_6810, duration(ns): 20091950 2025-07-13 18:14:58,749 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747634_6810, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-13 18:15:02,707 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747634_6810 replica FinalizedReplica, blk_1073747634_6810, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747634 for deletion 2025-07-13 18:15:02,709 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747634_6810 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747634 2025-07-13 18:19:03,736 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747638_6814 src: /192.168.158.1:38726 dest: /192.168.158.4:9866 2025-07-13 18:19:03,768 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38726, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-999467271_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747638_6814, duration(ns): 23423573 2025-07-13 18:19:03,768 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747638_6814, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-13 18:19:05,716 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747638_6814 replica FinalizedReplica, blk_1073747638_6814, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747638 for deletion 2025-07-13 18:19:05,717 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747638_6814 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747638 2025-07-13 18:20:03,748 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747639_6815 src: /192.168.158.9:53604 dest: /192.168.158.4:9866 2025-07-13 18:20:03,776 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:53604, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1962938337_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747639_6815, duration(ns): 20987198 2025-07-13 18:20:03,777 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747639_6815, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-13 18:20:05,717 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747639_6815 replica FinalizedReplica, blk_1073747639_6815, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747639 for deletion 2025-07-13 18:20:05,718 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747639_6815 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747639 2025-07-13 18:22:03,757 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747641_6817 src: /192.168.158.8:48406 dest: /192.168.158.4:9866 2025-07-13 18:22:03,775 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:48406, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_696833335_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747641_6817, duration(ns): 16336287 2025-07-13 18:22:03,775 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747641_6817, type=LAST_IN_PIPELINE terminating 2025-07-13 18:22:08,719 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747641_6817 replica FinalizedReplica, blk_1073747641_6817, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747641 for deletion 2025-07-13 18:22:08,720 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747641_6817 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747641 2025-07-13 18:23:03,743 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747642_6818 src: /192.168.158.7:60852 dest: /192.168.158.4:9866 2025-07-13 18:23:03,766 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:60852, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1216472353_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747642_6818, duration(ns): 17511571 2025-07-13 18:23:03,767 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747642_6818, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-13 18:23:08,718 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747642_6818 replica FinalizedReplica, blk_1073747642_6818, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747642 for deletion 2025-07-13 18:23:08,719 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747642_6818 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747642 2025-07-13 18:26:13,779 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747645_6821 src: /192.168.158.6:38086 dest: /192.168.158.4:9866 2025-07-13 18:26:13,802 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:38086, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1284491266_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747645_6821, duration(ns): 17889604 2025-07-13 18:26:13,802 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747645_6821, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-13 18:26:17,724 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747645_6821 replica FinalizedReplica, blk_1073747645_6821, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747645 for deletion 2025-07-13 18:26:17,725 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747645_6821 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747645 2025-07-13 18:29:18,777 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747648_6824 src: /192.168.158.8:43838 dest: /192.168.158.4:9866 2025-07-13 18:29:18,802 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:43838, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1837044772_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747648_6824, duration(ns): 19709575 2025-07-13 18:29:18,802 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747648_6824, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-13 18:29:20,727 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747648_6824 replica FinalizedReplica, blk_1073747648_6824, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747648 for deletion 2025-07-13 18:29:20,728 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747648_6824 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747648 2025-07-13 18:32:28,769 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747651_6827 src: /192.168.158.7:41004 dest: /192.168.158.4:9866 2025-07-13 18:32:28,785 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:41004, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_316927614_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747651_6827, duration(ns): 14666906 2025-07-13 18:32:28,786 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747651_6827, type=LAST_IN_PIPELINE terminating 2025-07-13 18:32:32,731 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747651_6827 replica FinalizedReplica, blk_1073747651_6827, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747651 for deletion 2025-07-13 18:32:32,732 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747651_6827 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747651 2025-07-13 18:33:28,789 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747652_6828 src: /192.168.158.1:58652 dest: /192.168.158.4:9866 2025-07-13 18:33:28,823 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58652, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1427043934_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747652_6828, duration(ns): 25191866 2025-07-13 18:33:28,823 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747652_6828, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-13 18:33:32,734 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747652_6828 replica FinalizedReplica, blk_1073747652_6828, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747652 for deletion 2025-07-13 18:33:32,735 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747652_6828 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747652 2025-07-13 18:34:28,776 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747653_6829 src: /192.168.158.9:35374 dest: /192.168.158.4:9866 2025-07-13 18:34:28,795 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:35374, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1179562517_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747653_6829, duration(ns): 16326147 2025-07-13 18:34:28,795 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747653_6829, type=LAST_IN_PIPELINE terminating 2025-07-13 18:34:32,736 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747653_6829 replica FinalizedReplica, blk_1073747653_6829, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747653 for deletion 2025-07-13 18:34:32,738 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747653_6829 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747653 2025-07-13 18:35:28,780 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747654_6830 src: /192.168.158.6:47082 dest: /192.168.158.4:9866 2025-07-13 18:35:28,804 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:47082, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-430356702_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747654_6830, duration(ns): 19292545 2025-07-13 18:35:28,805 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747654_6830, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-13 18:35:32,742 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747654_6830 replica FinalizedReplica, blk_1073747654_6830, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747654 for deletion 2025-07-13 18:35:32,743 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747654_6830 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747654 2025-07-13 18:37:33,791 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747656_6832 src: /192.168.158.8:33494 dest: /192.168.158.4:9866 2025-07-13 18:37:33,818 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:33494, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-317605599_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747656_6832, duration(ns): 21237349 2025-07-13 18:37:33,818 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747656_6832, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-13 18:37:38,746 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747656_6832 replica FinalizedReplica, blk_1073747656_6832, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747656 for deletion 2025-07-13 18:37:38,747 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747656_6832 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747656 2025-07-13 18:38:33,791 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747657_6833 src: /192.168.158.7:55640 dest: /192.168.158.4:9866 2025-07-13 18:38:33,811 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:55640, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_679304474_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747657_6833, duration(ns): 17856941 2025-07-13 18:38:33,811 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747657_6833, type=LAST_IN_PIPELINE terminating 2025-07-13 18:38:38,747 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747657_6833 replica FinalizedReplica, blk_1073747657_6833, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747657 for deletion 2025-07-13 18:38:38,749 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747657_6833 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747657 2025-07-13 18:39:33,792 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747658_6834 src: /192.168.158.8:42504 dest: /192.168.158.4:9866 2025-07-13 18:39:33,811 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:42504, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-289341451_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747658_6834, duration(ns): 17165249 2025-07-13 18:39:33,811 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747658_6834, type=LAST_IN_PIPELINE terminating 2025-07-13 18:39:35,751 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747658_6834 replica FinalizedReplica, blk_1073747658_6834, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747658 for deletion 2025-07-13 18:39:35,752 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747658_6834 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747658 2025-07-13 18:40:33,807 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747659_6835 src: /192.168.158.9:57104 dest: /192.168.158.4:9866 2025-07-13 18:40:33,825 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:57104, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1426182741_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747659_6835, duration(ns): 15539140 2025-07-13 18:40:33,825 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747659_6835, type=LAST_IN_PIPELINE terminating 2025-07-13 18:40:38,754 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747659_6835 replica FinalizedReplica, blk_1073747659_6835, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747659 for deletion 2025-07-13 18:40:38,755 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747659_6835 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747659 2025-07-13 18:41:38,790 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747660_6836 src: /192.168.158.1:60036 dest: /192.168.158.4:9866 2025-07-13 18:41:38,825 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60036, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1316076416_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747660_6836, duration(ns): 25915173 2025-07-13 18:41:38,825 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747660_6836, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-13 18:41:41,754 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747660_6836 replica FinalizedReplica, blk_1073747660_6836, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747660 for deletion 2025-07-13 18:41:41,755 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747660_6836 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747660 2025-07-13 18:42:38,773 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747661_6837 src: /192.168.158.1:39282 dest: /192.168.158.4:9866 2025-07-13 18:42:38,802 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39282, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1059395952_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747661_6837, duration(ns): 19686500 2025-07-13 18:42:38,802 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747661_6837, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-13 18:42:41,757 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747661_6837 replica FinalizedReplica, blk_1073747661_6837, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747661 for deletion 2025-07-13 18:42:41,758 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747661_6837 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747661 2025-07-13 18:43:38,782 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747662_6838 src: /192.168.158.8:56534 dest: /192.168.158.4:9866 2025-07-13 18:43:38,807 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:56534, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1576923162_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747662_6838, duration(ns): 20220784 2025-07-13 18:43:38,808 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747662_6838, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-13 18:43:41,762 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747662_6838 replica FinalizedReplica, blk_1073747662_6838, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747662 for deletion 2025-07-13 18:43:41,763 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747662_6838 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747662 2025-07-13 18:45:38,787 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747664_6840 src: /192.168.158.6:39718 dest: /192.168.158.4:9866 2025-07-13 18:45:38,811 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:39718, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_423456550_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747664_6840, duration(ns): 18552178 2025-07-13 18:45:38,812 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747664_6840, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-13 18:45:44,766 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747664_6840 replica FinalizedReplica, blk_1073747664_6840, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747664 for deletion 2025-07-13 18:45:44,767 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747664_6840 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747664 2025-07-13 18:46:38,788 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747665_6841 src: /192.168.158.1:59622 dest: /192.168.158.4:9866 2025-07-13 18:46:38,821 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59622, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1497148093_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747665_6841, duration(ns): 23918875 2025-07-13 18:46:38,821 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747665_6841, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-13 18:46:44,769 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747665_6841 replica FinalizedReplica, blk_1073747665_6841, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747665 for deletion 2025-07-13 18:46:44,770 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747665_6841 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747665 2025-07-13 18:47:38,788 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747666_6842 src: /192.168.158.5:57858 dest: /192.168.158.4:9866 2025-07-13 18:47:38,813 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:57858, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1093261901_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747666_6842, duration(ns): 20267479 2025-07-13 18:47:38,814 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747666_6842, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-13 18:47:41,770 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747666_6842 replica FinalizedReplica, blk_1073747666_6842, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747666 for deletion 2025-07-13 18:47:41,771 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747666_6842 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747666 2025-07-13 18:50:38,785 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747669_6845 src: /192.168.158.1:40678 dest: /192.168.158.4:9866 2025-07-13 18:50:38,818 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40678, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1985072445_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747669_6845, duration(ns): 24197158 2025-07-13 18:50:38,818 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747669_6845, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-13 18:50:44,779 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747669_6845 replica FinalizedReplica, blk_1073747669_6845, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747669 for deletion 2025-07-13 18:50:44,780 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747669_6845 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747669 2025-07-13 18:51:38,799 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747670_6846 src: /192.168.158.1:45312 dest: /192.168.158.4:9866 2025-07-13 18:51:38,834 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45312, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_311916186_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747670_6846, duration(ns): 25844476 2025-07-13 18:51:38,834 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747670_6846, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-13 18:51:41,784 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747670_6846 replica FinalizedReplica, blk_1073747670_6846, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747670 for deletion 2025-07-13 18:51:41,786 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747670_6846 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747670 2025-07-13 18:53:48,832 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747672_6848 src: /192.168.158.1:46278 dest: /192.168.158.4:9866 2025-07-13 18:53:48,865 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46278, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_252400012_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747672_6848, duration(ns): 24797654 2025-07-13 18:53:48,866 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747672_6848, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-13 18:53:50,790 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747672_6848 replica FinalizedReplica, blk_1073747672_6848, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747672 for deletion 2025-07-13 18:53:50,792 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747672_6848 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747672 2025-07-13 18:54:48,830 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747673_6849 src: /192.168.158.8:33778 dest: /192.168.158.4:9866 2025-07-13 18:54:48,850 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:33778, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-321053155_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747673_6849, duration(ns): 17774614 2025-07-13 18:54:48,850 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747673_6849, type=LAST_IN_PIPELINE terminating 2025-07-13 18:54:50,792 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747673_6849 replica FinalizedReplica, blk_1073747673_6849, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747673 for deletion 2025-07-13 18:54:50,793 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747673_6849 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747673 2025-07-13 18:57:48,830 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747676_6852 src: /192.168.158.9:45446 dest: /192.168.158.4:9866 2025-07-13 18:57:48,848 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:45446, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-880783437_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747676_6852, duration(ns): 15589694 2025-07-13 18:57:48,848 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747676_6852, type=LAST_IN_PIPELINE terminating 2025-07-13 18:57:50,796 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747676_6852 replica FinalizedReplica, blk_1073747676_6852, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747676 for deletion 2025-07-13 18:57:50,797 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747676_6852 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747676 2025-07-13 18:59:58,804 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747678_6854 src: /192.168.158.1:42892 dest: /192.168.158.4:9866 2025-07-13 18:59:58,834 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42892, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1433785970_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747678_6854, duration(ns): 21424385 2025-07-13 18:59:58,834 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747678_6854, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-13 19:00:05,799 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747678_6854 replica FinalizedReplica, blk_1073747678_6854, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747678 for deletion 2025-07-13 19:00:05,800 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747678_6854 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747678 2025-07-13 19:00:58,818 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747679_6855 src: /192.168.158.9:43322 dest: /192.168.158.4:9866 2025-07-13 19:00:58,841 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:43322, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1799378526_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747679_6855, duration(ns): 18249199 2025-07-13 19:00:58,842 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747679_6855, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-13 19:01:02,800 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747679_6855 replica FinalizedReplica, blk_1073747679_6855, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747679 for deletion 2025-07-13 19:01:02,801 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747679_6855 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747679 2025-07-13 19:01:58,818 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747680_6856 src: /192.168.158.5:33194 dest: /192.168.158.4:9866 2025-07-13 19:01:58,843 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:33194, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-833486968_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747680_6856, duration(ns): 19387922 2025-07-13 19:01:58,843 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747680_6856, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-13 19:02:02,803 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747680_6856 replica FinalizedReplica, blk_1073747680_6856, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747680 for deletion 2025-07-13 19:02:02,804 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747680_6856 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747680 2025-07-13 19:02:58,818 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747681_6857 src: /192.168.158.9:37698 dest: /192.168.158.4:9866 2025-07-13 19:02:58,843 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:37698, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_263733578_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747681_6857, duration(ns): 19421012 2025-07-13 19:02:58,843 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747681_6857, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-13 19:03:02,805 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747681_6857 replica FinalizedReplica, blk_1073747681_6857, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747681 for deletion 2025-07-13 19:03:02,806 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747681_6857 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747681 2025-07-13 19:04:03,836 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747682_6858 src: /192.168.158.8:57636 dest: /192.168.158.4:9866 2025-07-13 19:04:03,860 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:57636, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_379773718_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747682_6858, duration(ns): 18455192 2025-07-13 19:04:03,860 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747682_6858, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-13 19:04:08,806 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747682_6858 replica FinalizedReplica, blk_1073747682_6858, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747682 for deletion 2025-07-13 19:04:08,808 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747682_6858 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747682 2025-07-13 19:06:03,846 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747684_6860 src: /192.168.158.9:47330 dest: /192.168.158.4:9866 2025-07-13 19:06:03,864 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:47330, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1721013769_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747684_6860, duration(ns): 15744646 2025-07-13 19:06:03,864 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747684_6860, type=LAST_IN_PIPELINE terminating 2025-07-13 19:06:05,810 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747684_6860 replica FinalizedReplica, blk_1073747684_6860, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747684 for deletion 2025-07-13 19:06:05,811 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747684_6860 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747684 2025-07-13 19:09:08,855 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747687_6863 src: /192.168.158.7:34676 dest: /192.168.158.4:9866 2025-07-13 19:09:08,880 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:34676, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_936163787_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747687_6863, duration(ns): 19410820 2025-07-13 19:09:08,880 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747687_6863, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-13 19:09:14,815 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747687_6863 replica FinalizedReplica, blk_1073747687_6863, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747687 for deletion 2025-07-13 19:09:14,816 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747687_6863 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747687 2025-07-13 19:11:18,837 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747689_6865 src: /192.168.158.5:55500 dest: /192.168.158.4:9866 2025-07-13 19:11:18,861 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:55500, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1627988544_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747689_6865, duration(ns): 19210149 2025-07-13 19:11:18,861 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747689_6865, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-13 19:11:20,823 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747689_6865 replica FinalizedReplica, blk_1073747689_6865, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747689 for deletion 2025-07-13 19:11:20,824 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747689_6865 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747689 2025-07-13 19:13:23,834 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747691_6867 src: /192.168.158.1:48382 dest: /192.168.158.4:9866 2025-07-13 19:13:23,868 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48382, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_901378363_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747691_6867, duration(ns): 25481821 2025-07-13 19:13:23,869 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747691_6867, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-13 19:13:26,833 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747691_6867 replica FinalizedReplica, blk_1073747691_6867, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747691 for deletion 2025-07-13 19:13:26,834 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747691_6867 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747691 2025-07-13 19:14:28,842 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747692_6868 src: /192.168.158.7:57548 dest: /192.168.158.4:9866 2025-07-13 19:14:28,867 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:57548, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1983800171_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747692_6868, duration(ns): 19995878 2025-07-13 19:14:28,867 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747692_6868, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-13 19:14:32,833 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747692_6868 replica FinalizedReplica, blk_1073747692_6868, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747692 for deletion 2025-07-13 19:14:32,834 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747692_6868 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747692 2025-07-13 19:18:43,856 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747696_6872 src: /192.168.158.5:57274 dest: /192.168.158.4:9866 2025-07-13 19:18:43,874 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:57274, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2071982086_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747696_6872, duration(ns): 16107843 2025-07-13 19:18:43,874 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747696_6872, type=LAST_IN_PIPELINE terminating 2025-07-13 19:18:47,838 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747696_6872 replica FinalizedReplica, blk_1073747696_6872, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747696 for deletion 2025-07-13 19:18:47,839 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747696_6872 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747696 2025-07-13 19:22:53,850 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747700_6876 src: /192.168.158.9:47470 dest: /192.168.158.4:9866 2025-07-13 19:22:53,868 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:47470, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-355741692_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747700_6876, duration(ns): 15341745 2025-07-13 19:22:53,868 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747700_6876, type=LAST_IN_PIPELINE terminating 2025-07-13 19:22:59,846 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747700_6876 replica FinalizedReplica, blk_1073747700_6876, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747700 for deletion 2025-07-13 19:22:59,847 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747700_6876 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747700 2025-07-13 19:23:53,885 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747701_6877 src: /192.168.158.1:58000 dest: /192.168.158.4:9866 2025-07-13 19:23:53,914 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58000, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1031338240_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747701_6877, duration(ns): 20823172 2025-07-13 19:23:53,915 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747701_6877, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-13 19:23:59,850 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747701_6877 replica FinalizedReplica, blk_1073747701_6877, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747701 for deletion 2025-07-13 19:23:59,851 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747701_6877 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747701 2025-07-13 19:24:53,849 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747702_6878 src: /192.168.158.9:43768 dest: /192.168.158.4:9866 2025-07-13 19:24:53,876 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:43768, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-775001747_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747702_6878, duration(ns): 21172667 2025-07-13 19:24:53,876 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747702_6878, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-13 19:24:56,849 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747702_6878 replica FinalizedReplica, blk_1073747702_6878, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747702 for deletion 2025-07-13 19:24:56,851 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747702_6878 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747702 2025-07-13 19:26:53,853 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747704_6880 src: /192.168.158.1:60852 dest: /192.168.158.4:9866 2025-07-13 19:26:53,886 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60852, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_701283404_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747704_6880, duration(ns): 24516425 2025-07-13 19:26:53,886 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747704_6880, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-13 19:26:56,855 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747704_6880 replica FinalizedReplica, blk_1073747704_6880, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747704 for deletion 2025-07-13 19:26:56,856 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747704_6880 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747704 2025-07-13 19:28:58,853 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747706_6882 src: /192.168.158.9:48410 dest: /192.168.158.4:9866 2025-07-13 19:28:58,875 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:48410, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1232608399_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747706_6882, duration(ns): 17548734 2025-07-13 19:28:58,876 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747706_6882, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-13 19:29:02,857 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747706_6882 replica FinalizedReplica, blk_1073747706_6882, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747706 for deletion 2025-07-13 19:29:02,858 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747706_6882 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747706 2025-07-13 19:30:58,865 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747708_6884 src: /192.168.158.8:55088 dest: /192.168.158.4:9866 2025-07-13 19:30:58,893 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:55088, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-950379905_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747708_6884, duration(ns): 20821213 2025-07-13 19:30:58,893 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747708_6884, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-13 19:31:02,859 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747708_6884 replica FinalizedReplica, blk_1073747708_6884, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747708 for deletion 2025-07-13 19:31:02,860 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747708_6884 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747708 2025-07-13 19:32:58,871 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747710_6886 src: /192.168.158.1:47090 dest: /192.168.158.4:9866 2025-07-13 19:32:58,904 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47090, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-646911355_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747710_6886, duration(ns): 24182195 2025-07-13 19:32:58,904 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747710_6886, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-13 19:33:02,861 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747710_6886 replica FinalizedReplica, blk_1073747710_6886, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747710 for deletion 2025-07-13 19:33:02,863 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747710_6886 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747710 2025-07-13 19:33:58,869 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747711_6887 src: /192.168.158.9:50486 dest: /192.168.158.4:9866 2025-07-13 19:33:58,892 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:50486, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_632841336_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747711_6887, duration(ns): 18589646 2025-07-13 19:33:58,893 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747711_6887, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-13 19:34:05,866 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747711_6887 replica FinalizedReplica, blk_1073747711_6887, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747711 for deletion 2025-07-13 19:34:05,868 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747711_6887 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073747711 2025-07-13 19:34:58,876 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747712_6888 src: /192.168.158.5:54542 dest: /192.168.158.4:9866 2025-07-13 19:34:58,902 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:54542, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-802108519_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747712_6888, duration(ns): 20049637 2025-07-13 19:34:58,902 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747712_6888, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-13 19:35:02,867 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747712_6888 replica FinalizedReplica, blk_1073747712_6888, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747712 for deletion 2025-07-13 19:35:02,868 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747712_6888 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747712 2025-07-13 19:35:58,876 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747713_6889 src: /192.168.158.9:40128 dest: /192.168.158.4:9866 2025-07-13 19:35:58,894 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:40128, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_595376526_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747713_6889, duration(ns): 15408225 2025-07-13 19:35:58,894 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747713_6889, type=LAST_IN_PIPELINE terminating 2025-07-13 19:36:02,871 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747713_6889 replica FinalizedReplica, blk_1073747713_6889, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747713 for deletion 2025-07-13 19:36:02,872 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747713_6889 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747713 2025-07-13 19:36:58,896 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747714_6890 src: /192.168.158.8:59984 dest: /192.168.158.4:9866 2025-07-13 19:36:58,912 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:59984, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1283846660_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747714_6890, duration(ns): 13291136 2025-07-13 19:36:58,912 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747714_6890, type=LAST_IN_PIPELINE terminating 2025-07-13 19:37:02,872 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747714_6890 replica FinalizedReplica, blk_1073747714_6890, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747714 for deletion 2025-07-13 19:37:02,873 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747714_6890 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747714 2025-07-13 19:39:08,903 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747716_6892 src: /192.168.158.7:33768 dest: /192.168.158.4:9866 2025-07-13 19:39:08,926 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:33768, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_263570266_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747716_6892, duration(ns): 17005409 2025-07-13 19:39:08,926 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747716_6892, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-13 19:39:11,877 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747716_6892 replica FinalizedReplica, blk_1073747716_6892, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747716 for deletion 2025-07-13 19:39:11,878 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747716_6892 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747716 2025-07-13 19:44:18,896 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747721_6897 src: /192.168.158.6:40512 dest: /192.168.158.4:9866 2025-07-13 19:44:18,911 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:40512, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-489427005_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747721_6897, duration(ns): 12979387 2025-07-13 19:44:18,912 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747721_6897, type=LAST_IN_PIPELINE terminating 2025-07-13 19:44:23,890 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747721_6897 replica FinalizedReplica, blk_1073747721_6897, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747721 for deletion 2025-07-13 19:44:23,891 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747721_6897 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747721 2025-07-13 19:46:23,878 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747723_6899 src: /192.168.158.1:39092 dest: /192.168.158.4:9866 2025-07-13 19:46:23,910 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39092, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1864230156_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747723_6899, duration(ns): 23217022 2025-07-13 19:46:23,911 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747723_6899, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-13 19:46:26,895 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747723_6899 replica FinalizedReplica, blk_1073747723_6899, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747723 for deletion 2025-07-13 19:46:26,896 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747723_6899 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747723 2025-07-13 19:47:23,912 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747724_6900 src: /192.168.158.7:35416 dest: /192.168.158.4:9866 2025-07-13 19:47:23,938 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:35416, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1151007648_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747724_6900, duration(ns): 20621352 2025-07-13 19:47:23,938 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747724_6900, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-13 19:47:29,899 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747724_6900 replica FinalizedReplica, blk_1073747724_6900, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747724 for deletion 2025-07-13 19:47:29,900 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747724_6900 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747724 2025-07-13 19:48:23,902 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747725_6901 src: /192.168.158.1:45876 dest: /192.168.158.4:9866 2025-07-13 19:48:23,933 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45876, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_338818923_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747725_6901, duration(ns): 21901458 2025-07-13 19:48:23,933 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747725_6901, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-13 19:48:26,903 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747725_6901 replica FinalizedReplica, blk_1073747725_6901, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747725 for deletion 2025-07-13 19:48:26,904 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747725_6901 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747725 2025-07-13 19:52:28,907 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747729_6905 src: /192.168.158.1:47324 dest: /192.168.158.4:9866 2025-07-13 19:52:28,940 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47324, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1448934157_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747729_6905, duration(ns): 24165988 2025-07-13 19:52:28,941 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747729_6905, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-13 19:52:32,912 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747729_6905 replica FinalizedReplica, blk_1073747729_6905, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747729 for deletion 2025-07-13 19:52:32,913 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747729_6905 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747729 2025-07-13 19:55:28,943 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747732_6908 src: /192.168.158.5:34742 dest: /192.168.158.4:9866 2025-07-13 19:55:28,970 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:34742, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1125133475_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747732_6908, duration(ns): 20885630 2025-07-13 19:55:28,970 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747732_6908, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-13 19:55:32,919 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747732_6908 replica FinalizedReplica, blk_1073747732_6908, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747732 for deletion 2025-07-13 19:55:32,920 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747732_6908 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747732 2025-07-13 19:56:28,954 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747733_6909 src: /192.168.158.5:55144 dest: /192.168.158.4:9866 2025-07-13 19:56:28,971 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:55144, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1036770218_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747733_6909, duration(ns): 15027486 2025-07-13 19:56:28,972 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747733_6909, type=LAST_IN_PIPELINE terminating 2025-07-13 19:56:32,923 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747733_6909 replica FinalizedReplica, blk_1073747733_6909, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747733 for deletion 2025-07-13 19:56:32,924 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747733_6909 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747733 2025-07-13 19:57:33,911 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747734_6910 src: /192.168.158.1:44424 dest: /192.168.158.4:9866 2025-07-13 19:57:33,941 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44424, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_465298715_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747734_6910, duration(ns): 21515697 2025-07-13 19:57:33,942 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747734_6910, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-13 19:57:35,928 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747734_6910 replica FinalizedReplica, blk_1073747734_6910, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747734 for deletion 2025-07-13 19:57:35,929 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747734_6910 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747734 2025-07-13 20:01:38,929 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747738_6914 src: /192.168.158.1:32832 dest: /192.168.158.4:9866 2025-07-13 20:01:38,957 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:32832, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_761879859_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747738_6914, duration(ns): 20197616 2025-07-13 20:01:38,958 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747738_6914, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-13 20:01:44,935 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747738_6914 replica FinalizedReplica, blk_1073747738_6914, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747738 for deletion 2025-07-13 20:01:44,936 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747738_6914 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747738 2025-07-13 20:03:43,930 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747740_6916 src: /192.168.158.6:50096 dest: /192.168.158.4:9866 2025-07-13 20:03:43,955 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:50096, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1905175824_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747740_6916, duration(ns): 20068552 2025-07-13 20:03:43,955 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747740_6916, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-13 20:03:47,938 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747740_6916 replica FinalizedReplica, blk_1073747740_6916, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747740 for deletion 2025-07-13 20:03:47,939 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747740_6916 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747740 2025-07-13 20:05:43,917 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747742_6918 src: /192.168.158.6:53532 dest: /192.168.158.4:9866 2025-07-13 20:05:43,936 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:53532, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_151596889_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747742_6918, duration(ns): 16857865 2025-07-13 20:05:43,936 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747742_6918, type=LAST_IN_PIPELINE terminating 2025-07-13 20:05:47,942 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747742_6918 replica FinalizedReplica, blk_1073747742_6918, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747742 for deletion 2025-07-13 20:05:47,944 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747742_6918 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747742 2025-07-13 20:06:43,917 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747743_6919 src: /192.168.158.1:49514 dest: /192.168.158.4:9866 2025-07-13 20:06:43,953 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49514, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2091053667_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747743_6919, duration(ns): 27802369 2025-07-13 20:06:43,954 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747743_6919, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-13 20:06:47,947 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747743_6919 replica FinalizedReplica, blk_1073747743_6919, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747743 for deletion 2025-07-13 20:06:47,948 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747743_6919 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747743 2025-07-13 20:08:48,913 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747745_6921 src: /192.168.158.1:37962 dest: /192.168.158.4:9866 2025-07-13 20:08:48,947 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37962, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_229696656_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747745_6921, duration(ns): 25446396 2025-07-13 20:08:48,947 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747745_6921, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-13 20:08:53,955 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747745_6921 replica FinalizedReplica, blk_1073747745_6921, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747745 for deletion 2025-07-13 20:08:53,956 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747745_6921 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747745 2025-07-13 20:09:48,916 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747746_6922 src: /192.168.158.1:54840 dest: /192.168.158.4:9866 2025-07-13 20:09:48,949 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54840, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1948050262_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747746_6922, duration(ns): 23486694 2025-07-13 20:09:48,949 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747746_6922, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-13 20:09:50,959 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747746_6922 replica FinalizedReplica, blk_1073747746_6922, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747746 for deletion 2025-07-13 20:09:50,960 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747746_6922 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747746 2025-07-13 20:10:48,919 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747747_6923 src: /192.168.158.8:33354 dest: /192.168.158.4:9866 2025-07-13 20:10:48,937 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:33354, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1177981227_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747747_6923, duration(ns): 15934201 2025-07-13 20:10:48,937 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747747_6923, type=LAST_IN_PIPELINE terminating 2025-07-13 20:10:50,962 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747747_6923 replica FinalizedReplica, blk_1073747747_6923, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747747 for deletion 2025-07-13 20:10:50,963 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747747_6923 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747747 2025-07-13 20:13:53,925 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747750_6926 src: /192.168.158.1:45846 dest: /192.168.158.4:9866 2025-07-13 20:13:53,957 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45846, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_527490640_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747750_6926, duration(ns): 23822149 2025-07-13 20:13:53,957 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747750_6926, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-13 20:13:56,969 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747750_6926 replica FinalizedReplica, blk_1073747750_6926, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747750 for deletion 2025-07-13 20:13:56,970 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747750_6926 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747750 2025-07-13 20:14:53,924 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747751_6927 src: /192.168.158.1:37790 dest: /192.168.158.4:9866 2025-07-13 20:14:53,955 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37790, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1787537486_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747751_6927, duration(ns): 21879759 2025-07-13 20:14:53,955 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747751_6927, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-13 20:14:56,971 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747751_6927 replica FinalizedReplica, blk_1073747751_6927, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747751 for deletion 2025-07-13 20:14:56,972 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747751_6927 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747751 2025-07-13 20:16:58,928 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747753_6929 src: /192.168.158.5:36664 dest: /192.168.158.4:9866 2025-07-13 20:16:58,946 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:36664, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-824673525_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747753_6929, duration(ns): 15327234 2025-07-13 20:16:58,946 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747753_6929, type=LAST_IN_PIPELINE terminating 2025-07-13 20:17:02,998 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747753_6929 replica FinalizedReplica, blk_1073747753_6929, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747753 for deletion 2025-07-13 20:17:02,999 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747753_6929 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747753 2025-07-13 20:17:58,925 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747754_6930 src: /192.168.158.1:48514 dest: /192.168.158.4:9866 2025-07-13 20:17:58,957 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48514, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-810926800_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747754_6930, duration(ns): 23132795 2025-07-13 20:17:58,957 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747754_6930, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-13 20:18:02,975 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747754_6930 replica FinalizedReplica, blk_1073747754_6930, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747754 for deletion 2025-07-13 20:18:02,976 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747754_6930 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747754 2025-07-13 20:18:58,936 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747755_6931 src: /192.168.158.5:46900 dest: /192.168.158.4:9866 2025-07-13 20:18:58,953 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:46900, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2084994871_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747755_6931, duration(ns): 14499936 2025-07-13 20:18:58,953 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747755_6931, type=LAST_IN_PIPELINE terminating 2025-07-13 20:19:02,977 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747755_6931 replica FinalizedReplica, blk_1073747755_6931, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747755 for deletion 2025-07-13 20:19:02,978 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747755_6931 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747755 2025-07-13 20:20:03,935 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747756_6932 src: /192.168.158.7:43816 dest: /192.168.158.4:9866 2025-07-13 20:20:03,952 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:43816, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1649562600_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747756_6932, duration(ns): 15001609 2025-07-13 20:20:03,952 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747756_6932, type=LAST_IN_PIPELINE terminating 2025-07-13 20:20:05,978 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747756_6932 replica FinalizedReplica, blk_1073747756_6932, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747756 for deletion 2025-07-13 20:20:05,979 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747756_6932 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747756 2025-07-13 20:25:18,934 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747761_6937 src: /192.168.158.9:39956 dest: /192.168.158.4:9866 2025-07-13 20:25:18,953 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:39956, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1541540453_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747761_6937, duration(ns): 16474091 2025-07-13 20:25:18,953 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747761_6937, type=LAST_IN_PIPELINE terminating 2025-07-13 20:25:20,992 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747761_6937 replica FinalizedReplica, blk_1073747761_6937, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747761 for deletion 2025-07-13 20:25:20,993 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747761_6937 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747761 2025-07-13 20:26:18,950 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747762_6938 src: /192.168.158.1:59502 dest: /192.168.158.4:9866 2025-07-13 20:26:18,980 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59502, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1128552690_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747762_6938, duration(ns): 21289034 2025-07-13 20:26:18,980 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747762_6938, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-13 20:26:23,994 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747762_6938 replica FinalizedReplica, blk_1073747762_6938, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747762 for deletion 2025-07-13 20:26:23,995 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747762_6938 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747762 2025-07-13 20:28:18,938 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747764_6940 src: /192.168.158.5:49702 dest: /192.168.158.4:9866 2025-07-13 20:28:18,961 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:49702, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1593399214_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747764_6940, duration(ns): 18604477 2025-07-13 20:28:18,962 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747764_6940, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-13 20:28:24,001 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747764_6940 replica FinalizedReplica, blk_1073747764_6940, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747764 for deletion 2025-07-13 20:28:24,002 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747764_6940 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747764 2025-07-13 20:29:23,937 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747765_6941 src: /192.168.158.6:40692 dest: /192.168.158.4:9866 2025-07-13 20:29:23,961 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:40692, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1808376530_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747765_6941, duration(ns): 19065381 2025-07-13 20:29:23,962 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747765_6941, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-13 20:29:30,002 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747765_6941 replica FinalizedReplica, blk_1073747765_6941, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747765 for deletion 2025-07-13 20:29:30,003 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747765_6941 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747765 2025-07-13 20:30:23,943 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747766_6942 src: /192.168.158.6:51188 dest: /192.168.158.4:9866 2025-07-13 20:30:23,961 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:51188, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1204089911_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747766_6942, duration(ns): 15534339 2025-07-13 20:30:23,961 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747766_6942, type=LAST_IN_PIPELINE terminating 2025-07-13 20:30:27,004 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747766_6942 replica FinalizedReplica, blk_1073747766_6942, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747766 for deletion 2025-07-13 20:30:27,005 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747766_6942 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747766 2025-07-13 20:31:28,942 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747767_6943 src: /192.168.158.7:34876 dest: /192.168.158.4:9866 2025-07-13 20:31:28,959 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:34876, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1963830916_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747767_6943, duration(ns): 15261629 2025-07-13 20:31:28,959 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747767_6943, type=LAST_IN_PIPELINE terminating 2025-07-13 20:31:33,006 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747767_6943 replica FinalizedReplica, blk_1073747767_6943, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747767 for deletion 2025-07-13 20:31:33,007 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747767_6943 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747767 2025-07-13 20:33:33,981 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747769_6945 src: /192.168.158.8:35026 dest: /192.168.158.4:9866 2025-07-13 20:33:34,001 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:35026, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1957235298_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747769_6945, duration(ns): 17426480 2025-07-13 20:33:34,001 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747769_6945, type=LAST_IN_PIPELINE terminating 2025-07-13 20:33:36,011 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747769_6945 replica FinalizedReplica, blk_1073747769_6945, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747769 for deletion 2025-07-13 20:33:36,012 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747769_6945 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747769 2025-07-13 20:34:33,983 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747770_6946 src: /192.168.158.8:42186 dest: /192.168.158.4:9866 2025-07-13 20:34:34,002 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:42186, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1094503526_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747770_6946, duration(ns): 17022433 2025-07-13 20:34:34,002 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747770_6946, type=LAST_IN_PIPELINE terminating 2025-07-13 20:34:39,012 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747770_6946 replica FinalizedReplica, blk_1073747770_6946, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747770 for deletion 2025-07-13 20:34:39,013 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747770_6946 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747770 2025-07-13 20:35:33,984 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747771_6947 src: /192.168.158.5:41856 dest: /192.168.158.4:9866 2025-07-13 20:35:34,007 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:41856, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-197380382_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747771_6947, duration(ns): 18216414 2025-07-13 20:35:34,008 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747771_6947, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-13 20:35:39,013 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747771_6947 replica FinalizedReplica, blk_1073747771_6947, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747771 for deletion 2025-07-13 20:35:39,015 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747771_6947 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747771 2025-07-13 20:37:33,988 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747773_6949 src: /192.168.158.7:54932 dest: /192.168.158.4:9866 2025-07-13 20:37:34,005 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:54932, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_950467185_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747773_6949, duration(ns): 15381093 2025-07-13 20:37:34,006 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747773_6949, type=LAST_IN_PIPELINE terminating 2025-07-13 20:37:36,018 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747773_6949 replica FinalizedReplica, blk_1073747773_6949, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747773 for deletion 2025-07-13 20:37:36,019 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747773_6949 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747773 2025-07-13 20:40:43,977 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747776_6952 src: /192.168.158.9:58610 dest: /192.168.158.4:9866 2025-07-13 20:40:43,996 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:58610, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1309343489_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747776_6952, duration(ns): 16899577 2025-07-13 20:40:43,996 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747776_6952, type=LAST_IN_PIPELINE terminating 2025-07-13 20:40:48,024 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747776_6952 replica FinalizedReplica, blk_1073747776_6952, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747776 for deletion 2025-07-13 20:40:48,025 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747776_6952 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747776 2025-07-13 20:41:43,985 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747777_6953 src: /192.168.158.5:51304 dest: /192.168.158.4:9866 2025-07-13 20:41:44,005 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:51304, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-188304136_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747777_6953, duration(ns): 17301115 2025-07-13 20:41:44,005 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747777_6953, type=LAST_IN_PIPELINE terminating 2025-07-13 20:41:48,025 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747777_6953 replica FinalizedReplica, blk_1073747777_6953, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747777 for deletion 2025-07-13 20:41:48,027 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747777_6953 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747777 2025-07-13 20:42:43,999 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747778_6954 src: /192.168.158.9:36336 dest: /192.168.158.4:9866 2025-07-13 20:42:44,019 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:36336, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1583064397_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747778_6954, duration(ns): 17008755 2025-07-13 20:42:44,019 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747778_6954, type=LAST_IN_PIPELINE terminating 2025-07-13 20:42:48,028 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747778_6954 replica FinalizedReplica, blk_1073747778_6954, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747778 for deletion 2025-07-13 20:42:48,029 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747778_6954 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747778 2025-07-13 20:43:43,994 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747779_6955 src: /192.168.158.6:50340 dest: /192.168.158.4:9866 2025-07-13 20:43:44,020 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:50340, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1947826395_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747779_6955, duration(ns): 21298361 2025-07-13 20:43:44,021 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747779_6955, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-13 20:43:51,030 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747779_6955 replica FinalizedReplica, blk_1073747779_6955, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747779 for deletion 2025-07-13 20:43:51,031 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747779_6955 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747779 2025-07-13 20:45:43,967 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747781_6957 src: /192.168.158.6:48048 dest: /192.168.158.4:9866 2025-07-13 20:45:43,986 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:48048, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1834206680_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747781_6957, duration(ns): 17054948 2025-07-13 20:45:43,987 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747781_6957, type=LAST_IN_PIPELINE terminating 2025-07-13 20:45:48,037 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747781_6957 replica FinalizedReplica, blk_1073747781_6957, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747781 for deletion 2025-07-13 20:45:48,039 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747781_6957 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747781 2025-07-13 20:51:48,990 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747787_6963 src: /192.168.158.1:44072 dest: /192.168.158.4:9866 2025-07-13 20:51:49,025 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44072, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1849946457_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747787_6963, duration(ns): 26136080 2025-07-13 20:51:49,025 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747787_6963, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-13 20:51:51,053 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747787_6963 replica FinalizedReplica, blk_1073747787_6963, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747787 for deletion 2025-07-13 20:51:51,054 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747787_6963 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747787 2025-07-13 20:53:54,005 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747789_6965 src: /192.168.158.8:37296 dest: /192.168.158.4:9866 2025-07-13 20:53:54,030 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:37296, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2013079110_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747789_6965, duration(ns): 18895219 2025-07-13 20:53:54,030 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747789_6965, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-13 20:53:57,057 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747789_6965 replica FinalizedReplica, blk_1073747789_6965, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747789 for deletion 2025-07-13 20:53:57,058 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747789_6965 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747789 2025-07-13 20:54:53,978 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747790_6966 src: /192.168.158.1:53600 dest: /192.168.158.4:9866 2025-07-13 20:54:54,012 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53600, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-374004349_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747790_6966, duration(ns): 24594499 2025-07-13 20:54:54,012 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747790_6966, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-13 20:55:00,061 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747790_6966 replica FinalizedReplica, blk_1073747790_6966, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747790 for deletion 2025-07-13 20:55:00,062 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747790_6966 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747790 2025-07-13 20:56:53,987 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747792_6968 src: /192.168.158.6:51764 dest: /192.168.158.4:9866 2025-07-13 20:56:54,007 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:51764, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1032158354_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747792_6968, duration(ns): 17607956 2025-07-13 20:56:54,007 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747792_6968, type=LAST_IN_PIPELINE terminating 2025-07-13 20:57:00,066 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747792_6968 replica FinalizedReplica, blk_1073747792_6968, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747792 for deletion 2025-07-13 20:57:00,067 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747792_6968 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747792 2025-07-13 20:58:58,986 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747794_6970 src: /192.168.158.5:53584 dest: /192.168.158.4:9866 2025-07-13 20:58:59,011 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:53584, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1268885059_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747794_6970, duration(ns): 19365764 2025-07-13 20:58:59,011 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747794_6970, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-13 20:59:03,068 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747794_6970 replica FinalizedReplica, blk_1073747794_6970, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747794 for deletion 2025-07-13 20:59:03,069 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747794_6970 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747794 2025-07-13 20:59:58,990 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747795_6971 src: /192.168.158.7:58142 dest: /192.168.158.4:9866 2025-07-13 20:59:59,006 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:58142, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-619246597_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747795_6971, duration(ns): 14855689 2025-07-13 20:59:59,007 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747795_6971, type=LAST_IN_PIPELINE terminating 2025-07-13 21:00:03,069 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747795_6971 replica FinalizedReplica, blk_1073747795_6971, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747795 for deletion 2025-07-13 21:00:03,071 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747795_6971 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747795 2025-07-13 21:02:58,999 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747798_6974 src: /192.168.158.7:59512 dest: /192.168.158.4:9866 2025-07-13 21:02:59,018 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:59512, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1526496541_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747798_6974, duration(ns): 16898865 2025-07-13 21:02:59,019 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747798_6974, type=LAST_IN_PIPELINE terminating 2025-07-13 21:03:03,076 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747798_6974 replica FinalizedReplica, blk_1073747798_6974, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747798 for deletion 2025-07-13 21:03:03,077 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747798_6974 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747798 2025-07-13 21:09:14,030 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747804_6980 src: /192.168.158.6:42994 dest: /192.168.158.4:9866 2025-07-13 21:09:14,056 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:42994, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1483117900_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747804_6980, duration(ns): 19990930 2025-07-13 21:09:14,057 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747804_6980, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-13 21:09:18,086 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747804_6980 replica FinalizedReplica, blk_1073747804_6980, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747804 for deletion 2025-07-13 21:09:18,087 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747804_6980 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747804 2025-07-13 21:11:14,043 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747806_6982 src: /192.168.158.9:47814 dest: /192.168.158.4:9866 2025-07-13 21:11:14,069 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:47814, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1638437432_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747806_6982, duration(ns): 19874996 2025-07-13 21:11:14,069 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747806_6982, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-13 21:11:18,090 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747806_6982 replica FinalizedReplica, blk_1073747806_6982, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747806 for deletion 2025-07-13 21:11:18,091 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747806_6982 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747806 2025-07-13 21:12:14,053 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747807_6983 src: /192.168.158.1:50220 dest: /192.168.158.4:9866 2025-07-13 21:12:14,086 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50220, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1239205322_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747807_6983, duration(ns): 23094134 2025-07-13 21:12:14,086 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747807_6983, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-13 21:12:18,092 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747807_6983 replica FinalizedReplica, blk_1073747807_6983, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747807 for deletion 2025-07-13 21:12:18,093 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747807_6983 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747807 2025-07-13 21:14:14,056 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747809_6985 src: /192.168.158.5:54774 dest: /192.168.158.4:9866 2025-07-13 21:14:14,079 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:54774, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_282673156_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747809_6985, duration(ns): 17650216 2025-07-13 21:14:14,080 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747809_6985, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-13 21:14:18,095 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747809_6985 replica FinalizedReplica, blk_1073747809_6985, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747809 for deletion 2025-07-13 21:14:18,096 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747809_6985 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747809 2025-07-13 21:15:19,041 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747810_6986 src: /192.168.158.1:57398 dest: /192.168.158.4:9866 2025-07-13 21:15:19,073 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57398, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2007146193_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747810_6986, duration(ns): 23309289 2025-07-13 21:15:19,073 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747810_6986, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-13 21:15:24,097 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747810_6986 replica FinalizedReplica, blk_1073747810_6986, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747810 for deletion 2025-07-13 21:15:24,098 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747810_6986 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747810 2025-07-13 21:16:19,053 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747811_6987 src: /192.168.158.1:54224 dest: /192.168.158.4:9866 2025-07-13 21:16:19,083 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54224, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_825883331_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747811_6987, duration(ns): 21405180 2025-07-13 21:16:19,083 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747811_6987, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-13 21:16:21,098 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747811_6987 replica FinalizedReplica, blk_1073747811_6987, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747811 for deletion 2025-07-13 21:16:21,099 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747811_6987 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747811 2025-07-13 21:18:19,063 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747813_6989 src: /192.168.158.5:47230 dest: /192.168.158.4:9866 2025-07-13 21:18:19,081 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:47230, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_28025342_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747813_6989, duration(ns): 16342858 2025-07-13 21:18:19,081 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747813_6989, type=LAST_IN_PIPELINE terminating 2025-07-13 21:18:21,100 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747813_6989 replica FinalizedReplica, blk_1073747813_6989, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747813 for deletion 2025-07-13 21:18:21,101 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747813_6989 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747813 2025-07-13 21:20:24,072 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747815_6991 src: /192.168.158.6:37990 dest: /192.168.158.4:9866 2025-07-13 21:20:24,091 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:37990, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2079716058_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747815_6991, duration(ns): 17126377 2025-07-13 21:20:24,091 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747815_6991, type=LAST_IN_PIPELINE terminating 2025-07-13 21:20:30,105 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747815_6991 replica FinalizedReplica, blk_1073747815_6991, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747815 for deletion 2025-07-13 21:20:30,106 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747815_6991 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747815 2025-07-13 21:23:29,079 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747818_6994 src: /192.168.158.8:57010 dest: /192.168.158.4:9866 2025-07-13 21:23:29,099 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:57010, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-925892527_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747818_6994, duration(ns): 17914886 2025-07-13 21:23:29,099 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747818_6994, type=LAST_IN_PIPELINE terminating 2025-07-13 21:23:36,110 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747818_6994 replica FinalizedReplica, blk_1073747818_6994, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747818 for deletion 2025-07-13 21:23:36,112 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747818_6994 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747818 2025-07-13 21:26:39,067 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747821_6997 src: /192.168.158.9:38442 dest: /192.168.158.4:9866 2025-07-13 21:26:39,093 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:38442, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-114586033_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747821_6997, duration(ns): 20164001 2025-07-13 21:26:39,094 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747821_6997, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-13 21:26:42,116 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747821_6997 replica FinalizedReplica, blk_1073747821_6997, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747821 for deletion 2025-07-13 21:26:42,117 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747821_6997 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747821 2025-07-13 21:28:44,087 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747823_6999 src: /192.168.158.7:39574 dest: /192.168.158.4:9866 2025-07-13 21:28:44,106 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:39574, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236240_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747823_6999, duration(ns): 15809588 2025-07-13 21:28:44,106 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747823_6999, type=LAST_IN_PIPELINE terminating 2025-07-13 21:28:48,121 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747823_6999 replica FinalizedReplica, blk_1073747823_6999, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747823 for deletion 2025-07-13 21:28:48,122 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747823_6999 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747823 2025-07-13 21:30:49,063 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747825_7001 src: /192.168.158.1:50770 dest: /192.168.158.4:9866 2025-07-13 21:30:49,097 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50770, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-337284569_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747825_7001, duration(ns): 26010770 2025-07-13 21:30:49,098 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747825_7001, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-13 21:30:51,124 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747825_7001 replica FinalizedReplica, blk_1073747825_7001, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747825 for deletion 2025-07-13 21:30:51,126 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747825_7001 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747825 2025-07-13 21:33:59,081 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747828_7004 src: /192.168.158.9:39540 dest: /192.168.158.4:9866 2025-07-13 21:33:59,105 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:39540, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1295296843_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747828_7004, duration(ns): 19080951 2025-07-13 21:33:59,107 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747828_7004, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-13 21:34:03,135 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747828_7004 replica FinalizedReplica, blk_1073747828_7004, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747828 for deletion 2025-07-13 21:34:03,136 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747828_7004 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747828 2025-07-13 21:35:04,069 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747829_7005 src: /192.168.158.5:37862 dest: /192.168.158.4:9866 2025-07-13 21:35:04,092 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:37862, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-653002791_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747829_7005, duration(ns): 17758175 2025-07-13 21:35:04,092 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747829_7005, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-13 21:35:06,138 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747829_7005 replica FinalizedReplica, blk_1073747829_7005, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747829 for deletion 2025-07-13 21:35:06,139 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747829_7005 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747829 2025-07-13 21:36:04,082 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747830_7006 src: /192.168.158.9:36392 dest: /192.168.158.4:9866 2025-07-13 21:36:04,099 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:36392, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1815718676_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747830_7006, duration(ns): 15863559 2025-07-13 21:36:04,100 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747830_7006, type=LAST_IN_PIPELINE terminating 2025-07-13 21:36:06,139 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747830_7006 replica FinalizedReplica, blk_1073747830_7006, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747830 for deletion 2025-07-13 21:36:06,141 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747830_7006 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747830 2025-07-13 21:37:04,066 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747831_7007 src: /192.168.158.8:36820 dest: /192.168.158.4:9866 2025-07-13 21:37:04,083 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:36820, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_844403667_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747831_7007, duration(ns): 14815166 2025-07-13 21:37:04,083 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747831_7007, type=LAST_IN_PIPELINE terminating 2025-07-13 21:37:06,140 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747831_7007 replica FinalizedReplica, blk_1073747831_7007, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747831 for deletion 2025-07-13 21:37:06,141 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747831_7007 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747831 2025-07-13 21:40:04,070 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747834_7010 src: /192.168.158.9:42832 dest: /192.168.158.4:9866 2025-07-13 21:40:04,096 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:42832, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2044088547_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747834_7010, duration(ns): 21150631 2025-07-13 21:40:04,097 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747834_7010, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-13 21:40:09,146 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747834_7010 replica FinalizedReplica, blk_1073747834_7010, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747834 for deletion 2025-07-13 21:40:09,147 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747834_7010 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747834 2025-07-13 21:43:09,099 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747837_7013 src: /192.168.158.8:57878 dest: /192.168.158.4:9866 2025-07-13 21:43:09,118 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:57878, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-896041786_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747837_7013, duration(ns): 17187497 2025-07-13 21:43:09,118 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747837_7013, type=LAST_IN_PIPELINE terminating 2025-07-13 21:43:15,156 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747837_7013 replica FinalizedReplica, blk_1073747837_7013, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747837 for deletion 2025-07-13 21:43:15,157 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747837_7013 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747837 2025-07-13 21:44:14,084 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747838_7014 src: /192.168.158.7:45668 dest: /192.168.158.4:9866 2025-07-13 21:44:14,103 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:45668, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1088770436_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747838_7014, duration(ns): 17319132 2025-07-13 21:44:14,103 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747838_7014, type=LAST_IN_PIPELINE terminating 2025-07-13 21:44:15,158 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747838_7014 replica FinalizedReplica, blk_1073747838_7014, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747838 for deletion 2025-07-13 21:44:15,159 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747838_7014 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747838 2025-07-13 21:49:29,086 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747843_7019 src: /192.168.158.9:56546 dest: /192.168.158.4:9866 2025-07-13 21:49:29,111 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:56546, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1713883567_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747843_7019, duration(ns): 19510997 2025-07-13 21:49:29,111 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747843_7019, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-13 21:49:30,174 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747843_7019 replica FinalizedReplica, blk_1073747843_7019, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747843 for deletion 2025-07-13 21:49:30,175 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747843_7019 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747843 2025-07-13 21:54:34,078 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747848_7024 src: /192.168.158.5:57508 dest: /192.168.158.4:9866 2025-07-13 21:54:34,103 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:57508, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1321298323_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747848_7024, duration(ns): 19724707 2025-07-13 21:54:34,104 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747848_7024, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-13 21:54:39,189 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747848_7024 replica FinalizedReplica, blk_1073747848_7024, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747848 for deletion 2025-07-13 21:54:39,190 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747848_7024 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747848 2025-07-13 21:56:34,101 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747850_7026 src: /192.168.158.9:59048 dest: /192.168.158.4:9866 2025-07-13 21:56:34,120 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:59048, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-45822111_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747850_7026, duration(ns): 17480708 2025-07-13 21:56:34,121 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747850_7026, type=LAST_IN_PIPELINE terminating 2025-07-13 21:56:36,191 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747850_7026 replica FinalizedReplica, blk_1073747850_7026, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747850 for deletion 2025-07-13 21:56:36,193 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747850_7026 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747850 2025-07-13 21:57:39,086 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747851_7027 src: /192.168.158.1:37574 dest: /192.168.158.4:9866 2025-07-13 21:57:39,118 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37574, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_308517834_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747851_7027, duration(ns): 23489832 2025-07-13 21:57:39,119 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747851_7027, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-13 21:57:42,195 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747851_7027 replica FinalizedReplica, blk_1073747851_7027, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747851 for deletion 2025-07-13 21:57:42,196 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747851_7027 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747851 2025-07-13 21:58:39,114 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747852_7028 src: /192.168.158.1:46654 dest: /192.168.158.4:9866 2025-07-13 21:58:39,145 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46654, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1450161393_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747852_7028, duration(ns): 22355572 2025-07-13 21:58:39,145 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747852_7028, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-13 21:58:42,195 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747852_7028 replica FinalizedReplica, blk_1073747852_7028, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747852 for deletion 2025-07-13 21:58:42,196 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747852_7028 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747852 2025-07-13 21:59:39,104 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747853_7029 src: /192.168.158.6:60568 dest: /192.168.158.4:9866 2025-07-13 21:59:39,122 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:60568, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1084469596_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747853_7029, duration(ns): 16045974 2025-07-13 21:59:39,122 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747853_7029, type=LAST_IN_PIPELINE terminating 2025-07-13 21:59:42,196 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747853_7029 replica FinalizedReplica, blk_1073747853_7029, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747853 for deletion 2025-07-13 21:59:42,198 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747853_7029 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747853 2025-07-13 22:02:49,127 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747856_7032 src: /192.168.158.8:38546 dest: /192.168.158.4:9866 2025-07-13 22:02:49,144 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:38546, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2113792529_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747856_7032, duration(ns): 14917140 2025-07-13 22:02:49,144 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747856_7032, type=LAST_IN_PIPELINE terminating 2025-07-13 22:02:51,203 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747856_7032 replica FinalizedReplica, blk_1073747856_7032, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747856 for deletion 2025-07-13 22:02:51,204 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747856_7032 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747856 2025-07-13 22:05:49,129 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747859_7035 src: /192.168.158.6:39626 dest: /192.168.158.4:9866 2025-07-13 22:05:49,152 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:39626, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-478057327_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747859_7035, duration(ns): 20237263 2025-07-13 22:05:49,152 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747859_7035, type=LAST_IN_PIPELINE terminating 2025-07-13 22:05:51,210 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747859_7035 replica FinalizedReplica, blk_1073747859_7035, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747859 for deletion 2025-07-13 22:05:51,211 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747859_7035 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747859 2025-07-13 22:06:54,122 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747860_7036 src: /192.168.158.8:46194 dest: /192.168.158.4:9866 2025-07-13 22:06:54,147 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:46194, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1971767535_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747860_7036, duration(ns): 20413679 2025-07-13 22:06:54,147 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747860_7036, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-13 22:06:57,214 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747860_7036 replica FinalizedReplica, blk_1073747860_7036, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747860 for deletion 2025-07-13 22:06:57,215 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747860_7036 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747860 2025-07-13 22:10:54,119 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747864_7040 src: /192.168.158.6:43940 dest: /192.168.158.4:9866 2025-07-13 22:10:54,145 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:43940, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1617520003_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747864_7040, duration(ns): 20515849 2025-07-13 22:10:54,145 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747864_7040, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-13 22:10:57,221 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747864_7040 replica FinalizedReplica, blk_1073747864_7040, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747864 for deletion 2025-07-13 22:10:57,223 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747864_7040 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747864 2025-07-13 22:11:54,115 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747865_7041 src: /192.168.158.8:46142 dest: /192.168.158.4:9866 2025-07-13 22:11:54,141 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:46142, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_884417984_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747865_7041, duration(ns): 18596612 2025-07-13 22:11:54,141 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747865_7041, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-13 22:11:57,226 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747865_7041 replica FinalizedReplica, blk_1073747865_7041, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747865 for deletion 2025-07-13 22:11:57,227 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747865_7041 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747865 2025-07-13 22:12:54,115 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747866_7042 src: /192.168.158.9:36648 dest: /192.168.158.4:9866 2025-07-13 22:12:54,142 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:36648, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1446946286_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747866_7042, duration(ns): 20387292 2025-07-13 22:12:54,142 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747866_7042, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-13 22:13:00,227 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747866_7042 replica FinalizedReplica, blk_1073747866_7042, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747866 for deletion 2025-07-13 22:13:00,228 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747866_7042 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747866 2025-07-13 22:13:54,122 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747867_7043 src: /192.168.158.6:43120 dest: /192.168.158.4:9866 2025-07-13 22:13:54,143 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:43120, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1256918091_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747867_7043, duration(ns): 19042326 2025-07-13 22:13:54,144 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747867_7043, type=LAST_IN_PIPELINE terminating 2025-07-13 22:13:57,229 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747867_7043 replica FinalizedReplica, blk_1073747867_7043, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747867 for deletion 2025-07-13 22:13:57,230 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747867_7043 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747867 2025-07-13 22:15:54,140 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747869_7045 src: /192.168.158.9:37982 dest: /192.168.158.4:9866 2025-07-13 22:15:54,164 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:37982, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_180776545_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747869_7045, duration(ns): 18732104 2025-07-13 22:15:54,164 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747869_7045, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-13 22:16:00,235 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747869_7045 replica FinalizedReplica, blk_1073747869_7045, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747869 for deletion 2025-07-13 22:16:00,236 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747869_7045 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747869 2025-07-13 22:16:54,133 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747870_7046 src: /192.168.158.1:59988 dest: /192.168.158.4:9866 2025-07-13 22:16:54,165 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59988, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1088541032_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747870_7046, duration(ns): 22926142 2025-07-13 22:16:54,165 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747870_7046, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-13 22:17:00,239 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747870_7046 replica FinalizedReplica, blk_1073747870_7046, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747870 for deletion 2025-07-13 22:17:00,240 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747870_7046 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747870 2025-07-13 22:18:59,135 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747872_7048 src: /192.168.158.5:34764 dest: /192.168.158.4:9866 2025-07-13 22:18:59,160 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:34764, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-452719954_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747872_7048, duration(ns): 20111221 2025-07-13 22:18:59,161 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747872_7048, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-13 22:19:03,243 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747872_7048 replica FinalizedReplica, blk_1073747872_7048, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747872 for deletion 2025-07-13 22:19:03,244 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747872_7048 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747872 2025-07-13 22:22:04,148 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747875_7051 src: /192.168.158.8:52024 dest: /192.168.158.4:9866 2025-07-13 22:22:04,166 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:52024, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_410098734_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747875_7051, duration(ns): 16157551 2025-07-13 22:22:04,166 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747875_7051, type=LAST_IN_PIPELINE terminating 2025-07-13 22:22:09,251 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747875_7051 replica FinalizedReplica, blk_1073747875_7051, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747875 for deletion 2025-07-13 22:22:09,252 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747875_7051 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747875 2025-07-13 22:23:04,137 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747876_7052 src: /192.168.158.8:42780 dest: /192.168.158.4:9866 2025-07-13 22:23:04,155 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:42780, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1510456020_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747876_7052, duration(ns): 16055983 2025-07-13 22:23:04,155 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747876_7052, type=LAST_IN_PIPELINE terminating 2025-07-13 22:23:09,253 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747876_7052 replica FinalizedReplica, blk_1073747876_7052, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747876 for deletion 2025-07-13 22:23:09,254 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747876_7052 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747876 2025-07-13 22:24:04,140 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747877_7053 src: /192.168.158.1:59740 dest: /192.168.158.4:9866 2025-07-13 22:24:04,170 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59740, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_421978481_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747877_7053, duration(ns): 20660504 2025-07-13 22:24:04,170 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747877_7053, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-13 22:24:06,254 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747877_7053 replica FinalizedReplica, blk_1073747877_7053, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747877 for deletion 2025-07-13 22:24:06,255 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747877_7053 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747877 2025-07-13 22:25:04,141 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747878_7054 src: /192.168.158.8:45982 dest: /192.168.158.4:9866 2025-07-13 22:25:04,167 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:45982, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2017644091_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747878_7054, duration(ns): 20517761 2025-07-13 22:25:04,167 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747878_7054, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-13 22:25:06,257 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747878_7054 replica FinalizedReplica, blk_1073747878_7054, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747878 for deletion 2025-07-13 22:25:06,258 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747878_7054 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747878 2025-07-13 22:26:04,155 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747879_7055 src: /192.168.158.1:49364 dest: /192.168.158.4:9866 2025-07-13 22:26:04,187 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49364, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-949511063_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747879_7055, duration(ns): 23139956 2025-07-13 22:26:04,187 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747879_7055, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-13 22:26:09,259 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747879_7055 replica FinalizedReplica, blk_1073747879_7055, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747879 for deletion 2025-07-13 22:26:09,260 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747879_7055 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747879 2025-07-13 22:27:04,151 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747880_7056 src: /192.168.158.1:53602 dest: /192.168.158.4:9866 2025-07-13 22:27:04,179 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53602, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1367628562_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747880_7056, duration(ns): 19852703 2025-07-13 22:27:04,180 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747880_7056, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-13 22:27:06,261 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747880_7056 replica FinalizedReplica, blk_1073747880_7056, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747880 for deletion 2025-07-13 22:27:06,262 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747880_7056 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747880 2025-07-13 22:28:04,148 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747881_7057 src: /192.168.158.7:34190 dest: /192.168.158.4:9866 2025-07-13 22:28:04,167 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:34190, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1066663423_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747881_7057, duration(ns): 16938182 2025-07-13 22:28:04,167 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747881_7057, type=LAST_IN_PIPELINE terminating 2025-07-13 22:28:06,263 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747881_7057 replica FinalizedReplica, blk_1073747881_7057, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747881 for deletion 2025-07-13 22:28:06,264 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747881_7057 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747881 2025-07-13 22:35:14,159 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747888_7064 src: /192.168.158.7:51054 dest: /192.168.158.4:9866 2025-07-13 22:35:14,184 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:51054, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1861344175_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747888_7064, duration(ns): 19740586 2025-07-13 22:35:14,185 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747888_7064, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-13 22:35:15,271 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747888_7064 replica FinalizedReplica, blk_1073747888_7064, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747888 for deletion 2025-07-13 22:35:15,272 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747888_7064 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747888 2025-07-13 22:36:19,146 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747889_7065 src: /192.168.158.1:38606 dest: /192.168.158.4:9866 2025-07-13 22:36:19,178 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38606, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2140817279_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747889_7065, duration(ns): 22722494 2025-07-13 22:36:19,178 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747889_7065, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-13 22:36:21,273 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747889_7065 replica FinalizedReplica, blk_1073747889_7065, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747889 for deletion 2025-07-13 22:36:21,274 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747889_7065 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747889 2025-07-13 22:39:24,161 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747892_7068 src: /192.168.158.7:53472 dest: /192.168.158.4:9866 2025-07-13 22:39:24,178 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:53472, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1293004069_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747892_7068, duration(ns): 15648801 2025-07-13 22:39:24,179 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747892_7068, type=LAST_IN_PIPELINE terminating 2025-07-13 22:39:27,277 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747892_7068 replica FinalizedReplica, blk_1073747892_7068, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747892 for deletion 2025-07-13 22:39:27,278 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747892_7068 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747892 2025-07-13 22:40:29,151 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747893_7069 src: /192.168.158.1:45140 dest: /192.168.158.4:9866 2025-07-13 22:40:29,186 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45140, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-850368610_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747893_7069, duration(ns): 26072100 2025-07-13 22:40:29,186 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747893_7069, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-13 22:40:33,278 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747893_7069 replica FinalizedReplica, blk_1073747893_7069, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747893 for deletion 2025-07-13 22:40:33,279 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747893_7069 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747893 2025-07-13 22:45:34,161 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747898_7074 src: /192.168.158.5:53658 dest: /192.168.158.4:9866 2025-07-13 22:45:34,187 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:53658, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1435892735_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747898_7074, duration(ns): 20782591 2025-07-13 22:45:34,187 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747898_7074, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-13 22:45:36,282 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747898_7074 replica FinalizedReplica, blk_1073747898_7074, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747898 for deletion 2025-07-13 22:45:36,283 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747898_7074 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747898 2025-07-13 22:48:39,167 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747901_7077 src: /192.168.158.1:47360 dest: /192.168.158.4:9866 2025-07-13 22:48:39,197 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47360, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2044078754_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747901_7077, duration(ns): 21948315 2025-07-13 22:48:39,197 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747901_7077, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-13 22:48:42,284 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747901_7077 replica FinalizedReplica, blk_1073747901_7077, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747901 for deletion 2025-07-13 22:48:42,285 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747901_7077 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747901 2025-07-13 22:49:44,170 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747902_7078 src: /192.168.158.9:48114 dest: /192.168.158.4:9866 2025-07-13 22:49:44,188 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:48114, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1473378215_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747902_7078, duration(ns): 16703325 2025-07-13 22:49:44,189 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747902_7078, type=LAST_IN_PIPELINE terminating 2025-07-13 22:49:48,288 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747902_7078 replica FinalizedReplica, blk_1073747902_7078, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747902 for deletion 2025-07-13 22:49:48,289 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747902_7078 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747902 2025-07-13 22:52:54,175 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747905_7081 src: /192.168.158.9:43274 dest: /192.168.158.4:9866 2025-07-13 22:52:54,198 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:43274, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1946228182_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747905_7081, duration(ns): 17893826 2025-07-13 22:52:54,199 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747905_7081, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-13 22:53:00,290 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747905_7081 replica FinalizedReplica, blk_1073747905_7081, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747905 for deletion 2025-07-13 22:53:00,292 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747905_7081 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747905 2025-07-13 22:56:04,176 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747908_7084 src: /192.168.158.6:33680 dest: /192.168.158.4:9866 2025-07-13 22:56:04,205 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:33680, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-154702452_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747908_7084, duration(ns): 22221735 2025-07-13 22:56:04,205 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747908_7084, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-13 22:56:06,297 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747908_7084 replica FinalizedReplica, blk_1073747908_7084, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747908 for deletion 2025-07-13 22:56:06,298 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747908_7084 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747908 2025-07-13 22:57:04,178 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747909_7085 src: /192.168.158.5:54806 dest: /192.168.158.4:9866 2025-07-13 22:57:04,203 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:54806, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1728899396_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747909_7085, duration(ns): 19875921 2025-07-13 22:57:04,204 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747909_7085, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-13 22:57:06,299 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747909_7085 replica FinalizedReplica, blk_1073747909_7085, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747909 for deletion 2025-07-13 22:57:06,300 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747909_7085 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747909 2025-07-13 23:00:04,193 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747912_7088 src: /192.168.158.5:48436 dest: /192.168.158.4:9866 2025-07-13 23:00:04,212 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:48436, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1648981770_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747912_7088, duration(ns): 16943750 2025-07-13 23:00:04,212 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747912_7088, type=LAST_IN_PIPELINE terminating 2025-07-13 23:00:06,304 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747912_7088 replica FinalizedReplica, blk_1073747912_7088, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747912 for deletion 2025-07-13 23:00:06,305 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747912_7088 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747912 2025-07-13 23:01:04,185 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747913_7089 src: /192.168.158.5:50840 dest: /192.168.158.4:9866 2025-07-13 23:01:04,204 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:50840, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2745580_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747913_7089, duration(ns): 17369836 2025-07-13 23:01:04,204 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747913_7089, type=LAST_IN_PIPELINE terminating 2025-07-13 23:01:06,304 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747913_7089 replica FinalizedReplica, blk_1073747913_7089, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747913 for deletion 2025-07-13 23:01:06,306 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747913_7089 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747913 2025-07-13 23:03:04,187 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747915_7091 src: /192.168.158.1:47340 dest: /192.168.158.4:9866 2025-07-13 23:03:04,217 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47340, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1672084236_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747915_7091, duration(ns): 21547972 2025-07-13 23:03:04,217 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747915_7091, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-13 23:03:06,312 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747915_7091 replica FinalizedReplica, blk_1073747915_7091, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747915 for deletion 2025-07-13 23:03:06,313 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747915_7091 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747915 2025-07-13 23:04:09,194 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747916_7092 src: /192.168.158.6:47012 dest: /192.168.158.4:9866 2025-07-13 23:04:09,223 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:47012, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-524173611_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747916_7092, duration(ns): 23228840 2025-07-13 23:04:09,223 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747916_7092, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-13 23:04:15,313 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747916_7092 replica FinalizedReplica, blk_1073747916_7092, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747916 for deletion 2025-07-13 23:04:15,314 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747916_7092 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747916 2025-07-13 23:07:14,203 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747919_7095 src: /192.168.158.8:58934 dest: /192.168.158.4:9866 2025-07-13 23:07:14,228 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:58934, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1799325803_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747919_7095, duration(ns): 19459934 2025-07-13 23:07:14,228 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747919_7095, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-13 23:07:15,319 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747919_7095 replica FinalizedReplica, blk_1073747919_7095, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747919 for deletion 2025-07-13 23:07:15,320 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747919_7095 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747919 2025-07-13 23:08:19,201 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747920_7096 src: /192.168.158.8:40892 dest: /192.168.158.4:9866 2025-07-13 23:08:19,226 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:40892, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-203898195_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747920_7096, duration(ns): 19471664 2025-07-13 23:08:19,226 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747920_7096, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-13 23:08:24,321 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747920_7096 replica FinalizedReplica, blk_1073747920_7096, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747920 for deletion 2025-07-13 23:08:24,322 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747920_7096 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747920 2025-07-13 23:10:24,185 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747922_7098 src: /192.168.158.1:40276 dest: /192.168.158.4:9866 2025-07-13 23:10:24,216 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40276, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1482952100_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747922_7098, duration(ns): 23200662 2025-07-13 23:10:24,217 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747922_7098, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-13 23:10:30,323 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747922_7098 replica FinalizedReplica, blk_1073747922_7098, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747922 for deletion 2025-07-13 23:10:30,324 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747922_7098 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747922 2025-07-13 23:11:29,187 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747923_7099 src: /192.168.158.7:40240 dest: /192.168.158.4:9866 2025-07-13 23:11:29,210 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:40240, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-192971305_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747923_7099, duration(ns): 17925988 2025-07-13 23:11:29,211 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747923_7099, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-13 23:11:30,324 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747923_7099 replica FinalizedReplica, blk_1073747923_7099, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747923 for deletion 2025-07-13 23:11:30,326 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747923_7099 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747923 2025-07-13 23:14:34,215 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747926_7102 src: /192.168.158.9:46000 dest: /192.168.158.4:9866 2025-07-13 23:14:34,234 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:46000, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-567123508_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747926_7102, duration(ns): 16984989 2025-07-13 23:14:34,234 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747926_7102, type=LAST_IN_PIPELINE terminating 2025-07-13 23:14:39,334 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747926_7102 replica FinalizedReplica, blk_1073747926_7102, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747926 for deletion 2025-07-13 23:14:39,335 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747926_7102 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747926 2025-07-13 23:17:34,195 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747929_7105 src: /192.168.158.5:51214 dest: /192.168.158.4:9866 2025-07-13 23:17:34,221 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:51214, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-955189046_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747929_7105, duration(ns): 21194840 2025-07-13 23:17:34,222 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747929_7105, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-13 23:17:36,338 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747929_7105 replica FinalizedReplica, blk_1073747929_7105, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747929 for deletion 2025-07-13 23:17:36,340 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747929_7105 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747929 2025-07-13 23:18:34,216 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747930_7106 src: /192.168.158.9:34896 dest: /192.168.158.4:9866 2025-07-13 23:18:34,236 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:34896, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1889582927_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747930_7106, duration(ns): 18677275 2025-07-13 23:18:34,237 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747930_7106, type=LAST_IN_PIPELINE terminating 2025-07-13 23:18:39,348 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747930_7106 replica FinalizedReplica, blk_1073747930_7106, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747930 for deletion 2025-07-13 23:18:39,350 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747930_7106 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747930 2025-07-13 23:19:34,211 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747931_7107 src: /192.168.158.8:33776 dest: /192.168.158.4:9866 2025-07-13 23:19:34,235 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:33776, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1239463310_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747931_7107, duration(ns): 19036163 2025-07-13 23:19:34,235 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747931_7107, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-13 23:19:36,352 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747931_7107 replica FinalizedReplica, blk_1073747931_7107, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747931 for deletion 2025-07-13 23:19:36,353 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747931_7107 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747931 2025-07-13 23:20:39,222 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747932_7108 src: /192.168.158.7:58274 dest: /192.168.158.4:9866 2025-07-13 23:20:39,240 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:58274, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1978821633_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747932_7108, duration(ns): 16750382 2025-07-13 23:20:39,241 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747932_7108, type=LAST_IN_PIPELINE terminating 2025-07-13 23:20:45,353 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747932_7108 replica FinalizedReplica, blk_1073747932_7108, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747932 for deletion 2025-07-13 23:20:45,354 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747932_7108 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747932 2025-07-13 23:22:44,211 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747934_7110 src: /192.168.158.5:60740 dest: /192.168.158.4:9866 2025-07-13 23:22:44,235 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:60740, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_898361497_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747934_7110, duration(ns): 17840908 2025-07-13 23:22:44,235 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747934_7110, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-13 23:22:45,354 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747934_7110 replica FinalizedReplica, blk_1073747934_7110, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747934 for deletion 2025-07-13 23:22:45,355 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747934_7110 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747934 2025-07-13 23:25:49,225 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747937_7113 src: /192.168.158.6:45856 dest: /192.168.158.4:9866 2025-07-13 23:25:49,249 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:45856, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1799933953_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747937_7113, duration(ns): 17796604 2025-07-13 23:25:49,249 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747937_7113, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-13 23:25:51,359 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747937_7113 replica FinalizedReplica, blk_1073747937_7113, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747937 for deletion 2025-07-13 23:25:51,360 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747937_7113 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747937 2025-07-13 23:26:49,218 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747938_7114 src: /192.168.158.1:47422 dest: /192.168.158.4:9866 2025-07-13 23:26:49,250 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47422, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-466467296_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747938_7114, duration(ns): 23066181 2025-07-13 23:26:49,250 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747938_7114, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-13 23:26:51,362 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747938_7114 replica FinalizedReplica, blk_1073747938_7114, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747938 for deletion 2025-07-13 23:26:51,363 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747938_7114 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747938 2025-07-13 23:27:54,222 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747939_7115 src: /192.168.158.7:41662 dest: /192.168.158.4:9866 2025-07-13 23:27:54,241 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:41662, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-725719711_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747939_7115, duration(ns): 16730346 2025-07-13 23:27:54,242 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747939_7115, type=LAST_IN_PIPELINE terminating 2025-07-13 23:28:00,366 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747939_7115 replica FinalizedReplica, blk_1073747939_7115, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747939 for deletion 2025-07-13 23:28:00,367 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747939_7115 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747939 2025-07-13 23:30:59,230 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747942_7118 src: /192.168.158.7:56604 dest: /192.168.158.4:9866 2025-07-13 23:30:59,256 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:56604, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1773033131_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747942_7118, duration(ns): 21415444 2025-07-13 23:30:59,257 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747942_7118, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-13 23:31:00,372 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747942_7118 replica FinalizedReplica, blk_1073747942_7118, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747942 for deletion 2025-07-13 23:31:00,373 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747942_7118 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747942 2025-07-13 23:31:59,228 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747943_7119 src: /192.168.158.1:49452 dest: /192.168.158.4:9866 2025-07-13 23:31:59,259 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49452, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1221745002_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747943_7119, duration(ns): 22519253 2025-07-13 23:31:59,260 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747943_7119, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-13 23:32:03,373 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747943_7119 replica FinalizedReplica, blk_1073747943_7119, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747943 for deletion 2025-07-13 23:32:03,374 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747943_7119 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747943 2025-07-13 23:34:04,228 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747945_7121 src: /192.168.158.1:41224 dest: /192.168.158.4:9866 2025-07-13 23:34:04,260 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41224, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1685013248_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747945_7121, duration(ns): 23061739 2025-07-13 23:34:04,260 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747945_7121, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-13 23:34:09,374 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747945_7121 replica FinalizedReplica, blk_1073747945_7121, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747945 for deletion 2025-07-13 23:34:09,375 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747945_7121 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747945 2025-07-13 23:36:09,243 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747947_7123 src: /192.168.158.1:34924 dest: /192.168.158.4:9866 2025-07-13 23:36:09,273 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34924, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-588034677_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747947_7123, duration(ns): 20390501 2025-07-13 23:36:09,273 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747947_7123, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-13 23:36:12,381 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747947_7123 replica FinalizedReplica, blk_1073747947_7123, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747947 for deletion 2025-07-13 23:36:12,382 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747947_7123 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747947 2025-07-13 23:36:13,268 INFO org.apache.hadoop.hdfs.server.datanode.DirectoryScanner: BlockPool BP-1059995147-192.168.158.1-1752101929360 Total blocks: 8, missing metadata files:0, missing block files:0, missing blocks in memory:0, mismatched blocks:0 2025-07-13 23:37:21,386 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Successfully sent block report 0x12cfd9a757d31f36, containing 4 storage report(s), of which we sent 4. The reports had 8 total blocks and used 1 RPC(s). This took 1 msec to generate and 3 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5. 2025-07-13 23:37:21,387 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Got finalize command for block pool BP-1059995147-192.168.158.1-1752101929360 2025-07-13 23:39:19,247 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747950_7126 src: /192.168.158.6:59430 dest: /192.168.158.4:9866 2025-07-13 23:39:19,267 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:59430, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1986329265_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747950_7126, duration(ns): 17861882 2025-07-13 23:39:19,267 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747950_7126, type=LAST_IN_PIPELINE terminating 2025-07-13 23:39:21,392 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747950_7126 replica FinalizedReplica, blk_1073747950_7126, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747950 for deletion 2025-07-13 23:39:21,393 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747950_7126 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747950 2025-07-13 23:41:24,256 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747952_7128 src: /192.168.158.8:40486 dest: /192.168.158.4:9866 2025-07-13 23:41:24,283 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:40486, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1104966372_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747952_7128, duration(ns): 21369470 2025-07-13 23:41:24,283 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747952_7128, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-13 23:41:30,393 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747952_7128 replica FinalizedReplica, blk_1073747952_7128, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747952 for deletion 2025-07-13 23:41:30,394 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747952_7128 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747952 2025-07-13 23:43:24,285 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747954_7130 src: /192.168.158.5:43518 dest: /192.168.158.4:9866 2025-07-13 23:43:24,303 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:43518, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1063780505_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747954_7130, duration(ns): 16292370 2025-07-13 23:43:24,304 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747954_7130, type=LAST_IN_PIPELINE terminating 2025-07-13 23:43:30,396 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747954_7130 replica FinalizedReplica, blk_1073747954_7130, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747954 for deletion 2025-07-13 23:43:30,397 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747954_7130 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747954 2025-07-13 23:45:29,249 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747956_7132 src: /192.168.158.1:47568 dest: /192.168.158.4:9866 2025-07-13 23:45:29,280 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47568, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-98668315_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747956_7132, duration(ns): 21850752 2025-07-13 23:45:29,280 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747956_7132, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-13 23:45:30,399 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747956_7132 replica FinalizedReplica, blk_1073747956_7132, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747956 for deletion 2025-07-13 23:45:30,400 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747956_7132 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747956 2025-07-13 23:46:29,264 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747957_7133 src: /192.168.158.7:54218 dest: /192.168.158.4:9866 2025-07-13 23:46:29,289 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:54218, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-178176981_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747957_7133, duration(ns): 19138070 2025-07-13 23:46:29,289 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747957_7133, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-13 23:46:30,404 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747957_7133 replica FinalizedReplica, blk_1073747957_7133, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747957 for deletion 2025-07-13 23:46:30,405 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747957_7133 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747957 2025-07-13 23:47:29,280 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747958_7134 src: /192.168.158.8:60658 dest: /192.168.158.4:9866 2025-07-13 23:47:29,305 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:60658, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_861980525_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747958_7134, duration(ns): 20231275 2025-07-13 23:47:29,306 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747958_7134, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-13 23:47:33,406 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747958_7134 replica FinalizedReplica, blk_1073747958_7134, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747958 for deletion 2025-07-13 23:47:33,407 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747958_7134 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747958 2025-07-13 23:48:29,305 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747959_7135 src: /192.168.158.9:59430 dest: /192.168.158.4:9866 2025-07-13 23:48:29,322 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:59430, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1962405767_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747959_7135, duration(ns): 15175080 2025-07-13 23:48:29,323 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747959_7135, type=LAST_IN_PIPELINE terminating 2025-07-13 23:48:30,406 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747959_7135 replica FinalizedReplica, blk_1073747959_7135, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747959 for deletion 2025-07-13 23:48:30,408 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747959_7135 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747959 2025-07-13 23:51:34,264 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747962_7138 src: /192.168.158.1:60516 dest: /192.168.158.4:9866 2025-07-13 23:51:34,295 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60516, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-801681513_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747962_7138, duration(ns): 22569388 2025-07-13 23:51:34,295 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747962_7138, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-13 23:51:36,414 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747962_7138 replica FinalizedReplica, blk_1073747962_7138, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747962 for deletion 2025-07-13 23:51:36,415 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747962_7138 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747962 2025-07-13 23:52:34,269 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747963_7139 src: /192.168.158.1:43368 dest: /192.168.158.4:9866 2025-07-13 23:52:34,303 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43368, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-152020094_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747963_7139, duration(ns): 25588593 2025-07-13 23:52:34,304 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747963_7139, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-13 23:52:36,415 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747963_7139 replica FinalizedReplica, blk_1073747963_7139, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747963 for deletion 2025-07-13 23:52:36,417 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747963_7139 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747963 2025-07-13 23:53:34,282 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747964_7140 src: /192.168.158.9:55342 dest: /192.168.158.4:9866 2025-07-13 23:53:34,301 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:55342, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_153061738_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747964_7140, duration(ns): 16158873 2025-07-13 23:53:34,301 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747964_7140, type=LAST_IN_PIPELINE terminating 2025-07-13 23:53:36,416 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747964_7140 replica FinalizedReplica, blk_1073747964_7140, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747964 for deletion 2025-07-13 23:53:36,417 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747964_7140 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747964 2025-07-13 23:54:34,282 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747965_7141 src: /192.168.158.9:38120 dest: /192.168.158.4:9866 2025-07-13 23:54:34,306 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:38120, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-263842900_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747965_7141, duration(ns): 19494809 2025-07-13 23:54:34,306 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747965_7141, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-13 23:54:36,418 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747965_7141 replica FinalizedReplica, blk_1073747965_7141, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747965 for deletion 2025-07-13 23:54:36,419 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747965_7141 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747965 2025-07-13 23:55:34,288 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747966_7142 src: /192.168.158.8:56968 dest: /192.168.158.4:9866 2025-07-13 23:55:34,312 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:56968, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-93342448_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747966_7142, duration(ns): 18739947 2025-07-13 23:55:34,313 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747966_7142, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-13 23:55:39,420 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747966_7142 replica FinalizedReplica, blk_1073747966_7142, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747966 for deletion 2025-07-13 23:55:39,421 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747966_7142 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747966 2025-07-13 23:56:34,280 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747967_7143 src: /192.168.158.9:35026 dest: /192.168.158.4:9866 2025-07-13 23:56:34,304 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:35026, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1631521898_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747967_7143, duration(ns): 18312000 2025-07-13 23:56:34,304 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747967_7143, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-13 23:56:36,423 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747967_7143 replica FinalizedReplica, blk_1073747967_7143, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747967 for deletion 2025-07-13 23:56:36,424 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747967_7143 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073747967 2025-07-13 23:57:34,304 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747968_7144 src: /192.168.158.9:44386 dest: /192.168.158.4:9866 2025-07-13 23:57:34,323 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:44386, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_358606485_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747968_7144, duration(ns): 16150916 2025-07-13 23:57:34,323 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747968_7144, type=LAST_IN_PIPELINE terminating 2025-07-13 23:57:39,423 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747968_7144 replica FinalizedReplica, blk_1073747968_7144, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073747968 for deletion 2025-07-13 23:57:39,424 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747968_7144 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073747968 2025-07-13 23:58:34,280 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747969_7145 src: /192.168.158.1:36518 dest: /192.168.158.4:9866 2025-07-13 23:58:34,314 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36518, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1451584905_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747969_7145, duration(ns): 25866374 2025-07-13 23:58:34,314 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747969_7145, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-13 23:58:36,430 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747969_7145 replica FinalizedReplica, blk_1073747969_7145, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073747969 for deletion 2025-07-13 23:58:36,431 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747969_7145 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073747969 2025-07-14 00:02:44,297 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747973_7149 src: /192.168.158.1:57772 dest: /192.168.158.4:9866 2025-07-14 00:02:44,328 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57772, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-204246657_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747973_7149, duration(ns): 23116420 2025-07-14 00:02:44,328 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747973_7149, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-14 00:02:48,441 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747973_7149 replica FinalizedReplica, blk_1073747973_7149, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073747973 for deletion 2025-07-14 00:02:48,442 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747973_7149 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073747973 2025-07-14 00:04:44,300 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747975_7151 src: /192.168.158.6:49480 dest: /192.168.158.4:9866 2025-07-14 00:04:44,326 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:49480, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_666130638_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747975_7151, duration(ns): 20119341 2025-07-14 00:04:44,326 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747975_7151, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-14 00:04:45,449 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747975_7151 replica FinalizedReplica, blk_1073747975_7151, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073747975 for deletion 2025-07-14 00:04:45,450 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747975_7151 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073747975 2025-07-14 00:07:44,305 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747978_7154 src: /192.168.158.6:49058 dest: /192.168.158.4:9866 2025-07-14 00:07:44,324 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:49058, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-46047735_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747978_7154, duration(ns): 16219803 2025-07-14 00:07:44,325 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747978_7154, type=LAST_IN_PIPELINE terminating 2025-07-14 00:07:45,455 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747978_7154 replica FinalizedReplica, blk_1073747978_7154, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073747978 for deletion 2025-07-14 00:07:45,457 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747978_7154 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073747978 2025-07-14 00:09:44,344 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747980_7156 src: /192.168.158.1:33814 dest: /192.168.158.4:9866 2025-07-14 00:09:44,378 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33814, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1820586998_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747980_7156, duration(ns): 24979221 2025-07-14 00:09:44,378 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747980_7156, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-14 00:09:48,463 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747980_7156 replica FinalizedReplica, blk_1073747980_7156, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073747980 for deletion 2025-07-14 00:09:48,464 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747980_7156 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073747980 2025-07-14 00:10:49,309 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747981_7157 src: /192.168.158.6:43580 dest: /192.168.158.4:9866 2025-07-14 00:10:49,334 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:43580, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1910684856_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747981_7157, duration(ns): 19090734 2025-07-14 00:10:49,334 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747981_7157, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-14 00:10:51,464 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747981_7157 replica FinalizedReplica, blk_1073747981_7157, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073747981 for deletion 2025-07-14 00:10:51,465 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747981_7157 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073747981 2025-07-14 00:17:49,339 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747988_7164 src: /192.168.158.1:39340 dest: /192.168.158.4:9866 2025-07-14 00:17:49,372 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39340, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_221623802_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747988_7164, duration(ns): 24016408 2025-07-14 00:17:49,372 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747988_7164, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-14 00:17:54,482 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747988_7164 replica FinalizedReplica, blk_1073747988_7164, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073747988 for deletion 2025-07-14 00:17:54,483 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747988_7164 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073747988 2025-07-14 00:18:49,344 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747989_7165 src: /192.168.158.1:45198 dest: /192.168.158.4:9866 2025-07-14 00:18:49,378 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45198, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1172611958_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747989_7165, duration(ns): 24773635 2025-07-14 00:18:49,378 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747989_7165, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-14 00:18:51,487 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747989_7165 replica FinalizedReplica, blk_1073747989_7165, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073747989 for deletion 2025-07-14 00:18:51,488 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747989_7165 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073747989 2025-07-14 00:19:54,319 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747990_7166 src: /192.168.158.8:49446 dest: /192.168.158.4:9866 2025-07-14 00:19:54,344 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:49446, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-906477099_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747990_7166, duration(ns): 19485155 2025-07-14 00:19:54,345 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747990_7166, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-14 00:19:57,487 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747990_7166 replica FinalizedReplica, blk_1073747990_7166, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073747990 for deletion 2025-07-14 00:19:57,489 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747990_7166 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073747990 2025-07-14 00:25:54,339 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747996_7172 src: /192.168.158.5:45688 dest: /192.168.158.4:9866 2025-07-14 00:25:54,363 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:45688, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1870297610_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747996_7172, duration(ns): 19100418 2025-07-14 00:25:54,364 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747996_7172, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-14 00:25:57,495 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747996_7172 replica FinalizedReplica, blk_1073747996_7172, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073747996 for deletion 2025-07-14 00:25:57,496 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747996_7172 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073747996 2025-07-14 00:26:54,337 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747997_7173 src: /192.168.158.1:60918 dest: /192.168.158.4:9866 2025-07-14 00:26:54,367 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60918, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-271081400_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747997_7173, duration(ns): 20413229 2025-07-14 00:26:54,367 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747997_7173, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-14 00:26:57,495 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747997_7173 replica FinalizedReplica, blk_1073747997_7173, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073747997 for deletion 2025-07-14 00:26:57,496 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747997_7173 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073747997 2025-07-14 00:27:54,336 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073747998_7174 src: /192.168.158.7:57092 dest: /192.168.158.4:9866 2025-07-14 00:27:54,353 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:57092, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1229774170_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073747998_7174, duration(ns): 14949094 2025-07-14 00:27:54,354 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073747998_7174, type=LAST_IN_PIPELINE terminating 2025-07-14 00:28:00,496 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073747998_7174 replica FinalizedReplica, blk_1073747998_7174, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073747998 for deletion 2025-07-14 00:28:00,497 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073747998_7174 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073747998 2025-07-14 00:30:54,336 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748001_7177 src: /192.168.158.8:49916 dest: /192.168.158.4:9866 2025-07-14 00:30:54,354 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:49916, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1664680367_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748001_7177, duration(ns): 15592242 2025-07-14 00:30:54,354 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748001_7177, type=LAST_IN_PIPELINE terminating 2025-07-14 00:30:57,498 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748001_7177 replica FinalizedReplica, blk_1073748001_7177, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748001 for deletion 2025-07-14 00:30:57,499 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748001_7177 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748001 2025-07-14 00:31:54,349 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748002_7178 src: /192.168.158.8:39138 dest: /192.168.158.4:9866 2025-07-14 00:31:54,368 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:39138, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-316475406_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748002_7178, duration(ns): 16700618 2025-07-14 00:31:54,368 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748002_7178, type=LAST_IN_PIPELINE terminating 2025-07-14 00:32:00,498 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748002_7178 replica FinalizedReplica, blk_1073748002_7178, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748002 for deletion 2025-07-14 00:32:00,499 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748002_7178 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748002 2025-07-14 00:32:54,340 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748003_7179 src: /192.168.158.6:37758 dest: /192.168.158.4:9866 2025-07-14 00:32:54,358 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:37758, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_729732659_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748003_7179, duration(ns): 16056359 2025-07-14 00:32:54,358 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748003_7179, type=LAST_IN_PIPELINE terminating 2025-07-14 00:33:00,499 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748003_7179 replica FinalizedReplica, blk_1073748003_7179, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748003 for deletion 2025-07-14 00:33:00,501 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748003_7179 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748003 2025-07-14 00:35:04,335 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748005_7181 src: /192.168.158.9:50444 dest: /192.168.158.4:9866 2025-07-14 00:35:04,361 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:50444, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-476225305_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748005_7181, duration(ns): 20949449 2025-07-14 00:35:04,362 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748005_7181, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-14 00:35:09,504 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748005_7181 replica FinalizedReplica, blk_1073748005_7181, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748005 for deletion 2025-07-14 00:35:09,505 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748005_7181 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748005 2025-07-14 00:36:04,339 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748006_7182 src: /192.168.158.5:32920 dest: /192.168.158.4:9866 2025-07-14 00:36:04,359 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:32920, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1336732522_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748006_7182, duration(ns): 17624853 2025-07-14 00:36:04,360 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748006_7182, type=LAST_IN_PIPELINE terminating 2025-07-14 00:36:06,503 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748006_7182 replica FinalizedReplica, blk_1073748006_7182, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748006 for deletion 2025-07-14 00:36:06,504 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748006_7182 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748006 2025-07-14 00:37:04,364 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748007_7183 src: /192.168.158.7:46078 dest: /192.168.158.4:9866 2025-07-14 00:37:04,390 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:46078, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1231939824_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748007_7183, duration(ns): 20160152 2025-07-14 00:37:04,390 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748007_7183, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-14 00:37:09,506 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748007_7183 replica FinalizedReplica, blk_1073748007_7183, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748007 for deletion 2025-07-14 00:37:09,508 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748007_7183 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748007 2025-07-14 00:40:04,348 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748010_7186 src: /192.168.158.1:55240 dest: /192.168.158.4:9866 2025-07-14 00:40:04,382 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55240, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_788423249_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748010_7186, duration(ns): 24662899 2025-07-14 00:40:04,382 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748010_7186, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-14 00:40:09,512 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748010_7186 replica FinalizedReplica, blk_1073748010_7186, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748010 for deletion 2025-07-14 00:40:09,513 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748010_7186 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748010 2025-07-14 00:41:04,354 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748011_7187 src: /192.168.158.1:34968 dest: /192.168.158.4:9866 2025-07-14 00:41:04,384 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34968, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1064686638_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748011_7187, duration(ns): 21897923 2025-07-14 00:41:04,385 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748011_7187, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-14 00:41:09,513 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748011_7187 replica FinalizedReplica, blk_1073748011_7187, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748011 for deletion 2025-07-14 00:41:09,515 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748011_7187 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748011 2025-07-14 00:45:04,372 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748015_7191 src: /192.168.158.6:48192 dest: /192.168.158.4:9866 2025-07-14 00:45:04,389 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:48192, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-842003980_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748015_7191, duration(ns): 15819127 2025-07-14 00:45:04,390 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748015_7191, type=LAST_IN_PIPELINE terminating 2025-07-14 00:45:06,524 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748015_7191 replica FinalizedReplica, blk_1073748015_7191, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748015 for deletion 2025-07-14 00:45:06,526 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748015_7191 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748015 2025-07-14 00:47:04,358 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748017_7193 src: /192.168.158.8:56782 dest: /192.168.158.4:9866 2025-07-14 00:47:04,384 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:56782, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2053732591_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748017_7193, duration(ns): 19666406 2025-07-14 00:47:04,384 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748017_7193, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-14 00:47:09,533 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748017_7193 replica FinalizedReplica, blk_1073748017_7193, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748017 for deletion 2025-07-14 00:47:09,534 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748017_7193 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748017 2025-07-14 00:48:04,360 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748018_7194 src: /192.168.158.5:47850 dest: /192.168.158.4:9866 2025-07-14 00:48:04,385 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:47850, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_442011014_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748018_7194, duration(ns): 19571888 2025-07-14 00:48:04,385 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748018_7194, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-14 00:48:06,534 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748018_7194 replica FinalizedReplica, blk_1073748018_7194, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748018 for deletion 2025-07-14 00:48:06,535 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748018_7194 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748018 2025-07-14 00:49:09,357 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748019_7195 src: /192.168.158.5:52540 dest: /192.168.158.4:9866 2025-07-14 00:49:09,386 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:52540, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-226624951_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748019_7195, duration(ns): 22205755 2025-07-14 00:49:09,386 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748019_7195, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-14 00:49:15,534 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748019_7195 replica FinalizedReplica, blk_1073748019_7195, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748019 for deletion 2025-07-14 00:49:15,535 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748019_7195 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748019 2025-07-14 00:50:09,366 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748020_7196 src: /192.168.158.6:41780 dest: /192.168.158.4:9866 2025-07-14 00:50:09,384 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:41780, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-134640279_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748020_7196, duration(ns): 15330141 2025-07-14 00:50:09,384 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748020_7196, type=LAST_IN_PIPELINE terminating 2025-07-14 00:50:12,534 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748020_7196 replica FinalizedReplica, blk_1073748020_7196, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748020 for deletion 2025-07-14 00:50:12,535 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748020_7196 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748020 2025-07-14 00:51:09,368 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748021_7197 src: /192.168.158.9:47790 dest: /192.168.158.4:9866 2025-07-14 00:51:09,393 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:47790, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-311517201_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748021_7197, duration(ns): 19603584 2025-07-14 00:51:09,393 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748021_7197, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-14 00:51:12,537 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748021_7197 replica FinalizedReplica, blk_1073748021_7197, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748021 for deletion 2025-07-14 00:51:12,538 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748021_7197 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748021 2025-07-14 00:55:14,368 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748025_7201 src: /192.168.158.1:34428 dest: /192.168.158.4:9866 2025-07-14 00:55:14,399 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34428, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_99678189_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748025_7201, duration(ns): 22217760 2025-07-14 00:55:14,399 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748025_7201, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-14 00:55:15,543 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748025_7201 replica FinalizedReplica, blk_1073748025_7201, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748025 for deletion 2025-07-14 00:55:15,544 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748025_7201 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748025 2025-07-14 00:56:19,371 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748026_7202 src: /192.168.158.9:32954 dest: /192.168.158.4:9866 2025-07-14 00:56:19,397 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:32954, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1316342700_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748026_7202, duration(ns): 20801669 2025-07-14 00:56:19,397 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748026_7202, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-14 00:56:24,546 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748026_7202 replica FinalizedReplica, blk_1073748026_7202, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748026 for deletion 2025-07-14 00:56:24,547 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748026_7202 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748026 2025-07-14 00:57:24,370 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748027_7203 src: /192.168.158.9:60636 dest: /192.168.158.4:9866 2025-07-14 00:57:24,388 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:60636, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_727386276_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748027_7203, duration(ns): 15330466 2025-07-14 00:57:24,388 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748027_7203, type=LAST_IN_PIPELINE terminating 2025-07-14 00:57:27,548 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748027_7203 replica FinalizedReplica, blk_1073748027_7203, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748027 for deletion 2025-07-14 00:57:27,549 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748027_7203 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748027 2025-07-14 00:58:24,366 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748028_7204 src: /192.168.158.1:45222 dest: /192.168.158.4:9866 2025-07-14 00:58:24,398 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45222, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-595734460_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748028_7204, duration(ns): 23550970 2025-07-14 00:58:24,399 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748028_7204, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-14 00:58:27,548 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748028_7204 replica FinalizedReplica, blk_1073748028_7204, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748028 for deletion 2025-07-14 00:58:27,549 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748028_7204 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748028 2025-07-14 01:04:29,375 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748034_7210 src: /192.168.158.7:37200 dest: /192.168.158.4:9866 2025-07-14 01:04:29,397 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:37200, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1496760294_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748034_7210, duration(ns): 17504908 2025-07-14 01:04:29,398 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748034_7210, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-14 01:04:30,557 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748034_7210 replica FinalizedReplica, blk_1073748034_7210, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748034 for deletion 2025-07-14 01:04:30,559 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748034_7210 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748034 2025-07-14 01:05:29,396 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748035_7211 src: /192.168.158.1:43904 dest: /192.168.158.4:9866 2025-07-14 01:05:29,425 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43904, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-378208549_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748035_7211, duration(ns): 20565734 2025-07-14 01:05:29,425 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748035_7211, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-14 01:05:30,561 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748035_7211 replica FinalizedReplica, blk_1073748035_7211, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748035 for deletion 2025-07-14 01:05:30,562 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748035_7211 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748035 2025-07-14 01:09:39,383 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748039_7215 src: /192.168.158.8:60604 dest: /192.168.158.4:9866 2025-07-14 01:09:39,406 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:60604, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-310234995_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748039_7215, duration(ns): 18472416 2025-07-14 01:09:39,407 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748039_7215, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-14 01:09:42,574 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748039_7215 replica FinalizedReplica, blk_1073748039_7215, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748039 for deletion 2025-07-14 01:09:42,575 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748039_7215 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748039 2025-07-14 01:10:44,386 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748040_7216 src: /192.168.158.5:52846 dest: /192.168.158.4:9866 2025-07-14 01:10:44,405 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:52846, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2024025590_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748040_7216, duration(ns): 16180081 2025-07-14 01:10:44,405 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748040_7216, type=LAST_IN_PIPELINE terminating 2025-07-14 01:10:45,575 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748040_7216 replica FinalizedReplica, blk_1073748040_7216, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748040 for deletion 2025-07-14 01:10:45,576 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748040_7216 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748040 2025-07-14 01:11:44,394 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748041_7217 src: /192.168.158.7:53844 dest: /192.168.158.4:9866 2025-07-14 01:11:44,413 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:53844, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2113778420_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748041_7217, duration(ns): 17583209 2025-07-14 01:11:44,414 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748041_7217, type=LAST_IN_PIPELINE terminating 2025-07-14 01:11:45,577 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748041_7217 replica FinalizedReplica, blk_1073748041_7217, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748041 for deletion 2025-07-14 01:11:45,578 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748041_7217 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748041 2025-07-14 01:13:44,391 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748043_7219 src: /192.168.158.6:35176 dest: /192.168.158.4:9866 2025-07-14 01:13:44,408 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:35176, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_420592156_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748043_7219, duration(ns): 15129293 2025-07-14 01:13:44,409 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748043_7219, type=LAST_IN_PIPELINE terminating 2025-07-14 01:13:45,578 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748043_7219 replica FinalizedReplica, blk_1073748043_7219, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748043 for deletion 2025-07-14 01:13:45,579 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748043_7219 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748043 2025-07-14 01:15:44,397 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748045_7221 src: /192.168.158.7:35522 dest: /192.168.158.4:9866 2025-07-14 01:15:44,415 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:35522, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-601205505_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748045_7221, duration(ns): 15748127 2025-07-14 01:15:44,415 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748045_7221, type=LAST_IN_PIPELINE terminating 2025-07-14 01:15:45,582 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748045_7221 replica FinalizedReplica, blk_1073748045_7221, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748045 for deletion 2025-07-14 01:15:45,583 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748045_7221 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748045 2025-07-14 01:17:49,399 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748047_7223 src: /192.168.158.6:53634 dest: /192.168.158.4:9866 2025-07-14 01:17:49,418 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:53634, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1687458132_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748047_7223, duration(ns): 17221189 2025-07-14 01:17:49,418 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748047_7223, type=LAST_IN_PIPELINE terminating 2025-07-14 01:17:51,587 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748047_7223 replica FinalizedReplica, blk_1073748047_7223, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748047 for deletion 2025-07-14 01:17:51,588 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748047_7223 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748047 2025-07-14 01:22:54,403 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748052_7228 src: /192.168.158.1:60236 dest: /192.168.158.4:9866 2025-07-14 01:22:54,439 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60236, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1565805464_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748052_7228, duration(ns): 25853621 2025-07-14 01:22:54,439 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748052_7228, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-14 01:22:57,599 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748052_7228 replica FinalizedReplica, blk_1073748052_7228, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748052 for deletion 2025-07-14 01:22:57,599 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748052_7228 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748052 2025-07-14 01:23:59,411 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748053_7229 src: /192.168.158.9:41226 dest: /192.168.158.4:9866 2025-07-14 01:23:59,428 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:41226, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_402906440_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748053_7229, duration(ns): 15620399 2025-07-14 01:23:59,429 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748053_7229, type=LAST_IN_PIPELINE terminating 2025-07-14 01:24:03,603 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748053_7229 replica FinalizedReplica, blk_1073748053_7229, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748053 for deletion 2025-07-14 01:24:03,604 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748053_7229 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748053 2025-07-14 01:27:09,408 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748056_7232 src: /192.168.158.1:59296 dest: /192.168.158.4:9866 2025-07-14 01:27:09,443 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59296, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_305985520_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748056_7232, duration(ns): 26286608 2025-07-14 01:27:09,443 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748056_7232, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-14 01:27:12,610 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748056_7232 replica FinalizedReplica, blk_1073748056_7232, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748056 for deletion 2025-07-14 01:27:12,611 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748056_7232 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748056 2025-07-14 01:28:09,421 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748057_7233 src: /192.168.158.5:56930 dest: /192.168.158.4:9866 2025-07-14 01:28:09,440 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:56930, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1534709559_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748057_7233, duration(ns): 16319433 2025-07-14 01:28:09,440 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748057_7233, type=LAST_IN_PIPELINE terminating 2025-07-14 01:28:12,610 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748057_7233 replica FinalizedReplica, blk_1073748057_7233, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748057 for deletion 2025-07-14 01:28:12,612 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748057_7233 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748057 2025-07-14 01:30:09,423 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748059_7235 src: /192.168.158.8:58836 dest: /192.168.158.4:9866 2025-07-14 01:30:09,447 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:58836, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-435388823_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748059_7235, duration(ns): 18529776 2025-07-14 01:30:09,447 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748059_7235, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-14 01:30:12,615 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748059_7235 replica FinalizedReplica, blk_1073748059_7235, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748059 for deletion 2025-07-14 01:30:12,616 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748059_7235 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748059 2025-07-14 01:31:09,420 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748060_7236 src: /192.168.158.7:44282 dest: /192.168.158.4:9866 2025-07-14 01:31:09,448 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:44282, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_472559196_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748060_7236, duration(ns): 21266414 2025-07-14 01:31:09,448 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748060_7236, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-14 01:31:12,617 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748060_7236 replica FinalizedReplica, blk_1073748060_7236, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748060 for deletion 2025-07-14 01:31:12,620 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748060_7236 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748060 2025-07-14 01:33:09,426 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748062_7238 src: /192.168.158.1:45290 dest: /192.168.158.4:9866 2025-07-14 01:33:09,458 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45290, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1568526489_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748062_7238, duration(ns): 24127242 2025-07-14 01:33:09,459 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748062_7238, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-14 01:33:15,622 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748062_7238 replica FinalizedReplica, blk_1073748062_7238, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748062 for deletion 2025-07-14 01:33:15,623 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748062_7238 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748062 2025-07-14 01:38:14,440 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748067_7243 src: /192.168.158.1:35320 dest: /192.168.158.4:9866 2025-07-14 01:38:14,469 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35320, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2089169064_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748067_7243, duration(ns): 20567392 2025-07-14 01:38:14,470 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748067_7243, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-14 01:38:15,635 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748067_7243 replica FinalizedReplica, blk_1073748067_7243, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748067 for deletion 2025-07-14 01:38:15,636 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748067_7243 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748067 2025-07-14 01:39:14,443 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748068_7244 src: /192.168.158.8:59064 dest: /192.168.158.4:9866 2025-07-14 01:39:14,461 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:59064, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1144982197_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748068_7244, duration(ns): 15820545 2025-07-14 01:39:14,461 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748068_7244, type=LAST_IN_PIPELINE terminating 2025-07-14 01:39:15,636 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748068_7244 replica FinalizedReplica, blk_1073748068_7244, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748068 for deletion 2025-07-14 01:39:15,638 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748068_7244 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748068 2025-07-14 01:40:19,431 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748069_7245 src: /192.168.158.6:34682 dest: /192.168.158.4:9866 2025-07-14 01:40:19,456 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:34682, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2033321900_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748069_7245, duration(ns): 18831658 2025-07-14 01:40:19,456 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748069_7245, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-14 01:40:21,639 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748069_7245 replica FinalizedReplica, blk_1073748069_7245, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748069 for deletion 2025-07-14 01:40:21,640 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748069_7245 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748069 2025-07-14 01:41:19,439 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748070_7246 src: /192.168.158.9:40730 dest: /192.168.158.4:9866 2025-07-14 01:41:19,465 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:40730, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_661120402_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748070_7246, duration(ns): 19715408 2025-07-14 01:41:19,465 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748070_7246, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-14 01:41:21,640 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748070_7246 replica FinalizedReplica, blk_1073748070_7246, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748070 for deletion 2025-07-14 01:41:21,642 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748070_7246 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748070 2025-07-14 01:42:19,444 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748071_7247 src: /192.168.158.1:56632 dest: /192.168.158.4:9866 2025-07-14 01:42:19,476 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56632, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1803957558_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748071_7247, duration(ns): 22968466 2025-07-14 01:42:19,476 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748071_7247, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-14 01:42:21,643 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748071_7247 replica FinalizedReplica, blk_1073748071_7247, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748071 for deletion 2025-07-14 01:42:21,644 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748071_7247 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748071 2025-07-14 01:43:24,476 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748072_7248 src: /192.168.158.8:50796 dest: /192.168.158.4:9866 2025-07-14 01:43:24,502 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:50796, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2027996367_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748072_7248, duration(ns): 21079230 2025-07-14 01:43:24,502 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748072_7248, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-14 01:43:30,646 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748072_7248 replica FinalizedReplica, blk_1073748072_7248, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748072 for deletion 2025-07-14 01:43:30,648 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748072_7248 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748072 2025-07-14 01:45:24,440 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748074_7250 src: /192.168.158.6:60146 dest: /192.168.158.4:9866 2025-07-14 01:45:24,463 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:60146, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2130027994_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748074_7250, duration(ns): 17670519 2025-07-14 01:45:24,464 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748074_7250, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-14 01:45:30,650 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748074_7250 replica FinalizedReplica, blk_1073748074_7250, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748074 for deletion 2025-07-14 01:45:30,651 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748074_7250 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748074 2025-07-14 01:49:24,449 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748078_7254 src: /192.168.158.6:52256 dest: /192.168.158.4:9866 2025-07-14 01:49:24,464 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:52256, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2021445481_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748078_7254, duration(ns): 13780119 2025-07-14 01:49:24,464 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748078_7254, type=LAST_IN_PIPELINE terminating 2025-07-14 01:49:30,661 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748078_7254 replica FinalizedReplica, blk_1073748078_7254, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748078 for deletion 2025-07-14 01:49:30,662 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748078_7254 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748078 2025-07-14 01:52:29,459 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748081_7257 src: /192.168.158.7:60642 dest: /192.168.158.4:9866 2025-07-14 01:52:29,483 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:60642, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1263955625_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748081_7257, duration(ns): 18546533 2025-07-14 01:52:29,483 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748081_7257, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-14 01:52:33,667 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748081_7257 replica FinalizedReplica, blk_1073748081_7257, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748081 for deletion 2025-07-14 01:52:33,668 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748081_7257 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748081 2025-07-14 01:53:29,456 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748082_7258 src: /192.168.158.9:47816 dest: /192.168.158.4:9866 2025-07-14 01:53:29,475 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:47816, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-441158142_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748082_7258, duration(ns): 16379146 2025-07-14 01:53:29,475 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748082_7258, type=LAST_IN_PIPELINE terminating 2025-07-14 01:53:30,670 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748082_7258 replica FinalizedReplica, blk_1073748082_7258, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748082 for deletion 2025-07-14 01:53:30,671 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748082_7258 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748082 2025-07-14 01:56:34,453 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748085_7261 src: /192.168.158.1:51972 dest: /192.168.158.4:9866 2025-07-14 01:56:34,482 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51972, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1337562916_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748085_7261, duration(ns): 21334863 2025-07-14 01:56:34,482 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748085_7261, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-14 01:56:36,679 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748085_7261 replica FinalizedReplica, blk_1073748085_7261, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748085 for deletion 2025-07-14 01:56:36,680 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748085_7261 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748085 2025-07-14 01:58:39,467 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748087_7263 src: /192.168.158.8:35446 dest: /192.168.158.4:9866 2025-07-14 01:58:39,486 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:35446, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-510083584_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748087_7263, duration(ns): 17467714 2025-07-14 01:58:39,487 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748087_7263, type=LAST_IN_PIPELINE terminating 2025-07-14 01:58:45,682 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748087_7263 replica FinalizedReplica, blk_1073748087_7263, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748087 for deletion 2025-07-14 01:58:45,683 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748087_7263 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748087 2025-07-14 02:00:39,473 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748089_7265 src: /192.168.158.9:34088 dest: /192.168.158.4:9866 2025-07-14 02:00:39,491 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:34088, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_273770328_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748089_7265, duration(ns): 15613316 2025-07-14 02:00:39,491 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748089_7265, type=LAST_IN_PIPELINE terminating 2025-07-14 02:00:45,684 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748089_7265 replica FinalizedReplica, blk_1073748089_7265, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748089 for deletion 2025-07-14 02:00:45,686 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748089_7265 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748089 2025-07-14 02:01:39,468 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748090_7266 src: /192.168.158.1:51928 dest: /192.168.158.4:9866 2025-07-14 02:01:39,502 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51928, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1043930963_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748090_7266, duration(ns): 25558622 2025-07-14 02:01:39,502 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748090_7266, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-14 02:01:42,691 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748090_7266 replica FinalizedReplica, blk_1073748090_7266, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748090 for deletion 2025-07-14 02:01:42,692 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748090_7266 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748090 2025-07-14 02:05:44,483 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748094_7270 src: /192.168.158.6:45130 dest: /192.168.158.4:9866 2025-07-14 02:05:44,501 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:45130, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-943227345_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748094_7270, duration(ns): 15933548 2025-07-14 02:05:44,501 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748094_7270, type=LAST_IN_PIPELINE terminating 2025-07-14 02:05:45,700 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748094_7270 replica FinalizedReplica, blk_1073748094_7270, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748094 for deletion 2025-07-14 02:05:45,701 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748094_7270 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748094 2025-07-14 02:08:44,516 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748097_7273 src: /192.168.158.6:47904 dest: /192.168.158.4:9866 2025-07-14 02:08:44,541 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:47904, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_415896355_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748097_7273, duration(ns): 19046531 2025-07-14 02:08:44,541 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748097_7273, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-14 02:08:45,707 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748097_7273 replica FinalizedReplica, blk_1073748097_7273, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748097 for deletion 2025-07-14 02:08:45,708 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748097_7273 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748097 2025-07-14 02:09:49,481 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748098_7274 src: /192.168.158.1:52348 dest: /192.168.158.4:9866 2025-07-14 02:09:49,513 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52348, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1396839606_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748098_7274, duration(ns): 22802203 2025-07-14 02:09:49,514 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748098_7274, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-14 02:09:51,709 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748098_7274 replica FinalizedReplica, blk_1073748098_7274, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748098 for deletion 2025-07-14 02:09:51,710 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748098_7274 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748098 2025-07-14 02:10:49,490 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748099_7275 src: /192.168.158.6:49174 dest: /192.168.158.4:9866 2025-07-14 02:10:49,514 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:49174, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1742219290_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748099_7275, duration(ns): 18552821 2025-07-14 02:10:49,514 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748099_7275, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-14 02:10:51,712 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748099_7275 replica FinalizedReplica, blk_1073748099_7275, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748099 for deletion 2025-07-14 02:10:51,713 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748099_7275 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748099 2025-07-14 02:14:59,490 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748103_7279 src: /192.168.158.1:35430 dest: /192.168.158.4:9866 2025-07-14 02:14:59,522 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35430, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1214690894_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748103_7279, duration(ns): 23010076 2025-07-14 02:14:59,523 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748103_7279, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-14 02:15:03,718 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748103_7279 replica FinalizedReplica, blk_1073748103_7279, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748103 for deletion 2025-07-14 02:15:03,719 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748103_7279 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748103 2025-07-14 02:17:04,493 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748105_7281 src: /192.168.158.9:54262 dest: /192.168.158.4:9866 2025-07-14 02:17:04,518 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:54262, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1504139508_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748105_7281, duration(ns): 19470405 2025-07-14 02:17:04,518 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748105_7281, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-14 02:17:09,721 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748105_7281 replica FinalizedReplica, blk_1073748105_7281, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748105 for deletion 2025-07-14 02:17:09,722 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748105_7281 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748105 2025-07-14 02:18:04,497 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748106_7282 src: /192.168.158.8:34236 dest: /192.168.158.4:9866 2025-07-14 02:18:04,523 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:34236, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1749378036_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748106_7282, duration(ns): 20268621 2025-07-14 02:18:04,523 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748106_7282, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-14 02:18:06,722 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748106_7282 replica FinalizedReplica, blk_1073748106_7282, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748106 for deletion 2025-07-14 02:18:06,724 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748106_7282 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748106 2025-07-14 02:19:04,499 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748107_7283 src: /192.168.158.5:59498 dest: /192.168.158.4:9866 2025-07-14 02:19:04,525 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:59498, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2087966354_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748107_7283, duration(ns): 20821163 2025-07-14 02:19:04,526 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748107_7283, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-14 02:19:06,725 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748107_7283 replica FinalizedReplica, blk_1073748107_7283, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748107 for deletion 2025-07-14 02:19:06,727 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748107_7283 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748107 2025-07-14 02:20:04,497 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748108_7284 src: /192.168.158.1:54472 dest: /192.168.158.4:9866 2025-07-14 02:20:04,531 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54472, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1684322578_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748108_7284, duration(ns): 25560093 2025-07-14 02:20:04,531 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748108_7284, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-14 02:20:09,728 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748108_7284 replica FinalizedReplica, blk_1073748108_7284, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748108 for deletion 2025-07-14 02:20:09,729 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748108_7284 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748108 2025-07-14 02:22:04,516 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748110_7286 src: /192.168.158.7:44952 dest: /192.168.158.4:9866 2025-07-14 02:22:04,534 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:44952, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1976160755_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748110_7286, duration(ns): 16020657 2025-07-14 02:22:04,534 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748110_7286, type=LAST_IN_PIPELINE terminating 2025-07-14 02:22:06,730 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748110_7286 replica FinalizedReplica, blk_1073748110_7286, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748110 for deletion 2025-07-14 02:22:06,731 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748110_7286 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748110 2025-07-14 02:23:04,513 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748111_7287 src: /192.168.158.9:37184 dest: /192.168.158.4:9866 2025-07-14 02:23:04,530 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:37184, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_720917792_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748111_7287, duration(ns): 14583415 2025-07-14 02:23:04,530 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748111_7287, type=LAST_IN_PIPELINE terminating 2025-07-14 02:23:06,730 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748111_7287 replica FinalizedReplica, blk_1073748111_7287, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748111 for deletion 2025-07-14 02:23:06,731 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748111_7287 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748111 2025-07-14 02:25:04,519 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748113_7289 src: /192.168.158.5:40478 dest: /192.168.158.4:9866 2025-07-14 02:25:04,537 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:40478, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1402518312_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748113_7289, duration(ns): 15784954 2025-07-14 02:25:04,537 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748113_7289, type=LAST_IN_PIPELINE terminating 2025-07-14 02:25:06,731 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748113_7289 replica FinalizedReplica, blk_1073748113_7289, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748113 for deletion 2025-07-14 02:25:06,732 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748113_7289 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748113 2025-07-14 02:26:09,516 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748114_7290 src: /192.168.158.6:46764 dest: /192.168.158.4:9866 2025-07-14 02:26:09,530 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:46764, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1889902539_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748114_7290, duration(ns): 12723611 2025-07-14 02:26:09,531 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748114_7290, type=LAST_IN_PIPELINE terminating 2025-07-14 02:26:15,731 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748114_7290 replica FinalizedReplica, blk_1073748114_7290, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748114 for deletion 2025-07-14 02:26:15,732 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748114_7290 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748114 2025-07-14 02:27:09,520 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748115_7291 src: /192.168.158.1:47342 dest: /192.168.158.4:9866 2025-07-14 02:27:09,550 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47342, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_263014676_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748115_7291, duration(ns): 21067715 2025-07-14 02:27:09,550 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748115_7291, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-14 02:27:15,735 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748115_7291 replica FinalizedReplica, blk_1073748115_7291, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748115 for deletion 2025-07-14 02:27:15,736 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748115_7291 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748115 2025-07-14 02:28:09,522 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748116_7292 src: /192.168.158.8:43484 dest: /192.168.158.4:9866 2025-07-14 02:28:09,540 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:43484, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-705030653_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748116_7292, duration(ns): 15729232 2025-07-14 02:28:09,541 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748116_7292, type=LAST_IN_PIPELINE terminating 2025-07-14 02:28:12,734 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748116_7292 replica FinalizedReplica, blk_1073748116_7292, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748116 for deletion 2025-07-14 02:28:12,735 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748116_7292 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748116 2025-07-14 02:32:14,525 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748120_7296 src: /192.168.158.9:47776 dest: /192.168.158.4:9866 2025-07-14 02:32:14,549 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:47776, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1397328061_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748120_7296, duration(ns): 19610948 2025-07-14 02:32:14,550 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748120_7296, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-14 02:32:15,740 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748120_7296 replica FinalizedReplica, blk_1073748120_7296, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748120 for deletion 2025-07-14 02:32:15,741 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748120_7296 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748120 2025-07-14 02:34:19,520 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748122_7298 src: /192.168.158.1:45308 dest: /192.168.158.4:9866 2025-07-14 02:34:19,550 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45308, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_972058967_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748122_7298, duration(ns): 21386881 2025-07-14 02:34:19,550 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748122_7298, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-14 02:34:21,745 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748122_7298 replica FinalizedReplica, blk_1073748122_7298, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748122 for deletion 2025-07-14 02:34:21,746 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748122_7298 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748122 2025-07-14 02:35:19,530 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748123_7299 src: /192.168.158.1:46106 dest: /192.168.158.4:9866 2025-07-14 02:35:19,562 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46106, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1222465760_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748123_7299, duration(ns): 23432767 2025-07-14 02:35:19,563 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748123_7299, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-14 02:35:21,748 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748123_7299 replica FinalizedReplica, blk_1073748123_7299, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748123 for deletion 2025-07-14 02:35:21,749 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748123_7299 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748123 2025-07-14 02:36:24,530 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748124_7300 src: /192.168.158.1:56930 dest: /192.168.158.4:9866 2025-07-14 02:36:24,561 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56930, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-792340857_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748124_7300, duration(ns): 22706971 2025-07-14 02:36:24,561 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748124_7300, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-14 02:36:27,751 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748124_7300 replica FinalizedReplica, blk_1073748124_7300, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748124 for deletion 2025-07-14 02:36:27,752 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748124_7300 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748124 2025-07-14 02:37:29,546 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748125_7301 src: /192.168.158.1:34998 dest: /192.168.158.4:9866 2025-07-14 02:37:29,576 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34998, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-611675476_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748125_7301, duration(ns): 21910506 2025-07-14 02:37:29,577 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748125_7301, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-14 02:37:33,752 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748125_7301 replica FinalizedReplica, blk_1073748125_7301, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748125 for deletion 2025-07-14 02:37:33,753 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748125_7301 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748125 2025-07-14 02:38:34,554 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748126_7302 src: /192.168.158.9:46172 dest: /192.168.158.4:9866 2025-07-14 02:38:34,574 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:46172, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1903525644_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748126_7302, duration(ns): 17115932 2025-07-14 02:38:34,574 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748126_7302, type=LAST_IN_PIPELINE terminating 2025-07-14 02:38:39,757 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748126_7302 replica FinalizedReplica, blk_1073748126_7302, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748126 for deletion 2025-07-14 02:38:39,758 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748126_7302 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748126 2025-07-14 02:50:54,564 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748138_7314 src: /192.168.158.1:56412 dest: /192.168.158.4:9866 2025-07-14 02:50:54,594 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56412, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1644325739_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748138_7314, duration(ns): 21719653 2025-07-14 02:50:54,595 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748138_7314, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-14 02:50:57,785 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748138_7314 replica FinalizedReplica, blk_1073748138_7314, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748138 for deletion 2025-07-14 02:50:57,786 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748138_7314 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748138 2025-07-14 02:51:54,558 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748139_7315 src: /192.168.158.5:55024 dest: /192.168.158.4:9866 2025-07-14 02:51:54,586 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:55024, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_738891455_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748139_7315, duration(ns): 23160615 2025-07-14 02:51:54,587 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748139_7315, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-14 02:51:57,790 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748139_7315 replica FinalizedReplica, blk_1073748139_7315, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748139 for deletion 2025-07-14 02:51:57,791 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748139_7315 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748139 2025-07-14 02:52:54,568 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748140_7316 src: /192.168.158.8:47460 dest: /192.168.158.4:9866 2025-07-14 02:52:54,592 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:47460, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1641039521_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748140_7316, duration(ns): 17453035 2025-07-14 02:52:54,592 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748140_7316, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-14 02:52:57,789 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748140_7316 replica FinalizedReplica, blk_1073748140_7316, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748140 for deletion 2025-07-14 02:52:57,790 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748140_7316 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748140 2025-07-14 02:53:54,563 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748141_7317 src: /192.168.158.1:54068 dest: /192.168.158.4:9866 2025-07-14 02:53:54,597 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54068, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_122253055_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748141_7317, duration(ns): 23798697 2025-07-14 02:53:54,597 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748141_7317, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-14 02:54:00,790 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748141_7317 replica FinalizedReplica, blk_1073748141_7317, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748141 for deletion 2025-07-14 02:54:00,791 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748141_7317 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748141 2025-07-14 02:55:59,562 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748143_7319 src: /192.168.158.1:51304 dest: /192.168.158.4:9866 2025-07-14 02:55:59,594 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51304, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-809136273_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748143_7319, duration(ns): 22782896 2025-07-14 02:55:59,595 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748143_7319, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-14 02:56:00,791 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748143_7319 replica FinalizedReplica, blk_1073748143_7319, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748143 for deletion 2025-07-14 02:56:00,792 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748143_7319 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748143 2025-07-14 02:56:59,585 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748144_7320 src: /192.168.158.5:41578 dest: /192.168.158.4:9866 2025-07-14 02:56:59,605 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:41578, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1919017906_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748144_7320, duration(ns): 17554129 2025-07-14 02:56:59,605 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748144_7320, type=LAST_IN_PIPELINE terminating 2025-07-14 02:57:03,793 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748144_7320 replica FinalizedReplica, blk_1073748144_7320, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748144 for deletion 2025-07-14 02:57:03,794 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748144_7320 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748144 2025-07-14 02:57:59,583 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748145_7321 src: /192.168.158.9:47786 dest: /192.168.158.4:9866 2025-07-14 02:57:59,600 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:47786, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1575188268_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748145_7321, duration(ns): 14825133 2025-07-14 02:57:59,601 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748145_7321, type=LAST_IN_PIPELINE terminating 2025-07-14 02:58:03,795 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748145_7321 replica FinalizedReplica, blk_1073748145_7321, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748145 for deletion 2025-07-14 02:58:03,796 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748145_7321 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748145 2025-07-14 02:59:59,583 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748147_7323 src: /192.168.158.9:54182 dest: /192.168.158.4:9866 2025-07-14 02:59:59,608 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:54182, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_115269543_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748147_7323, duration(ns): 19005116 2025-07-14 02:59:59,608 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748147_7323, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-14 03:00:03,798 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748147_7323 replica FinalizedReplica, blk_1073748147_7323, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748147 for deletion 2025-07-14 03:00:03,799 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748147_7323 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748147 2025-07-14 03:00:59,581 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748148_7324 src: /192.168.158.1:35946 dest: /192.168.158.4:9866 2025-07-14 03:00:59,638 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35946, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1915945870_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748148_7324, duration(ns): 47843963 2025-07-14 03:00:59,638 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748148_7324, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-14 03:01:00,798 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748148_7324 replica FinalizedReplica, blk_1073748148_7324, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748148 for deletion 2025-07-14 03:01:00,799 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748148_7324 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748148 2025-07-14 03:03:04,585 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748150_7326 src: /192.168.158.5:39536 dest: /192.168.158.4:9866 2025-07-14 03:03:04,612 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:39536, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1891031494_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748150_7326, duration(ns): 22494372 2025-07-14 03:03:04,612 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748150_7326, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-14 03:03:09,801 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748150_7326 replica FinalizedReplica, blk_1073748150_7326, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748150 for deletion 2025-07-14 03:03:09,802 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748150_7326 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748150 2025-07-14 03:04:04,598 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748151_7327 src: /192.168.158.9:49178 dest: /192.168.158.4:9866 2025-07-14 03:04:04,616 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:49178, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_15451157_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748151_7327, duration(ns): 15763024 2025-07-14 03:04:04,616 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748151_7327, type=LAST_IN_PIPELINE terminating 2025-07-14 03:04:06,801 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748151_7327 replica FinalizedReplica, blk_1073748151_7327, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748151 for deletion 2025-07-14 03:04:06,802 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748151_7327 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748151 2025-07-14 03:05:04,597 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748152_7328 src: /192.168.158.1:35358 dest: /192.168.158.4:9866 2025-07-14 03:05:04,630 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35358, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1781866598_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748152_7328, duration(ns): 23383375 2025-07-14 03:05:04,630 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748152_7328, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-14 03:05:09,802 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748152_7328 replica FinalizedReplica, blk_1073748152_7328, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748152 for deletion 2025-07-14 03:05:09,803 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748152_7328 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748152 2025-07-14 03:08:14,596 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748155_7331 src: /192.168.158.8:39588 dest: /192.168.158.4:9866 2025-07-14 03:08:14,614 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:39588, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1860123304_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748155_7331, duration(ns): 16379169 2025-07-14 03:08:14,615 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748155_7331, type=LAST_IN_PIPELINE terminating 2025-07-14 03:08:15,807 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748155_7331 replica FinalizedReplica, blk_1073748155_7331, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748155 for deletion 2025-07-14 03:08:15,808 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748155_7331 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748155 2025-07-14 03:09:14,598 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748156_7332 src: /192.168.158.5:48270 dest: /192.168.158.4:9866 2025-07-14 03:09:14,616 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:48270, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2080788673_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748156_7332, duration(ns): 16103100 2025-07-14 03:09:14,616 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748156_7332, type=LAST_IN_PIPELINE terminating 2025-07-14 03:09:18,808 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748156_7332 replica FinalizedReplica, blk_1073748156_7332, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748156 for deletion 2025-07-14 03:09:18,809 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748156_7332 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748156 2025-07-14 03:11:14,628 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748158_7334 src: /192.168.158.6:34680 dest: /192.168.158.4:9866 2025-07-14 03:11:14,645 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:34680, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_884763866_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748158_7334, duration(ns): 14199620 2025-07-14 03:11:14,645 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748158_7334, type=LAST_IN_PIPELINE terminating 2025-07-14 03:11:15,817 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748158_7334 replica FinalizedReplica, blk_1073748158_7334, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748158 for deletion 2025-07-14 03:11:15,818 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748158_7334 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748158 2025-07-14 03:13:19,602 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748160_7336 src: /192.168.158.9:49824 dest: /192.168.158.4:9866 2025-07-14 03:13:19,621 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:49824, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1769510970_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748160_7336, duration(ns): 16711453 2025-07-14 03:13:19,621 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748160_7336, type=LAST_IN_PIPELINE terminating 2025-07-14 03:13:21,819 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748160_7336 replica FinalizedReplica, blk_1073748160_7336, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748160 for deletion 2025-07-14 03:13:21,820 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748160_7336 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748160 2025-07-14 03:14:24,621 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748161_7337 src: /192.168.158.8:43922 dest: /192.168.158.4:9866 2025-07-14 03:14:24,645 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:43922, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1148735777_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748161_7337, duration(ns): 18617457 2025-07-14 03:14:24,646 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748161_7337, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-14 03:14:27,822 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748161_7337 replica FinalizedReplica, blk_1073748161_7337, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748161 for deletion 2025-07-14 03:14:27,823 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748161_7337 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748161 2025-07-14 03:15:24,598 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748162_7338 src: /192.168.158.1:42370 dest: /192.168.158.4:9866 2025-07-14 03:15:24,629 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42370, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-77567192_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748162_7338, duration(ns): 22317197 2025-07-14 03:15:24,629 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748162_7338, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-14 03:15:27,827 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748162_7338 replica FinalizedReplica, blk_1073748162_7338, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748162 for deletion 2025-07-14 03:15:27,828 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748162_7338 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748162 2025-07-14 03:17:29,614 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748164_7340 src: /192.168.158.1:41220 dest: /192.168.158.4:9866 2025-07-14 03:17:29,650 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41220, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_238827440_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748164_7340, duration(ns): 26730071 2025-07-14 03:17:29,651 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748164_7340, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-14 03:17:33,834 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748164_7340 replica FinalizedReplica, blk_1073748164_7340, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748164 for deletion 2025-07-14 03:17:33,835 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748164_7340 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748164 2025-07-14 03:18:29,628 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748165_7341 src: /192.168.158.1:47454 dest: /192.168.158.4:9866 2025-07-14 03:18:29,658 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47454, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_295488018_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748165_7341, duration(ns): 21543898 2025-07-14 03:18:29,659 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748165_7341, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-14 03:18:30,836 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748165_7341 replica FinalizedReplica, blk_1073748165_7341, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748165 for deletion 2025-07-14 03:18:30,837 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748165_7341 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748165 2025-07-14 03:19:29,596 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748166_7342 src: /192.168.158.6:39322 dest: /192.168.158.4:9866 2025-07-14 03:19:29,615 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:39322, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2121708819_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748166_7342, duration(ns): 16816249 2025-07-14 03:19:29,615 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748166_7342, type=LAST_IN_PIPELINE terminating 2025-07-14 03:19:30,840 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748166_7342 replica FinalizedReplica, blk_1073748166_7342, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748166 for deletion 2025-07-14 03:19:30,841 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748166_7342 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748166 2025-07-14 03:20:29,600 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748167_7343 src: /192.168.158.8:51864 dest: /192.168.158.4:9866 2025-07-14 03:20:29,618 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:51864, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1895189973_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748167_7343, duration(ns): 15451379 2025-07-14 03:20:29,618 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748167_7343, type=LAST_IN_PIPELINE terminating 2025-07-14 03:20:30,843 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748167_7343 replica FinalizedReplica, blk_1073748167_7343, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748167 for deletion 2025-07-14 03:20:30,844 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748167_7343 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748167 2025-07-14 03:25:34,609 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748172_7348 src: /192.168.158.9:39394 dest: /192.168.158.4:9866 2025-07-14 03:25:34,627 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:39394, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_957590661_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748172_7348, duration(ns): 15746363 2025-07-14 03:25:34,627 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748172_7348, type=LAST_IN_PIPELINE terminating 2025-07-14 03:25:39,858 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748172_7348 replica FinalizedReplica, blk_1073748172_7348, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748172 for deletion 2025-07-14 03:25:39,859 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748172_7348 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748172 2025-07-14 03:26:34,603 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748173_7349 src: /192.168.158.1:57150 dest: /192.168.158.4:9866 2025-07-14 03:26:34,637 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57150, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_280970194_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748173_7349, duration(ns): 24258000 2025-07-14 03:26:34,637 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748173_7349, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-14 03:26:36,864 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748173_7349 replica FinalizedReplica, blk_1073748173_7349, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748173 for deletion 2025-07-14 03:26:36,865 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748173_7349 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748173 2025-07-14 03:27:34,610 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748174_7350 src: /192.168.158.1:58672 dest: /192.168.158.4:9866 2025-07-14 03:27:34,644 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58672, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_111281275_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748174_7350, duration(ns): 25484279 2025-07-14 03:27:34,645 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748174_7350, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-14 03:27:36,863 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748174_7350 replica FinalizedReplica, blk_1073748174_7350, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748174 for deletion 2025-07-14 03:27:36,863 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748174_7350 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748174 2025-07-14 03:30:39,619 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748177_7353 src: /192.168.158.9:33484 dest: /192.168.158.4:9866 2025-07-14 03:30:39,644 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:33484, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-465699270_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748177_7353, duration(ns): 20110492 2025-07-14 03:30:39,645 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748177_7353, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-14 03:30:45,870 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748177_7353 replica FinalizedReplica, blk_1073748177_7353, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748177 for deletion 2025-07-14 03:30:45,871 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748177_7353 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748177 2025-07-14 03:32:49,643 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748179_7355 src: /192.168.158.7:33330 dest: /192.168.158.4:9866 2025-07-14 03:32:49,662 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:33330, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2086250622_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748179_7355, duration(ns): 16457136 2025-07-14 03:32:49,662 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748179_7355, type=LAST_IN_PIPELINE terminating 2025-07-14 03:32:54,871 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748179_7355 replica FinalizedReplica, blk_1073748179_7355, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748179 for deletion 2025-07-14 03:32:54,872 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748179_7355 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748179 2025-07-14 03:34:49,623 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748181_7357 src: /192.168.158.6:50246 dest: /192.168.158.4:9866 2025-07-14 03:34:49,647 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:50246, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1950787039_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748181_7357, duration(ns): 18763711 2025-07-14 03:34:49,647 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748181_7357, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-14 03:34:51,873 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748181_7357 replica FinalizedReplica, blk_1073748181_7357, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748181 for deletion 2025-07-14 03:34:51,874 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748181_7357 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748181 2025-07-14 03:35:49,626 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748182_7358 src: /192.168.158.9:54294 dest: /192.168.158.4:9866 2025-07-14 03:35:49,650 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:54294, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-349342523_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748182_7358, duration(ns): 18242633 2025-07-14 03:35:49,650 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748182_7358, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-14 03:35:51,873 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748182_7358 replica FinalizedReplica, blk_1073748182_7358, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748182 for deletion 2025-07-14 03:35:51,874 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748182_7358 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748182 2025-07-14 03:37:54,672 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748184_7360 src: /192.168.158.5:56160 dest: /192.168.158.4:9866 2025-07-14 03:37:54,697 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:56160, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1032930961_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748184_7360, duration(ns): 19162781 2025-07-14 03:37:54,698 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748184_7360, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-14 03:38:00,877 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748184_7360 replica FinalizedReplica, blk_1073748184_7360, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748184 for deletion 2025-07-14 03:38:00,878 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748184_7360 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748184 2025-07-14 03:38:54,637 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748185_7361 src: /192.168.158.1:53164 dest: /192.168.158.4:9866 2025-07-14 03:38:54,672 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53164, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1404619767_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748185_7361, duration(ns): 26794146 2025-07-14 03:38:54,673 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748185_7361, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-14 03:38:57,876 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748185_7361 replica FinalizedReplica, blk_1073748185_7361, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748185 for deletion 2025-07-14 03:38:57,877 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748185_7361 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748185 2025-07-14 03:39:59,668 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748186_7362 src: /192.168.158.5:43148 dest: /192.168.158.4:9866 2025-07-14 03:39:59,685 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:43148, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1765460445_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748186_7362, duration(ns): 15038941 2025-07-14 03:39:59,685 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748186_7362, type=LAST_IN_PIPELINE terminating 2025-07-14 03:40:00,879 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748186_7362 replica FinalizedReplica, blk_1073748186_7362, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748186 for deletion 2025-07-14 03:40:00,880 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748186_7362 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748186 2025-07-14 03:40:59,670 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748187_7363 src: /192.168.158.6:51036 dest: /192.168.158.4:9866 2025-07-14 03:40:59,688 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:51036, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1122678725_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748187_7363, duration(ns): 15561806 2025-07-14 03:40:59,688 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748187_7363, type=LAST_IN_PIPELINE terminating 2025-07-14 03:41:00,880 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748187_7363 replica FinalizedReplica, blk_1073748187_7363, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748187 for deletion 2025-07-14 03:41:00,881 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748187_7363 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748187 2025-07-14 03:41:59,669 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748188_7364 src: /192.168.158.1:46992 dest: /192.168.158.4:9866 2025-07-14 03:41:59,702 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46992, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_970011836_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748188_7364, duration(ns): 24346056 2025-07-14 03:41:59,702 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748188_7364, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-14 03:42:00,882 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748188_7364 replica FinalizedReplica, blk_1073748188_7364, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748188 for deletion 2025-07-14 03:42:00,883 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748188_7364 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748188 2025-07-14 03:45:04,652 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748191_7367 src: /192.168.158.1:42258 dest: /192.168.158.4:9866 2025-07-14 03:45:04,683 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42258, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_335147485_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748191_7367, duration(ns): 23150680 2025-07-14 03:45:04,684 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748191_7367, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-14 03:45:09,887 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748191_7367 replica FinalizedReplica, blk_1073748191_7367, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748191 for deletion 2025-07-14 03:45:09,888 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748191_7367 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748191 2025-07-14 03:47:04,665 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748193_7369 src: /192.168.158.6:42208 dest: /192.168.158.4:9866 2025-07-14 03:47:04,683 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:42208, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_994333339_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748193_7369, duration(ns): 16721766 2025-07-14 03:47:04,684 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748193_7369, type=LAST_IN_PIPELINE terminating 2025-07-14 03:47:06,887 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748193_7369 replica FinalizedReplica, blk_1073748193_7369, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748193 for deletion 2025-07-14 03:47:06,888 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748193_7369 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748193 2025-07-14 03:48:04,708 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748194_7370 src: /192.168.158.6:38892 dest: /192.168.158.4:9866 2025-07-14 03:48:04,725 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:38892, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-242270577_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748194_7370, duration(ns): 15544635 2025-07-14 03:48:04,726 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748194_7370, type=LAST_IN_PIPELINE terminating 2025-07-14 03:48:09,889 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748194_7370 replica FinalizedReplica, blk_1073748194_7370, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748194 for deletion 2025-07-14 03:48:09,890 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748194_7370 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748194 2025-07-14 03:49:04,688 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748195_7371 src: /192.168.158.7:47858 dest: /192.168.158.4:9866 2025-07-14 03:49:04,706 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:47858, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1282012676_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748195_7371, duration(ns): 15977701 2025-07-14 03:49:04,707 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748195_7371, type=LAST_IN_PIPELINE terminating 2025-07-14 03:49:09,892 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748195_7371 replica FinalizedReplica, blk_1073748195_7371, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748195 for deletion 2025-07-14 03:49:09,893 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748195_7371 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748195 2025-07-14 03:50:04,671 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748196_7372 src: /192.168.158.1:38092 dest: /192.168.158.4:9866 2025-07-14 03:50:04,701 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38092, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_697291814_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748196_7372, duration(ns): 21760266 2025-07-14 03:50:04,701 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748196_7372, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-14 03:50:06,892 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748196_7372 replica FinalizedReplica, blk_1073748196_7372, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748196 for deletion 2025-07-14 03:50:06,893 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748196_7372 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748196 2025-07-14 03:51:09,687 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748197_7373 src: /192.168.158.8:43072 dest: /192.168.158.4:9866 2025-07-14 03:51:09,705 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:43072, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_998869949_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748197_7373, duration(ns): 15252709 2025-07-14 03:51:09,705 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748197_7373, type=LAST_IN_PIPELINE terminating 2025-07-14 03:51:09,895 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748197_7373 replica FinalizedReplica, blk_1073748197_7373, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748197 for deletion 2025-07-14 03:51:09,896 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748197_7373 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748197 2025-07-14 03:55:14,696 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748201_7377 src: /192.168.158.7:50342 dest: /192.168.158.4:9866 2025-07-14 03:55:14,715 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:50342, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-378832044_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748201_7377, duration(ns): 16590572 2025-07-14 03:55:14,715 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748201_7377, type=LAST_IN_PIPELINE terminating 2025-07-14 03:55:18,902 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748201_7377 replica FinalizedReplica, blk_1073748201_7377, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748201 for deletion 2025-07-14 03:55:18,904 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748201_7377 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748201 2025-07-14 03:58:24,684 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748204_7380 src: /192.168.158.1:43100 dest: /192.168.158.4:9866 2025-07-14 03:58:24,718 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43100, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2115962449_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748204_7380, duration(ns): 25731263 2025-07-14 03:58:24,719 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748204_7380, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-14 03:58:24,911 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748204_7380 replica FinalizedReplica, blk_1073748204_7380, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748204 for deletion 2025-07-14 03:58:24,912 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748204_7380 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748204 2025-07-14 03:59:24,673 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748205_7381 src: /192.168.158.6:58998 dest: /192.168.158.4:9866 2025-07-14 03:59:24,697 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:58998, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_950357583_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748205_7381, duration(ns): 18216305 2025-07-14 03:59:24,697 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748205_7381, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-14 03:59:27,911 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748205_7381 replica FinalizedReplica, blk_1073748205_7381, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748205 for deletion 2025-07-14 03:59:27,913 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748205_7381 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748205 2025-07-14 04:02:24,696 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748208_7384 src: /192.168.158.6:58912 dest: /192.168.158.4:9866 2025-07-14 04:02:24,714 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:58912, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_115504137_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748208_7384, duration(ns): 14960988 2025-07-14 04:02:24,714 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748208_7384, type=LAST_IN_PIPELINE terminating 2025-07-14 04:02:27,916 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748208_7384 replica FinalizedReplica, blk_1073748208_7384, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748208 for deletion 2025-07-14 04:02:27,918 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748208_7384 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748208 2025-07-14 04:04:29,693 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748210_7386 src: /192.168.158.1:35244 dest: /192.168.158.4:9866 2025-07-14 04:04:29,726 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35244, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1278135185_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748210_7386, duration(ns): 23965662 2025-07-14 04:04:29,726 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748210_7386, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-14 04:04:30,918 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748210_7386 replica FinalizedReplica, blk_1073748210_7386, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748210 for deletion 2025-07-14 04:04:30,919 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748210_7386 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748210 2025-07-14 04:05:29,693 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748211_7387 src: /192.168.158.5:33874 dest: /192.168.158.4:9866 2025-07-14 04:05:29,719 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:33874, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_182892655_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748211_7387, duration(ns): 20630584 2025-07-14 04:05:29,719 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748211_7387, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-14 04:05:30,919 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748211_7387 replica FinalizedReplica, blk_1073748211_7387, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748211 for deletion 2025-07-14 04:05:30,921 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748211_7387 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748211 2025-07-14 04:06:34,695 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748212_7388 src: /192.168.158.8:50336 dest: /192.168.158.4:9866 2025-07-14 04:06:34,714 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:50336, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1524762199_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748212_7388, duration(ns): 16743693 2025-07-14 04:06:34,714 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748212_7388, type=LAST_IN_PIPELINE terminating 2025-07-14 04:06:39,921 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748212_7388 replica FinalizedReplica, blk_1073748212_7388, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748212 for deletion 2025-07-14 04:06:39,922 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748212_7388 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748212 2025-07-14 04:07:34,701 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748213_7389 src: /192.168.158.1:49326 dest: /192.168.158.4:9866 2025-07-14 04:07:34,732 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49326, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-199542750_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748213_7389, duration(ns): 23140645 2025-07-14 04:07:34,733 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748213_7389, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-14 04:07:36,920 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748213_7389 replica FinalizedReplica, blk_1073748213_7389, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748213 for deletion 2025-07-14 04:07:36,922 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748213_7389 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748213 2025-07-14 04:08:34,697 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748214_7390 src: /192.168.158.6:50668 dest: /192.168.158.4:9866 2025-07-14 04:08:34,716 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:50668, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_885227873_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748214_7390, duration(ns): 17491633 2025-07-14 04:08:34,717 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748214_7390, type=LAST_IN_PIPELINE terminating 2025-07-14 04:08:39,920 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748214_7390 replica FinalizedReplica, blk_1073748214_7390, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748214 for deletion 2025-07-14 04:08:39,922 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748214_7390 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748214 2025-07-14 04:13:34,708 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748219_7395 src: /192.168.158.1:37748 dest: /192.168.158.4:9866 2025-07-14 04:13:34,740 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37748, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_933943550_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748219_7395, duration(ns): 23432543 2025-07-14 04:13:34,741 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748219_7395, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-14 04:13:39,926 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748219_7395 replica FinalizedReplica, blk_1073748219_7395, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748219 for deletion 2025-07-14 04:13:39,928 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748219_7395 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748219 2025-07-14 04:17:44,715 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748223_7399 src: /192.168.158.5:41968 dest: /192.168.158.4:9866 2025-07-14 04:17:44,732 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:41968, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_84318749_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748223_7399, duration(ns): 15541664 2025-07-14 04:17:44,733 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748223_7399, type=LAST_IN_PIPELINE terminating 2025-07-14 04:17:45,935 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748223_7399 replica FinalizedReplica, blk_1073748223_7399, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748223 for deletion 2025-07-14 04:17:45,937 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748223_7399 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073748223 2025-07-14 04:18:44,713 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748224_7400 src: /192.168.158.9:52824 dest: /192.168.158.4:9866 2025-07-14 04:18:44,737 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:52824, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-433740318_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748224_7400, duration(ns): 18920124 2025-07-14 04:18:44,737 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748224_7400, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-14 04:18:48,939 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748224_7400 replica FinalizedReplica, blk_1073748224_7400, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748224 for deletion 2025-07-14 04:18:48,941 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748224_7400 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748224 2025-07-14 04:19:49,712 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748225_7401 src: /192.168.158.7:58950 dest: /192.168.158.4:9866 2025-07-14 04:19:49,731 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:58950, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-464817183_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748225_7401, duration(ns): 16368985 2025-07-14 04:19:49,731 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748225_7401, type=LAST_IN_PIPELINE terminating 2025-07-14 04:19:54,944 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748225_7401 replica FinalizedReplica, blk_1073748225_7401, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748225 for deletion 2025-07-14 04:19:54,945 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748225_7401 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748225 2025-07-14 04:29:09,728 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748234_7410 src: /192.168.158.5:52884 dest: /192.168.158.4:9866 2025-07-14 04:29:09,754 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:52884, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-690711602_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748234_7410, duration(ns): 20552624 2025-07-14 04:29:09,754 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748234_7410, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-14 04:29:09,953 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748234_7410 replica FinalizedReplica, blk_1073748234_7410, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748234 for deletion 2025-07-14 04:29:09,954 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748234_7410 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748234 2025-07-14 04:31:09,726 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748236_7412 src: /192.168.158.5:39450 dest: /192.168.158.4:9866 2025-07-14 04:31:09,752 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:39450, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-118524445_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748236_7412, duration(ns): 19853085 2025-07-14 04:31:09,752 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748236_7412, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-14 04:31:09,955 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748236_7412 replica FinalizedReplica, blk_1073748236_7412, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748236 for deletion 2025-07-14 04:31:09,956 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748236_7412 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748236 2025-07-14 04:32:09,754 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748237_7413 src: /192.168.158.5:59756 dest: /192.168.158.4:9866 2025-07-14 04:32:09,773 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:59756, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-565975831_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748237_7413, duration(ns): 17937212 2025-07-14 04:32:09,774 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748237_7413, type=LAST_IN_PIPELINE terminating 2025-07-14 04:32:09,957 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748237_7413 replica FinalizedReplica, blk_1073748237_7413, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748237 for deletion 2025-07-14 04:32:09,960 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748237_7413 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748237 2025-07-14 04:33:09,754 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748238_7414 src: /192.168.158.9:54612 dest: /192.168.158.4:9866 2025-07-14 04:33:09,777 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:54612, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1372780234_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748238_7414, duration(ns): 17675050 2025-07-14 04:33:09,778 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748238_7414, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-14 04:33:09,962 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748238_7414 replica FinalizedReplica, blk_1073748238_7414, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748238 for deletion 2025-07-14 04:33:09,963 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748238_7414 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748238 2025-07-14 04:34:09,787 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748239_7415 src: /192.168.158.1:46558 dest: /192.168.158.4:9866 2025-07-14 04:34:09,816 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46558, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_372676157_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748239_7415, duration(ns): 21308197 2025-07-14 04:34:09,816 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748239_7415, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-14 04:34:09,963 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748239_7415 replica FinalizedReplica, blk_1073748239_7415, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748239 for deletion 2025-07-14 04:34:09,965 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748239_7415 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748239 2025-07-14 04:36:09,741 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748241_7417 src: /192.168.158.8:50806 dest: /192.168.158.4:9866 2025-07-14 04:36:09,759 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:50806, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-214274795_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748241_7417, duration(ns): 15678869 2025-07-14 04:36:09,759 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748241_7417, type=LAST_IN_PIPELINE terminating 2025-07-14 04:36:09,972 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748241_7417 replica FinalizedReplica, blk_1073748241_7417, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748241 for deletion 2025-07-14 04:36:09,973 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748241_7417 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748241 2025-07-14 04:38:14,747 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748243_7419 src: /192.168.158.7:37160 dest: /192.168.158.4:9866 2025-07-14 04:38:14,763 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:37160, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1514377776_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748243_7419, duration(ns): 14248783 2025-07-14 04:38:14,764 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748243_7419, type=LAST_IN_PIPELINE terminating 2025-07-14 04:38:18,979 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748243_7419 replica FinalizedReplica, blk_1073748243_7419, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748243 for deletion 2025-07-14 04:38:18,981 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748243_7419 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748243 2025-07-14 04:39:14,758 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748244_7420 src: /192.168.158.9:43218 dest: /192.168.158.4:9866 2025-07-14 04:39:14,776 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:43218, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1726181672_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748244_7420, duration(ns): 16262987 2025-07-14 04:39:14,776 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748244_7420, type=LAST_IN_PIPELINE terminating 2025-07-14 04:39:18,981 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748244_7420 replica FinalizedReplica, blk_1073748244_7420, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748244 for deletion 2025-07-14 04:39:18,982 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748244_7420 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748244 2025-07-14 04:43:29,748 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748248_7424 src: /192.168.158.9:46890 dest: /192.168.158.4:9866 2025-07-14 04:43:29,766 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:46890, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1681006136_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748248_7424, duration(ns): 15918206 2025-07-14 04:43:29,766 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748248_7424, type=LAST_IN_PIPELINE terminating 2025-07-14 04:43:30,989 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748248_7424 replica FinalizedReplica, blk_1073748248_7424, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748248 for deletion 2025-07-14 04:43:30,991 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748248_7424 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748248 2025-07-14 04:45:29,772 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748250_7426 src: /192.168.158.8:38068 dest: /192.168.158.4:9866 2025-07-14 04:45:29,791 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:38068, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-745425347_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748250_7426, duration(ns): 17181007 2025-07-14 04:45:29,792 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748250_7426, type=LAST_IN_PIPELINE terminating 2025-07-14 04:45:33,993 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748250_7426 replica FinalizedReplica, blk_1073748250_7426, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748250 for deletion 2025-07-14 04:45:33,994 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748250_7426 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748250 2025-07-14 04:46:29,765 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748251_7427 src: /192.168.158.6:45102 dest: /192.168.158.4:9866 2025-07-14 04:46:29,784 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:45102, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_377164069_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748251_7427, duration(ns): 15981589 2025-07-14 04:46:29,784 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748251_7427, type=LAST_IN_PIPELINE terminating 2025-07-14 04:46:33,993 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748251_7427 replica FinalizedReplica, blk_1073748251_7427, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748251 for deletion 2025-07-14 04:46:33,994 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748251_7427 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748251 2025-07-14 04:49:39,764 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748254_7430 src: /192.168.158.6:46840 dest: /192.168.158.4:9866 2025-07-14 04:49:39,785 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:46840, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-172863697_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748254_7430, duration(ns): 18379620 2025-07-14 04:49:39,785 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748254_7430, type=LAST_IN_PIPELINE terminating 2025-07-14 04:49:43,000 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748254_7430 replica FinalizedReplica, blk_1073748254_7430, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748254 for deletion 2025-07-14 04:49:43,002 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748254_7430 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748254 2025-07-14 04:54:44,772 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748259_7435 src: /192.168.158.8:44684 dest: /192.168.158.4:9866 2025-07-14 04:54:44,791 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:44684, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-282350483_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748259_7435, duration(ns): 17383848 2025-07-14 04:54:44,791 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748259_7435, type=LAST_IN_PIPELINE terminating 2025-07-14 04:54:52,010 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748259_7435 replica FinalizedReplica, blk_1073748259_7435, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748259 for deletion 2025-07-14 04:54:52,011 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748259_7435 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748259 2025-07-14 04:56:44,779 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748261_7437 src: /192.168.158.6:38776 dest: /192.168.158.4:9866 2025-07-14 04:56:44,803 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:38776, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1994607897_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748261_7437, duration(ns): 18786319 2025-07-14 04:56:44,803 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748261_7437, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-14 04:56:49,012 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748261_7437 replica FinalizedReplica, blk_1073748261_7437, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748261 for deletion 2025-07-14 04:56:49,013 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748261_7437 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748261 2025-07-14 04:58:44,768 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748263_7439 src: /192.168.158.1:39180 dest: /192.168.158.4:9866 2025-07-14 04:58:44,802 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39180, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1623565245_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748263_7439, duration(ns): 22563739 2025-07-14 04:58:44,802 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748263_7439, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-14 04:58:52,018 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748263_7439 replica FinalizedReplica, blk_1073748263_7439, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748263 for deletion 2025-07-14 04:58:52,019 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748263_7439 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748263 2025-07-14 04:59:49,781 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748264_7440 src: /192.168.158.6:58734 dest: /192.168.158.4:9866 2025-07-14 04:59:49,807 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:58734, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_640373905_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748264_7440, duration(ns): 20892602 2025-07-14 04:59:49,808 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748264_7440, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-14 04:59:58,020 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748264_7440 replica FinalizedReplica, blk_1073748264_7440, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748264 for deletion 2025-07-14 04:59:58,022 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748264_7440 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748264 2025-07-14 05:00:49,778 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748265_7441 src: /192.168.158.1:37374 dest: /192.168.158.4:9866 2025-07-14 05:00:49,811 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37374, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-867002811_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748265_7441, duration(ns): 22575555 2025-07-14 05:00:49,811 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748265_7441, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-14 05:00:55,026 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748265_7441 replica FinalizedReplica, blk_1073748265_7441, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748265 for deletion 2025-07-14 05:00:55,027 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748265_7441 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748265 2025-07-14 05:02:49,787 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748267_7443 src: /192.168.158.6:51932 dest: /192.168.158.4:9866 2025-07-14 05:02:49,811 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:51932, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-515201714_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748267_7443, duration(ns): 18194762 2025-07-14 05:02:49,811 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748267_7443, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-14 05:02:55,030 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748267_7443 replica FinalizedReplica, blk_1073748267_7443, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748267 for deletion 2025-07-14 05:02:55,031 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748267_7443 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748267 2025-07-14 05:03:49,797 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748268_7444 src: /192.168.158.8:60746 dest: /192.168.158.4:9866 2025-07-14 05:03:49,815 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:60746, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_736978107_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748268_7444, duration(ns): 15657446 2025-07-14 05:03:49,815 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748268_7444, type=LAST_IN_PIPELINE terminating 2025-07-14 05:03:55,030 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748268_7444 replica FinalizedReplica, blk_1073748268_7444, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748268 for deletion 2025-07-14 05:03:55,031 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748268_7444 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748268 2025-07-14 05:11:09,819 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748275_7451 src: /192.168.158.1:46798 dest: /192.168.158.4:9866 2025-07-14 05:11:09,849 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46798, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-829290592_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748275_7451, duration(ns): 21045373 2025-07-14 05:11:09,849 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748275_7451, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-14 05:11:13,044 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748275_7451 replica FinalizedReplica, blk_1073748275_7451, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748275 for deletion 2025-07-14 05:11:13,045 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748275_7451 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748275 2025-07-14 05:14:14,816 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748278_7454 src: /192.168.158.9:54536 dest: /192.168.158.4:9866 2025-07-14 05:14:14,843 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:54536, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-778989238_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748278_7454, duration(ns): 20586401 2025-07-14 05:14:14,849 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748278_7454, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-14 05:14:22,053 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748278_7454 replica FinalizedReplica, blk_1073748278_7454, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748278 for deletion 2025-07-14 05:14:22,054 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748278_7454 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748278 2025-07-14 05:15:14,845 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748279_7455 src: /192.168.158.9:48994 dest: /192.168.158.4:9866 2025-07-14 05:15:14,871 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:48994, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1227155463_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748279_7455, duration(ns): 20127409 2025-07-14 05:15:14,871 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748279_7455, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-14 05:15:19,057 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748279_7455 replica FinalizedReplica, blk_1073748279_7455, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748279 for deletion 2025-07-14 05:15:19,059 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748279_7455 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748279 2025-07-14 05:17:14,828 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748281_7457 src: /192.168.158.9:36010 dest: /192.168.158.4:9866 2025-07-14 05:17:14,855 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:36010, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-885077205_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748281_7457, duration(ns): 21069005 2025-07-14 05:17:14,855 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748281_7457, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-14 05:17:22,063 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748281_7457 replica FinalizedReplica, blk_1073748281_7457, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748281 for deletion 2025-07-14 05:17:22,065 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748281_7457 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748281 2025-07-14 05:18:14,835 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748282_7458 src: /192.168.158.7:34190 dest: /192.168.158.4:9866 2025-07-14 05:18:14,861 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:34190, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2056897172_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748282_7458, duration(ns): 20349967 2025-07-14 05:18:14,861 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748282_7458, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-14 05:18:19,067 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748282_7458 replica FinalizedReplica, blk_1073748282_7458, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748282 for deletion 2025-07-14 05:18:19,068 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748282_7458 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748282 2025-07-14 05:24:19,849 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748288_7464 src: /192.168.158.5:57604 dest: /192.168.158.4:9866 2025-07-14 05:24:19,874 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:57604, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1799379391_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748288_7464, duration(ns): 19102119 2025-07-14 05:24:19,874 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748288_7464, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-14 05:24:28,078 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748288_7464 replica FinalizedReplica, blk_1073748288_7464, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748288 for deletion 2025-07-14 05:24:28,080 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748288_7464 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748288 2025-07-14 05:27:24,841 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748291_7467 src: /192.168.158.9:39012 dest: /192.168.158.4:9866 2025-07-14 05:27:24,859 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:39012, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_342551297_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748291_7467, duration(ns): 15307726 2025-07-14 05:27:24,859 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748291_7467, type=LAST_IN_PIPELINE terminating 2025-07-14 05:27:31,086 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748291_7467 replica FinalizedReplica, blk_1073748291_7467, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748291 for deletion 2025-07-14 05:27:31,088 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748291_7467 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748291 2025-07-14 05:29:29,845 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748293_7469 src: /192.168.158.7:52830 dest: /192.168.158.4:9866 2025-07-14 05:29:29,864 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:52830, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1920485748_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748293_7469, duration(ns): 16247130 2025-07-14 05:29:29,864 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748293_7469, type=LAST_IN_PIPELINE terminating 2025-07-14 05:29:34,090 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748293_7469 replica FinalizedReplica, blk_1073748293_7469, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748293 for deletion 2025-07-14 05:29:34,091 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748293_7469 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748293 2025-07-14 05:35:39,849 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748299_7475 src: /192.168.158.8:58484 dest: /192.168.158.4:9866 2025-07-14 05:35:39,872 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:58484, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_939419815_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748299_7475, duration(ns): 18127476 2025-07-14 05:35:39,872 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748299_7475, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-14 05:35:46,105 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748299_7475 replica FinalizedReplica, blk_1073748299_7475, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748299 for deletion 2025-07-14 05:35:46,106 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748299_7475 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748299 2025-07-14 05:36:13,268 INFO org.apache.hadoop.hdfs.server.datanode.DirectoryScanner: BlockPool BP-1059995147-192.168.158.1-1752101929360 Total blocks: 8, missing metadata files:0, missing block files:0, missing blocks in memory:0, mismatched blocks:0 2025-07-14 05:36:44,864 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748300_7476 src: /192.168.158.8:47274 dest: /192.168.158.4:9866 2025-07-14 05:36:44,883 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:47274, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-368527963_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748300_7476, duration(ns): 16075934 2025-07-14 05:36:44,883 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748300_7476, type=LAST_IN_PIPELINE terminating 2025-07-14 05:36:49,108 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748300_7476 replica FinalizedReplica, blk_1073748300_7476, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748300 for deletion 2025-07-14 05:36:49,109 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748300_7476 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748300 2025-07-14 05:37:19,112 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Successfully sent block report 0x12cfd9a757d31f37, containing 4 storage report(s), of which we sent 4. The reports had 8 total blocks and used 1 RPC(s). This took 0 msec to generate and 4 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5. 2025-07-14 05:37:19,112 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Got finalize command for block pool BP-1059995147-192.168.158.1-1752101929360 2025-07-14 05:37:44,860 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748301_7477 src: /192.168.158.8:56468 dest: /192.168.158.4:9866 2025-07-14 05:37:44,879 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:56468, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1973483479_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748301_7477, duration(ns): 16892837 2025-07-14 05:37:44,880 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748301_7477, type=LAST_IN_PIPELINE terminating 2025-07-14 05:37:49,109 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748301_7477 replica FinalizedReplica, blk_1073748301_7477, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748301 for deletion 2025-07-14 05:37:49,111 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748301_7477 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748301 2025-07-14 05:38:44,861 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748302_7478 src: /192.168.158.1:36448 dest: /192.168.158.4:9866 2025-07-14 05:38:44,895 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36448, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-499797631_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748302_7478, duration(ns): 24885696 2025-07-14 05:38:44,895 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748302_7478, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-14 05:38:49,110 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748302_7478 replica FinalizedReplica, blk_1073748302_7478, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748302 for deletion 2025-07-14 05:38:49,111 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748302_7478 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748302 2025-07-14 05:39:44,874 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748303_7479 src: /192.168.158.8:43326 dest: /192.168.158.4:9866 2025-07-14 05:39:44,899 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:43326, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1292938839_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748303_7479, duration(ns): 19991740 2025-07-14 05:39:44,899 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748303_7479, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-14 05:39:52,111 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748303_7479 replica FinalizedReplica, blk_1073748303_7479, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748303 for deletion 2025-07-14 05:39:52,112 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748303_7479 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748303 2025-07-14 05:41:49,894 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748305_7481 src: /192.168.158.7:57412 dest: /192.168.158.4:9866 2025-07-14 05:41:49,912 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:57412, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_65935396_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748305_7481, duration(ns): 16467065 2025-07-14 05:41:49,913 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748305_7481, type=LAST_IN_PIPELINE terminating 2025-07-14 05:41:55,114 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748305_7481 replica FinalizedReplica, blk_1073748305_7481, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748305 for deletion 2025-07-14 05:41:55,116 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748305_7481 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748305 2025-07-14 05:43:49,859 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748307_7483 src: /192.168.158.1:51960 dest: /192.168.158.4:9866 2025-07-14 05:43:49,891 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51960, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1936123807_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748307_7483, duration(ns): 22961369 2025-07-14 05:43:49,891 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748307_7483, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-14 05:43:55,121 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748307_7483 replica FinalizedReplica, blk_1073748307_7483, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748307 for deletion 2025-07-14 05:43:55,122 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748307_7483 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748307 2025-07-14 05:44:49,871 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748308_7484 src: /192.168.158.1:41118 dest: /192.168.158.4:9866 2025-07-14 05:44:49,902 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41118, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1042842810_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748308_7484, duration(ns): 21959328 2025-07-14 05:44:49,902 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748308_7484, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-14 05:44:55,126 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748308_7484 replica FinalizedReplica, blk_1073748308_7484, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748308 for deletion 2025-07-14 05:44:55,127 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748308_7484 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748308 2025-07-14 05:46:54,861 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748310_7486 src: /192.168.158.1:38942 dest: /192.168.158.4:9866 2025-07-14 05:46:54,894 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38942, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2074533759_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748310_7486, duration(ns): 23637227 2025-07-14 05:46:54,894 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748310_7486, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-14 05:47:01,131 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748310_7486 replica FinalizedReplica, blk_1073748310_7486, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748310 for deletion 2025-07-14 05:47:01,132 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748310_7486 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748310 2025-07-14 05:47:54,867 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748311_7487 src: /192.168.158.1:45784 dest: /192.168.158.4:9866 2025-07-14 05:47:54,903 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45784, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_632368683_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748311_7487, duration(ns): 25805224 2025-07-14 05:47:54,903 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748311_7487, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-14 05:48:01,134 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748311_7487 replica FinalizedReplica, blk_1073748311_7487, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748311 for deletion 2025-07-14 05:48:01,135 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748311_7487 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748311 2025-07-14 05:48:54,870 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748312_7488 src: /192.168.158.1:55138 dest: /192.168.158.4:9866 2025-07-14 05:48:54,902 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55138, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1048690854_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748312_7488, duration(ns): 22508836 2025-07-14 05:48:54,902 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748312_7488, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-14 05:49:01,133 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748312_7488 replica FinalizedReplica, blk_1073748312_7488, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748312 for deletion 2025-07-14 05:49:01,135 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748312_7488 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748312 2025-07-14 05:49:54,867 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748313_7489 src: /192.168.158.1:35868 dest: /192.168.158.4:9866 2025-07-14 05:49:54,898 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35868, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-209341421_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748313_7489, duration(ns): 21407202 2025-07-14 05:49:54,898 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748313_7489, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-14 05:49:58,135 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748313_7489 replica FinalizedReplica, blk_1073748313_7489, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748313 for deletion 2025-07-14 05:49:58,137 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748313_7489 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748313 2025-07-14 05:50:54,877 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748314_7490 src: /192.168.158.8:48078 dest: /192.168.158.4:9866 2025-07-14 05:50:54,897 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:48078, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1813563937_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748314_7490, duration(ns): 17160495 2025-07-14 05:50:54,897 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748314_7490, type=LAST_IN_PIPELINE terminating 2025-07-14 05:50:58,138 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748314_7490 replica FinalizedReplica, blk_1073748314_7490, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748314 for deletion 2025-07-14 05:50:58,139 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748314_7490 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748314 2025-07-14 05:52:59,866 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748316_7492 src: /192.168.158.1:34152 dest: /192.168.158.4:9866 2025-07-14 05:52:59,900 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34152, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2197317_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748316_7492, duration(ns): 24637550 2025-07-14 05:52:59,900 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748316_7492, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-14 05:53:07,139 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748316_7492 replica FinalizedReplica, blk_1073748316_7492, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748316 for deletion 2025-07-14 05:53:07,140 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748316_7492 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748316 2025-07-14 05:55:04,874 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748318_7494 src: /192.168.158.8:51048 dest: /192.168.158.4:9866 2025-07-14 05:55:04,899 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:51048, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_318645171_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748318_7494, duration(ns): 19590662 2025-07-14 05:55:04,899 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748318_7494, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-14 05:55:10,144 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748318_7494 replica FinalizedReplica, blk_1073748318_7494, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748318 for deletion 2025-07-14 05:55:10,145 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748318_7494 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748318 2025-07-14 05:56:04,875 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748319_7495 src: /192.168.158.1:49520 dest: /192.168.158.4:9866 2025-07-14 05:56:04,907 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49520, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1856870792_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748319_7495, duration(ns): 23028936 2025-07-14 05:56:04,907 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748319_7495, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-14 05:56:10,146 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748319_7495 replica FinalizedReplica, blk_1073748319_7495, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748319 for deletion 2025-07-14 05:56:10,147 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748319_7495 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748319 2025-07-14 05:57:04,887 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748320_7496 src: /192.168.158.5:52114 dest: /192.168.158.4:9866 2025-07-14 05:57:04,906 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:52114, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-340621112_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748320_7496, duration(ns): 16409742 2025-07-14 05:57:04,906 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748320_7496, type=LAST_IN_PIPELINE terminating 2025-07-14 05:57:10,150 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748320_7496 replica FinalizedReplica, blk_1073748320_7496, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748320 for deletion 2025-07-14 05:57:10,151 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748320_7496 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748320 2025-07-14 06:03:09,892 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748326_7502 src: /192.168.158.8:55264 dest: /192.168.158.4:9866 2025-07-14 06:03:09,916 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:55264, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1370961958_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748326_7502, duration(ns): 18786248 2025-07-14 06:03:09,916 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748326_7502, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-14 06:03:13,162 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748326_7502 replica FinalizedReplica, blk_1073748326_7502, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748326 for deletion 2025-07-14 06:03:13,163 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748326_7502 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748326 2025-07-14 06:08:19,889 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748331_7507 src: /192.168.158.1:33272 dest: /192.168.158.4:9866 2025-07-14 06:08:19,920 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33272, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-8115349_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748331_7507, duration(ns): 22316017 2025-07-14 06:08:19,920 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748331_7507, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-14 06:08:28,176 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748331_7507 replica FinalizedReplica, blk_1073748331_7507, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748331 for deletion 2025-07-14 06:08:28,177 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748331_7507 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748331 2025-07-14 06:11:19,914 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748334_7510 src: /192.168.158.8:58068 dest: /192.168.158.4:9866 2025-07-14 06:11:19,943 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:58068, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-761716254_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748334_7510, duration(ns): 23379205 2025-07-14 06:11:19,943 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748334_7510, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-14 06:11:25,180 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748334_7510 replica FinalizedReplica, blk_1073748334_7510, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748334 for deletion 2025-07-14 06:11:25,181 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748334_7510 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748334 2025-07-14 06:14:24,927 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748337_7513 src: /192.168.158.7:55306 dest: /192.168.158.4:9866 2025-07-14 06:14:24,945 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:55306, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1909401059_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748337_7513, duration(ns): 14159389 2025-07-14 06:14:24,945 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748337_7513, type=LAST_IN_PIPELINE terminating 2025-07-14 06:14:28,184 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748337_7513 replica FinalizedReplica, blk_1073748337_7513, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748337 for deletion 2025-07-14 06:14:28,186 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748337_7513 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748337 2025-07-14 06:15:24,923 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748338_7514 src: /192.168.158.9:50560 dest: /192.168.158.4:9866 2025-07-14 06:15:24,943 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:50560, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_3309180_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748338_7514, duration(ns): 17836678 2025-07-14 06:15:24,943 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748338_7514, type=LAST_IN_PIPELINE terminating 2025-07-14 06:15:28,186 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748338_7514 replica FinalizedReplica, blk_1073748338_7514, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748338 for deletion 2025-07-14 06:15:28,188 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748338_7514 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748338 2025-07-14 06:19:29,927 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748342_7518 src: /192.168.158.9:32782 dest: /192.168.158.4:9866 2025-07-14 06:19:29,945 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:32782, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-440814915_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748342_7518, duration(ns): 16010803 2025-07-14 06:19:29,945 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748342_7518, type=LAST_IN_PIPELINE terminating 2025-07-14 06:19:37,190 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748342_7518 replica FinalizedReplica, blk_1073748342_7518, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748342 for deletion 2025-07-14 06:19:37,191 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748342_7518 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748342 2025-07-14 06:20:29,920 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748343_7519 src: /192.168.158.9:42010 dest: /192.168.158.4:9866 2025-07-14 06:20:29,950 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:42010, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1602746116_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748343_7519, duration(ns): 23101775 2025-07-14 06:20:29,950 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748343_7519, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-14 06:20:34,190 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748343_7519 replica FinalizedReplica, blk_1073748343_7519, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748343 for deletion 2025-07-14 06:20:34,191 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748343_7519 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748343 2025-07-14 06:22:29,960 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748345_7521 src: /192.168.158.9:56554 dest: /192.168.158.4:9866 2025-07-14 06:22:29,985 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:56554, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_985565761_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748345_7521, duration(ns): 20002379 2025-07-14 06:22:29,985 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748345_7521, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-14 06:22:34,195 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748345_7521 replica FinalizedReplica, blk_1073748345_7521, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748345 for deletion 2025-07-14 06:22:34,196 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748345_7521 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748345 2025-07-14 06:24:34,934 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748347_7523 src: /192.168.158.8:46276 dest: /192.168.158.4:9866 2025-07-14 06:24:34,959 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:46276, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_541392469_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748347_7523, duration(ns): 19830417 2025-07-14 06:24:34,959 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748347_7523, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-14 06:24:40,197 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748347_7523 replica FinalizedReplica, blk_1073748347_7523, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748347 for deletion 2025-07-14 06:24:40,198 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748347_7523 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748347 2025-07-14 06:26:34,929 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748349_7525 src: /192.168.158.1:52008 dest: /192.168.158.4:9866 2025-07-14 06:26:34,959 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52008, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2044161047_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748349_7525, duration(ns): 21784677 2025-07-14 06:26:34,959 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748349_7525, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-14 06:26:40,202 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748349_7525 replica FinalizedReplica, blk_1073748349_7525, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748349 for deletion 2025-07-14 06:26:40,203 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748349_7525 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748349 2025-07-14 06:29:39,936 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748352_7528 src: /192.168.158.6:50932 dest: /192.168.158.4:9866 2025-07-14 06:29:39,956 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:50932, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2073583935_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748352_7528, duration(ns): 18187018 2025-07-14 06:29:39,956 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748352_7528, type=LAST_IN_PIPELINE terminating 2025-07-14 06:29:43,206 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748352_7528 replica FinalizedReplica, blk_1073748352_7528, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748352 for deletion 2025-07-14 06:29:43,207 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748352_7528 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748352 2025-07-14 06:31:39,940 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748354_7530 src: /192.168.158.8:38240 dest: /192.168.158.4:9866 2025-07-14 06:31:39,968 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:38240, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_209880434_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748354_7530, duration(ns): 22850738 2025-07-14 06:31:39,969 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748354_7530, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-14 06:31:43,212 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748354_7530 replica FinalizedReplica, blk_1073748354_7530, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748354 for deletion 2025-07-14 06:31:43,213 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748354_7530 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748354 2025-07-14 06:32:44,975 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748355_7531 src: /192.168.158.9:42124 dest: /192.168.158.4:9866 2025-07-14 06:32:44,994 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:42124, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1586602980_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748355_7531, duration(ns): 16524507 2025-07-14 06:32:44,994 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748355_7531, type=LAST_IN_PIPELINE terminating 2025-07-14 06:32:49,217 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748355_7531 replica FinalizedReplica, blk_1073748355_7531, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748355 for deletion 2025-07-14 06:32:49,218 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748355_7531 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748355 2025-07-14 06:34:44,951 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748357_7533 src: /192.168.158.9:55780 dest: /192.168.158.4:9866 2025-07-14 06:34:44,969 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:55780, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_261154336_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748357_7533, duration(ns): 15192202 2025-07-14 06:34:44,969 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748357_7533, type=LAST_IN_PIPELINE terminating 2025-07-14 06:34:52,222 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748357_7533 replica FinalizedReplica, blk_1073748357_7533, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748357 for deletion 2025-07-14 06:34:52,223 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748357_7533 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748357 2025-07-14 06:36:49,944 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748359_7535 src: /192.168.158.1:45582 dest: /192.168.158.4:9866 2025-07-14 06:36:49,976 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45582, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_488684965_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748359_7535, duration(ns): 22441526 2025-07-14 06:36:49,976 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748359_7535, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-14 06:36:55,225 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748359_7535 replica FinalizedReplica, blk_1073748359_7535, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748359 for deletion 2025-07-14 06:36:55,226 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748359_7535 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748359 2025-07-14 06:37:49,955 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748360_7536 src: /192.168.158.6:34354 dest: /192.168.158.4:9866 2025-07-14 06:37:49,980 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:34354, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1377359897_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748360_7536, duration(ns): 20115871 2025-07-14 06:37:49,981 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748360_7536, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-14 06:37:58,229 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748360_7536 replica FinalizedReplica, blk_1073748360_7536, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748360 for deletion 2025-07-14 06:37:58,230 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748360_7536 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748360 2025-07-14 06:38:54,938 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748361_7537 src: /192.168.158.6:51190 dest: /192.168.158.4:9866 2025-07-14 06:38:54,956 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:51190, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-180448131_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748361_7537, duration(ns): 15347311 2025-07-14 06:38:54,956 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748361_7537, type=LAST_IN_PIPELINE terminating 2025-07-14 06:38:58,232 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748361_7537 replica FinalizedReplica, blk_1073748361_7537, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748361 for deletion 2025-07-14 06:38:58,233 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748361_7537 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748361 2025-07-14 06:40:54,955 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748363_7539 src: /192.168.158.1:55104 dest: /192.168.158.4:9866 2025-07-14 06:40:54,988 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55104, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_800611931_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748363_7539, duration(ns): 24115105 2025-07-14 06:40:54,988 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748363_7539, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-14 06:40:58,233 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748363_7539 replica FinalizedReplica, blk_1073748363_7539, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748363 for deletion 2025-07-14 06:40:58,234 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748363_7539 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748363 2025-07-14 06:41:54,955 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748364_7540 src: /192.168.158.1:35668 dest: /192.168.158.4:9866 2025-07-14 06:41:54,989 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35668, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_886192190_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748364_7540, duration(ns): 24824838 2025-07-14 06:41:54,990 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748364_7540, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-14 06:41:58,236 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748364_7540 replica FinalizedReplica, blk_1073748364_7540, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748364 for deletion 2025-07-14 06:41:58,237 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748364_7540 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748364 2025-07-14 06:42:59,949 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748365_7541 src: /192.168.158.1:36228 dest: /192.168.158.4:9866 2025-07-14 06:42:59,978 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36228, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_696288726_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748365_7541, duration(ns): 20379624 2025-07-14 06:42:59,978 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748365_7541, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-14 06:43:04,237 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748365_7541 replica FinalizedReplica, blk_1073748365_7541, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748365 for deletion 2025-07-14 06:43:04,238 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748365_7541 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748365 2025-07-14 06:45:59,998 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748368_7544 src: /192.168.158.7:45422 dest: /192.168.158.4:9866 2025-07-14 06:46:00,015 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:45422, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-461237682_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748368_7544, duration(ns): 15597003 2025-07-14 06:46:00,016 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748368_7544, type=LAST_IN_PIPELINE terminating 2025-07-14 06:46:04,241 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748368_7544 replica FinalizedReplica, blk_1073748368_7544, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748368 for deletion 2025-07-14 06:46:04,242 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748368_7544 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748368 2025-07-14 06:47:59,978 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748370_7546 src: /192.168.158.9:39990 dest: /192.168.158.4:9866 2025-07-14 06:47:59,997 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:39990, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_339833892_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748370_7546, duration(ns): 17911701 2025-07-14 06:47:59,998 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748370_7546, type=LAST_IN_PIPELINE terminating 2025-07-14 06:48:04,244 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748370_7546 replica FinalizedReplica, blk_1073748370_7546, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748370 for deletion 2025-07-14 06:48:04,245 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748370_7546 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748370 2025-07-14 06:50:59,970 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748373_7549 src: /192.168.158.1:39638 dest: /192.168.158.4:9866 2025-07-14 06:51:00,002 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39638, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_548638699_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748373_7549, duration(ns): 23158149 2025-07-14 06:51:00,002 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748373_7549, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-14 06:51:04,252 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748373_7549 replica FinalizedReplica, blk_1073748373_7549, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748373 for deletion 2025-07-14 06:51:04,253 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748373_7549 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748373 2025-07-14 06:51:59,979 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748374_7550 src: /192.168.158.7:55514 dest: /192.168.158.4:9866 2025-07-14 06:52:00,004 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:55514, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_177194835_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748374_7550, duration(ns): 19549450 2025-07-14 06:52:00,005 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748374_7550, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-14 06:52:04,252 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748374_7550 replica FinalizedReplica, blk_1073748374_7550, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748374 for deletion 2025-07-14 06:52:04,253 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748374_7550 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748374 2025-07-14 06:52:59,967 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748375_7551 src: /192.168.158.9:34440 dest: /192.168.158.4:9866 2025-07-14 06:52:59,985 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:34440, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-588480331_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748375_7551, duration(ns): 15725302 2025-07-14 06:52:59,985 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748375_7551, type=LAST_IN_PIPELINE terminating 2025-07-14 06:53:07,252 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748375_7551 replica FinalizedReplica, blk_1073748375_7551, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748375 for deletion 2025-07-14 06:53:07,253 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748375_7551 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748375 2025-07-14 06:53:59,974 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748376_7552 src: /192.168.158.1:50062 dest: /192.168.158.4:9866 2025-07-14 06:54:00,005 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50062, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1697680763_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748376_7552, duration(ns): 21239999 2025-07-14 06:54:00,005 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748376_7552, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-14 06:54:04,254 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748376_7552 replica FinalizedReplica, blk_1073748376_7552, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748376 for deletion 2025-07-14 06:54:04,256 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748376_7552 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748376 2025-07-14 07:00:09,997 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748382_7558 src: /192.168.158.9:45944 dest: /192.168.158.4:9866 2025-07-14 07:00:10,015 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:45944, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-644704139_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748382_7558, duration(ns): 14766954 2025-07-14 07:00:10,016 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748382_7558, type=LAST_IN_PIPELINE terminating 2025-07-14 07:00:13,267 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748382_7558 replica FinalizedReplica, blk_1073748382_7558, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748382 for deletion 2025-07-14 07:00:13,268 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748382_7558 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748382 2025-07-14 07:02:09,996 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748384_7560 src: /192.168.158.5:39908 dest: /192.168.158.4:9866 2025-07-14 07:02:10,016 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:39908, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1943922330_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748384_7560, duration(ns): 18090956 2025-07-14 07:02:10,016 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748384_7560, type=LAST_IN_PIPELINE terminating 2025-07-14 07:02:16,271 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748384_7560 replica FinalizedReplica, blk_1073748384_7560, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748384 for deletion 2025-07-14 07:02:16,272 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748384_7560 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748384 2025-07-14 07:03:09,995 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748385_7561 src: /192.168.158.9:58568 dest: /192.168.158.4:9866 2025-07-14 07:03:10,019 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:58568, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1573778829_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748385_7561, duration(ns): 19054430 2025-07-14 07:03:10,019 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748385_7561, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-14 07:03:13,273 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748385_7561 replica FinalizedReplica, blk_1073748385_7561, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748385 for deletion 2025-07-14 07:03:13,274 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748385_7561 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748385 2025-07-14 07:04:09,989 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748386_7562 src: /192.168.158.7:52586 dest: /192.168.158.4:9866 2025-07-14 07:04:10,009 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:52586, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1903309977_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748386_7562, duration(ns): 17229841 2025-07-14 07:04:10,009 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748386_7562, type=LAST_IN_PIPELINE terminating 2025-07-14 07:04:13,274 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748386_7562 replica FinalizedReplica, blk_1073748386_7562, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748386 for deletion 2025-07-14 07:04:13,276 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748386_7562 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748386 2025-07-14 07:05:09,991 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748387_7563 src: /192.168.158.1:45350 dest: /192.168.158.4:9866 2025-07-14 07:05:10,022 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45350, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1064762323_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748387_7563, duration(ns): 22241030 2025-07-14 07:05:10,022 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748387_7563, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-14 07:05:13,276 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748387_7563 replica FinalizedReplica, blk_1073748387_7563, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748387 for deletion 2025-07-14 07:05:13,277 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748387_7563 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748387 2025-07-14 07:08:10,067 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748390_7566 src: /192.168.158.6:46606 dest: /192.168.158.4:9866 2025-07-14 07:08:10,088 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:46606, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1273164429_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748390_7566, duration(ns): 18615377 2025-07-14 07:08:10,088 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748390_7566, type=LAST_IN_PIPELINE terminating 2025-07-14 07:08:16,281 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748390_7566 replica FinalizedReplica, blk_1073748390_7566, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748390 for deletion 2025-07-14 07:08:16,282 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748390_7566 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748390 2025-07-14 07:09:09,999 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748391_7567 src: /192.168.158.6:49582 dest: /192.168.158.4:9866 2025-07-14 07:09:10,018 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:49582, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2039652178_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748391_7567, duration(ns): 17253285 2025-07-14 07:09:10,019 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748391_7567, type=LAST_IN_PIPELINE terminating 2025-07-14 07:09:13,282 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748391_7567 replica FinalizedReplica, blk_1073748391_7567, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748391 for deletion 2025-07-14 07:09:13,283 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748391_7567 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748391 2025-07-14 07:11:20,008 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748393_7569 src: /192.168.158.6:43910 dest: /192.168.158.4:9866 2025-07-14 07:11:20,028 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:43910, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_867411084_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748393_7569, duration(ns): 17479056 2025-07-14 07:11:20,028 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748393_7569, type=LAST_IN_PIPELINE terminating 2025-07-14 07:11:28,286 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748393_7569 replica FinalizedReplica, blk_1073748393_7569, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748393 for deletion 2025-07-14 07:11:28,287 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748393_7569 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748393 2025-07-14 07:12:20,005 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748394_7570 src: /192.168.158.1:34536 dest: /192.168.158.4:9866 2025-07-14 07:12:20,033 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34536, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1415420966_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748394_7570, duration(ns): 19958135 2025-07-14 07:12:20,034 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748394_7570, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-14 07:12:28,290 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748394_7570 replica FinalizedReplica, blk_1073748394_7570, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748394 for deletion 2025-07-14 07:12:28,291 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748394_7570 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748394 2025-07-14 07:14:20,003 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748396_7572 src: /192.168.158.9:33444 dest: /192.168.158.4:9866 2025-07-14 07:14:20,028 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:33444, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_714893251_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748396_7572, duration(ns): 18239883 2025-07-14 07:14:20,028 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748396_7572, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-14 07:14:25,294 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748396_7572 replica FinalizedReplica, blk_1073748396_7572, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748396 for deletion 2025-07-14 07:14:25,295 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748396_7572 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748396 2025-07-14 07:17:25,014 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748399_7575 src: /192.168.158.1:51432 dest: /192.168.158.4:9866 2025-07-14 07:17:25,043 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51432, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1807701460_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748399_7575, duration(ns): 20869383 2025-07-14 07:17:25,044 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748399_7575, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-14 07:17:28,299 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748399_7575 replica FinalizedReplica, blk_1073748399_7575, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748399 for deletion 2025-07-14 07:17:28,300 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748399_7575 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748399 2025-07-14 07:18:25,028 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748400_7576 src: /192.168.158.7:51686 dest: /192.168.158.4:9866 2025-07-14 07:18:25,055 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:51686, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1502875515_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748400_7576, duration(ns): 21411844 2025-07-14 07:18:25,055 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748400_7576, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-14 07:18:28,301 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748400_7576 replica FinalizedReplica, blk_1073748400_7576, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748400 for deletion 2025-07-14 07:18:28,303 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748400_7576 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748400 2025-07-14 07:19:25,020 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748401_7577 src: /192.168.158.6:50338 dest: /192.168.158.4:9866 2025-07-14 07:19:25,045 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:50338, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1509474322_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748401_7577, duration(ns): 19666933 2025-07-14 07:19:25,045 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748401_7577, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-14 07:19:31,301 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748401_7577 replica FinalizedReplica, blk_1073748401_7577, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748401 for deletion 2025-07-14 07:19:31,303 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748401_7577 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748401 2025-07-14 07:21:25,026 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748403_7579 src: /192.168.158.8:49956 dest: /192.168.158.4:9866 2025-07-14 07:21:25,044 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:49956, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-401842478_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748403_7579, duration(ns): 15239860 2025-07-14 07:21:25,044 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748403_7579, type=LAST_IN_PIPELINE terminating 2025-07-14 07:21:31,304 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748403_7579 replica FinalizedReplica, blk_1073748403_7579, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748403 for deletion 2025-07-14 07:21:31,305 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748403_7579 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748403 2025-07-14 07:23:30,024 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748405_7581 src: /192.168.158.6:34200 dest: /192.168.158.4:9866 2025-07-14 07:23:30,050 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:34200, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-53063367_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748405_7581, duration(ns): 19002477 2025-07-14 07:23:30,050 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748405_7581, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-14 07:23:34,307 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748405_7581 replica FinalizedReplica, blk_1073748405_7581, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748405 for deletion 2025-07-14 07:23:34,308 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748405_7581 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748405 2025-07-14 07:27:35,033 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748409_7585 src: /192.168.158.7:56334 dest: /192.168.158.4:9866 2025-07-14 07:27:35,058 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:56334, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-183929840_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748409_7585, duration(ns): 19239026 2025-07-14 07:27:35,058 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748409_7585, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-14 07:27:43,321 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748409_7585 replica FinalizedReplica, blk_1073748409_7585, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748409 for deletion 2025-07-14 07:27:43,322 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748409_7585 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748409 2025-07-14 07:28:35,035 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748410_7586 src: /192.168.158.9:52422 dest: /192.168.158.4:9866 2025-07-14 07:28:35,062 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:52422, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_422628049_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748410_7586, duration(ns): 21688210 2025-07-14 07:28:35,063 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748410_7586, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-14 07:28:43,322 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748410_7586 replica FinalizedReplica, blk_1073748410_7586, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748410 for deletion 2025-07-14 07:28:43,324 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748410_7586 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748410 2025-07-14 07:29:35,041 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748411_7587 src: /192.168.158.6:40684 dest: /192.168.158.4:9866 2025-07-14 07:29:35,064 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:40684, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1365056017_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748411_7587, duration(ns): 17498817 2025-07-14 07:29:35,064 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748411_7587, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-14 07:29:43,325 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748411_7587 replica FinalizedReplica, blk_1073748411_7587, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748411 for deletion 2025-07-14 07:29:43,326 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748411_7587 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748411 2025-07-14 07:30:35,050 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748412_7588 src: /192.168.158.5:50610 dest: /192.168.158.4:9866 2025-07-14 07:30:35,078 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:50610, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_34848989_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748412_7588, duration(ns): 22288415 2025-07-14 07:30:35,078 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748412_7588, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-14 07:30:40,327 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748412_7588 replica FinalizedReplica, blk_1073748412_7588, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748412 for deletion 2025-07-14 07:30:40,328 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748412_7588 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748412 2025-07-14 07:33:40,050 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748415_7591 src: /192.168.158.5:48506 dest: /192.168.158.4:9866 2025-07-14 07:33:40,076 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:48506, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_341831784_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748415_7591, duration(ns): 19405009 2025-07-14 07:33:40,076 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748415_7591, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-14 07:33:43,334 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748415_7591 replica FinalizedReplica, blk_1073748415_7591, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748415 for deletion 2025-07-14 07:33:43,335 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748415_7591 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748415 2025-07-14 07:35:45,033 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748417_7593 src: /192.168.158.6:37122 dest: /192.168.158.4:9866 2025-07-14 07:35:45,060 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:37122, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1877816812_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748417_7593, duration(ns): 21063881 2025-07-14 07:35:45,061 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748417_7593, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-14 07:35:52,338 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748417_7593 replica FinalizedReplica, blk_1073748417_7593, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748417 for deletion 2025-07-14 07:35:52,339 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748417_7593 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748417 2025-07-14 07:39:45,035 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748421_7597 src: /192.168.158.1:54420 dest: /192.168.158.4:9866 2025-07-14 07:39:45,064 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54420, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-732249323_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748421_7597, duration(ns): 20878027 2025-07-14 07:39:45,065 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748421_7597, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-14 07:39:49,341 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748421_7597 replica FinalizedReplica, blk_1073748421_7597, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748421 for deletion 2025-07-14 07:39:49,342 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748421_7597 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748421 2025-07-14 07:40:45,053 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748422_7598 src: /192.168.158.1:58408 dest: /192.168.158.4:9866 2025-07-14 07:40:45,087 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58408, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-941971734_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748422_7598, duration(ns): 25908487 2025-07-14 07:40:45,088 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748422_7598, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-14 07:40:49,341 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748422_7598 replica FinalizedReplica, blk_1073748422_7598, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748422 for deletion 2025-07-14 07:40:49,342 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748422_7598 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748422 2025-07-14 07:41:45,066 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748423_7599 src: /192.168.158.9:50292 dest: /192.168.158.4:9866 2025-07-14 07:41:45,091 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:50292, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1250894977_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748423_7599, duration(ns): 19588001 2025-07-14 07:41:45,092 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748423_7599, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-14 07:41:52,343 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748423_7599 replica FinalizedReplica, blk_1073748423_7599, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748423 for deletion 2025-07-14 07:41:52,344 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748423_7599 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748423 2025-07-14 07:42:50,063 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748424_7600 src: /192.168.158.1:32988 dest: /192.168.158.4:9866 2025-07-14 07:42:50,093 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:32988, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-692201907_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748424_7600, duration(ns): 22011367 2025-07-14 07:42:50,094 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748424_7600, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-14 07:42:58,344 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748424_7600 replica FinalizedReplica, blk_1073748424_7600, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748424 for deletion 2025-07-14 07:42:58,346 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748424_7600 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748424 2025-07-14 07:43:50,048 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748425_7601 src: /192.168.158.1:51050 dest: /192.168.158.4:9866 2025-07-14 07:43:50,079 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51050, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1128846518_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748425_7601, duration(ns): 21506458 2025-07-14 07:43:50,079 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748425_7601, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-14 07:43:55,347 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748425_7601 replica FinalizedReplica, blk_1073748425_7601, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748425 for deletion 2025-07-14 07:43:55,348 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748425_7601 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748425 2025-07-14 07:45:55,058 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748427_7603 src: /192.168.158.5:58742 dest: /192.168.158.4:9866 2025-07-14 07:45:55,076 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:58742, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_223391334_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748427_7603, duration(ns): 16154211 2025-07-14 07:45:55,077 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748427_7603, type=LAST_IN_PIPELINE terminating 2025-07-14 07:45:58,352 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748427_7603 replica FinalizedReplica, blk_1073748427_7603, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748427 for deletion 2025-07-14 07:45:58,354 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748427_7603 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748427 2025-07-14 07:48:05,062 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748429_7605 src: /192.168.158.8:49682 dest: /192.168.158.4:9866 2025-07-14 07:48:05,079 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:49682, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1533308164_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748429_7605, duration(ns): 14901734 2025-07-14 07:48:05,079 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748429_7605, type=LAST_IN_PIPELINE terminating 2025-07-14 07:48:10,353 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748429_7605 replica FinalizedReplica, blk_1073748429_7605, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748429 for deletion 2025-07-14 07:48:10,354 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748429_7605 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748429 2025-07-14 07:50:10,066 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748431_7607 src: /192.168.158.5:45914 dest: /192.168.158.4:9866 2025-07-14 07:50:10,091 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:45914, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-522339221_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748431_7607, duration(ns): 18900963 2025-07-14 07:50:10,091 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748431_7607, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-14 07:50:16,361 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748431_7607 replica FinalizedReplica, blk_1073748431_7607, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748431 for deletion 2025-07-14 07:50:16,363 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748431_7607 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748431 2025-07-14 07:51:15,075 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748432_7608 src: /192.168.158.9:58084 dest: /192.168.158.4:9866 2025-07-14 07:51:15,096 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:58084, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-815669489_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748432_7608, duration(ns): 19649231 2025-07-14 07:51:15,097 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748432_7608, type=LAST_IN_PIPELINE terminating 2025-07-14 07:51:22,365 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748432_7608 replica FinalizedReplica, blk_1073748432_7608, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748432 for deletion 2025-07-14 07:51:22,366 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748432_7608 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748432 2025-07-14 07:52:15,073 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748433_7609 src: /192.168.158.1:44118 dest: /192.168.158.4:9866 2025-07-14 07:52:15,106 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44118, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-807577853_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748433_7609, duration(ns): 24329914 2025-07-14 07:52:15,106 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748433_7609, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-14 07:52:19,367 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748433_7609 replica FinalizedReplica, blk_1073748433_7609, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748433 for deletion 2025-07-14 07:52:19,368 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748433_7609 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748433 2025-07-14 07:53:15,077 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748434_7610 src: /192.168.158.8:59662 dest: /192.168.158.4:9866 2025-07-14 07:53:15,097 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:59662, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2045950570_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748434_7610, duration(ns): 18305470 2025-07-14 07:53:15,097 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748434_7610, type=LAST_IN_PIPELINE terminating 2025-07-14 07:53:19,368 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748434_7610 replica FinalizedReplica, blk_1073748434_7610, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748434 for deletion 2025-07-14 07:53:19,369 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748434_7610 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748434 2025-07-14 07:55:20,074 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748436_7612 src: /192.168.158.8:47664 dest: /192.168.158.4:9866 2025-07-14 07:55:20,099 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:47664, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_886940790_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748436_7612, duration(ns): 19722979 2025-07-14 07:55:20,099 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748436_7612, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-14 07:55:25,370 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748436_7612 replica FinalizedReplica, blk_1073748436_7612, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748436 for deletion 2025-07-14 07:55:25,372 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748436_7612 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748436 2025-07-14 07:56:20,089 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748437_7613 src: /192.168.158.8:54584 dest: /192.168.158.4:9866 2025-07-14 07:56:20,105 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:54584, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1652798690_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748437_7613, duration(ns): 14512058 2025-07-14 07:56:20,106 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748437_7613, type=LAST_IN_PIPELINE terminating 2025-07-14 07:56:28,373 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748437_7613 replica FinalizedReplica, blk_1073748437_7613, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748437 for deletion 2025-07-14 07:56:28,374 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748437_7613 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748437 2025-07-14 07:57:25,094 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748438_7614 src: /192.168.158.1:48504 dest: /192.168.158.4:9866 2025-07-14 07:57:25,127 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48504, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1604652725_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748438_7614, duration(ns): 23721556 2025-07-14 07:57:25,127 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748438_7614, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-14 07:57:31,373 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748438_7614 replica FinalizedReplica, blk_1073748438_7614, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748438 for deletion 2025-07-14 07:57:31,374 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748438_7614 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748438 2025-07-14 07:58:25,077 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748439_7615 src: /192.168.158.1:47286 dest: /192.168.158.4:9866 2025-07-14 07:58:25,108 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47286, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1879693635_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748439_7615, duration(ns): 22745449 2025-07-14 07:58:25,109 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748439_7615, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-14 07:58:28,373 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748439_7615 replica FinalizedReplica, blk_1073748439_7615, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748439 for deletion 2025-07-14 07:58:28,375 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748439_7615 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748439 2025-07-14 07:59:25,079 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748440_7616 src: /192.168.158.7:52506 dest: /192.168.158.4:9866 2025-07-14 07:59:25,098 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:52506, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_460115873_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748440_7616, duration(ns): 17610394 2025-07-14 07:59:25,099 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748440_7616, type=LAST_IN_PIPELINE terminating 2025-07-14 07:59:28,374 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748440_7616 replica FinalizedReplica, blk_1073748440_7616, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748440 for deletion 2025-07-14 07:59:28,376 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748440_7616 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748440 2025-07-14 08:02:25,079 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748443_7619 src: /192.168.158.9:36564 dest: /192.168.158.4:9866 2025-07-14 08:02:25,096 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:36564, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1805286143_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748443_7619, duration(ns): 15549851 2025-07-14 08:02:25,097 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748443_7619, type=LAST_IN_PIPELINE terminating 2025-07-14 08:02:31,385 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748443_7619 replica FinalizedReplica, blk_1073748443_7619, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748443 for deletion 2025-07-14 08:02:31,387 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748443_7619 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748443 2025-07-14 08:04:25,078 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748445_7621 src: /192.168.158.6:52934 dest: /192.168.158.4:9866 2025-07-14 08:04:25,099 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:52934, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1434702688_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748445_7621, duration(ns): 18792460 2025-07-14 08:04:25,100 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748445_7621, type=LAST_IN_PIPELINE terminating 2025-07-14 08:04:28,388 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748445_7621 replica FinalizedReplica, blk_1073748445_7621, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748445 for deletion 2025-07-14 08:04:28,390 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748445_7621 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748445 2025-07-14 08:05:25,081 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748446_7622 src: /192.168.158.8:36392 dest: /192.168.158.4:9866 2025-07-14 08:05:25,099 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:36392, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2077591245_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748446_7622, duration(ns): 15700468 2025-07-14 08:05:25,099 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748446_7622, type=LAST_IN_PIPELINE terminating 2025-07-14 08:05:31,390 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748446_7622 replica FinalizedReplica, blk_1073748446_7622, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748446 for deletion 2025-07-14 08:05:31,391 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748446_7622 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748446 2025-07-14 08:06:25,078 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748447_7623 src: /192.168.158.8:34272 dest: /192.168.158.4:9866 2025-07-14 08:06:25,106 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:34272, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_764823278_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748447_7623, duration(ns): 22034846 2025-07-14 08:06:25,107 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748447_7623, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-14 08:06:28,392 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748447_7623 replica FinalizedReplica, blk_1073748447_7623, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748447 for deletion 2025-07-14 08:06:28,393 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748447_7623 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748447 2025-07-14 08:07:30,105 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748448_7624 src: /192.168.158.8:34366 dest: /192.168.158.4:9866 2025-07-14 08:07:30,123 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:34366, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1731203312_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748448_7624, duration(ns): 15998393 2025-07-14 08:07:30,123 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748448_7624, type=LAST_IN_PIPELINE terminating 2025-07-14 08:07:34,395 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748448_7624 replica FinalizedReplica, blk_1073748448_7624, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748448 for deletion 2025-07-14 08:07:34,396 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748448_7624 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748448 2025-07-14 08:08:30,084 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748449_7625 src: /192.168.158.7:53838 dest: /192.168.158.4:9866 2025-07-14 08:08:30,103 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:53838, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-60965726_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748449_7625, duration(ns): 16291761 2025-07-14 08:08:30,103 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748449_7625, type=LAST_IN_PIPELINE terminating 2025-07-14 08:08:34,397 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748449_7625 replica FinalizedReplica, blk_1073748449_7625, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748449 for deletion 2025-07-14 08:08:34,398 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748449_7625 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748449 2025-07-14 08:09:30,109 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748450_7626 src: /192.168.158.1:57534 dest: /192.168.158.4:9866 2025-07-14 08:09:30,142 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57534, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1326034272_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748450_7626, duration(ns): 23939678 2025-07-14 08:09:30,142 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748450_7626, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-14 08:09:34,398 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748450_7626 replica FinalizedReplica, blk_1073748450_7626, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748450 for deletion 2025-07-14 08:09:34,400 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748450_7626 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748450 2025-07-14 08:10:30,088 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748451_7627 src: /192.168.158.1:47734 dest: /192.168.158.4:9866 2025-07-14 08:10:30,118 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47734, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1554546933_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748451_7627, duration(ns): 21069850 2025-07-14 08:10:30,118 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748451_7627, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-14 08:10:37,400 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748451_7627 replica FinalizedReplica, blk_1073748451_7627, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748451 for deletion 2025-07-14 08:10:37,401 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748451_7627 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748451 2025-07-14 08:11:35,093 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748452_7628 src: /192.168.158.1:57816 dest: /192.168.158.4:9866 2025-07-14 08:11:35,126 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57816, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1436867139_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748452_7628, duration(ns): 24119074 2025-07-14 08:11:35,126 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748452_7628, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-14 08:11:40,403 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748452_7628 replica FinalizedReplica, blk_1073748452_7628, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748452 for deletion 2025-07-14 08:11:40,404 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748452_7628 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748452 2025-07-14 08:12:35,092 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748453_7629 src: /192.168.158.7:47682 dest: /192.168.158.4:9866 2025-07-14 08:12:35,109 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:47682, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-968665036_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748453_7629, duration(ns): 15044662 2025-07-14 08:12:35,109 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748453_7629, type=LAST_IN_PIPELINE terminating 2025-07-14 08:12:43,407 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748453_7629 replica FinalizedReplica, blk_1073748453_7629, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748453 for deletion 2025-07-14 08:12:43,408 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748453_7629 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748453 2025-07-14 08:13:35,098 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748454_7630 src: /192.168.158.5:59422 dest: /192.168.158.4:9866 2025-07-14 08:13:35,116 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:59422, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_670251152_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748454_7630, duration(ns): 15844584 2025-07-14 08:13:35,116 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748454_7630, type=LAST_IN_PIPELINE terminating 2025-07-14 08:13:40,407 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748454_7630 replica FinalizedReplica, blk_1073748454_7630, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748454 for deletion 2025-07-14 08:13:40,408 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748454_7630 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748454 2025-07-14 08:14:35,085 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748455_7631 src: /192.168.158.8:41978 dest: /192.168.158.4:9866 2025-07-14 08:14:35,105 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:41978, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-778555117_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748455_7631, duration(ns): 14736051 2025-07-14 08:14:35,105 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748455_7631, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-14 08:14:40,409 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748455_7631 replica FinalizedReplica, blk_1073748455_7631, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748455 for deletion 2025-07-14 08:14:40,410 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748455_7631 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748455 2025-07-14 08:16:35,109 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748457_7633 src: /192.168.158.8:53944 dest: /192.168.158.4:9866 2025-07-14 08:16:35,132 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:53944, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2079921402_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748457_7633, duration(ns): 17749066 2025-07-14 08:16:35,132 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748457_7633, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-14 08:16:43,412 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748457_7633 replica FinalizedReplica, blk_1073748457_7633, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748457 for deletion 2025-07-14 08:16:43,413 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748457_7633 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748457 2025-07-14 08:17:40,103 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748458_7634 src: /192.168.158.8:44190 dest: /192.168.158.4:9866 2025-07-14 08:17:40,120 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:44190, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_223004458_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748458_7634, duration(ns): 15739481 2025-07-14 08:17:40,121 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748458_7634, type=LAST_IN_PIPELINE terminating 2025-07-14 08:17:46,414 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748458_7634 replica FinalizedReplica, blk_1073748458_7634, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748458 for deletion 2025-07-14 08:17:46,415 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748458_7634 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748458 2025-07-14 08:19:45,109 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748460_7636 src: /192.168.158.6:51524 dest: /192.168.158.4:9866 2025-07-14 08:19:45,129 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:51524, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_770831624_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748460_7636, duration(ns): 17788345 2025-07-14 08:19:45,129 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748460_7636, type=LAST_IN_PIPELINE terminating 2025-07-14 08:19:52,418 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748460_7636 replica FinalizedReplica, blk_1073748460_7636, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748460 for deletion 2025-07-14 08:19:52,419 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748460_7636 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748460 2025-07-14 08:22:55,149 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748463_7639 src: /192.168.158.1:54356 dest: /192.168.158.4:9866 2025-07-14 08:22:55,180 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54356, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_808406439_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748463_7639, duration(ns): 21995715 2025-07-14 08:22:55,180 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748463_7639, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-14 08:22:58,422 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748463_7639 replica FinalizedReplica, blk_1073748463_7639, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748463 for deletion 2025-07-14 08:22:58,424 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748463_7639 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748463 2025-07-14 08:25:00,144 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748465_7641 src: /192.168.158.8:60140 dest: /192.168.158.4:9866 2025-07-14 08:25:00,164 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:60140, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_270780054_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748465_7641, duration(ns): 17558517 2025-07-14 08:25:00,164 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748465_7641, type=LAST_IN_PIPELINE terminating 2025-07-14 08:25:07,425 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748465_7641 replica FinalizedReplica, blk_1073748465_7641, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748465 for deletion 2025-07-14 08:25:07,426 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748465_7641 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748465 2025-07-14 08:28:05,119 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748468_7644 src: /192.168.158.1:48454 dest: /192.168.158.4:9866 2025-07-14 08:28:05,149 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48454, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-484771073_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748468_7644, duration(ns): 21207184 2025-07-14 08:28:05,149 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748468_7644, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-14 08:28:10,434 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748468_7644 replica FinalizedReplica, blk_1073748468_7644, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748468 for deletion 2025-07-14 08:28:10,435 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748468_7644 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748468 2025-07-14 08:29:10,141 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748469_7645 src: /192.168.158.1:50516 dest: /192.168.158.4:9866 2025-07-14 08:29:10,170 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50516, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1496696148_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748469_7645, duration(ns): 20589700 2025-07-14 08:29:10,170 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748469_7645, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-14 08:29:16,434 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748469_7645 replica FinalizedReplica, blk_1073748469_7645, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748469 for deletion 2025-07-14 08:29:16,436 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748469_7645 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748469 2025-07-14 08:31:10,151 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748471_7647 src: /192.168.158.6:37754 dest: /192.168.158.4:9866 2025-07-14 08:31:10,170 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:37754, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2068733845_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748471_7647, duration(ns): 16757696 2025-07-14 08:31:10,170 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748471_7647, type=LAST_IN_PIPELINE terminating 2025-07-14 08:31:16,440 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748471_7647 replica FinalizedReplica, blk_1073748471_7647, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748471 for deletion 2025-07-14 08:31:16,441 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748471_7647 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748471 2025-07-14 08:32:10,167 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748472_7648 src: /192.168.158.6:47050 dest: /192.168.158.4:9866 2025-07-14 08:32:10,187 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:47050, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2039987315_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748472_7648, duration(ns): 17753928 2025-07-14 08:32:10,187 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748472_7648, type=LAST_IN_PIPELINE terminating 2025-07-14 08:32:16,440 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748472_7648 replica FinalizedReplica, blk_1073748472_7648, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748472 for deletion 2025-07-14 08:32:16,441 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748472_7648 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748472 2025-07-14 08:34:10,143 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748474_7650 src: /192.168.158.5:38698 dest: /192.168.158.4:9866 2025-07-14 08:34:10,164 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:38698, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_736961356_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748474_7650, duration(ns): 17986191 2025-07-14 08:34:10,164 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748474_7650, type=LAST_IN_PIPELINE terminating 2025-07-14 08:34:16,440 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748474_7650 replica FinalizedReplica, blk_1073748474_7650, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748474 for deletion 2025-07-14 08:34:16,442 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748474_7650 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748474 2025-07-14 08:36:10,134 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748476_7652 src: /192.168.158.1:56638 dest: /192.168.158.4:9866 2025-07-14 08:36:10,164 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56638, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1837270720_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748476_7652, duration(ns): 21844249 2025-07-14 08:36:10,165 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748476_7652, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-14 08:36:13,443 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748476_7652 replica FinalizedReplica, blk_1073748476_7652, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748476 for deletion 2025-07-14 08:36:13,444 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748476_7652 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748476 2025-07-14 08:38:15,146 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748478_7654 src: /192.168.158.6:57434 dest: /192.168.158.4:9866 2025-07-14 08:38:15,174 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:57434, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1710064476_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748478_7654, duration(ns): 21225955 2025-07-14 08:38:15,174 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748478_7654, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-14 08:38:19,449 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748478_7654 replica FinalizedReplica, blk_1073748478_7654, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748478 for deletion 2025-07-14 08:38:19,451 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748478_7654 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073748478 2025-07-14 08:41:15,139 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748481_7657 src: /192.168.158.1:51938 dest: /192.168.158.4:9866 2025-07-14 08:41:15,171 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51938, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-565510855_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748481_7657, duration(ns): 24567591 2025-07-14 08:41:15,172 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748481_7657, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-14 08:41:22,456 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748481_7657 replica FinalizedReplica, blk_1073748481_7657, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748481 for deletion 2025-07-14 08:41:22,457 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748481_7657 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748481 2025-07-14 08:42:15,148 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748482_7658 src: /192.168.158.1:54846 dest: /192.168.158.4:9866 2025-07-14 08:42:15,179 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54846, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1209411705_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748482_7658, duration(ns): 22122732 2025-07-14 08:42:15,180 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748482_7658, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-14 08:42:19,456 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748482_7658 replica FinalizedReplica, blk_1073748482_7658, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748482 for deletion 2025-07-14 08:42:19,457 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748482_7658 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748482 2025-07-14 08:43:15,144 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748483_7659 src: /192.168.158.5:57244 dest: /192.168.158.4:9866 2025-07-14 08:43:15,162 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:57244, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_552109799_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748483_7659, duration(ns): 15371394 2025-07-14 08:43:15,162 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748483_7659, type=LAST_IN_PIPELINE terminating 2025-07-14 08:43:22,456 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748483_7659 replica FinalizedReplica, blk_1073748483_7659, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748483 for deletion 2025-07-14 08:43:22,458 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748483_7659 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748483 2025-07-14 08:44:15,147 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748484_7660 src: /192.168.158.5:42812 dest: /192.168.158.4:9866 2025-07-14 08:44:15,166 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:42812, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_439085157_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748484_7660, duration(ns): 16345736 2025-07-14 08:44:15,166 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748484_7660, type=LAST_IN_PIPELINE terminating 2025-07-14 08:44:19,458 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748484_7660 replica FinalizedReplica, blk_1073748484_7660, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748484 for deletion 2025-07-14 08:44:19,459 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748484_7660 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748484 2025-07-14 08:48:15,181 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748488_7664 src: /192.168.158.8:58688 dest: /192.168.158.4:9866 2025-07-14 08:48:15,197 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:58688, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-731010392_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748488_7664, duration(ns): 14388549 2025-07-14 08:48:15,198 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748488_7664, type=LAST_IN_PIPELINE terminating 2025-07-14 08:48:19,467 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748488_7664 replica FinalizedReplica, blk_1073748488_7664, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748488 for deletion 2025-07-14 08:48:19,468 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748488_7664 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748488 2025-07-14 08:50:15,159 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748490_7666 src: /192.168.158.8:41528 dest: /192.168.158.4:9866 2025-07-14 08:50:15,178 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:41528, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1072616248_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748490_7666, duration(ns): 17157998 2025-07-14 08:50:15,178 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748490_7666, type=LAST_IN_PIPELINE terminating 2025-07-14 08:50:22,468 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748490_7666 replica FinalizedReplica, blk_1073748490_7666, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748490 for deletion 2025-07-14 08:50:22,470 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748490_7666 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748490 2025-07-14 08:52:25,171 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748492_7668 src: /192.168.158.8:53266 dest: /192.168.158.4:9866 2025-07-14 08:52:25,190 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:53266, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_564615301_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748492_7668, duration(ns): 16681907 2025-07-14 08:52:25,190 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748492_7668, type=LAST_IN_PIPELINE terminating 2025-07-14 08:52:28,470 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748492_7668 replica FinalizedReplica, blk_1073748492_7668, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748492 for deletion 2025-07-14 08:52:28,471 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748492_7668 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748492 2025-07-14 08:53:25,154 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748493_7669 src: /192.168.158.1:34120 dest: /192.168.158.4:9866 2025-07-14 08:53:25,184 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34120, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-145050140_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748493_7669, duration(ns): 21324527 2025-07-14 08:53:25,184 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748493_7669, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-14 08:53:31,471 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748493_7669 replica FinalizedReplica, blk_1073748493_7669, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748493 for deletion 2025-07-14 08:53:31,473 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748493_7669 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748493 2025-07-14 08:55:30,168 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748495_7671 src: /192.168.158.1:58754 dest: /192.168.158.4:9866 2025-07-14 08:55:30,200 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58754, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1249323602_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748495_7671, duration(ns): 22889222 2025-07-14 08:55:30,200 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748495_7671, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-14 08:55:37,475 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748495_7671 replica FinalizedReplica, blk_1073748495_7671, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748495 for deletion 2025-07-14 08:55:37,476 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748495_7671 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748495 2025-07-14 08:59:40,172 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748499_7675 src: /192.168.158.5:55076 dest: /192.168.158.4:9866 2025-07-14 08:59:40,197 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:55076, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-386999115_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748499_7675, duration(ns): 18472731 2025-07-14 08:59:40,198 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748499_7675, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-14 08:59:43,482 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748499_7675 replica FinalizedReplica, blk_1073748499_7675, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748499 for deletion 2025-07-14 08:59:43,483 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748499_7675 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748499 2025-07-14 09:02:40,175 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748502_7678 src: /192.168.158.1:35852 dest: /192.168.158.4:9866 2025-07-14 09:02:40,209 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35852, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-217201858_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748502_7678, duration(ns): 26316564 2025-07-14 09:02:40,210 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748502_7678, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-14 09:02:43,487 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748502_7678 replica FinalizedReplica, blk_1073748502_7678, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748502 for deletion 2025-07-14 09:02:43,488 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748502_7678 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748502 2025-07-14 09:04:45,187 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748504_7680 src: /192.168.158.9:52736 dest: /192.168.158.4:9866 2025-07-14 09:04:45,204 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:52736, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1035574755_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748504_7680, duration(ns): 15570538 2025-07-14 09:04:45,205 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748504_7680, type=LAST_IN_PIPELINE terminating 2025-07-14 09:04:49,490 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748504_7680 replica FinalizedReplica, blk_1073748504_7680, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748504 for deletion 2025-07-14 09:04:49,492 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748504_7680 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748504 2025-07-14 09:08:50,202 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748508_7684 src: /192.168.158.8:42546 dest: /192.168.158.4:9866 2025-07-14 09:08:50,222 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:42546, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-43969253_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748508_7684, duration(ns): 17918354 2025-07-14 09:08:50,222 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748508_7684, type=LAST_IN_PIPELINE terminating 2025-07-14 09:08:58,496 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748508_7684 replica FinalizedReplica, blk_1073748508_7684, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748508 for deletion 2025-07-14 09:08:58,498 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748508_7684 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748508 2025-07-14 09:09:50,208 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748509_7685 src: /192.168.158.1:50482 dest: /192.168.158.4:9866 2025-07-14 09:09:50,239 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50482, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2050828748_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748509_7685, duration(ns): 21764551 2025-07-14 09:09:50,239 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748509_7685, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-14 09:09:55,499 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748509_7685 replica FinalizedReplica, blk_1073748509_7685, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748509 for deletion 2025-07-14 09:09:55,500 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748509_7685 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748509 2025-07-14 09:11:55,203 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748511_7687 src: /192.168.158.1:52442 dest: /192.168.158.4:9866 2025-07-14 09:11:55,236 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52442, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_47391712_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748511_7687, duration(ns): 24551403 2025-07-14 09:11:55,237 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748511_7687, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-14 09:11:58,503 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748511_7687 replica FinalizedReplica, blk_1073748511_7687, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748511 for deletion 2025-07-14 09:11:58,504 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748511_7687 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748511 2025-07-14 09:12:55,202 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748512_7688 src: /192.168.158.5:43306 dest: /192.168.158.4:9866 2025-07-14 09:12:55,219 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:43306, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-99042174_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748512_7688, duration(ns): 15472170 2025-07-14 09:12:55,220 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748512_7688, type=LAST_IN_PIPELINE terminating 2025-07-14 09:12:58,505 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748512_7688 replica FinalizedReplica, blk_1073748512_7688, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748512 for deletion 2025-07-14 09:12:58,508 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748512_7688 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748512 2025-07-14 09:13:55,213 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748513_7689 src: /192.168.158.1:57402 dest: /192.168.158.4:9866 2025-07-14 09:13:55,246 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57402, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1403476581_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748513_7689, duration(ns): 23570246 2025-07-14 09:13:55,247 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748513_7689, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-14 09:14:01,506 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748513_7689 replica FinalizedReplica, blk_1073748513_7689, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748513 for deletion 2025-07-14 09:14:01,507 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748513_7689 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748513 2025-07-14 09:15:55,222 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748515_7691 src: /192.168.158.5:58458 dest: /192.168.158.4:9866 2025-07-14 09:15:55,240 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:58458, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1038684369_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748515_7691, duration(ns): 16476456 2025-07-14 09:15:55,241 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748515_7691, type=LAST_IN_PIPELINE terminating 2025-07-14 09:16:01,509 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748515_7691 replica FinalizedReplica, blk_1073748515_7691, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748515 for deletion 2025-07-14 09:16:01,510 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748515_7691 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748515 2025-07-14 09:18:00,236 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748517_7693 src: /192.168.158.7:57968 dest: /192.168.158.4:9866 2025-07-14 09:18:00,254 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:57968, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1129590468_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748517_7693, duration(ns): 16111075 2025-07-14 09:18:00,254 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748517_7693, type=LAST_IN_PIPELINE terminating 2025-07-14 09:18:04,513 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748517_7693 replica FinalizedReplica, blk_1073748517_7693, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748517 for deletion 2025-07-14 09:18:04,514 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748517_7693 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748517 2025-07-14 09:19:00,230 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748518_7694 src: /192.168.158.1:54290 dest: /192.168.158.4:9866 2025-07-14 09:19:00,267 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54290, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1947630829_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748518_7694, duration(ns): 25835326 2025-07-14 09:19:00,267 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748518_7694, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-14 09:19:07,514 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748518_7694 replica FinalizedReplica, blk_1073748518_7694, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748518 for deletion 2025-07-14 09:19:07,515 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748518_7694 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748518 2025-07-14 09:20:00,216 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748519_7695 src: /192.168.158.9:32882 dest: /192.168.158.4:9866 2025-07-14 09:20:00,241 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:32882, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_652539127_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748519_7695, duration(ns): 20178137 2025-07-14 09:20:00,242 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748519_7695, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-14 09:20:04,515 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748519_7695 replica FinalizedReplica, blk_1073748519_7695, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748519 for deletion 2025-07-14 09:20:04,516 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748519_7695 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748519 2025-07-14 09:25:10,219 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748524_7700 src: /192.168.158.1:48840 dest: /192.168.158.4:9866 2025-07-14 09:25:10,249 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48840, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2055198598_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748524_7700, duration(ns): 21086708 2025-07-14 09:25:10,249 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748524_7700, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-14 09:25:16,527 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748524_7700 replica FinalizedReplica, blk_1073748524_7700, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748524 for deletion 2025-07-14 09:25:16,529 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748524_7700 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748524 2025-07-14 09:27:15,233 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748526_7702 src: /192.168.158.5:38640 dest: /192.168.158.4:9866 2025-07-14 09:27:15,252 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:38640, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_686053192_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748526_7702, duration(ns): 16960572 2025-07-14 09:27:15,252 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748526_7702, type=LAST_IN_PIPELINE terminating 2025-07-14 09:27:19,531 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748526_7702 replica FinalizedReplica, blk_1073748526_7702, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748526 for deletion 2025-07-14 09:27:19,533 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748526_7702 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748526 2025-07-14 09:28:15,239 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748527_7703 src: /192.168.158.8:51974 dest: /192.168.158.4:9866 2025-07-14 09:28:15,259 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:51974, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2063612435_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748527_7703, duration(ns): 18003850 2025-07-14 09:28:15,260 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748527_7703, type=LAST_IN_PIPELINE terminating 2025-07-14 09:28:19,534 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748527_7703 replica FinalizedReplica, blk_1073748527_7703, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748527 for deletion 2025-07-14 09:28:19,535 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748527_7703 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748527 2025-07-14 09:31:20,236 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748530_7706 src: /192.168.158.6:37348 dest: /192.168.158.4:9866 2025-07-14 09:31:20,262 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:37348, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1000464681_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748530_7706, duration(ns): 20067734 2025-07-14 09:31:20,262 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748530_7706, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-14 09:31:25,543 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748530_7706 replica FinalizedReplica, blk_1073748530_7706, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748530 for deletion 2025-07-14 09:31:25,544 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748530_7706 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748530 2025-07-14 09:32:20,997 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748531_7707 src: /192.168.158.9:38914 dest: /192.168.158.4:9866 2025-07-14 09:32:21,023 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:38914, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1904519820_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748531_7707, duration(ns): 20211356 2025-07-14 09:32:21,023 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748531_7707, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-14 09:32:25,544 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748531_7707 replica FinalizedReplica, blk_1073748531_7707, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748531 for deletion 2025-07-14 09:32:25,546 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748531_7707 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748531 2025-07-14 09:33:25,268 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748532_7708 src: /192.168.158.6:35844 dest: /192.168.158.4:9866 2025-07-14 09:33:25,294 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:35844, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-825615152_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748532_7708, duration(ns): 20341529 2025-07-14 09:33:25,294 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748532_7708, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-14 09:33:28,545 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748532_7708 replica FinalizedReplica, blk_1073748532_7708, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748532 for deletion 2025-07-14 09:33:28,546 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748532_7708 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748532 2025-07-14 09:35:30,236 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748534_7710 src: /192.168.158.1:40300 dest: /192.168.158.4:9866 2025-07-14 09:35:30,269 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40300, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_805556746_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748534_7710, duration(ns): 24154224 2025-07-14 09:35:30,269 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748534_7710, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-14 09:35:34,552 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748534_7710 replica FinalizedReplica, blk_1073748534_7710, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748534 for deletion 2025-07-14 09:35:34,554 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748534_7710 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748534 2025-07-14 09:39:35,247 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748538_7714 src: /192.168.158.5:41950 dest: /192.168.158.4:9866 2025-07-14 09:39:35,270 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:41950, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2076294823_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748538_7714, duration(ns): 17434099 2025-07-14 09:39:35,270 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748538_7714, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-14 09:39:43,562 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748538_7714 replica FinalizedReplica, blk_1073748538_7714, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748538 for deletion 2025-07-14 09:39:43,563 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748538_7714 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748538 2025-07-14 09:40:35,243 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748539_7715 src: /192.168.158.8:33420 dest: /192.168.158.4:9866 2025-07-14 09:40:35,267 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:33420, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_369775388_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748539_7715, duration(ns): 19833240 2025-07-14 09:40:35,268 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748539_7715, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-14 09:40:40,565 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748539_7715 replica FinalizedReplica, blk_1073748539_7715, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748539 for deletion 2025-07-14 09:40:40,566 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748539_7715 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748539 2025-07-14 09:41:40,240 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748540_7716 src: /192.168.158.1:45262 dest: /192.168.158.4:9866 2025-07-14 09:41:40,274 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45262, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2114162474_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748540_7716, duration(ns): 25630148 2025-07-14 09:41:40,275 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748540_7716, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-14 09:41:43,567 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748540_7716 replica FinalizedReplica, blk_1073748540_7716, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748540 for deletion 2025-07-14 09:41:43,568 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748540_7716 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748540 2025-07-14 09:42:45,232 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748541_7717 src: /192.168.158.7:37564 dest: /192.168.158.4:9866 2025-07-14 09:42:45,256 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:37564, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_490666524_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748541_7717, duration(ns): 18207133 2025-07-14 09:42:45,256 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748541_7717, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-14 09:42:52,570 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748541_7717 replica FinalizedReplica, blk_1073748541_7717, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748541 for deletion 2025-07-14 09:42:52,571 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748541_7717 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748541 2025-07-14 09:43:45,236 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748542_7718 src: /192.168.158.1:47428 dest: /192.168.158.4:9866 2025-07-14 09:43:45,268 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47428, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1976253159_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748542_7718, duration(ns): 23018316 2025-07-14 09:43:45,268 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748542_7718, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-14 09:43:52,570 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748542_7718 replica FinalizedReplica, blk_1073748542_7718, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748542 for deletion 2025-07-14 09:43:52,571 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748542_7718 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748542 2025-07-14 09:46:50,246 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748545_7721 src: /192.168.158.6:46082 dest: /192.168.158.4:9866 2025-07-14 09:46:50,263 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:46082, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_254486130_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748545_7721, duration(ns): 15549760 2025-07-14 09:46:50,264 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748545_7721, type=LAST_IN_PIPELINE terminating 2025-07-14 09:46:55,577 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748545_7721 replica FinalizedReplica, blk_1073748545_7721, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748545 for deletion 2025-07-14 09:46:55,578 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748545_7721 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748545 2025-07-14 09:47:50,245 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748546_7722 src: /192.168.158.9:49422 dest: /192.168.158.4:9866 2025-07-14 09:47:50,263 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:49422, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1999640364_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748546_7722, duration(ns): 15713991 2025-07-14 09:47:50,264 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748546_7722, type=LAST_IN_PIPELINE terminating 2025-07-14 09:47:52,578 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748546_7722 replica FinalizedReplica, blk_1073748546_7722, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748546 for deletion 2025-07-14 09:47:52,580 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748546_7722 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748546 2025-07-14 09:55:10,258 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748553_7729 src: /192.168.158.9:52238 dest: /192.168.158.4:9866 2025-07-14 09:55:10,277 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:52238, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-126110817_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748553_7729, duration(ns): 17528733 2025-07-14 09:55:10,278 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748553_7729, type=LAST_IN_PIPELINE terminating 2025-07-14 09:55:16,594 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748553_7729 replica FinalizedReplica, blk_1073748553_7729, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748553 for deletion 2025-07-14 09:55:16,595 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748553_7729 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748553 2025-07-14 09:56:10,253 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748554_7730 src: /192.168.158.1:36608 dest: /192.168.158.4:9866 2025-07-14 09:56:10,287 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36608, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1807071640_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748554_7730, duration(ns): 25437949 2025-07-14 09:56:10,287 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748554_7730, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-14 09:56:13,595 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748554_7730 replica FinalizedReplica, blk_1073748554_7730, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748554 for deletion 2025-07-14 09:56:13,597 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748554_7730 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748554 2025-07-14 09:57:10,253 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748555_7731 src: /192.168.158.1:41014 dest: /192.168.158.4:9866 2025-07-14 09:57:10,285 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41014, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-145928688_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748555_7731, duration(ns): 23438123 2025-07-14 09:57:10,285 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748555_7731, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-14 09:57:16,599 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748555_7731 replica FinalizedReplica, blk_1073748555_7731, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748555 for deletion 2025-07-14 09:57:16,601 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748555_7731 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748555 2025-07-14 09:59:15,259 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748557_7733 src: /192.168.158.1:35052 dest: /192.168.158.4:9866 2025-07-14 09:59:15,291 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35052, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1631125235_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748557_7733, duration(ns): 22488389 2025-07-14 09:59:15,291 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748557_7733, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-14 09:59:22,607 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748557_7733 replica FinalizedReplica, blk_1073748557_7733, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748557 for deletion 2025-07-14 09:59:22,608 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748557_7733 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748557 2025-07-14 10:02:25,273 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748560_7736 src: /192.168.158.8:41616 dest: /192.168.158.4:9866 2025-07-14 10:02:25,292 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:41616, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-412721550_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748560_7736, duration(ns): 15685196 2025-07-14 10:02:25,292 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748560_7736, type=LAST_IN_PIPELINE terminating 2025-07-14 10:02:31,615 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748560_7736 replica FinalizedReplica, blk_1073748560_7736, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748560 for deletion 2025-07-14 10:02:31,616 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748560_7736 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748560 2025-07-14 10:04:25,277 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748562_7738 src: /192.168.158.8:44494 dest: /192.168.158.4:9866 2025-07-14 10:04:25,300 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:44494, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_115922450_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748562_7738, duration(ns): 18607638 2025-07-14 10:04:25,301 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748562_7738, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-14 10:04:28,624 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748562_7738 replica FinalizedReplica, blk_1073748562_7738, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748562 for deletion 2025-07-14 10:04:28,625 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748562_7738 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748562 2025-07-14 10:05:30,274 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748563_7739 src: /192.168.158.8:58548 dest: /192.168.158.4:9866 2025-07-14 10:05:30,298 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:58548, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1822366420_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748563_7739, duration(ns): 18746295 2025-07-14 10:05:30,298 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748563_7739, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-14 10:05:37,626 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748563_7739 replica FinalizedReplica, blk_1073748563_7739, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748563 for deletion 2025-07-14 10:05:37,627 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748563_7739 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748563 2025-07-14 10:06:35,279 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748564_7740 src: /192.168.158.9:33634 dest: /192.168.158.4:9866 2025-07-14 10:06:35,294 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:33634, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-133521982_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748564_7740, duration(ns): 13540836 2025-07-14 10:06:35,295 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748564_7740, type=LAST_IN_PIPELINE terminating 2025-07-14 10:06:37,627 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748564_7740 replica FinalizedReplica, blk_1073748564_7740, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748564 for deletion 2025-07-14 10:06:37,629 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748564_7740 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748564 2025-07-14 10:08:40,273 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748566_7742 src: /192.168.158.1:40920 dest: /192.168.158.4:9866 2025-07-14 10:08:40,306 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40920, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_552802774_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748566_7742, duration(ns): 23027135 2025-07-14 10:08:40,306 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748566_7742, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-14 10:08:43,638 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748566_7742 replica FinalizedReplica, blk_1073748566_7742, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748566 for deletion 2025-07-14 10:08:43,639 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748566_7742 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748566 2025-07-14 10:09:40,279 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748567_7743 src: /192.168.158.7:41446 dest: /192.168.158.4:9866 2025-07-14 10:09:40,298 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:41446, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1127471386_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748567_7743, duration(ns): 17080624 2025-07-14 10:09:40,299 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748567_7743, type=LAST_IN_PIPELINE terminating 2025-07-14 10:09:43,639 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748567_7743 replica FinalizedReplica, blk_1073748567_7743, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748567 for deletion 2025-07-14 10:09:43,641 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748567_7743 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748567 2025-07-14 10:11:45,276 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748569_7745 src: /192.168.158.1:43432 dest: /192.168.158.4:9866 2025-07-14 10:11:45,309 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43432, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-818306296_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748569_7745, duration(ns): 23618234 2025-07-14 10:11:45,309 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748569_7745, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-14 10:11:52,643 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748569_7745 replica FinalizedReplica, blk_1073748569_7745, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748569 for deletion 2025-07-14 10:11:52,644 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748569_7745 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748569 2025-07-14 10:12:45,277 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748570_7746 src: /192.168.158.1:34652 dest: /192.168.158.4:9866 2025-07-14 10:12:45,313 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34652, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_736989590_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748570_7746, duration(ns): 25967896 2025-07-14 10:12:45,313 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748570_7746, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-14 10:12:52,646 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748570_7746 replica FinalizedReplica, blk_1073748570_7746, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748570 for deletion 2025-07-14 10:12:52,647 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748570_7746 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748570 2025-07-14 10:14:45,280 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748572_7748 src: /192.168.158.6:33432 dest: /192.168.158.4:9866 2025-07-14 10:14:45,307 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:33432, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1286781401_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748572_7748, duration(ns): 21619051 2025-07-14 10:14:45,307 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748572_7748, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-14 10:14:49,648 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748572_7748 replica FinalizedReplica, blk_1073748572_7748, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748572 for deletion 2025-07-14 10:14:49,650 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748572_7748 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748572 2025-07-14 10:15:45,281 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748573_7749 src: /192.168.158.7:45416 dest: /192.168.158.4:9866 2025-07-14 10:15:45,307 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:45416, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-444612540_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748573_7749, duration(ns): 19827889 2025-07-14 10:15:45,307 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748573_7749, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-14 10:15:49,649 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748573_7749 replica FinalizedReplica, blk_1073748573_7749, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748573 for deletion 2025-07-14 10:15:49,651 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748573_7749 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748573 2025-07-14 10:16:45,317 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748574_7750 src: /192.168.158.6:45490 dest: /192.168.158.4:9866 2025-07-14 10:16:45,336 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:45490, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_819143526_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748574_7750, duration(ns): 16467072 2025-07-14 10:16:45,336 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748574_7750, type=LAST_IN_PIPELINE terminating 2025-07-14 10:16:52,655 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748574_7750 replica FinalizedReplica, blk_1073748574_7750, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748574 for deletion 2025-07-14 10:16:52,656 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748574_7750 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748574 2025-07-14 10:19:50,289 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748577_7753 src: /192.168.158.7:37818 dest: /192.168.158.4:9866 2025-07-14 10:19:50,314 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:37818, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1995211732_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748577_7753, duration(ns): 19968389 2025-07-14 10:19:50,314 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748577_7753, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-14 10:19:52,660 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748577_7753 replica FinalizedReplica, blk_1073748577_7753, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748577 for deletion 2025-07-14 10:19:52,661 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748577_7753 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748577 2025-07-14 10:22:55,292 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748580_7756 src: /192.168.158.7:54466 dest: /192.168.158.4:9866 2025-07-14 10:22:55,309 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:54466, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-426693016_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748580_7756, duration(ns): 15657723 2025-07-14 10:22:55,310 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748580_7756, type=LAST_IN_PIPELINE terminating 2025-07-14 10:22:58,664 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748580_7756 replica FinalizedReplica, blk_1073748580_7756, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748580 for deletion 2025-07-14 10:22:58,665 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748580_7756 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748580 2025-07-14 10:25:55,307 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748583_7759 src: /192.168.158.6:38194 dest: /192.168.158.4:9866 2025-07-14 10:25:55,325 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:38194, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1514074896_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748583_7759, duration(ns): 16221150 2025-07-14 10:25:55,326 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748583_7759, type=LAST_IN_PIPELINE terminating 2025-07-14 10:25:58,673 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748583_7759 replica FinalizedReplica, blk_1073748583_7759, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748583 for deletion 2025-07-14 10:25:58,674 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748583_7759 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748583 2025-07-14 10:28:00,305 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748585_7761 src: /192.168.158.6:44624 dest: /192.168.158.4:9866 2025-07-14 10:28:00,330 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:44624, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_836135345_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748585_7761, duration(ns): 20039189 2025-07-14 10:28:00,331 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748585_7761, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-14 10:28:04,677 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748585_7761 replica FinalizedReplica, blk_1073748585_7761, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748585 for deletion 2025-07-14 10:28:04,678 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748585_7761 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748585 2025-07-14 10:29:00,310 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748586_7762 src: /192.168.158.5:55364 dest: /192.168.158.4:9866 2025-07-14 10:29:00,336 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:55364, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1000190154_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748586_7762, duration(ns): 21021496 2025-07-14 10:29:00,336 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748586_7762, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-14 10:29:04,678 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748586_7762 replica FinalizedReplica, blk_1073748586_7762, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748586 for deletion 2025-07-14 10:29:04,679 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748586_7762 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748586 2025-07-14 10:31:05,310 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748588_7764 src: /192.168.158.5:49144 dest: /192.168.158.4:9866 2025-07-14 10:31:05,335 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:49144, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1327628706_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748588_7764, duration(ns): 19428828 2025-07-14 10:31:05,335 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748588_7764, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-14 10:31:07,682 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748588_7764 replica FinalizedReplica, blk_1073748588_7764, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748588 for deletion 2025-07-14 10:31:07,683 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748588_7764 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748588 2025-07-14 10:33:10,326 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748590_7766 src: /192.168.158.7:55200 dest: /192.168.158.4:9866 2025-07-14 10:33:10,345 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:55200, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1031268765_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748590_7766, duration(ns): 17336386 2025-07-14 10:33:10,345 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748590_7766, type=LAST_IN_PIPELINE terminating 2025-07-14 10:33:13,687 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748590_7766 replica FinalizedReplica, blk_1073748590_7766, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748590 for deletion 2025-07-14 10:33:13,689 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748590_7766 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748590 2025-07-14 10:35:10,345 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748592_7768 src: /192.168.158.1:38296 dest: /192.168.158.4:9866 2025-07-14 10:35:10,380 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38296, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_563239203_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748592_7768, duration(ns): 25603502 2025-07-14 10:35:10,380 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748592_7768, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-14 10:35:16,692 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748592_7768 replica FinalizedReplica, blk_1073748592_7768, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748592 for deletion 2025-07-14 10:35:16,693 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748592_7768 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748592 2025-07-14 10:36:10,323 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748593_7769 src: /192.168.158.7:57124 dest: /192.168.158.4:9866 2025-07-14 10:36:10,347 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:57124, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1113070707_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748593_7769, duration(ns): 18134849 2025-07-14 10:36:10,347 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748593_7769, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-14 10:36:13,695 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748593_7769 replica FinalizedReplica, blk_1073748593_7769, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748593 for deletion 2025-07-14 10:36:13,697 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748593_7769 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748593 2025-07-14 10:38:10,321 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748595_7771 src: /192.168.158.1:35180 dest: /192.168.158.4:9866 2025-07-14 10:38:10,355 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35180, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1464337796_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748595_7771, duration(ns): 24522275 2025-07-14 10:38:10,356 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748595_7771, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-14 10:38:13,701 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748595_7771 replica FinalizedReplica, blk_1073748595_7771, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748595 for deletion 2025-07-14 10:38:13,702 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748595_7771 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748595 2025-07-14 10:39:10,323 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748596_7772 src: /192.168.158.8:43188 dest: /192.168.158.4:9866 2025-07-14 10:39:10,354 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:43188, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1499207916_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748596_7772, duration(ns): 25592127 2025-07-14 10:39:10,354 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748596_7772, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-14 10:39:16,703 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748596_7772 replica FinalizedReplica, blk_1073748596_7772, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748596 for deletion 2025-07-14 10:39:16,705 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748596_7772 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748596 2025-07-14 10:42:10,328 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748599_7775 src: /192.168.158.8:50832 dest: /192.168.158.4:9866 2025-07-14 10:42:10,351 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:50832, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_741064263_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748599_7775, duration(ns): 18188853 2025-07-14 10:42:10,351 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748599_7775, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-14 10:42:13,708 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748599_7775 replica FinalizedReplica, blk_1073748599_7775, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748599 for deletion 2025-07-14 10:42:13,709 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748599_7775 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748599 2025-07-14 10:45:10,321 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748602_7778 src: /192.168.158.8:32860 dest: /192.168.158.4:9866 2025-07-14 10:45:10,343 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:32860, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1369594680_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748602_7778, duration(ns): 16888194 2025-07-14 10:45:10,344 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748602_7778, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-14 10:45:16,714 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748602_7778 replica FinalizedReplica, blk_1073748602_7778, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748602 for deletion 2025-07-14 10:45:16,715 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748602_7778 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748602 2025-07-14 10:46:10,321 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748603_7779 src: /192.168.158.1:60086 dest: /192.168.158.4:9866 2025-07-14 10:46:10,351 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60086, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_866699057_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748603_7779, duration(ns): 21553402 2025-07-14 10:46:10,351 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748603_7779, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-14 10:46:16,716 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748603_7779 replica FinalizedReplica, blk_1073748603_7779, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748603 for deletion 2025-07-14 10:46:16,718 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748603_7779 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748603 2025-07-14 10:47:10,329 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748604_7780 src: /192.168.158.6:34752 dest: /192.168.158.4:9866 2025-07-14 10:47:10,350 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:34752, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1205427931_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748604_7780, duration(ns): 18650206 2025-07-14 10:47:10,350 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748604_7780, type=LAST_IN_PIPELINE terminating 2025-07-14 10:47:13,718 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748604_7780 replica FinalizedReplica, blk_1073748604_7780, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748604 for deletion 2025-07-14 10:47:13,719 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748604_7780 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748604 2025-07-14 10:48:10,320 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748605_7781 src: /192.168.158.1:38384 dest: /192.168.158.4:9866 2025-07-14 10:48:10,351 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38384, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1522764792_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748605_7781, duration(ns): 22985451 2025-07-14 10:48:10,351 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748605_7781, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-14 10:48:13,719 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748605_7781 replica FinalizedReplica, blk_1073748605_7781, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748605 for deletion 2025-07-14 10:48:13,721 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748605_7781 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748605 2025-07-14 10:49:10,327 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748606_7782 src: /192.168.158.6:56202 dest: /192.168.158.4:9866 2025-07-14 10:49:10,351 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:56202, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1081718327_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748606_7782, duration(ns): 18810441 2025-07-14 10:49:10,351 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748606_7782, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-14 10:49:13,720 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748606_7782 replica FinalizedReplica, blk_1073748606_7782, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748606 for deletion 2025-07-14 10:49:13,722 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748606_7782 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748606 2025-07-14 10:50:10,349 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748607_7783 src: /192.168.158.1:34484 dest: /192.168.158.4:9866 2025-07-14 10:50:10,379 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34484, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-696530935_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748607_7783, duration(ns): 20781848 2025-07-14 10:50:10,379 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748607_7783, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-14 10:50:16,723 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748607_7783 replica FinalizedReplica, blk_1073748607_7783, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748607 for deletion 2025-07-14 10:50:16,724 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748607_7783 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748607 2025-07-14 10:51:10,332 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748608_7784 src: /192.168.158.5:37554 dest: /192.168.158.4:9866 2025-07-14 10:51:10,356 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:37554, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-827053091_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748608_7784, duration(ns): 19421566 2025-07-14 10:51:10,356 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748608_7784, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-14 10:51:16,724 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748608_7784 replica FinalizedReplica, blk_1073748608_7784, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748608 for deletion 2025-07-14 10:51:16,726 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748608_7784 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748608 2025-07-14 10:52:10,336 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748609_7785 src: /192.168.158.5:47988 dest: /192.168.158.4:9866 2025-07-14 10:52:10,360 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:47988, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-371241400_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748609_7785, duration(ns): 18903923 2025-07-14 10:52:10,361 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748609_7785, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-14 10:52:16,726 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748609_7785 replica FinalizedReplica, blk_1073748609_7785, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748609 for deletion 2025-07-14 10:52:16,728 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748609_7785 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748609 2025-07-14 10:53:10,334 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748610_7786 src: /192.168.158.5:45708 dest: /192.168.158.4:9866 2025-07-14 10:53:10,354 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:45708, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1923239116_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748610_7786, duration(ns): 17863719 2025-07-14 10:53:10,354 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748610_7786, type=LAST_IN_PIPELINE terminating 2025-07-14 10:53:16,727 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748610_7786 replica FinalizedReplica, blk_1073748610_7786, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748610 for deletion 2025-07-14 10:53:16,728 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748610_7786 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748610 2025-07-14 11:02:20,351 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748619_7795 src: /192.168.158.5:45088 dest: /192.168.158.4:9866 2025-07-14 11:02:20,369 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:45088, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1894394102_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748619_7795, duration(ns): 15820049 2025-07-14 11:02:20,369 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748619_7795, type=LAST_IN_PIPELINE terminating 2025-07-14 11:02:22,745 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748619_7795 replica FinalizedReplica, blk_1073748619_7795, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748619 for deletion 2025-07-14 11:02:22,746 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748619_7795 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748619 2025-07-14 11:03:20,347 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748620_7796 src: /192.168.158.1:60144 dest: /192.168.158.4:9866 2025-07-14 11:03:20,378 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60144, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1332615653_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748620_7796, duration(ns): 21771674 2025-07-14 11:03:20,378 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748620_7796, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-14 11:03:22,747 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748620_7796 replica FinalizedReplica, blk_1073748620_7796, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748620 for deletion 2025-07-14 11:03:22,748 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748620_7796 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748620 2025-07-14 11:05:25,347 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748622_7798 src: /192.168.158.1:58938 dest: /192.168.158.4:9866 2025-07-14 11:05:25,377 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58938, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1060557791_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748622_7798, duration(ns): 21359638 2025-07-14 11:05:25,377 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748622_7798, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-14 11:05:28,751 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748622_7798 replica FinalizedReplica, blk_1073748622_7798, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748622 for deletion 2025-07-14 11:05:28,752 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748622_7798 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748622 2025-07-14 11:06:25,357 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748623_7799 src: /192.168.158.7:51836 dest: /192.168.158.4:9866 2025-07-14 11:06:25,377 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:51836, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_570613585_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748623_7799, duration(ns): 17143349 2025-07-14 11:06:25,377 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748623_7799, type=LAST_IN_PIPELINE terminating 2025-07-14 11:06:28,751 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748623_7799 replica FinalizedReplica, blk_1073748623_7799, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748623 for deletion 2025-07-14 11:06:28,753 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748623_7799 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748623 2025-07-14 11:08:25,358 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748625_7801 src: /192.168.158.6:50202 dest: /192.168.158.4:9866 2025-07-14 11:08:25,382 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:50202, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1386983545_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748625_7801, duration(ns): 18575372 2025-07-14 11:08:25,382 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748625_7801, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-14 11:08:28,756 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748625_7801 replica FinalizedReplica, blk_1073748625_7801, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748625 for deletion 2025-07-14 11:08:28,757 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748625_7801 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748625 2025-07-14 11:09:25,363 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748626_7802 src: /192.168.158.7:42898 dest: /192.168.158.4:9866 2025-07-14 11:09:25,381 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:42898, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_36156415_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748626_7802, duration(ns): 15955635 2025-07-14 11:09:25,381 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748626_7802, type=LAST_IN_PIPELINE terminating 2025-07-14 11:09:31,758 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748626_7802 replica FinalizedReplica, blk_1073748626_7802, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748626 for deletion 2025-07-14 11:09:31,759 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748626_7802 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748626 2025-07-14 11:10:30,357 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748627_7803 src: /192.168.158.1:38716 dest: /192.168.158.4:9866 2025-07-14 11:10:30,388 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38716, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1235221472_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748627_7803, duration(ns): 23096945 2025-07-14 11:10:30,389 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748627_7803, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-14 11:10:34,761 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748627_7803 replica FinalizedReplica, blk_1073748627_7803, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748627 for deletion 2025-07-14 11:10:34,762 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748627_7803 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748627 2025-07-14 11:11:30,361 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748628_7804 src: /192.168.158.9:35950 dest: /192.168.158.4:9866 2025-07-14 11:11:30,388 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:35950, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-321033299_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748628_7804, duration(ns): 20716117 2025-07-14 11:11:30,388 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748628_7804, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-14 11:11:34,763 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748628_7804 replica FinalizedReplica, blk_1073748628_7804, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748628 for deletion 2025-07-14 11:11:34,764 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748628_7804 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748628 2025-07-14 11:14:30,367 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748631_7807 src: /192.168.158.1:58516 dest: /192.168.158.4:9866 2025-07-14 11:14:30,401 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58516, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-89888201_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748631_7807, duration(ns): 24352739 2025-07-14 11:14:30,401 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748631_7807, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-14 11:14:37,768 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748631_7807 replica FinalizedReplica, blk_1073748631_7807, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748631 for deletion 2025-07-14 11:14:37,770 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748631_7807 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748631 2025-07-14 11:15:30,372 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748632_7808 src: /192.168.158.8:52998 dest: /192.168.158.4:9866 2025-07-14 11:15:30,398 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:52998, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1823176990_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748632_7808, duration(ns): 21357631 2025-07-14 11:15:30,398 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748632_7808, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-14 11:15:37,768 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748632_7808 replica FinalizedReplica, blk_1073748632_7808, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748632 for deletion 2025-07-14 11:15:37,769 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748632_7808 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748632 2025-07-14 11:17:35,369 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748634_7810 src: /192.168.158.6:33548 dest: /192.168.158.4:9866 2025-07-14 11:17:35,395 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:33548, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-13501579_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748634_7810, duration(ns): 20738726 2025-07-14 11:17:35,395 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748634_7810, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-14 11:17:40,772 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748634_7810 replica FinalizedReplica, blk_1073748634_7810, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748634 for deletion 2025-07-14 11:17:40,773 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748634_7810 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748634 2025-07-14 11:19:35,378 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748636_7812 src: /192.168.158.8:54878 dest: /192.168.158.4:9866 2025-07-14 11:19:35,398 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:54878, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_474172119_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748636_7812, duration(ns): 17327134 2025-07-14 11:19:35,398 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748636_7812, type=LAST_IN_PIPELINE terminating 2025-07-14 11:19:37,777 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748636_7812 replica FinalizedReplica, blk_1073748636_7812, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748636 for deletion 2025-07-14 11:19:37,778 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748636_7812 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748636 2025-07-14 11:23:40,381 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748640_7816 src: /192.168.158.6:35514 dest: /192.168.158.4:9866 2025-07-14 11:23:40,406 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:35514, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1360749852_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748640_7816, duration(ns): 19159162 2025-07-14 11:23:40,406 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748640_7816, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-14 11:23:43,785 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748640_7816 replica FinalizedReplica, blk_1073748640_7816, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748640 for deletion 2025-07-14 11:23:43,786 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748640_7816 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748640 2025-07-14 11:25:45,383 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748642_7818 src: /192.168.158.6:56138 dest: /192.168.158.4:9866 2025-07-14 11:25:45,403 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:56138, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2053422240_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748642_7818, duration(ns): 18140299 2025-07-14 11:25:45,403 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748642_7818, type=LAST_IN_PIPELINE terminating 2025-07-14 11:25:49,789 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748642_7818 replica FinalizedReplica, blk_1073748642_7818, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748642 for deletion 2025-07-14 11:25:49,790 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748642_7818 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748642 2025-07-14 11:30:45,389 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748647_7823 src: /192.168.158.8:56114 dest: /192.168.158.4:9866 2025-07-14 11:30:45,407 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:56114, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-169416073_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748647_7823, duration(ns): 15936990 2025-07-14 11:30:45,407 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748647_7823, type=LAST_IN_PIPELINE terminating 2025-07-14 11:30:49,800 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748647_7823 replica FinalizedReplica, blk_1073748647_7823, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748647 for deletion 2025-07-14 11:30:49,801 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748647_7823 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748647 2025-07-14 11:31:45,392 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748648_7824 src: /192.168.158.7:40306 dest: /192.168.158.4:9866 2025-07-14 11:31:45,410 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:40306, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1753398918_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748648_7824, duration(ns): 15854523 2025-07-14 11:31:45,410 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748648_7824, type=LAST_IN_PIPELINE terminating 2025-07-14 11:31:52,802 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748648_7824 replica FinalizedReplica, blk_1073748648_7824, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748648 for deletion 2025-07-14 11:31:52,803 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748648_7824 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748648 2025-07-14 11:34:45,388 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748651_7827 src: /192.168.158.1:42540 dest: /192.168.158.4:9866 2025-07-14 11:34:45,421 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42540, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2141506872_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748651_7827, duration(ns): 24108252 2025-07-14 11:34:45,421 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748651_7827, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-14 11:34:49,809 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748651_7827 replica FinalizedReplica, blk_1073748651_7827, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748651 for deletion 2025-07-14 11:34:49,810 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748651_7827 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748651 2025-07-14 11:36:13,269 INFO org.apache.hadoop.hdfs.server.datanode.DirectoryScanner: BlockPool BP-1059995147-192.168.158.1-1752101929360 Total blocks: 8, missing metadata files:0, missing block files:0, missing blocks in memory:0, mismatched blocks:0 2025-07-14 11:36:45,410 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748653_7829 src: /192.168.158.1:42972 dest: /192.168.158.4:9866 2025-07-14 11:36:45,443 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42972, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-877632299_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748653_7829, duration(ns): 23245396 2025-07-14 11:36:45,443 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748653_7829, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-14 11:36:49,810 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748653_7829 replica FinalizedReplica, blk_1073748653_7829, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748653 for deletion 2025-07-14 11:36:49,812 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748653_7829 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748653 2025-07-14 11:37:19,815 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Successfully sent block report 0x12cfd9a757d31f38, containing 4 storage report(s), of which we sent 4. The reports had 8 total blocks and used 1 RPC(s). This took 0 msec to generate and 4 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5. 2025-07-14 11:37:19,816 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Got finalize command for block pool BP-1059995147-192.168.158.1-1752101929360 2025-07-14 11:40:50,419 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748657_7833 src: /192.168.158.8:53946 dest: /192.168.158.4:9866 2025-07-14 11:40:50,442 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:53946, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1088794917_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748657_7833, duration(ns): 17470083 2025-07-14 11:40:50,442 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748657_7833, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-14 11:40:52,819 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748657_7833 replica FinalizedReplica, blk_1073748657_7833, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748657 for deletion 2025-07-14 11:40:52,820 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748657_7833 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748657 2025-07-14 11:41:55,414 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748658_7834 src: /192.168.158.1:58580 dest: /192.168.158.4:9866 2025-07-14 11:41:55,446 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58580, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_60277241_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748658_7834, duration(ns): 24244341 2025-07-14 11:41:55,447 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748658_7834, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-14 11:41:58,821 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748658_7834 replica FinalizedReplica, blk_1073748658_7834, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748658 for deletion 2025-07-14 11:41:58,822 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748658_7834 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748658 2025-07-14 11:42:55,413 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748659_7835 src: /192.168.158.1:45820 dest: /192.168.158.4:9866 2025-07-14 11:42:55,447 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45820, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_414810226_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748659_7835, duration(ns): 25296309 2025-07-14 11:42:55,447 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748659_7835, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-14 11:42:58,820 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748659_7835 replica FinalizedReplica, blk_1073748659_7835, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748659 for deletion 2025-07-14 11:42:58,821 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748659_7835 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748659 2025-07-14 11:43:55,417 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748660_7836 src: /192.168.158.1:37338 dest: /192.168.158.4:9866 2025-07-14 11:43:55,449 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37338, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1402222426_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748660_7836, duration(ns): 22591534 2025-07-14 11:43:55,451 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748660_7836, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-14 11:43:58,823 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748660_7836 replica FinalizedReplica, blk_1073748660_7836, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748660 for deletion 2025-07-14 11:43:58,824 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748660_7836 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748660 2025-07-14 11:45:55,425 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748662_7838 src: /192.168.158.7:51960 dest: /192.168.158.4:9866 2025-07-14 11:45:55,452 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:51960, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-452617798_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748662_7838, duration(ns): 19654821 2025-07-14 11:45:55,452 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748662_7838, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-14 11:46:01,828 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748662_7838 replica FinalizedReplica, blk_1073748662_7838, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748662 for deletion 2025-07-14 11:46:01,829 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748662_7838 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748662 2025-07-14 11:47:00,424 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748663_7839 src: /192.168.158.7:53386 dest: /192.168.158.4:9866 2025-07-14 11:47:00,442 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:53386, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1111627792_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748663_7839, duration(ns): 16869258 2025-07-14 11:47:00,443 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748663_7839, type=LAST_IN_PIPELINE terminating 2025-07-14 11:47:04,830 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748663_7839 replica FinalizedReplica, blk_1073748663_7839, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748663 for deletion 2025-07-14 11:47:04,832 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748663_7839 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748663 2025-07-14 11:48:00,431 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748664_7840 src: /192.168.158.7:46852 dest: /192.168.158.4:9866 2025-07-14 11:48:00,449 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:46852, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1582565817_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748664_7840, duration(ns): 16650306 2025-07-14 11:48:00,450 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748664_7840, type=LAST_IN_PIPELINE terminating 2025-07-14 11:48:04,832 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748664_7840 replica FinalizedReplica, blk_1073748664_7840, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748664 for deletion 2025-07-14 11:48:04,834 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748664_7840 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748664 2025-07-14 11:50:00,440 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748666_7842 src: /192.168.158.8:52148 dest: /192.168.158.4:9866 2025-07-14 11:50:00,467 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:52148, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-266201799_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748666_7842, duration(ns): 21056028 2025-07-14 11:50:00,467 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748666_7842, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-14 11:50:07,837 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748666_7842 replica FinalizedReplica, blk_1073748666_7842, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748666 for deletion 2025-07-14 11:50:07,838 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748666_7842 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748666 2025-07-14 11:52:05,445 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748668_7844 src: /192.168.158.5:55100 dest: /192.168.158.4:9866 2025-07-14 11:52:05,471 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:55100, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_130315025_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748668_7844, duration(ns): 20299035 2025-07-14 11:52:05,471 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748668_7844, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-14 11:52:07,844 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748668_7844 replica FinalizedReplica, blk_1073748668_7844, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748668 for deletion 2025-07-14 11:52:07,845 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748668_7844 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748668 2025-07-14 11:53:05,430 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748669_7845 src: /192.168.158.8:42398 dest: /192.168.158.4:9866 2025-07-14 11:53:05,455 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:42398, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1954816606_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748669_7845, duration(ns): 19404073 2025-07-14 11:53:05,456 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748669_7845, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-14 11:53:10,847 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748669_7845 replica FinalizedReplica, blk_1073748669_7845, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748669 for deletion 2025-07-14 11:53:10,849 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748669_7845 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748669 2025-07-14 11:59:15,443 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748675_7851 src: /192.168.158.8:35524 dest: /192.168.158.4:9866 2025-07-14 11:59:15,462 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:35524, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1719146762_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748675_7851, duration(ns): 15346216 2025-07-14 11:59:15,462 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748675_7851, type=LAST_IN_PIPELINE terminating 2025-07-14 11:59:22,859 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748675_7851 replica FinalizedReplica, blk_1073748675_7851, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748675 for deletion 2025-07-14 11:59:22,860 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748675_7851 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748675 2025-07-14 12:03:15,446 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748679_7855 src: /192.168.158.5:42600 dest: /192.168.158.4:9866 2025-07-14 12:03:15,467 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:42600, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2057376364_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748679_7855, duration(ns): 18738930 2025-07-14 12:03:15,467 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748679_7855, type=LAST_IN_PIPELINE terminating 2025-07-14 12:03:19,867 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748679_7855 replica FinalizedReplica, blk_1073748679_7855, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748679 for deletion 2025-07-14 12:03:19,868 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748679_7855 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748679 2025-07-14 12:04:15,474 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748680_7856 src: /192.168.158.7:44354 dest: /192.168.158.4:9866 2025-07-14 12:04:15,492 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:44354, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1956979457_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748680_7856, duration(ns): 16657234 2025-07-14 12:04:15,493 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748680_7856, type=LAST_IN_PIPELINE terminating 2025-07-14 12:04:19,870 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748680_7856 replica FinalizedReplica, blk_1073748680_7856, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748680 for deletion 2025-07-14 12:04:19,871 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748680_7856 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748680 2025-07-14 12:09:20,462 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748685_7861 src: /192.168.158.9:33630 dest: /192.168.158.4:9866 2025-07-14 12:09:20,487 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:33630, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1564541091_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748685_7861, duration(ns): 18836690 2025-07-14 12:09:20,487 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748685_7861, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-14 12:09:22,879 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748685_7861 replica FinalizedReplica, blk_1073748685_7861, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748685 for deletion 2025-07-14 12:09:22,880 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748685_7861 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748685 2025-07-14 12:15:30,459 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748691_7867 src: /192.168.158.9:41510 dest: /192.168.158.4:9866 2025-07-14 12:15:30,483 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:41510, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-721775821_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748691_7867, duration(ns): 18110405 2025-07-14 12:15:30,484 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748691_7867, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-14 12:15:37,886 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748691_7867 replica FinalizedReplica, blk_1073748691_7867, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748691 for deletion 2025-07-14 12:15:37,888 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748691_7867 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748691 2025-07-14 12:17:35,471 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748693_7869 src: /192.168.158.9:59046 dest: /192.168.158.4:9866 2025-07-14 12:17:35,490 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:59046, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1640322277_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748693_7869, duration(ns): 17123447 2025-07-14 12:17:35,490 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748693_7869, type=LAST_IN_PIPELINE terminating 2025-07-14 12:17:37,894 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748693_7869 replica FinalizedReplica, blk_1073748693_7869, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748693 for deletion 2025-07-14 12:17:37,895 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748693_7869 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748693 2025-07-14 12:18:35,457 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748694_7870 src: /192.168.158.1:47882 dest: /192.168.158.4:9866 2025-07-14 12:18:35,490 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47882, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2065444743_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748694_7870, duration(ns): 24403407 2025-07-14 12:18:35,491 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748694_7870, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-14 12:18:40,896 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748694_7870 replica FinalizedReplica, blk_1073748694_7870, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748694 for deletion 2025-07-14 12:18:40,897 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748694_7870 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748694 2025-07-14 12:19:35,459 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748695_7871 src: /192.168.158.1:49950 dest: /192.168.158.4:9866 2025-07-14 12:19:35,489 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49950, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-319461717_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748695_7871, duration(ns): 22065783 2025-07-14 12:19:35,489 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748695_7871, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-14 12:19:37,899 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748695_7871 replica FinalizedReplica, blk_1073748695_7871, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748695 for deletion 2025-07-14 12:19:37,901 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748695_7871 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748695 2025-07-14 12:22:40,475 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748698_7874 src: /192.168.158.5:50158 dest: /192.168.158.4:9866 2025-07-14 12:22:40,500 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:50158, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1823318478_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748698_7874, duration(ns): 19254075 2025-07-14 12:22:40,500 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748698_7874, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-14 12:22:43,908 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748698_7874 replica FinalizedReplica, blk_1073748698_7874, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748698 for deletion 2025-07-14 12:22:43,909 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748698_7874 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748698 2025-07-14 12:24:40,475 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748700_7876 src: /192.168.158.1:59038 dest: /192.168.158.4:9866 2025-07-14 12:24:40,505 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59038, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_645265320_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748700_7876, duration(ns): 21608422 2025-07-14 12:24:40,506 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748700_7876, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-14 12:24:46,914 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748700_7876 replica FinalizedReplica, blk_1073748700_7876, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748700 for deletion 2025-07-14 12:24:46,915 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748700_7876 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748700 2025-07-14 12:25:45,470 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748701_7877 src: /192.168.158.7:54264 dest: /192.168.158.4:9866 2025-07-14 12:25:45,488 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:54264, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1158122384_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748701_7877, duration(ns): 16290897 2025-07-14 12:25:45,489 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748701_7877, type=LAST_IN_PIPELINE terminating 2025-07-14 12:25:52,916 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748701_7877 replica FinalizedReplica, blk_1073748701_7877, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748701 for deletion 2025-07-14 12:25:52,917 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748701_7877 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748701 2025-07-14 12:26:45,477 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748702_7878 src: /192.168.158.9:41944 dest: /192.168.158.4:9866 2025-07-14 12:26:45,495 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:41944, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-578579790_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748702_7878, duration(ns): 15312837 2025-07-14 12:26:45,495 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748702_7878, type=LAST_IN_PIPELINE terminating 2025-07-14 12:26:49,916 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748702_7878 replica FinalizedReplica, blk_1073748702_7878, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748702 for deletion 2025-07-14 12:26:49,917 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748702_7878 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748702 2025-07-14 12:30:55,503 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748706_7882 src: /192.168.158.5:42072 dest: /192.168.158.4:9866 2025-07-14 12:30:55,521 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:42072, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-470191119_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748706_7882, duration(ns): 15324896 2025-07-14 12:30:55,521 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748706_7882, type=LAST_IN_PIPELINE terminating 2025-07-14 12:30:58,925 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748706_7882 replica FinalizedReplica, blk_1073748706_7882, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748706 for deletion 2025-07-14 12:30:58,926 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748706_7882 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748706 2025-07-14 12:31:55,498 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748707_7883 src: /192.168.158.1:36138 dest: /192.168.158.4:9866 2025-07-14 12:31:55,528 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36138, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1803961114_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748707_7883, duration(ns): 21644705 2025-07-14 12:31:55,528 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748707_7883, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-14 12:31:58,928 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748707_7883 replica FinalizedReplica, blk_1073748707_7883, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748707 for deletion 2025-07-14 12:31:58,929 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748707_7883 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748707 2025-07-14 12:33:00,483 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748708_7884 src: /192.168.158.1:51554 dest: /192.168.158.4:9866 2025-07-14 12:33:00,513 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51554, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1127611738_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748708_7884, duration(ns): 21465715 2025-07-14 12:33:00,514 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748708_7884, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-14 12:33:04,932 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748708_7884 replica FinalizedReplica, blk_1073748708_7884, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748708 for deletion 2025-07-14 12:33:04,933 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748708_7884 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748708 2025-07-14 12:34:00,511 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748709_7885 src: /192.168.158.1:33100 dest: /192.168.158.4:9866 2025-07-14 12:34:00,545 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33100, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_54591587_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748709_7885, duration(ns): 25486134 2025-07-14 12:34:00,545 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748709_7885, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-14 12:34:04,932 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748709_7885 replica FinalizedReplica, blk_1073748709_7885, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748709 for deletion 2025-07-14 12:34:04,933 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748709_7885 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748709 2025-07-14 12:35:00,497 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748710_7886 src: /192.168.158.8:46470 dest: /192.168.158.4:9866 2025-07-14 12:35:00,517 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:46470, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1271657839_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748710_7886, duration(ns): 17525469 2025-07-14 12:35:00,517 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748710_7886, type=LAST_IN_PIPELINE terminating 2025-07-14 12:35:07,932 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748710_7886 replica FinalizedReplica, blk_1073748710_7886, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748710 for deletion 2025-07-14 12:35:07,933 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748710_7886 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748710 2025-07-14 12:36:00,499 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748711_7887 src: /192.168.158.1:50642 dest: /192.168.158.4:9866 2025-07-14 12:36:00,532 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50642, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1162065667_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748711_7887, duration(ns): 24288045 2025-07-14 12:36:00,532 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748711_7887, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-14 12:36:07,931 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748711_7887 replica FinalizedReplica, blk_1073748711_7887, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748711 for deletion 2025-07-14 12:36:07,932 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748711_7887 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748711 2025-07-14 12:37:05,516 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748712_7888 src: /192.168.158.6:49366 dest: /192.168.158.4:9866 2025-07-14 12:37:05,543 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:49366, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_123702464_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748712_7888, duration(ns): 21350779 2025-07-14 12:37:05,543 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748712_7888, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-14 12:37:07,934 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748712_7888 replica FinalizedReplica, blk_1073748712_7888, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748712 for deletion 2025-07-14 12:37:07,935 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748712_7888 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748712 2025-07-14 12:39:05,536 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748714_7890 src: /192.168.158.7:46084 dest: /192.168.158.4:9866 2025-07-14 12:39:05,554 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:46084, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1229429491_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748714_7890, duration(ns): 15661670 2025-07-14 12:39:05,554 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748714_7890, type=LAST_IN_PIPELINE terminating 2025-07-14 12:39:10,936 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748714_7890 replica FinalizedReplica, blk_1073748714_7890, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748714 for deletion 2025-07-14 12:39:10,937 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748714_7890 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748714 2025-07-14 12:40:05,517 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748715_7891 src: /192.168.158.9:60308 dest: /192.168.158.4:9866 2025-07-14 12:40:05,537 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:60308, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2042229004_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748715_7891, duration(ns): 17711916 2025-07-14 12:40:05,538 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748715_7891, type=LAST_IN_PIPELINE terminating 2025-07-14 12:40:10,935 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748715_7891 replica FinalizedReplica, blk_1073748715_7891, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748715 for deletion 2025-07-14 12:40:10,937 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748715_7891 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748715 2025-07-14 12:42:05,539 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748717_7893 src: /192.168.158.1:37408 dest: /192.168.158.4:9866 2025-07-14 12:42:05,570 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37408, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1925509684_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748717_7893, duration(ns): 22167141 2025-07-14 12:42:05,571 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748717_7893, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-14 12:42:10,937 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748717_7893 replica FinalizedReplica, blk_1073748717_7893, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748717 for deletion 2025-07-14 12:42:10,938 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748717_7893 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748717 2025-07-14 12:44:05,515 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748719_7895 src: /192.168.158.5:33126 dest: /192.168.158.4:9866 2025-07-14 12:44:05,542 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:33126, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-183600271_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748719_7895, duration(ns): 21466823 2025-07-14 12:44:05,542 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748719_7895, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-14 12:44:07,939 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748719_7895 replica FinalizedReplica, blk_1073748719_7895, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748719 for deletion 2025-07-14 12:44:07,940 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748719_7895 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748719 2025-07-14 12:45:10,511 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748720_7896 src: /192.168.158.1:35370 dest: /192.168.158.4:9866 2025-07-14 12:45:10,540 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35370, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1201990269_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748720_7896, duration(ns): 21723142 2025-07-14 12:45:10,541 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748720_7896, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-14 12:45:16,942 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748720_7896 replica FinalizedReplica, blk_1073748720_7896, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748720 for deletion 2025-07-14 12:45:16,943 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748720_7896 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748720 2025-07-14 12:48:10,509 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748723_7899 src: /192.168.158.9:46028 dest: /192.168.158.4:9866 2025-07-14 12:48:10,531 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:46028, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1869060222_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748723_7899, duration(ns): 17399326 2025-07-14 12:48:10,531 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748723_7899, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-14 12:48:13,948 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748723_7899 replica FinalizedReplica, blk_1073748723_7899, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748723 for deletion 2025-07-14 12:48:13,949 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748723_7899 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748723 2025-07-14 12:51:15,527 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748726_7902 src: /192.168.158.1:37208 dest: /192.168.158.4:9866 2025-07-14 12:51:15,558 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37208, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1529240559_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748726_7902, duration(ns): 23170787 2025-07-14 12:51:15,559 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748726_7902, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-14 12:51:19,955 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748726_7902 replica FinalizedReplica, blk_1073748726_7902, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748726 for deletion 2025-07-14 12:51:19,956 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748726_7902 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748726 2025-07-14 12:53:15,534 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748728_7904 src: /192.168.158.1:60134 dest: /192.168.158.4:9866 2025-07-14 12:53:15,564 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60134, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-405187287_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748728_7904, duration(ns): 21319186 2025-07-14 12:53:15,564 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748728_7904, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-14 12:53:22,958 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748728_7904 replica FinalizedReplica, blk_1073748728_7904, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748728 for deletion 2025-07-14 12:53:22,959 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748728_7904 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748728 2025-07-14 12:55:20,536 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748730_7906 src: /192.168.158.7:56708 dest: /192.168.158.4:9866 2025-07-14 12:55:20,563 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:56708, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-191022136_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748730_7906, duration(ns): 21350481 2025-07-14 12:55:20,563 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748730_7906, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-14 12:55:22,964 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748730_7906 replica FinalizedReplica, blk_1073748730_7906, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748730 for deletion 2025-07-14 12:55:22,965 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748730_7906 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748730 2025-07-14 12:56:20,543 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748731_7907 src: /192.168.158.1:59112 dest: /192.168.158.4:9866 2025-07-14 12:56:20,573 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59112, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1404572311_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748731_7907, duration(ns): 21431644 2025-07-14 12:56:20,573 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748731_7907, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-14 12:56:25,964 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748731_7907 replica FinalizedReplica, blk_1073748731_7907, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748731 for deletion 2025-07-14 12:56:25,966 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748731_7907 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073748731 2025-07-14 13:01:30,563 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748736_7912 src: /192.168.158.8:52252 dest: /192.168.158.4:9866 2025-07-14 13:01:30,581 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:52252, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1853285321_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748736_7912, duration(ns): 16043296 2025-07-14 13:01:30,582 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748736_7912, type=LAST_IN_PIPELINE terminating 2025-07-14 13:01:37,974 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748736_7912 replica FinalizedReplica, blk_1073748736_7912, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748736 for deletion 2025-07-14 13:01:37,975 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748736_7912 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748736 2025-07-14 13:02:30,566 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748737_7913 src: /192.168.158.5:53182 dest: /192.168.158.4:9866 2025-07-14 13:02:30,583 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:53182, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_909632462_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748737_7913, duration(ns): 15664446 2025-07-14 13:02:30,584 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748737_7913, type=LAST_IN_PIPELINE terminating 2025-07-14 13:02:34,975 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748737_7913 replica FinalizedReplica, blk_1073748737_7913, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748737 for deletion 2025-07-14 13:02:34,976 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748737_7913 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748737 2025-07-14 13:03:30,562 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748738_7914 src: /192.168.158.6:59166 dest: /192.168.158.4:9866 2025-07-14 13:03:30,587 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:59166, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1064564160_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748738_7914, duration(ns): 19509697 2025-07-14 13:03:30,587 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748738_7914, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-14 13:03:37,980 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748738_7914 replica FinalizedReplica, blk_1073748738_7914, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748738 for deletion 2025-07-14 13:03:37,981 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748738_7914 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748738 2025-07-14 13:06:35,557 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748741_7917 src: /192.168.158.1:47084 dest: /192.168.158.4:9866 2025-07-14 13:06:35,586 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47084, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1796316483_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748741_7917, duration(ns): 20487903 2025-07-14 13:06:35,586 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748741_7917, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-14 13:06:40,989 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748741_7917 replica FinalizedReplica, blk_1073748741_7917, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748741 for deletion 2025-07-14 13:06:40,990 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748741_7917 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748741 2025-07-14 13:07:35,572 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748742_7918 src: /192.168.158.8:37304 dest: /192.168.158.4:9866 2025-07-14 13:07:35,589 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:37304, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1714764665_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748742_7918, duration(ns): 15610266 2025-07-14 13:07:35,590 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748742_7918, type=LAST_IN_PIPELINE terminating 2025-07-14 13:07:40,992 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748742_7918 replica FinalizedReplica, blk_1073748742_7918, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748742 for deletion 2025-07-14 13:07:40,993 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748742_7918 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748742 2025-07-14 13:09:35,573 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748744_7920 src: /192.168.158.5:60548 dest: /192.168.158.4:9866 2025-07-14 13:09:35,592 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:60548, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-673616497_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748744_7920, duration(ns): 17063270 2025-07-14 13:09:35,593 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748744_7920, type=LAST_IN_PIPELINE terminating 2025-07-14 13:09:37,997 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748744_7920 replica FinalizedReplica, blk_1073748744_7920, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748744 for deletion 2025-07-14 13:09:37,998 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748744_7920 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748744 2025-07-14 13:12:40,578 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748747_7923 src: /192.168.158.9:51974 dest: /192.168.158.4:9866 2025-07-14 13:12:40,603 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:51974, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-691274795_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748747_7923, duration(ns): 19509766 2025-07-14 13:12:40,604 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748747_7923, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-14 13:12:47,000 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748747_7923 replica FinalizedReplica, blk_1073748747_7923, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748747 for deletion 2025-07-14 13:12:47,001 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748747_7923 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748747 2025-07-14 13:15:50,568 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748750_7926 src: /192.168.158.8:56232 dest: /192.168.158.4:9866 2025-07-14 13:15:50,589 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:56232, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-43043723_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748750_7926, duration(ns): 16257647 2025-07-14 13:15:50,590 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748750_7926, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-14 13:15:53,006 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748750_7926 replica FinalizedReplica, blk_1073748750_7926, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748750 for deletion 2025-07-14 13:15:53,007 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748750_7926 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748750 2025-07-14 13:16:50,571 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748751_7927 src: /192.168.158.8:44760 dest: /192.168.158.4:9866 2025-07-14 13:16:50,589 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:44760, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1571223819_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748751_7927, duration(ns): 15801574 2025-07-14 13:16:50,589 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748751_7927, type=LAST_IN_PIPELINE terminating 2025-07-14 13:16:53,006 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748751_7927 replica FinalizedReplica, blk_1073748751_7927, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748751 for deletion 2025-07-14 13:16:53,007 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748751_7927 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748751 2025-07-14 13:18:55,580 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748753_7929 src: /192.168.158.9:39994 dest: /192.168.158.4:9866 2025-07-14 13:18:55,599 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:39994, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1593188316_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748753_7929, duration(ns): 16275478 2025-07-14 13:18:55,599 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748753_7929, type=LAST_IN_PIPELINE terminating 2025-07-14 13:18:59,009 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748753_7929 replica FinalizedReplica, blk_1073748753_7929, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748753 for deletion 2025-07-14 13:18:59,011 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748753_7929 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748753 2025-07-14 13:19:55,586 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748754_7930 src: /192.168.158.8:34054 dest: /192.168.158.4:9866 2025-07-14 13:19:55,620 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:34054, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2092991793_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748754_7930, duration(ns): 31503895 2025-07-14 13:19:55,620 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748754_7930, type=LAST_IN_PIPELINE terminating 2025-07-14 13:19:59,012 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748754_7930 replica FinalizedReplica, blk_1073748754_7930, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748754 for deletion 2025-07-14 13:19:59,013 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748754_7930 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748754 2025-07-14 13:21:55,583 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748756_7932 src: /192.168.158.9:53698 dest: /192.168.158.4:9866 2025-07-14 13:21:55,602 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:53698, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_405616211_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748756_7932, duration(ns): 15144658 2025-07-14 13:21:55,602 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748756_7932, type=LAST_IN_PIPELINE terminating 2025-07-14 13:21:59,018 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748756_7932 replica FinalizedReplica, blk_1073748756_7932, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748756 for deletion 2025-07-14 13:21:59,019 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748756_7932 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748756 2025-07-14 13:26:00,589 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748760_7936 src: /192.168.158.5:42896 dest: /192.168.158.4:9866 2025-07-14 13:26:00,608 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:42896, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_674918299_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748760_7936, duration(ns): 17183900 2025-07-14 13:26:00,609 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748760_7936, type=LAST_IN_PIPELINE terminating 2025-07-14 13:26:05,030 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748760_7936 replica FinalizedReplica, blk_1073748760_7936, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748760 for deletion 2025-07-14 13:26:05,031 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748760_7936 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748760 2025-07-14 13:27:00,589 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748761_7937 src: /192.168.158.1:50214 dest: /192.168.158.4:9866 2025-07-14 13:27:00,621 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50214, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2117324140_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748761_7937, duration(ns): 23291857 2025-07-14 13:27:00,621 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748761_7937, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-14 13:27:08,032 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748761_7937 replica FinalizedReplica, blk_1073748761_7937, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748761 for deletion 2025-07-14 13:27:08,033 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748761_7937 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748761 2025-07-14 13:28:05,561 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748762_7938 src: /192.168.158.9:32816 dest: /192.168.158.4:9866 2025-07-14 13:28:05,581 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:32816, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1532960095_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748762_7938, duration(ns): 17109427 2025-07-14 13:28:05,581 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748762_7938, type=LAST_IN_PIPELINE terminating 2025-07-14 13:28:11,033 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748762_7938 replica FinalizedReplica, blk_1073748762_7938, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748762 for deletion 2025-07-14 13:28:11,034 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748762_7938 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748762 2025-07-14 13:29:10,576 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748763_7939 src: /192.168.158.1:52914 dest: /192.168.158.4:9866 2025-07-14 13:29:10,608 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52914, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-886683203_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748763_7939, duration(ns): 22999230 2025-07-14 13:29:10,609 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748763_7939, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-14 13:29:14,035 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748763_7939 replica FinalizedReplica, blk_1073748763_7939, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748763 for deletion 2025-07-14 13:29:14,036 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748763_7939 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748763 2025-07-14 13:31:10,582 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748765_7941 src: /192.168.158.1:37566 dest: /192.168.158.4:9866 2025-07-14 13:31:10,614 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37566, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-300441159_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748765_7941, duration(ns): 23909722 2025-07-14 13:31:10,616 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748765_7941, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-14 13:31:14,040 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748765_7941 replica FinalizedReplica, blk_1073748765_7941, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748765 for deletion 2025-07-14 13:31:14,041 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748765_7941 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748765 2025-07-14 13:33:10,585 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748767_7943 src: /192.168.158.6:49016 dest: /192.168.158.4:9866 2025-07-14 13:33:10,605 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:49016, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2029490047_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748767_7943, duration(ns): 17217316 2025-07-14 13:33:10,605 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748767_7943, type=LAST_IN_PIPELINE terminating 2025-07-14 13:33:14,044 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748767_7943 replica FinalizedReplica, blk_1073748767_7943, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748767 for deletion 2025-07-14 13:33:14,045 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748767_7943 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748767 2025-07-14 13:35:15,592 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748769_7945 src: /192.168.158.6:56214 dest: /192.168.158.4:9866 2025-07-14 13:35:15,616 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:56214, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1903774984_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748769_7945, duration(ns): 19189705 2025-07-14 13:35:15,616 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748769_7945, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-14 13:35:20,051 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748769_7945 replica FinalizedReplica, blk_1073748769_7945, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748769 for deletion 2025-07-14 13:35:20,052 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748769_7945 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748769 2025-07-14 13:36:15,592 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748770_7946 src: /192.168.158.5:52898 dest: /192.168.158.4:9866 2025-07-14 13:36:15,612 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:52898, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-625364938_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748770_7946, duration(ns): 18504018 2025-07-14 13:36:15,613 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748770_7946, type=LAST_IN_PIPELINE terminating 2025-07-14 13:36:20,055 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748770_7946 replica FinalizedReplica, blk_1073748770_7946, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748770 for deletion 2025-07-14 13:36:20,056 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748770_7946 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748770 2025-07-14 13:40:20,595 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748774_7950 src: /192.168.158.6:55132 dest: /192.168.158.4:9866 2025-07-14 13:40:20,622 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:55132, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_921089054_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748774_7950, duration(ns): 20896400 2025-07-14 13:40:20,622 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748774_7950, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-14 13:40:23,064 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748774_7950 replica FinalizedReplica, blk_1073748774_7950, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748774 for deletion 2025-07-14 13:40:23,065 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748774_7950 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748774 2025-07-14 13:43:25,619 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748777_7953 src: /192.168.158.5:34240 dest: /192.168.158.4:9866 2025-07-14 13:43:25,636 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:34240, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1555837759_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748777_7953, duration(ns): 14709445 2025-07-14 13:43:25,636 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748777_7953, type=LAST_IN_PIPELINE terminating 2025-07-14 13:43:32,073 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748777_7953 replica FinalizedReplica, blk_1073748777_7953, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748777 for deletion 2025-07-14 13:43:32,074 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748777_7953 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748777 2025-07-14 13:46:25,614 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748780_7956 src: /192.168.158.8:41588 dest: /192.168.158.4:9866 2025-07-14 13:46:25,640 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:41588, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1277999631_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748780_7956, duration(ns): 21456339 2025-07-14 13:46:25,641 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748780_7956, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-14 13:46:29,078 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748780_7956 replica FinalizedReplica, blk_1073748780_7956, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748780 for deletion 2025-07-14 13:46:29,079 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748780_7956 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748780 2025-07-14 13:47:25,608 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748781_7957 src: /192.168.158.5:49798 dest: /192.168.158.4:9866 2025-07-14 13:47:25,633 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:49798, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1550872992_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748781_7957, duration(ns): 19841905 2025-07-14 13:47:25,633 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748781_7957, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-14 13:47:29,079 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748781_7957 replica FinalizedReplica, blk_1073748781_7957, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748781 for deletion 2025-07-14 13:47:29,083 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748781_7957 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748781 2025-07-14 13:49:35,626 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748783_7959 src: /192.168.158.7:37604 dest: /192.168.158.4:9866 2025-07-14 13:49:35,650 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:37604, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1723376656_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748783_7959, duration(ns): 18514415 2025-07-14 13:49:35,651 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748783_7959, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-14 13:49:38,085 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748783_7959 replica FinalizedReplica, blk_1073748783_7959, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748783 for deletion 2025-07-14 13:49:38,086 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748783_7959 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748783 2025-07-14 13:50:40,604 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748784_7960 src: /192.168.158.1:50148 dest: /192.168.158.4:9866 2025-07-14 13:50:40,635 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50148, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2052499781_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748784_7960, duration(ns): 21538976 2025-07-14 13:50:40,635 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748784_7960, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-14 13:50:44,089 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748784_7960 replica FinalizedReplica, blk_1073748784_7960, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748784 for deletion 2025-07-14 13:50:44,090 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748784_7960 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748784 2025-07-14 13:51:40,605 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748785_7961 src: /192.168.158.1:38712 dest: /192.168.158.4:9866 2025-07-14 13:51:40,637 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38712, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1194067883_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748785_7961, duration(ns): 22510921 2025-07-14 13:51:40,637 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748785_7961, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-14 13:51:44,091 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748785_7961 replica FinalizedReplica, blk_1073748785_7961, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748785 for deletion 2025-07-14 13:51:44,092 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748785_7961 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748785 2025-07-14 13:54:45,605 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748788_7964 src: /192.168.158.6:37494 dest: /192.168.158.4:9866 2025-07-14 13:54:45,629 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:37494, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1296947171_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748788_7964, duration(ns): 19014654 2025-07-14 13:54:45,630 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748788_7964, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-14 13:54:50,097 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748788_7964 replica FinalizedReplica, blk_1073748788_7964, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748788 for deletion 2025-07-14 13:54:50,098 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748788_7964 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748788 2025-07-14 13:56:50,613 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748790_7966 src: /192.168.158.1:39168 dest: /192.168.158.4:9866 2025-07-14 13:56:50,648 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39168, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_722290801_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748790_7966, duration(ns): 25956289 2025-07-14 13:56:50,648 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748790_7966, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-14 13:56:53,101 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748790_7966 replica FinalizedReplica, blk_1073748790_7966, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748790 for deletion 2025-07-14 13:56:53,102 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748790_7966 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748790 2025-07-14 13:58:50,623 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748792_7968 src: /192.168.158.9:42504 dest: /192.168.158.4:9866 2025-07-14 13:58:50,644 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:42504, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1903229289_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748792_7968, duration(ns): 19384297 2025-07-14 13:58:50,645 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748792_7968, type=LAST_IN_PIPELINE terminating 2025-07-14 13:58:53,104 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748792_7968 replica FinalizedReplica, blk_1073748792_7968, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748792 for deletion 2025-07-14 13:58:53,106 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748792_7968 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748792 2025-07-14 13:59:55,624 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748793_7969 src: /192.168.158.1:47656 dest: /192.168.158.4:9866 2025-07-14 13:59:55,656 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47656, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_43649018_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748793_7969, duration(ns): 22622181 2025-07-14 13:59:55,656 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748793_7969, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-14 13:59:59,108 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748793_7969 replica FinalizedReplica, blk_1073748793_7969, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748793 for deletion 2025-07-14 13:59:59,109 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748793_7969 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748793 2025-07-14 14:00:55,617 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748794_7970 src: /192.168.158.9:34288 dest: /192.168.158.4:9866 2025-07-14 14:00:55,643 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:34288, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-634235864_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748794_7970, duration(ns): 20575569 2025-07-14 14:00:55,644 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748794_7970, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-14 14:00:59,110 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748794_7970 replica FinalizedReplica, blk_1073748794_7970, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748794 for deletion 2025-07-14 14:00:59,111 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748794_7970 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748794 2025-07-14 14:01:55,621 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748795_7971 src: /192.168.158.8:45432 dest: /192.168.158.4:9866 2025-07-14 14:01:55,639 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:45432, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1646347144_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748795_7971, duration(ns): 16073516 2025-07-14 14:01:55,639 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748795_7971, type=LAST_IN_PIPELINE terminating 2025-07-14 14:02:02,115 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748795_7971 replica FinalizedReplica, blk_1073748795_7971, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748795 for deletion 2025-07-14 14:02:02,116 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748795_7971 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748795 2025-07-14 14:04:05,620 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748797_7973 src: /192.168.158.7:45478 dest: /192.168.158.4:9866 2025-07-14 14:04:05,637 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:45478, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-755454289_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748797_7973, duration(ns): 15652427 2025-07-14 14:04:05,638 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748797_7973, type=LAST_IN_PIPELINE terminating 2025-07-14 14:04:11,123 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748797_7973 replica FinalizedReplica, blk_1073748797_7973, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748797 for deletion 2025-07-14 14:04:11,125 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748797_7973 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748797 2025-07-14 14:05:05,617 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748798_7974 src: /192.168.158.1:42538 dest: /192.168.158.4:9866 2025-07-14 14:05:05,650 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42538, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1235595449_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748798_7974, duration(ns): 23611334 2025-07-14 14:05:05,650 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748798_7974, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-14 14:05:11,124 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748798_7974 replica FinalizedReplica, blk_1073748798_7974, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748798 for deletion 2025-07-14 14:05:11,126 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748798_7974 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748798 2025-07-14 14:06:05,628 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748799_7975 src: /192.168.158.7:34962 dest: /192.168.158.4:9866 2025-07-14 14:06:05,651 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:34962, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-341226826_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748799_7975, duration(ns): 17705537 2025-07-14 14:06:05,651 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748799_7975, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-14 14:06:08,128 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748799_7975 replica FinalizedReplica, blk_1073748799_7975, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748799 for deletion 2025-07-14 14:06:08,129 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748799_7975 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748799 2025-07-14 14:12:20,625 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748805_7981 src: /192.168.158.1:47752 dest: /192.168.158.4:9866 2025-07-14 14:12:20,656 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47752, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1764132800_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748805_7981, duration(ns): 21803383 2025-07-14 14:12:20,657 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748805_7981, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-14 14:12:23,145 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748805_7981 replica FinalizedReplica, blk_1073748805_7981, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748805 for deletion 2025-07-14 14:12:23,146 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748805_7981 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748805 2025-07-14 14:14:20,608 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748807_7983 src: /192.168.158.6:34610 dest: /192.168.158.4:9866 2025-07-14 14:14:20,624 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:34610, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-233602259_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748807_7983, duration(ns): 14401344 2025-07-14 14:14:20,625 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748807_7983, type=LAST_IN_PIPELINE terminating 2025-07-14 14:14:23,150 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748807_7983 replica FinalizedReplica, blk_1073748807_7983, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748807 for deletion 2025-07-14 14:14:23,151 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748807_7983 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748807 2025-07-14 14:15:20,631 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748808_7984 src: /192.168.158.1:33840 dest: /192.168.158.4:9866 2025-07-14 14:15:20,663 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33840, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1418994458_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748808_7984, duration(ns): 22327035 2025-07-14 14:15:20,663 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748808_7984, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-14 14:15:23,152 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748808_7984 replica FinalizedReplica, blk_1073748808_7984, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748808 for deletion 2025-07-14 14:15:23,153 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748808_7984 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748808 2025-07-14 14:17:20,635 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748810_7986 src: /192.168.158.1:59090 dest: /192.168.158.4:9866 2025-07-14 14:17:20,666 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59090, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1425450758_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748810_7986, duration(ns): 23575801 2025-07-14 14:17:20,667 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748810_7986, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-14 14:17:23,159 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748810_7986 replica FinalizedReplica, blk_1073748810_7986, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748810 for deletion 2025-07-14 14:17:23,160 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748810_7986 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748810 2025-07-14 14:20:20,641 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748813_7989 src: /192.168.158.1:52178 dest: /192.168.158.4:9866 2025-07-14 14:20:20,675 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52178, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1565079102_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748813_7989, duration(ns): 25099876 2025-07-14 14:20:20,675 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748813_7989, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-14 14:20:26,167 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748813_7989 replica FinalizedReplica, blk_1073748813_7989, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748813 for deletion 2025-07-14 14:20:26,169 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748813_7989 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748813 2025-07-14 14:21:25,642 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748814_7990 src: /192.168.158.9:56018 dest: /192.168.158.4:9866 2025-07-14 14:21:25,666 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:56018, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1100966010_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748814_7990, duration(ns): 18107603 2025-07-14 14:21:25,666 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748814_7990, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-14 14:21:29,173 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748814_7990 replica FinalizedReplica, blk_1073748814_7990, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748814 for deletion 2025-07-14 14:21:29,174 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748814_7990 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748814 2025-07-14 14:24:35,652 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748817_7993 src: /192.168.158.9:56944 dest: /192.168.158.4:9866 2025-07-14 14:24:35,677 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:56944, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-13516968_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748817_7993, duration(ns): 19815408 2025-07-14 14:24:35,678 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748817_7993, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-14 14:24:38,177 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748817_7993 replica FinalizedReplica, blk_1073748817_7993, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748817 for deletion 2025-07-14 14:24:38,179 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748817_7993 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748817 2025-07-14 14:28:40,653 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748821_7997 src: /192.168.158.6:38062 dest: /192.168.158.4:9866 2025-07-14 14:28:40,679 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:38062, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_501440501_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748821_7997, duration(ns): 18539824 2025-07-14 14:28:40,679 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748821_7997, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-14 14:28:47,189 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748821_7997 replica FinalizedReplica, blk_1073748821_7997, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748821 for deletion 2025-07-14 14:28:47,190 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748821_7997 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748821 2025-07-14 14:29:45,654 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748822_7998 src: /192.168.158.1:48440 dest: /192.168.158.4:9866 2025-07-14 14:29:45,684 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48440, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2030198978_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748822_7998, duration(ns): 21990544 2025-07-14 14:29:45,685 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748822_7998, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-14 14:29:50,191 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748822_7998 replica FinalizedReplica, blk_1073748822_7998, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748822 for deletion 2025-07-14 14:29:50,192 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748822_7998 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748822 2025-07-14 14:30:45,657 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748823_7999 src: /192.168.158.7:42780 dest: /192.168.158.4:9866 2025-07-14 14:30:45,674 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:42780, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1259699547_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748823_7999, duration(ns): 15431195 2025-07-14 14:30:45,675 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748823_7999, type=LAST_IN_PIPELINE terminating 2025-07-14 14:30:50,192 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748823_7999 replica FinalizedReplica, blk_1073748823_7999, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748823 for deletion 2025-07-14 14:30:50,194 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748823_7999 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748823 2025-07-14 14:34:45,656 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748827_8003 src: /192.168.158.1:59022 dest: /192.168.158.4:9866 2025-07-14 14:34:45,688 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59022, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1341805408_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748827_8003, duration(ns): 22814836 2025-07-14 14:34:45,688 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748827_8003, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-14 14:34:50,204 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748827_8003 replica FinalizedReplica, blk_1073748827_8003, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748827 for deletion 2025-07-14 14:34:50,206 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748827_8003 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748827 2025-07-14 14:35:45,663 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748828_8004 src: /192.168.158.1:49518 dest: /192.168.158.4:9866 2025-07-14 14:35:45,695 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49518, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-786373603_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748828_8004, duration(ns): 23176250 2025-07-14 14:35:45,695 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748828_8004, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-14 14:35:53,206 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748828_8004 replica FinalizedReplica, blk_1073748828_8004, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748828 for deletion 2025-07-14 14:35:53,208 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748828_8004 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748828 2025-07-14 14:38:50,671 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748831_8007 src: /192.168.158.8:55856 dest: /192.168.158.4:9866 2025-07-14 14:38:50,696 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:55856, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1592726806_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748831_8007, duration(ns): 19854770 2025-07-14 14:38:50,696 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748831_8007, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-14 14:38:53,214 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748831_8007 replica FinalizedReplica, blk_1073748831_8007, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748831 for deletion 2025-07-14 14:38:53,215 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748831_8007 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748831 2025-07-14 14:39:50,676 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748832_8008 src: /192.168.158.1:53298 dest: /192.168.158.4:9866 2025-07-14 14:39:50,710 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53298, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-387549672_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748832_8008, duration(ns): 25321947 2025-07-14 14:39:50,710 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748832_8008, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-14 14:39:53,219 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748832_8008 replica FinalizedReplica, blk_1073748832_8008, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748832 for deletion 2025-07-14 14:39:53,220 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748832_8008 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748832 2025-07-14 14:41:50,715 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748834_8010 src: /192.168.158.6:32804 dest: /192.168.158.4:9866 2025-07-14 14:41:50,739 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:32804, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1677231098_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748834_8010, duration(ns): 17947088 2025-07-14 14:41:50,739 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748834_8010, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-14 14:41:53,222 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748834_8010 replica FinalizedReplica, blk_1073748834_8010, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748834 for deletion 2025-07-14 14:41:53,223 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748834_8010 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748834 2025-07-14 14:44:55,685 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748837_8013 src: /192.168.158.9:57876 dest: /192.168.158.4:9866 2025-07-14 14:44:55,711 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:57876, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1675408438_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748837_8013, duration(ns): 20202100 2025-07-14 14:44:55,711 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748837_8013, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-14 14:44:59,229 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748837_8013 replica FinalizedReplica, blk_1073748837_8013, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748837 for deletion 2025-07-14 14:44:59,231 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748837_8013 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748837 2025-07-14 14:46:55,685 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748839_8015 src: /192.168.158.8:41116 dest: /192.168.158.4:9866 2025-07-14 14:46:55,703 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:41116, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1981147111_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748839_8015, duration(ns): 16453043 2025-07-14 14:46:55,704 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748839_8015, type=LAST_IN_PIPELINE terminating 2025-07-14 14:46:59,231 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748839_8015 replica FinalizedReplica, blk_1073748839_8015, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748839 for deletion 2025-07-14 14:46:59,232 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748839_8015 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748839 2025-07-14 14:48:55,679 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748841_8017 src: /192.168.158.1:38250 dest: /192.168.158.4:9866 2025-07-14 14:48:55,709 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38250, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-653801235_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748841_8017, duration(ns): 21762945 2025-07-14 14:48:55,710 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748841_8017, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-14 14:48:59,239 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748841_8017 replica FinalizedReplica, blk_1073748841_8017, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748841 for deletion 2025-07-14 14:48:59,240 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748841_8017 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748841 2025-07-14 14:51:55,694 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748844_8020 src: /192.168.158.9:39188 dest: /192.168.158.4:9866 2025-07-14 14:51:55,722 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:39188, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1237623837_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748844_8020, duration(ns): 21859062 2025-07-14 14:51:55,725 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748844_8020, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-14 14:52:02,244 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748844_8020 replica FinalizedReplica, blk_1073748844_8020, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748844 for deletion 2025-07-14 14:52:02,245 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748844_8020 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748844 2025-07-14 14:53:00,696 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748845_8021 src: /192.168.158.1:59528 dest: /192.168.158.4:9866 2025-07-14 14:53:00,730 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59528, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1747010397_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748845_8021, duration(ns): 24405347 2025-07-14 14:53:00,730 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748845_8021, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-14 14:53:08,244 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748845_8021 replica FinalizedReplica, blk_1073748845_8021, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748845 for deletion 2025-07-14 14:53:08,245 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748845_8021 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748845 2025-07-14 14:59:05,709 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748851_8027 src: /192.168.158.7:38182 dest: /192.168.158.4:9866 2025-07-14 14:59:05,727 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:38182, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-942583830_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748851_8027, duration(ns): 15174662 2025-07-14 14:59:05,727 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748851_8027, type=LAST_IN_PIPELINE terminating 2025-07-14 14:59:08,249 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748851_8027 replica FinalizedReplica, blk_1073748851_8027, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748851 for deletion 2025-07-14 14:59:08,250 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748851_8027 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748851 2025-07-14 15:03:05,726 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748855_8031 src: /192.168.158.5:42630 dest: /192.168.158.4:9866 2025-07-14 15:03:05,750 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:42630, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-689965413_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748855_8031, duration(ns): 19119601 2025-07-14 15:03:05,751 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748855_8031, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-14 15:03:11,253 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748855_8031 replica FinalizedReplica, blk_1073748855_8031, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748855 for deletion 2025-07-14 15:03:11,255 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748855_8031 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748855 2025-07-14 15:06:10,725 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748858_8034 src: /192.168.158.9:58408 dest: /192.168.158.4:9866 2025-07-14 15:06:10,750 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:58408, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1178725540_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748858_8034, duration(ns): 19372492 2025-07-14 15:06:10,750 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748858_8034, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-14 15:06:17,256 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748858_8034 replica FinalizedReplica, blk_1073748858_8034, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748858 for deletion 2025-07-14 15:06:17,257 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748858_8034 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748858 2025-07-14 15:07:15,755 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748859_8035 src: /192.168.158.1:40970 dest: /192.168.158.4:9866 2025-07-14 15:07:15,786 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40970, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1572904672_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748859_8035, duration(ns): 22259784 2025-07-14 15:07:15,786 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748859_8035, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-14 15:07:23,257 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748859_8035 replica FinalizedReplica, blk_1073748859_8035, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748859 for deletion 2025-07-14 15:07:23,258 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748859_8035 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748859 2025-07-14 15:08:15,725 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748860_8036 src: /192.168.158.6:55620 dest: /192.168.158.4:9866 2025-07-14 15:08:15,745 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:55620, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1640632551_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748860_8036, duration(ns): 17864918 2025-07-14 15:08:15,745 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748860_8036, type=LAST_IN_PIPELINE terminating 2025-07-14 15:08:23,258 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748860_8036 replica FinalizedReplica, blk_1073748860_8036, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748860 for deletion 2025-07-14 15:08:23,260 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748860_8036 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748860 2025-07-14 15:09:15,722 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748861_8037 src: /192.168.158.5:40468 dest: /192.168.158.4:9866 2025-07-14 15:09:15,746 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:40468, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1422817045_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748861_8037, duration(ns): 19271899 2025-07-14 15:09:15,747 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748861_8037, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-14 15:09:20,258 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748861_8037 replica FinalizedReplica, blk_1073748861_8037, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748861 for deletion 2025-07-14 15:09:20,259 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748861_8037 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748861 2025-07-14 15:10:20,729 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748862_8038 src: /192.168.158.1:50044 dest: /192.168.158.4:9866 2025-07-14 15:10:20,763 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50044, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1321002536_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748862_8038, duration(ns): 25270607 2025-07-14 15:10:20,763 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748862_8038, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-14 15:10:23,260 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748862_8038 replica FinalizedReplica, blk_1073748862_8038, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748862 for deletion 2025-07-14 15:10:23,261 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748862_8038 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748862 2025-07-14 15:11:25,736 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748863_8039 src: /192.168.158.9:51056 dest: /192.168.158.4:9866 2025-07-14 15:11:25,753 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:51056, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_599319898_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748863_8039, duration(ns): 15766026 2025-07-14 15:11:25,754 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748863_8039, type=LAST_IN_PIPELINE terminating 2025-07-14 15:11:29,259 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748863_8039 replica FinalizedReplica, blk_1073748863_8039, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748863 for deletion 2025-07-14 15:11:29,260 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748863_8039 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748863 2025-07-14 15:12:25,735 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748864_8040 src: /192.168.158.6:37916 dest: /192.168.158.4:9866 2025-07-14 15:12:25,761 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:37916, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1772040358_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748864_8040, duration(ns): 20342841 2025-07-14 15:12:25,761 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748864_8040, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-14 15:12:32,260 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748864_8040 replica FinalizedReplica, blk_1073748864_8040, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748864 for deletion 2025-07-14 15:12:32,261 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748864_8040 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748864 2025-07-14 15:13:25,740 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748865_8041 src: /192.168.158.9:37778 dest: /192.168.158.4:9866 2025-07-14 15:13:25,766 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:37778, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1630417646_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748865_8041, duration(ns): 20427548 2025-07-14 15:13:25,766 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748865_8041, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-14 15:13:32,259 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748865_8041 replica FinalizedReplica, blk_1073748865_8041, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748865 for deletion 2025-07-14 15:13:32,260 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748865_8041 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748865 2025-07-14 15:18:30,749 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748870_8046 src: /192.168.158.7:47164 dest: /192.168.158.4:9866 2025-07-14 15:18:30,776 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:47164, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_489888189_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748870_8046, duration(ns): 20947072 2025-07-14 15:18:30,778 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748870_8046, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-14 15:18:35,263 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748870_8046 replica FinalizedReplica, blk_1073748870_8046, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748870 for deletion 2025-07-14 15:18:35,264 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748870_8046 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748870 2025-07-14 15:20:40,743 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748872_8048 src: /192.168.158.1:54004 dest: /192.168.158.4:9866 2025-07-14 15:20:40,776 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54004, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_733342842_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748872_8048, duration(ns): 24181108 2025-07-14 15:20:40,778 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748872_8048, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-14 15:20:47,267 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748872_8048 replica FinalizedReplica, blk_1073748872_8048, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748872 for deletion 2025-07-14 15:20:47,268 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748872_8048 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748872 2025-07-14 15:21:40,735 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748873_8049 src: /192.168.158.7:53212 dest: /192.168.158.4:9866 2025-07-14 15:21:40,754 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:53212, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1433633758_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748873_8049, duration(ns): 16827840 2025-07-14 15:21:40,754 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748873_8049, type=LAST_IN_PIPELINE terminating 2025-07-14 15:21:47,269 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748873_8049 replica FinalizedReplica, blk_1073748873_8049, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748873 for deletion 2025-07-14 15:21:47,270 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748873_8049 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748873 2025-07-14 15:24:45,743 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748876_8052 src: /192.168.158.1:50062 dest: /192.168.158.4:9866 2025-07-14 15:24:45,775 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50062, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_484065341_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748876_8052, duration(ns): 22759493 2025-07-14 15:24:45,775 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748876_8052, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-14 15:24:50,271 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748876_8052 replica FinalizedReplica, blk_1073748876_8052, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748876 for deletion 2025-07-14 15:24:50,273 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748876_8052 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748876 2025-07-14 15:25:45,746 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748877_8053 src: /192.168.158.1:48420 dest: /192.168.158.4:9866 2025-07-14 15:25:45,777 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48420, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1614809987_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748877_8053, duration(ns): 22103021 2025-07-14 15:25:45,777 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748877_8053, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-14 15:25:50,273 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748877_8053 replica FinalizedReplica, blk_1073748877_8053, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748877 for deletion 2025-07-14 15:25:50,276 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748877_8053 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748877 2025-07-14 15:26:50,754 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748878_8054 src: /192.168.158.1:42954 dest: /192.168.158.4:9866 2025-07-14 15:26:50,789 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42954, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2061381074_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748878_8054, duration(ns): 24226416 2025-07-14 15:26:50,789 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748878_8054, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-14 15:26:53,273 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748878_8054 replica FinalizedReplica, blk_1073748878_8054, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748878 for deletion 2025-07-14 15:26:53,274 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748878_8054 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748878 2025-07-14 15:29:55,763 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748881_8057 src: /192.168.158.1:35388 dest: /192.168.158.4:9866 2025-07-14 15:29:55,796 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35388, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1187754685_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748881_8057, duration(ns): 24056624 2025-07-14 15:29:55,796 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748881_8057, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-14 15:29:59,278 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748881_8057 replica FinalizedReplica, blk_1073748881_8057, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748881 for deletion 2025-07-14 15:29:59,279 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748881_8057 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748881 2025-07-14 15:31:55,747 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748883_8059 src: /192.168.158.1:55746 dest: /192.168.158.4:9866 2025-07-14 15:31:55,779 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55746, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_686090418_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748883_8059, duration(ns): 23182102 2025-07-14 15:31:55,779 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748883_8059, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-14 15:32:02,281 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748883_8059 replica FinalizedReplica, blk_1073748883_8059, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748883 for deletion 2025-07-14 15:32:02,282 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748883_8059 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748883 2025-07-14 15:34:55,760 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748886_8062 src: /192.168.158.9:33232 dest: /192.168.158.4:9866 2025-07-14 15:34:55,779 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:33232, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1290644965_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748886_8062, duration(ns): 16746620 2025-07-14 15:34:55,779 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748886_8062, type=LAST_IN_PIPELINE terminating 2025-07-14 15:34:59,288 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748886_8062 replica FinalizedReplica, blk_1073748886_8062, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748886 for deletion 2025-07-14 15:34:59,289 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748886_8062 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748886 2025-07-14 15:35:55,756 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748887_8063 src: /192.168.158.1:52272 dest: /192.168.158.4:9866 2025-07-14 15:35:55,787 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52272, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1271209889_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748887_8063, duration(ns): 21591576 2025-07-14 15:35:55,787 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748887_8063, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-14 15:35:59,288 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748887_8063 replica FinalizedReplica, blk_1073748887_8063, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748887 for deletion 2025-07-14 15:35:59,290 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748887_8063 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748887 2025-07-14 15:36:55,755 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748888_8064 src: /192.168.158.1:56828 dest: /192.168.158.4:9866 2025-07-14 15:36:55,789 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56828, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-163771275_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748888_8064, duration(ns): 24555068 2025-07-14 15:36:55,789 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748888_8064, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-14 15:36:59,292 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748888_8064 replica FinalizedReplica, blk_1073748888_8064, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748888 for deletion 2025-07-14 15:36:59,293 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748888_8064 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748888 2025-07-14 15:38:55,759 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748890_8066 src: /192.168.158.1:59044 dest: /192.168.158.4:9866 2025-07-14 15:38:55,790 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59044, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1685398744_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748890_8066, duration(ns): 21769271 2025-07-14 15:38:55,790 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748890_8066, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-14 15:39:02,295 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748890_8066 replica FinalizedReplica, blk_1073748890_8066, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748890 for deletion 2025-07-14 15:39:02,297 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748890_8066 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748890 2025-07-14 15:39:55,761 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748891_8067 src: /192.168.158.1:51430 dest: /192.168.158.4:9866 2025-07-14 15:39:55,791 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51430, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2106387568_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748891_8067, duration(ns): 21274439 2025-07-14 15:39:55,791 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748891_8067, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-14 15:39:59,299 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748891_8067 replica FinalizedReplica, blk_1073748891_8067, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748891 for deletion 2025-07-14 15:39:59,300 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748891_8067 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748891 2025-07-14 15:41:55,771 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748893_8069 src: /192.168.158.9:48024 dest: /192.168.158.4:9866 2025-07-14 15:41:55,796 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:48024, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-339457167_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748893_8069, duration(ns): 20108136 2025-07-14 15:41:55,796 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748893_8069, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-14 15:41:59,302 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748893_8069 replica FinalizedReplica, blk_1073748893_8069, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748893 for deletion 2025-07-14 15:41:59,303 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748893_8069 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748893 2025-07-14 15:45:00,779 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748896_8072 src: /192.168.158.6:50704 dest: /192.168.158.4:9866 2025-07-14 15:45:00,797 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:50704, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-557329574_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748896_8072, duration(ns): 15878844 2025-07-14 15:45:00,797 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748896_8072, type=LAST_IN_PIPELINE terminating 2025-07-14 15:45:05,306 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748896_8072 replica FinalizedReplica, blk_1073748896_8072, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748896 for deletion 2025-07-14 15:45:05,308 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748896_8072 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748896 2025-07-14 15:46:00,771 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748897_8073 src: /192.168.158.1:59588 dest: /192.168.158.4:9866 2025-07-14 15:46:00,802 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59588, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_352572276_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748897_8073, duration(ns): 22004948 2025-07-14 15:46:00,803 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748897_8073, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-14 15:46:05,311 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748897_8073 replica FinalizedReplica, blk_1073748897_8073, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748897 for deletion 2025-07-14 15:46:05,312 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748897_8073 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748897 2025-07-14 15:48:05,771 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748899_8075 src: /192.168.158.1:60308 dest: /192.168.158.4:9866 2025-07-14 15:48:05,803 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60308, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1937845126_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748899_8075, duration(ns): 23429272 2025-07-14 15:48:05,803 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748899_8075, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-14 15:48:08,314 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748899_8075 replica FinalizedReplica, blk_1073748899_8075, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748899 for deletion 2025-07-14 15:48:08,315 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748899_8075 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748899 2025-07-14 15:49:10,774 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748900_8076 src: /192.168.158.8:37528 dest: /192.168.158.4:9866 2025-07-14 15:49:10,793 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:37528, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_899747448_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748900_8076, duration(ns): 16902533 2025-07-14 15:49:10,794 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748900_8076, type=LAST_IN_PIPELINE terminating 2025-07-14 15:49:17,314 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748900_8076 replica FinalizedReplica, blk_1073748900_8076, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748900 for deletion 2025-07-14 15:49:17,315 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748900_8076 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748900 2025-07-14 15:50:15,776 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748901_8077 src: /192.168.158.7:35996 dest: /192.168.158.4:9866 2025-07-14 15:50:15,793 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:35996, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1425606727_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748901_8077, duration(ns): 15528074 2025-07-14 15:50:15,794 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748901_8077, type=LAST_IN_PIPELINE terminating 2025-07-14 15:50:23,318 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748901_8077 replica FinalizedReplica, blk_1073748901_8077, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748901 for deletion 2025-07-14 15:50:23,319 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748901_8077 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748901 2025-07-14 15:51:20,779 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748902_8078 src: /192.168.158.5:38212 dest: /192.168.158.4:9866 2025-07-14 15:51:20,803 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:38212, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1378113755_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748902_8078, duration(ns): 19112205 2025-07-14 15:51:20,804 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748902_8078, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-14 15:51:23,322 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748902_8078 replica FinalizedReplica, blk_1073748902_8078, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748902 for deletion 2025-07-14 15:51:23,323 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748902_8078 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748902 2025-07-14 15:56:25,786 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748907_8083 src: /192.168.158.7:51868 dest: /192.168.158.4:9866 2025-07-14 15:56:25,803 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:51868, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1236256919_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748907_8083, duration(ns): 15303793 2025-07-14 15:56:25,804 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748907_8083, type=LAST_IN_PIPELINE terminating 2025-07-14 15:56:32,338 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748907_8083 replica FinalizedReplica, blk_1073748907_8083, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748907 for deletion 2025-07-14 15:56:32,339 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748907_8083 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748907 2025-07-14 15:58:25,788 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748909_8085 src: /192.168.158.6:53386 dest: /192.168.158.4:9866 2025-07-14 15:58:25,815 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:53386, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1217073894_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748909_8085, duration(ns): 21535294 2025-07-14 15:58:25,817 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748909_8085, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-14 15:58:29,340 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748909_8085 replica FinalizedReplica, blk_1073748909_8085, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748909 for deletion 2025-07-14 15:58:29,341 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748909_8085 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748909 2025-07-14 16:02:35,798 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748913_8089 src: /192.168.158.8:39730 dest: /192.168.158.4:9866 2025-07-14 16:02:35,815 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:39730, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-655197486_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748913_8089, duration(ns): 15395117 2025-07-14 16:02:35,816 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748913_8089, type=LAST_IN_PIPELINE terminating 2025-07-14 16:02:41,347 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748913_8089 replica FinalizedReplica, blk_1073748913_8089, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748913 for deletion 2025-07-14 16:02:41,348 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748913_8089 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748913 2025-07-14 16:11:45,818 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748922_8098 src: /192.168.158.8:38764 dest: /192.168.158.4:9866 2025-07-14 16:11:45,837 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:38764, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_326345190_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748922_8098, duration(ns): 16632589 2025-07-14 16:11:45,837 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748922_8098, type=LAST_IN_PIPELINE terminating 2025-07-14 16:11:50,362 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748922_8098 replica FinalizedReplica, blk_1073748922_8098, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748922 for deletion 2025-07-14 16:11:50,364 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748922_8098 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748922 2025-07-14 16:12:50,829 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748923_8099 src: /192.168.158.5:53726 dest: /192.168.158.4:9866 2025-07-14 16:12:50,847 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:53726, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-685007292_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748923_8099, duration(ns): 15640285 2025-07-14 16:12:50,847 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748923_8099, type=LAST_IN_PIPELINE terminating 2025-07-14 16:12:56,365 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748923_8099 replica FinalizedReplica, blk_1073748923_8099, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748923 for deletion 2025-07-14 16:12:56,366 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748923_8099 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748923 2025-07-14 16:14:50,817 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748925_8101 src: /192.168.158.6:33482 dest: /192.168.158.4:9866 2025-07-14 16:14:50,844 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:33482, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1011281790_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748925_8101, duration(ns): 21176617 2025-07-14 16:14:50,844 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748925_8101, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-14 16:14:53,369 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748925_8101 replica FinalizedReplica, blk_1073748925_8101, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748925 for deletion 2025-07-14 16:14:53,370 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748925_8101 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748925 2025-07-14 16:17:50,863 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748928_8104 src: /192.168.158.9:41832 dest: /192.168.158.4:9866 2025-07-14 16:17:50,889 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:41832, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_996978547_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748928_8104, duration(ns): 20222553 2025-07-14 16:17:50,889 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748928_8104, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-14 16:17:53,373 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748928_8104 replica FinalizedReplica, blk_1073748928_8104, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748928 for deletion 2025-07-14 16:17:53,374 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748928_8104 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748928 2025-07-14 16:22:05,832 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748932_8108 src: /192.168.158.1:38592 dest: /192.168.158.4:9866 2025-07-14 16:22:05,864 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38592, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1074316592_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748932_8108, duration(ns): 22535291 2025-07-14 16:22:05,864 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748932_8108, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-14 16:22:11,384 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748932_8108 replica FinalizedReplica, blk_1073748932_8108, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748932 for deletion 2025-07-14 16:22:11,385 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748932_8108 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748932 2025-07-14 16:23:05,832 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748933_8109 src: /192.168.158.8:36092 dest: /192.168.158.4:9866 2025-07-14 16:23:05,849 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:36092, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_642094685_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748933_8109, duration(ns): 15036828 2025-07-14 16:23:05,849 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748933_8109, type=LAST_IN_PIPELINE terminating 2025-07-14 16:23:08,385 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748933_8109 replica FinalizedReplica, blk_1073748933_8109, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748933 for deletion 2025-07-14 16:23:08,386 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748933_8109 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748933 2025-07-14 16:24:05,835 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748934_8110 src: /192.168.158.6:54220 dest: /192.168.158.4:9866 2025-07-14 16:24:05,853 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:54220, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1871235217_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748934_8110, duration(ns): 15723942 2025-07-14 16:24:05,853 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748934_8110, type=LAST_IN_PIPELINE terminating 2025-07-14 16:24:08,387 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748934_8110 replica FinalizedReplica, blk_1073748934_8110, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748934 for deletion 2025-07-14 16:24:08,388 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748934_8110 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748934 2025-07-14 16:25:05,829 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748935_8111 src: /192.168.158.1:55928 dest: /192.168.158.4:9866 2025-07-14 16:25:05,861 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55928, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1468687328_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748935_8111, duration(ns): 22505993 2025-07-14 16:25:05,861 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748935_8111, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-14 16:25:08,387 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748935_8111 replica FinalizedReplica, blk_1073748935_8111, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748935 for deletion 2025-07-14 16:25:08,388 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748935_8111 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748935 2025-07-14 16:26:05,842 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748936_8112 src: /192.168.158.5:57254 dest: /192.168.158.4:9866 2025-07-14 16:26:05,867 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:57254, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1741923304_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748936_8112, duration(ns): 20226868 2025-07-14 16:26:05,868 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748936_8112, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-14 16:26:11,389 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748936_8112 replica FinalizedReplica, blk_1073748936_8112, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748936 for deletion 2025-07-14 16:26:11,390 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748936_8112 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748936 2025-07-14 16:28:05,836 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748938_8114 src: /192.168.158.5:60712 dest: /192.168.158.4:9866 2025-07-14 16:28:05,853 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:60712, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1450181644_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748938_8114, duration(ns): 15658069 2025-07-14 16:28:05,854 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748938_8114, type=LAST_IN_PIPELINE terminating 2025-07-14 16:28:11,394 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748938_8114 replica FinalizedReplica, blk_1073748938_8114, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748938 for deletion 2025-07-14 16:28:11,395 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748938_8114 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748938 2025-07-14 16:29:05,840 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748939_8115 src: /192.168.158.1:37430 dest: /192.168.158.4:9866 2025-07-14 16:29:05,872 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37430, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-27879920_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748939_8115, duration(ns): 22577242 2025-07-14 16:29:05,872 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748939_8115, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-14 16:29:08,396 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748939_8115 replica FinalizedReplica, blk_1073748939_8115, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748939 for deletion 2025-07-14 16:29:08,397 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748939_8115 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748939 2025-07-14 16:30:05,846 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748940_8116 src: /192.168.158.1:38274 dest: /192.168.158.4:9866 2025-07-14 16:30:05,875 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38274, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1142989943_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748940_8116, duration(ns): 21069397 2025-07-14 16:30:05,875 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748940_8116, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-14 16:30:08,398 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748940_8116 replica FinalizedReplica, blk_1073748940_8116, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748940 for deletion 2025-07-14 16:30:08,399 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748940_8116 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748940 2025-07-14 16:31:05,844 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748941_8117 src: /192.168.158.7:40512 dest: /192.168.158.4:9866 2025-07-14 16:31:05,860 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:40512, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_44595040_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748941_8117, duration(ns): 14699721 2025-07-14 16:31:05,861 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748941_8117, type=LAST_IN_PIPELINE terminating 2025-07-14 16:31:11,399 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748941_8117 replica FinalizedReplica, blk_1073748941_8117, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748941 for deletion 2025-07-14 16:31:11,400 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748941_8117 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748941 2025-07-14 16:32:05,850 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748942_8118 src: /192.168.158.9:52050 dest: /192.168.158.4:9866 2025-07-14 16:32:05,879 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:52050, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-23809220_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748942_8118, duration(ns): 23414498 2025-07-14 16:32:05,880 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748942_8118, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-14 16:32:08,399 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748942_8118 replica FinalizedReplica, blk_1073748942_8118, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748942 for deletion 2025-07-14 16:32:08,401 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748942_8118 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748942 2025-07-14 16:33:10,843 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748943_8119 src: /192.168.158.7:55536 dest: /192.168.158.4:9866 2025-07-14 16:33:10,861 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:55536, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1207120808_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748943_8119, duration(ns): 15174207 2025-07-14 16:33:10,861 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748943_8119, type=LAST_IN_PIPELINE terminating 2025-07-14 16:33:14,402 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748943_8119 replica FinalizedReplica, blk_1073748943_8119, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748943 for deletion 2025-07-14 16:33:14,403 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748943_8119 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748943 2025-07-14 16:36:15,845 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748946_8122 src: /192.168.158.9:48090 dest: /192.168.158.4:9866 2025-07-14 16:36:15,864 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:48090, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_936535269_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748946_8122, duration(ns): 16057529 2025-07-14 16:36:15,864 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748946_8122, type=LAST_IN_PIPELINE terminating 2025-07-14 16:36:20,412 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748946_8122 replica FinalizedReplica, blk_1073748946_8122, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748946 for deletion 2025-07-14 16:36:20,413 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748946_8122 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748946 2025-07-14 16:41:25,844 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748951_8127 src: /192.168.158.1:44124 dest: /192.168.158.4:9866 2025-07-14 16:41:25,875 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44124, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_898348992_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748951_8127, duration(ns): 20922617 2025-07-14 16:41:25,875 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748951_8127, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-14 16:41:32,425 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748951_8127 replica FinalizedReplica, blk_1073748951_8127, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748951 for deletion 2025-07-14 16:41:32,426 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748951_8127 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748951 2025-07-14 16:42:30,866 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748952_8128 src: /192.168.158.9:42456 dest: /192.168.158.4:9866 2025-07-14 16:42:30,891 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:42456, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1207529018_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748952_8128, duration(ns): 19665533 2025-07-14 16:42:30,891 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748952_8128, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-14 16:42:38,425 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748952_8128 replica FinalizedReplica, blk_1073748952_8128, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748952 for deletion 2025-07-14 16:42:38,426 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748952_8128 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748952 2025-07-14 16:43:30,854 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748953_8129 src: /192.168.158.7:37608 dest: /192.168.158.4:9866 2025-07-14 16:43:30,879 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:37608, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1893053731_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748953_8129, duration(ns): 19286621 2025-07-14 16:43:30,879 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748953_8129, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-14 16:43:35,426 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748953_8129 replica FinalizedReplica, blk_1073748953_8129, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748953 for deletion 2025-07-14 16:43:35,427 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748953_8129 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748953 2025-07-14 16:44:35,872 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748954_8130 src: /192.168.158.1:58220 dest: /192.168.158.4:9866 2025-07-14 16:44:35,902 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58220, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1362284902_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748954_8130, duration(ns): 21943258 2025-07-14 16:44:35,903 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748954_8130, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-14 16:44:38,427 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748954_8130 replica FinalizedReplica, blk_1073748954_8130, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748954 for deletion 2025-07-14 16:44:38,429 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748954_8130 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748954 2025-07-14 16:46:35,871 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748956_8132 src: /192.168.158.9:48544 dest: /192.168.158.4:9866 2025-07-14 16:46:35,891 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:48544, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1944473114_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748956_8132, duration(ns): 17377872 2025-07-14 16:46:35,891 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748956_8132, type=LAST_IN_PIPELINE terminating 2025-07-14 16:46:41,431 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748956_8132 replica FinalizedReplica, blk_1073748956_8132, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748956 for deletion 2025-07-14 16:46:41,432 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748956_8132 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748956 2025-07-14 16:47:35,877 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748957_8133 src: /192.168.158.8:49100 dest: /192.168.158.4:9866 2025-07-14 16:47:35,896 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:49100, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1082475968_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748957_8133, duration(ns): 17090580 2025-07-14 16:47:35,896 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748957_8133, type=LAST_IN_PIPELINE terminating 2025-07-14 16:47:38,433 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748957_8133 replica FinalizedReplica, blk_1073748957_8133, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748957 for deletion 2025-07-14 16:47:38,434 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748957_8133 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748957 2025-07-14 16:48:35,865 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748958_8134 src: /192.168.158.1:54064 dest: /192.168.158.4:9866 2025-07-14 16:48:35,899 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54064, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2111453509_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748958_8134, duration(ns): 24561924 2025-07-14 16:48:35,900 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748958_8134, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-14 16:48:41,434 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748958_8134 replica FinalizedReplica, blk_1073748958_8134, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748958 for deletion 2025-07-14 16:48:41,435 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748958_8134 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748958 2025-07-14 16:50:40,867 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748960_8136 src: /192.168.158.1:33602 dest: /192.168.158.4:9866 2025-07-14 16:50:40,901 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33602, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-510948770_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748960_8136, duration(ns): 24706688 2025-07-14 16:50:40,901 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748960_8136, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-14 16:50:47,439 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748960_8136 replica FinalizedReplica, blk_1073748960_8136, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748960 for deletion 2025-07-14 16:50:47,440 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748960_8136 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748960 2025-07-14 16:54:50,883 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748964_8140 src: /192.168.158.6:35984 dest: /192.168.158.4:9866 2025-07-14 16:54:50,908 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:35984, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1795432400_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748964_8140, duration(ns): 19285071 2025-07-14 16:54:50,908 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748964_8140, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-14 16:54:53,446 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748964_8140 replica FinalizedReplica, blk_1073748964_8140, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748964 for deletion 2025-07-14 16:54:53,447 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748964_8140 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748964 2025-07-14 16:55:50,887 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748965_8141 src: /192.168.158.9:56928 dest: /192.168.158.4:9866 2025-07-14 16:55:50,905 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:56928, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_580090129_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748965_8141, duration(ns): 15439520 2025-07-14 16:55:50,905 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748965_8141, type=LAST_IN_PIPELINE terminating 2025-07-14 16:55:53,449 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748965_8141 replica FinalizedReplica, blk_1073748965_8141, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748965 for deletion 2025-07-14 16:55:53,450 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748965_8141 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748965 2025-07-14 16:56:50,884 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748966_8142 src: /192.168.158.6:58518 dest: /192.168.158.4:9866 2025-07-14 16:56:50,904 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:58518, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_472646460_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748966_8142, duration(ns): 17586981 2025-07-14 16:56:50,904 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748966_8142, type=LAST_IN_PIPELINE terminating 2025-07-14 16:56:53,449 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748966_8142 replica FinalizedReplica, blk_1073748966_8142, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748966 for deletion 2025-07-14 16:56:53,450 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748966_8142 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748966 2025-07-14 16:57:50,877 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748967_8143 src: /192.168.158.1:59166 dest: /192.168.158.4:9866 2025-07-14 16:57:50,906 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59166, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1367853844_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748967_8143, duration(ns): 20721446 2025-07-14 16:57:50,906 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748967_8143, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-14 16:57:53,452 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748967_8143 replica FinalizedReplica, blk_1073748967_8143, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748967 for deletion 2025-07-14 16:57:53,454 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748967_8143 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748967 2025-07-14 16:58:50,885 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748968_8144 src: /192.168.158.9:54490 dest: /192.168.158.4:9866 2025-07-14 16:58:50,903 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:54490, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1135504464_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748968_8144, duration(ns): 15065120 2025-07-14 16:58:50,903 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748968_8144, type=LAST_IN_PIPELINE terminating 2025-07-14 16:58:53,452 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748968_8144 replica FinalizedReplica, blk_1073748968_8144, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748968 for deletion 2025-07-14 16:58:53,453 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748968_8144 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748968 2025-07-14 16:59:50,916 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748969_8145 src: /192.168.158.7:43188 dest: /192.168.158.4:9866 2025-07-14 16:59:50,936 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:43188, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1573154264_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748969_8145, duration(ns): 17505233 2025-07-14 16:59:50,936 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748969_8145, type=LAST_IN_PIPELINE terminating 2025-07-14 16:59:53,454 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748969_8145 replica FinalizedReplica, blk_1073748969_8145, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748969 for deletion 2025-07-14 16:59:53,456 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748969_8145 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748969 2025-07-14 17:00:50,901 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748970_8146 src: /192.168.158.1:43200 dest: /192.168.158.4:9866 2025-07-14 17:00:50,934 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43200, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1543947880_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748970_8146, duration(ns): 23637844 2025-07-14 17:00:50,934 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748970_8146, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-14 17:00:56,455 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748970_8146 replica FinalizedReplica, blk_1073748970_8146, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748970 for deletion 2025-07-14 17:00:56,457 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748970_8146 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748970 2025-07-14 17:01:50,916 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748971_8147 src: /192.168.158.6:47254 dest: /192.168.158.4:9866 2025-07-14 17:01:50,933 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:47254, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1883996765_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748971_8147, duration(ns): 15698406 2025-07-14 17:01:50,934 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748971_8147, type=LAST_IN_PIPELINE terminating 2025-07-14 17:01:53,457 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748971_8147 replica FinalizedReplica, blk_1073748971_8147, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748971 for deletion 2025-07-14 17:01:53,459 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748971_8147 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748971 2025-07-14 17:02:55,904 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748972_8148 src: /192.168.158.6:33246 dest: /192.168.158.4:9866 2025-07-14 17:02:55,928 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:33246, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1855258385_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748972_8148, duration(ns): 18522594 2025-07-14 17:02:55,928 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748972_8148, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-14 17:02:59,459 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748972_8148 replica FinalizedReplica, blk_1073748972_8148, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748972 for deletion 2025-07-14 17:02:59,460 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748972_8148 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748972 2025-07-14 17:03:55,910 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748973_8149 src: /192.168.158.5:47030 dest: /192.168.158.4:9866 2025-07-14 17:03:55,937 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:47030, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1651212573_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748973_8149, duration(ns): 22038271 2025-07-14 17:03:55,938 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748973_8149, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-14 17:03:59,460 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748973_8149 replica FinalizedReplica, blk_1073748973_8149, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748973 for deletion 2025-07-14 17:03:59,462 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748973_8149 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748973 2025-07-14 17:14:20,909 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748983_8159 src: /192.168.158.1:34874 dest: /192.168.158.4:9866 2025-07-14 17:14:20,946 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34874, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1344626418_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748983_8159, duration(ns): 25522344 2025-07-14 17:14:20,947 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748983_8159, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-14 17:14:23,481 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748983_8159 replica FinalizedReplica, blk_1073748983_8159, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748983 for deletion 2025-07-14 17:14:23,482 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748983_8159 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748983 2025-07-14 17:15:20,933 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748984_8160 src: /192.168.158.9:43170 dest: /192.168.158.4:9866 2025-07-14 17:15:20,951 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:43170, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_760276362_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748984_8160, duration(ns): 15825249 2025-07-14 17:15:20,952 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748984_8160, type=LAST_IN_PIPELINE terminating 2025-07-14 17:15:23,482 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748984_8160 replica FinalizedReplica, blk_1073748984_8160, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748984 for deletion 2025-07-14 17:15:23,484 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748984_8160 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748984 2025-07-14 17:17:25,935 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748986_8162 src: /192.168.158.5:50906 dest: /192.168.158.4:9866 2025-07-14 17:17:25,953 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:50906, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1473904309_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748986_8162, duration(ns): 16088070 2025-07-14 17:17:25,954 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748986_8162, type=LAST_IN_PIPELINE terminating 2025-07-14 17:17:29,485 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748986_8162 replica FinalizedReplica, blk_1073748986_8162, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748986 for deletion 2025-07-14 17:17:29,486 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748986_8162 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748986 2025-07-14 17:21:25,937 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748990_8166 src: /192.168.158.1:47142 dest: /192.168.158.4:9866 2025-07-14 17:21:25,971 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47142, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-766653221_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748990_8166, duration(ns): 25107785 2025-07-14 17:21:25,975 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748990_8166, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-14 17:21:29,488 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748990_8166 replica FinalizedReplica, blk_1073748990_8166, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748990 for deletion 2025-07-14 17:21:29,489 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748990_8166 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748990 2025-07-14 17:22:25,940 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748991_8167 src: /192.168.158.8:44328 dest: /192.168.158.4:9866 2025-07-14 17:22:25,959 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:44328, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2118012820_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748991_8167, duration(ns): 16603470 2025-07-14 17:22:25,959 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748991_8167, type=LAST_IN_PIPELINE terminating 2025-07-14 17:22:29,490 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748991_8167 replica FinalizedReplica, blk_1073748991_8167, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748991 for deletion 2025-07-14 17:22:29,492 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748991_8167 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073748991 2025-07-14 17:25:25,953 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748994_8170 src: /192.168.158.1:46016 dest: /192.168.158.4:9866 2025-07-14 17:25:25,983 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46016, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1923307500_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748994_8170, duration(ns): 21176182 2025-07-14 17:25:25,983 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748994_8170, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-14 17:25:29,495 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748994_8170 replica FinalizedReplica, blk_1073748994_8170, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073748994 for deletion 2025-07-14 17:25:29,496 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748994_8170 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073748994 2025-07-14 17:26:25,940 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748995_8171 src: /192.168.158.5:33224 dest: /192.168.158.4:9866 2025-07-14 17:26:25,965 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:33224, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1698220602_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748995_8171, duration(ns): 19196325 2025-07-14 17:26:25,965 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748995_8171, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-14 17:26:29,499 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748995_8171 replica FinalizedReplica, blk_1073748995_8171, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073748995 for deletion 2025-07-14 17:26:29,500 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748995_8171 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073748995 2025-07-14 17:29:30,940 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748998_8174 src: /192.168.158.6:41952 dest: /192.168.158.4:9866 2025-07-14 17:29:30,957 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:41952, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_536658183_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748998_8174, duration(ns): 15087168 2025-07-14 17:29:30,957 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748998_8174, type=LAST_IN_PIPELINE terminating 2025-07-14 17:29:32,506 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748998_8174 replica FinalizedReplica, blk_1073748998_8174, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073748998 for deletion 2025-07-14 17:29:32,508 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748998_8174 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073748998 2025-07-14 17:30:35,961 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073748999_8175 src: /192.168.158.5:59338 dest: /192.168.158.4:9866 2025-07-14 17:30:35,989 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:59338, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_934848891_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073748999_8175, duration(ns): 21915826 2025-07-14 17:30:35,990 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073748999_8175, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-14 17:30:41,511 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073748999_8175 replica FinalizedReplica, blk_1073748999_8175, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073748999 for deletion 2025-07-14 17:30:41,512 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073748999_8175 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073748999 2025-07-14 17:31:40,955 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749000_8176 src: /192.168.158.9:47086 dest: /192.168.158.4:9866 2025-07-14 17:31:40,982 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:47086, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_956525307_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749000_8176, duration(ns): 21198110 2025-07-14 17:31:40,983 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749000_8176, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-14 17:31:44,513 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749000_8176 replica FinalizedReplica, blk_1073749000_8176, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749000 for deletion 2025-07-14 17:31:44,514 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749000_8176 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749000 2025-07-14 17:33:40,956 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749002_8178 src: /192.168.158.8:44240 dest: /192.168.158.4:9866 2025-07-14 17:33:40,980 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:44240, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-659619233_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749002_8178, duration(ns): 18515487 2025-07-14 17:33:40,980 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749002_8178, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-14 17:33:44,516 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749002_8178 replica FinalizedReplica, blk_1073749002_8178, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749002 for deletion 2025-07-14 17:33:44,517 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749002_8178 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749002 2025-07-14 17:34:40,962 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749003_8179 src: /192.168.158.1:59658 dest: /192.168.158.4:9866 2025-07-14 17:34:40,997 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59658, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1968101323_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749003_8179, duration(ns): 25750534 2025-07-14 17:34:40,997 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749003_8179, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-14 17:34:44,518 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749003_8179 replica FinalizedReplica, blk_1073749003_8179, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749003 for deletion 2025-07-14 17:34:44,519 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749003_8179 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749003 2025-07-14 17:36:13,269 INFO org.apache.hadoop.hdfs.server.datanode.DirectoryScanner: BlockPool BP-1059995147-192.168.158.1-1752101929360 Total blocks: 8, missing metadata files:0, missing block files:0, missing blocks in memory:0, mismatched blocks:0 2025-07-14 17:37:20,532 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Successfully sent block report 0x12cfd9a757d31f39, containing 4 storage report(s), of which we sent 4. The reports had 8 total blocks and used 1 RPC(s). This took 0 msec to generate and 4 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5. 2025-07-14 17:37:20,533 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Got finalize command for block pool BP-1059995147-192.168.158.1-1752101929360 2025-07-14 17:39:45,968 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749008_8184 src: /192.168.158.8:53740 dest: /192.168.158.4:9866 2025-07-14 17:39:45,985 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:53740, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_421949128_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749008_8184, duration(ns): 15058426 2025-07-14 17:39:45,985 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749008_8184, type=LAST_IN_PIPELINE terminating 2025-07-14 17:39:47,532 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749008_8184 replica FinalizedReplica, blk_1073749008_8184, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749008 for deletion 2025-07-14 17:39:47,533 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749008_8184 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749008 2025-07-14 17:41:45,967 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749010_8186 src: /192.168.158.1:35988 dest: /192.168.158.4:9866 2025-07-14 17:41:45,999 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35988, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-272532304_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749010_8186, duration(ns): 22869664 2025-07-14 17:41:45,999 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749010_8186, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-14 17:41:47,538 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749010_8186 replica FinalizedReplica, blk_1073749010_8186, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749010 for deletion 2025-07-14 17:41:47,539 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749010_8186 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749010 2025-07-14 17:45:45,972 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749014_8190 src: /192.168.158.6:34178 dest: /192.168.158.4:9866 2025-07-14 17:45:45,991 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:34178, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_50879538_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749014_8190, duration(ns): 17015507 2025-07-14 17:45:45,991 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749014_8190, type=LAST_IN_PIPELINE terminating 2025-07-14 17:45:50,548 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749014_8190 replica FinalizedReplica, blk_1073749014_8190, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749014 for deletion 2025-07-14 17:45:50,549 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749014_8190 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749014 2025-07-14 17:52:55,986 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749021_8197 src: /192.168.158.5:40364 dest: /192.168.158.4:9866 2025-07-14 17:52:56,004 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:40364, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1240422436_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749021_8197, duration(ns): 15390533 2025-07-14 17:52:56,004 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749021_8197, type=LAST_IN_PIPELINE terminating 2025-07-14 17:53:02,561 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749021_8197 replica FinalizedReplica, blk_1073749021_8197, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749021 for deletion 2025-07-14 17:53:02,562 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749021_8197 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749021 2025-07-14 17:56:00,982 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749024_8200 src: /192.168.158.6:35878 dest: /192.168.158.4:9866 2025-07-14 17:56:01,008 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:35878, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1323209004_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749024_8200, duration(ns): 20704029 2025-07-14 17:56:01,008 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749024_8200, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-14 17:56:02,563 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749024_8200 replica FinalizedReplica, blk_1073749024_8200, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749024 for deletion 2025-07-14 17:56:02,564 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749024_8200 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749024 2025-07-14 17:59:05,989 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749027_8203 src: /192.168.158.1:47598 dest: /192.168.158.4:9866 2025-07-14 17:59:06,020 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47598, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1922148564_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749027_8203, duration(ns): 22507575 2025-07-14 17:59:06,020 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749027_8203, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-14 17:59:11,570 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749027_8203 replica FinalizedReplica, blk_1073749027_8203, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749027 for deletion 2025-07-14 17:59:11,571 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749027_8203 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749027 2025-07-14 18:01:16,003 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749029_8205 src: /192.168.158.1:38772 dest: /192.168.158.4:9866 2025-07-14 18:01:16,035 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38772, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-191320381_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749029_8205, duration(ns): 23529850 2025-07-14 18:01:16,035 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749029_8205, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-14 18:01:17,571 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749029_8205 replica FinalizedReplica, blk_1073749029_8205, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749029 for deletion 2025-07-14 18:01:17,572 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749029_8205 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749029 2025-07-14 18:03:21,009 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749031_8207 src: /192.168.158.6:40760 dest: /192.168.158.4:9866 2025-07-14 18:03:21,036 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:40760, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1442966948_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749031_8207, duration(ns): 21062000 2025-07-14 18:03:21,036 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749031_8207, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-14 18:03:23,574 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749031_8207 replica FinalizedReplica, blk_1073749031_8207, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749031 for deletion 2025-07-14 18:03:23,575 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749031_8207 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749031 2025-07-14 18:05:31,017 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749033_8209 src: /192.168.158.5:52074 dest: /192.168.158.4:9866 2025-07-14 18:05:31,035 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:52074, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1718875348_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749033_8209, duration(ns): 15359938 2025-07-14 18:05:31,035 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749033_8209, type=LAST_IN_PIPELINE terminating 2025-07-14 18:05:35,577 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749033_8209 replica FinalizedReplica, blk_1073749033_8209, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749033 for deletion 2025-07-14 18:05:35,578 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749033_8209 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749033 2025-07-14 18:07:36,013 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749035_8211 src: /192.168.158.5:48874 dest: /192.168.158.4:9866 2025-07-14 18:07:36,031 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:48874, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2071647404_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749035_8211, duration(ns): 15777101 2025-07-14 18:07:36,031 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749035_8211, type=LAST_IN_PIPELINE terminating 2025-07-14 18:07:38,580 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749035_8211 replica FinalizedReplica, blk_1073749035_8211, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749035 for deletion 2025-07-14 18:07:38,581 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749035_8211 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749035 2025-07-14 18:08:36,015 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749036_8212 src: /192.168.158.5:47990 dest: /192.168.158.4:9866 2025-07-14 18:08:36,040 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:47990, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_251226846_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749036_8212, duration(ns): 19656518 2025-07-14 18:08:36,041 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749036_8212, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-14 18:08:41,582 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749036_8212 replica FinalizedReplica, blk_1073749036_8212, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749036 for deletion 2025-07-14 18:08:41,583 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749036_8212 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749036 2025-07-14 18:12:41,028 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749040_8216 src: /192.168.158.1:36124 dest: /192.168.158.4:9866 2025-07-14 18:12:41,058 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36124, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2070161353_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749040_8216, duration(ns): 21282316 2025-07-14 18:12:41,058 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749040_8216, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-14 18:12:47,587 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749040_8216 replica FinalizedReplica, blk_1073749040_8216, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749040 for deletion 2025-07-14 18:12:47,588 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749040_8216 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749040 2025-07-14 18:15:41,027 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749043_8219 src: /192.168.158.5:43810 dest: /192.168.158.4:9866 2025-07-14 18:15:41,051 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:43810, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1507701044_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749043_8219, duration(ns): 19311181 2025-07-14 18:15:41,051 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749043_8219, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-14 18:15:47,594 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749043_8219 replica FinalizedReplica, blk_1073749043_8219, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749043 for deletion 2025-07-14 18:15:47,595 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749043_8219 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749043 2025-07-14 18:17:46,030 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749045_8221 src: /192.168.158.6:48316 dest: /192.168.158.4:9866 2025-07-14 18:17:46,054 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:48316, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1383188069_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749045_8221, duration(ns): 19068349 2025-07-14 18:17:46,057 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749045_8221, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-14 18:17:50,596 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749045_8221 replica FinalizedReplica, blk_1073749045_8221, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749045 for deletion 2025-07-14 18:17:50,597 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749045_8221 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749045 2025-07-14 18:19:51,031 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749047_8223 src: /192.168.158.1:38876 dest: /192.168.158.4:9866 2025-07-14 18:19:51,063 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38876, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_929991268_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749047_8223, duration(ns): 23150741 2025-07-14 18:19:51,063 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749047_8223, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-14 18:19:56,599 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749047_8223 replica FinalizedReplica, blk_1073749047_8223, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749047 for deletion 2025-07-14 18:19:56,600 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749047_8223 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749047 2025-07-14 18:20:51,035 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749048_8224 src: /192.168.158.5:51786 dest: /192.168.158.4:9866 2025-07-14 18:20:51,056 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:51786, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-230431808_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749048_8224, duration(ns): 19095135 2025-07-14 18:20:51,057 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749048_8224, type=LAST_IN_PIPELINE terminating 2025-07-14 18:20:53,598 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749048_8224 replica FinalizedReplica, blk_1073749048_8224, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749048 for deletion 2025-07-14 18:20:53,600 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749048_8224 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749048 2025-07-14 18:23:56,067 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749051_8227 src: /192.168.158.5:34246 dest: /192.168.158.4:9866 2025-07-14 18:23:56,085 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:34246, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1502258215_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749051_8227, duration(ns): 15042825 2025-07-14 18:23:56,085 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749051_8227, type=LAST_IN_PIPELINE terminating 2025-07-14 18:24:02,607 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749051_8227 replica FinalizedReplica, blk_1073749051_8227, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749051 for deletion 2025-07-14 18:24:02,608 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749051_8227 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749051 2025-07-14 18:24:56,046 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749052_8228 src: /192.168.158.8:50292 dest: /192.168.158.4:9866 2025-07-14 18:24:56,072 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:50292, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_773946699_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749052_8228, duration(ns): 20115926 2025-07-14 18:24:56,072 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749052_8228, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-14 18:24:59,609 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749052_8228 replica FinalizedReplica, blk_1073749052_8228, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749052 for deletion 2025-07-14 18:24:59,610 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749052_8228 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749052 2025-07-14 18:27:01,047 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749054_8230 src: /192.168.158.8:50846 dest: /192.168.158.4:9866 2025-07-14 18:27:01,071 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:50846, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1985728547_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749054_8230, duration(ns): 18429660 2025-07-14 18:27:01,071 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749054_8230, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-14 18:27:02,613 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749054_8230 replica FinalizedReplica, blk_1073749054_8230, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749054 for deletion 2025-07-14 18:27:02,614 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749054_8230 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749054 2025-07-14 18:28:01,035 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749055_8231 src: /192.168.158.1:39556 dest: /192.168.158.4:9866 2025-07-14 18:28:01,066 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39556, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_763537042_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749055_8231, duration(ns): 22135825 2025-07-14 18:28:01,066 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749055_8231, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-14 18:28:05,617 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749055_8231 replica FinalizedReplica, blk_1073749055_8231, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749055 for deletion 2025-07-14 18:28:05,618 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749055_8231 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749055 2025-07-14 18:29:01,044 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749056_8232 src: /192.168.158.6:34956 dest: /192.168.158.4:9866 2025-07-14 18:29:01,067 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:34956, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-625906514_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749056_8232, duration(ns): 18132744 2025-07-14 18:29:01,068 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749056_8232, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-14 18:29:05,620 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749056_8232 replica FinalizedReplica, blk_1073749056_8232, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749056 for deletion 2025-07-14 18:29:05,621 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749056_8232 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749056 2025-07-14 18:30:01,029 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749057_8233 src: /192.168.158.6:45070 dest: /192.168.158.4:9866 2025-07-14 18:30:01,047 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:45070, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-752782291_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749057_8233, duration(ns): 16339103 2025-07-14 18:30:01,048 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749057_8233, type=LAST_IN_PIPELINE terminating 2025-07-14 18:30:02,620 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749057_8233 replica FinalizedReplica, blk_1073749057_8233, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749057 for deletion 2025-07-14 18:30:02,621 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749057_8233 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749057 2025-07-14 18:32:01,072 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749059_8235 src: /192.168.158.8:35848 dest: /192.168.158.4:9866 2025-07-14 18:32:01,095 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:35848, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_240428854_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749059_8235, duration(ns): 17660233 2025-07-14 18:32:01,095 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749059_8235, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-14 18:32:02,622 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749059_8235 replica FinalizedReplica, blk_1073749059_8235, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749059 for deletion 2025-07-14 18:32:02,623 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749059_8235 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749059 2025-07-14 18:33:01,036 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749060_8236 src: /192.168.158.8:57010 dest: /192.168.158.4:9866 2025-07-14 18:33:01,055 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:57010, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1491925087_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749060_8236, duration(ns): 17198822 2025-07-14 18:33:01,055 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749060_8236, type=LAST_IN_PIPELINE terminating 2025-07-14 18:33:02,625 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749060_8236 replica FinalizedReplica, blk_1073749060_8236, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749060 for deletion 2025-07-14 18:33:02,626 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749060_8236 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749060 2025-07-14 18:34:01,038 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749061_8237 src: /192.168.158.8:60506 dest: /192.168.158.4:9866 2025-07-14 18:34:01,056 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:60506, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_825549339_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749061_8237, duration(ns): 16440034 2025-07-14 18:34:01,057 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749061_8237, type=LAST_IN_PIPELINE terminating 2025-07-14 18:34:02,625 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749061_8237 replica FinalizedReplica, blk_1073749061_8237, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749061 for deletion 2025-07-14 18:34:02,628 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749061_8237 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749061 2025-07-14 18:38:16,059 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749065_8241 src: /192.168.158.8:44338 dest: /192.168.158.4:9866 2025-07-14 18:38:16,079 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:44338, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_703242096_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749065_8241, duration(ns): 17934635 2025-07-14 18:38:16,080 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749065_8241, type=LAST_IN_PIPELINE terminating 2025-07-14 18:38:20,633 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749065_8241 replica FinalizedReplica, blk_1073749065_8241, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749065 for deletion 2025-07-14 18:38:20,634 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749065_8241 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749065 2025-07-14 18:39:16,045 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749066_8242 src: /192.168.158.1:55892 dest: /192.168.158.4:9866 2025-07-14 18:39:16,076 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55892, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1853016807_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749066_8242, duration(ns): 21960728 2025-07-14 18:39:16,076 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749066_8242, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-14 18:39:17,635 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749066_8242 replica FinalizedReplica, blk_1073749066_8242, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749066 for deletion 2025-07-14 18:39:17,636 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749066_8242 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749066 2025-07-14 18:40:16,052 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749067_8243 src: /192.168.158.7:59706 dest: /192.168.158.4:9866 2025-07-14 18:40:16,070 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:59706, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2057335122_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749067_8243, duration(ns): 16300750 2025-07-14 18:40:16,070 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749067_8243, type=LAST_IN_PIPELINE terminating 2025-07-14 18:40:20,638 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749067_8243 replica FinalizedReplica, blk_1073749067_8243, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749067 for deletion 2025-07-14 18:40:20,640 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749067_8243 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749067 2025-07-14 18:42:16,059 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749069_8245 src: /192.168.158.1:51306 dest: /192.168.158.4:9866 2025-07-14 18:42:16,089 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51306, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-68917313_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749069_8245, duration(ns): 21587747 2025-07-14 18:42:16,089 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749069_8245, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-14 18:42:20,642 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749069_8245 replica FinalizedReplica, blk_1073749069_8245, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749069 for deletion 2025-07-14 18:42:20,643 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749069_8245 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749069 2025-07-14 18:44:16,064 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749071_8247 src: /192.168.158.5:49588 dest: /192.168.158.4:9866 2025-07-14 18:44:16,082 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:49588, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-105035033_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749071_8247, duration(ns): 15401580 2025-07-14 18:44:16,082 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749071_8247, type=LAST_IN_PIPELINE terminating 2025-07-14 18:44:17,647 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749071_8247 replica FinalizedReplica, blk_1073749071_8247, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749071 for deletion 2025-07-14 18:44:17,648 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749071_8247 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749071 2025-07-14 18:48:16,047 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749075_8251 src: /192.168.158.9:42568 dest: /192.168.158.4:9866 2025-07-14 18:48:16,068 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:42568, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-239903606_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749075_8251, duration(ns): 18960912 2025-07-14 18:48:16,068 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749075_8251, type=LAST_IN_PIPELINE terminating 2025-07-14 18:48:20,655 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749075_8251 replica FinalizedReplica, blk_1073749075_8251, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749075 for deletion 2025-07-14 18:48:20,656 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749075_8251 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749075 2025-07-14 18:49:21,072 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749076_8252 src: /192.168.158.1:45688 dest: /192.168.158.4:9866 2025-07-14 18:49:21,103 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45688, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-321388337_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749076_8252, duration(ns): 21575464 2025-07-14 18:49:21,104 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749076_8252, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-14 18:49:23,656 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749076_8252 replica FinalizedReplica, blk_1073749076_8252, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749076 for deletion 2025-07-14 18:49:23,657 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749076_8252 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749076 2025-07-14 18:50:21,078 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749077_8253 src: /192.168.158.9:53950 dest: /192.168.158.4:9866 2025-07-14 18:50:21,097 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:53950, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1241214905_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749077_8253, duration(ns): 17057716 2025-07-14 18:50:21,097 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749077_8253, type=LAST_IN_PIPELINE terminating 2025-07-14 18:50:23,658 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749077_8253 replica FinalizedReplica, blk_1073749077_8253, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749077 for deletion 2025-07-14 18:50:23,660 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749077_8253 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749077 2025-07-14 18:51:21,071 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749078_8254 src: /192.168.158.7:56846 dest: /192.168.158.4:9866 2025-07-14 18:51:21,090 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:56846, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1546004503_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749078_8254, duration(ns): 16140376 2025-07-14 18:51:21,090 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749078_8254, type=LAST_IN_PIPELINE terminating 2025-07-14 18:51:23,660 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749078_8254 replica FinalizedReplica, blk_1073749078_8254, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749078 for deletion 2025-07-14 18:51:23,661 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749078_8254 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749078 2025-07-14 18:54:26,072 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749081_8257 src: /192.168.158.1:50610 dest: /192.168.158.4:9866 2025-07-14 18:54:26,106 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50610, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1346620384_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749081_8257, duration(ns): 25457787 2025-07-14 18:54:26,107 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749081_8257, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-14 18:54:29,667 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749081_8257 replica FinalizedReplica, blk_1073749081_8257, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749081 for deletion 2025-07-14 18:54:29,668 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749081_8257 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749081 2025-07-14 18:55:26,082 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749082_8258 src: /192.168.158.8:38076 dest: /192.168.158.4:9866 2025-07-14 18:55:26,100 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:38076, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1054121508_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749082_8258, duration(ns): 16648541 2025-07-14 18:55:26,101 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749082_8258, type=LAST_IN_PIPELINE terminating 2025-07-14 18:55:32,668 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749082_8258 replica FinalizedReplica, blk_1073749082_8258, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749082 for deletion 2025-07-14 18:55:32,669 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749082_8258 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749082 2025-07-14 18:58:26,057 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749085_8261 src: /192.168.158.1:46318 dest: /192.168.158.4:9866 2025-07-14 18:58:26,090 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46318, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1833214847_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749085_8261, duration(ns): 24253871 2025-07-14 18:58:26,091 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749085_8261, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-14 18:58:32,674 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749085_8261 replica FinalizedReplica, blk_1073749085_8261, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749085 for deletion 2025-07-14 18:58:32,675 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749085_8261 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749085 2025-07-14 18:59:26,099 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749086_8262 src: /192.168.158.9:56052 dest: /192.168.158.4:9866 2025-07-14 18:59:26,119 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:56052, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-187174899_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749086_8262, duration(ns): 17775061 2025-07-14 18:59:26,120 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749086_8262, type=LAST_IN_PIPELINE terminating 2025-07-14 18:59:29,674 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749086_8262 replica FinalizedReplica, blk_1073749086_8262, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749086 for deletion 2025-07-14 18:59:29,676 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749086_8262 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749086 2025-07-14 19:00:26,064 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749087_8263 src: /192.168.158.8:47682 dest: /192.168.158.4:9866 2025-07-14 19:00:26,088 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:47682, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-295567890_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749087_8263, duration(ns): 18385086 2025-07-14 19:00:26,088 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749087_8263, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-14 19:00:29,676 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749087_8263 replica FinalizedReplica, blk_1073749087_8263, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749087 for deletion 2025-07-14 19:00:29,677 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749087_8263 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749087 2025-07-14 19:05:36,078 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749092_8268 src: /192.168.158.9:58736 dest: /192.168.158.4:9866 2025-07-14 19:05:36,105 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:58736, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1410274197_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749092_8268, duration(ns): 21373281 2025-07-14 19:05:36,105 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749092_8268, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-14 19:05:38,690 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749092_8268 replica FinalizedReplica, blk_1073749092_8268, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749092 for deletion 2025-07-14 19:05:38,691 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749092_8268 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749092 2025-07-14 19:07:36,090 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749094_8270 src: /192.168.158.6:55722 dest: /192.168.158.4:9866 2025-07-14 19:07:36,108 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:55722, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_775735205_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749094_8270, duration(ns): 15901832 2025-07-14 19:07:36,108 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749094_8270, type=LAST_IN_PIPELINE terminating 2025-07-14 19:07:38,695 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749094_8270 replica FinalizedReplica, blk_1073749094_8270, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749094 for deletion 2025-07-14 19:07:38,696 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749094_8270 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749094 2025-07-14 19:15:51,105 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749102_8278 src: /192.168.158.8:37832 dest: /192.168.158.4:9866 2025-07-14 19:15:51,123 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:37832, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2045862084_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749102_8278, duration(ns): 15459024 2025-07-14 19:15:51,123 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749102_8278, type=LAST_IN_PIPELINE terminating 2025-07-14 19:15:53,718 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749102_8278 replica FinalizedReplica, blk_1073749102_8278, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749102 for deletion 2025-07-14 19:15:53,719 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749102_8278 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749102 2025-07-14 19:17:56,116 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749104_8280 src: /192.168.158.5:36808 dest: /192.168.158.4:9866 2025-07-14 19:17:56,141 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:36808, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-330201290_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749104_8280, duration(ns): 19403686 2025-07-14 19:17:56,141 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749104_8280, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-14 19:18:02,719 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749104_8280 replica FinalizedReplica, blk_1073749104_8280, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749104 for deletion 2025-07-14 19:18:02,720 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749104_8280 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749104 2025-07-14 19:21:01,114 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749107_8283 src: /192.168.158.8:45974 dest: /192.168.158.4:9866 2025-07-14 19:21:01,139 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:45974, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1643377853_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749107_8283, duration(ns): 19971124 2025-07-14 19:21:01,139 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749107_8283, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-14 19:21:05,721 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749107_8283 replica FinalizedReplica, blk_1073749107_8283, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749107 for deletion 2025-07-14 19:21:05,722 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749107_8283 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749107 2025-07-14 19:24:01,118 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749110_8286 src: /192.168.158.1:54916 dest: /192.168.158.4:9866 2025-07-14 19:24:01,151 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54916, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1888883691_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749110_8286, duration(ns): 24429141 2025-07-14 19:24:01,151 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749110_8286, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-14 19:24:05,728 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749110_8286 replica FinalizedReplica, blk_1073749110_8286, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749110 for deletion 2025-07-14 19:24:05,729 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749110_8286 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749110 2025-07-14 19:28:11,134 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749114_8290 src: /192.168.158.5:45894 dest: /192.168.158.4:9866 2025-07-14 19:28:11,158 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:45894, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_556189534_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749114_8290, duration(ns): 18208810 2025-07-14 19:28:11,158 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749114_8290, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-14 19:28:14,741 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749114_8290 replica FinalizedReplica, blk_1073749114_8290, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749114 for deletion 2025-07-14 19:28:14,742 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749114_8290 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749114 2025-07-14 19:30:11,121 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749116_8292 src: /192.168.158.9:59592 dest: /192.168.158.4:9866 2025-07-14 19:30:11,146 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:59592, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1645767959_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749116_8292, duration(ns): 19862064 2025-07-14 19:30:11,146 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749116_8292, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-14 19:30:17,745 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749116_8292 replica FinalizedReplica, blk_1073749116_8292, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749116 for deletion 2025-07-14 19:30:17,746 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749116_8292 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749116 2025-07-14 19:33:11,130 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749119_8295 src: /192.168.158.9:49206 dest: /192.168.158.4:9866 2025-07-14 19:33:11,155 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:49206, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-297135369_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749119_8295, duration(ns): 19647339 2025-07-14 19:33:11,156 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749119_8295, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-14 19:33:14,751 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749119_8295 replica FinalizedReplica, blk_1073749119_8295, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749119 for deletion 2025-07-14 19:33:14,752 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749119_8295 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749119 2025-07-14 19:34:11,144 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749120_8296 src: /192.168.158.1:60408 dest: /192.168.158.4:9866 2025-07-14 19:34:11,176 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60408, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1637955853_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749120_8296, duration(ns): 23152481 2025-07-14 19:34:11,176 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749120_8296, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-14 19:34:14,752 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749120_8296 replica FinalizedReplica, blk_1073749120_8296, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749120 for deletion 2025-07-14 19:34:14,755 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749120_8296 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749120 2025-07-14 19:35:11,146 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749121_8297 src: /192.168.158.6:55682 dest: /192.168.158.4:9866 2025-07-14 19:35:11,163 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:55682, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2059313653_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749121_8297, duration(ns): 15635018 2025-07-14 19:35:11,164 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749121_8297, type=LAST_IN_PIPELINE terminating 2025-07-14 19:35:17,757 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749121_8297 replica FinalizedReplica, blk_1073749121_8297, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749121 for deletion 2025-07-14 19:35:17,758 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749121_8297 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749121 2025-07-14 19:36:16,147 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749122_8298 src: /192.168.158.9:37180 dest: /192.168.158.4:9866 2025-07-14 19:36:16,165 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:37180, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1708215423_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749122_8298, duration(ns): 15883098 2025-07-14 19:36:16,165 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749122_8298, type=LAST_IN_PIPELINE terminating 2025-07-14 19:36:17,759 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749122_8298 replica FinalizedReplica, blk_1073749122_8298, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749122 for deletion 2025-07-14 19:36:17,761 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749122_8298 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749122 2025-07-14 19:37:16,140 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749123_8299 src: /192.168.158.9:34066 dest: /192.168.158.4:9866 2025-07-14 19:37:16,158 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:34066, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1842325660_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749123_8299, duration(ns): 16129175 2025-07-14 19:37:16,159 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749123_8299, type=LAST_IN_PIPELINE terminating 2025-07-14 19:37:17,763 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749123_8299 replica FinalizedReplica, blk_1073749123_8299, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749123 for deletion 2025-07-14 19:37:17,764 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749123_8299 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749123 2025-07-14 19:39:16,135 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749125_8301 src: /192.168.158.1:38926 dest: /192.168.158.4:9866 2025-07-14 19:39:16,167 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38926, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1152922253_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749125_8301, duration(ns): 22913680 2025-07-14 19:39:16,167 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749125_8301, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-14 19:39:20,766 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749125_8301 replica FinalizedReplica, blk_1073749125_8301, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749125 for deletion 2025-07-14 19:39:20,767 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749125_8301 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749125 2025-07-14 19:41:16,140 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749127_8303 src: /192.168.158.1:35318 dest: /192.168.158.4:9866 2025-07-14 19:41:16,172 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35318, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1164035779_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749127_8303, duration(ns): 21538925 2025-07-14 19:41:16,172 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749127_8303, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-14 19:41:17,770 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749127_8303 replica FinalizedReplica, blk_1073749127_8303, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749127 for deletion 2025-07-14 19:41:17,771 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749127_8303 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749127 2025-07-14 19:42:16,148 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749128_8304 src: /192.168.158.1:34252 dest: /192.168.158.4:9866 2025-07-14 19:42:16,181 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34252, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1139936383_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749128_8304, duration(ns): 24270723 2025-07-14 19:42:16,181 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749128_8304, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-14 19:42:17,773 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749128_8304 replica FinalizedReplica, blk_1073749128_8304, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749128 for deletion 2025-07-14 19:42:17,774 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749128_8304 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749128 2025-07-14 19:43:21,159 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749129_8305 src: /192.168.158.9:54810 dest: /192.168.158.4:9866 2025-07-14 19:43:21,184 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:54810, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1839637590_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749129_8305, duration(ns): 20388313 2025-07-14 19:43:21,185 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749129_8305, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-14 19:43:26,776 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749129_8305 replica FinalizedReplica, blk_1073749129_8305, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749129 for deletion 2025-07-14 19:43:26,777 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749129_8305 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749129 2025-07-14 19:44:21,139 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749130_8306 src: /192.168.158.1:46224 dest: /192.168.158.4:9866 2025-07-14 19:44:21,175 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46224, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-189416923_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749130_8306, duration(ns): 26347523 2025-07-14 19:44:21,175 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749130_8306, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-14 19:44:23,778 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749130_8306 replica FinalizedReplica, blk_1073749130_8306, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749130 for deletion 2025-07-14 19:44:23,779 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749130_8306 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749130 2025-07-14 19:45:21,140 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749131_8307 src: /192.168.158.1:41168 dest: /192.168.158.4:9866 2025-07-14 19:45:21,172 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41168, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-840485697_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749131_8307, duration(ns): 23316233 2025-07-14 19:45:21,173 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749131_8307, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-14 19:45:23,779 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749131_8307 replica FinalizedReplica, blk_1073749131_8307, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749131 for deletion 2025-07-14 19:45:23,781 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749131_8307 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749131 2025-07-14 19:46:21,154 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749132_8308 src: /192.168.158.5:47392 dest: /192.168.158.4:9866 2025-07-14 19:46:21,181 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:47392, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1934746047_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749132_8308, duration(ns): 21560449 2025-07-14 19:46:21,181 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749132_8308, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-14 19:46:26,781 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749132_8308 replica FinalizedReplica, blk_1073749132_8308, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749132 for deletion 2025-07-14 19:46:26,782 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749132_8308 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749132 2025-07-14 19:47:21,147 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749133_8309 src: /192.168.158.5:51674 dest: /192.168.158.4:9866 2025-07-14 19:47:21,171 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:51674, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1797677849_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749133_8309, duration(ns): 18710704 2025-07-14 19:47:21,171 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749133_8309, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-14 19:47:23,782 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749133_8309 replica FinalizedReplica, blk_1073749133_8309, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749133 for deletion 2025-07-14 19:47:23,783 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749133_8309 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749133 2025-07-14 19:48:21,153 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749134_8310 src: /192.168.158.6:55234 dest: /192.168.158.4:9866 2025-07-14 19:48:21,172 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:55234, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_375042536_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749134_8310, duration(ns): 16873843 2025-07-14 19:48:21,172 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749134_8310, type=LAST_IN_PIPELINE terminating 2025-07-14 19:48:26,782 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749134_8310 replica FinalizedReplica, blk_1073749134_8310, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749134 for deletion 2025-07-14 19:48:26,783 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749134_8310 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749134 2025-07-14 19:51:31,153 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749137_8313 src: /192.168.158.1:34868 dest: /192.168.158.4:9866 2025-07-14 19:51:31,184 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34868, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-216485811_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749137_8313, duration(ns): 22227718 2025-07-14 19:51:31,184 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749137_8313, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-14 19:51:32,788 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749137_8313 replica FinalizedReplica, blk_1073749137_8313, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749137 for deletion 2025-07-14 19:51:32,789 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749137_8313 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749137 2025-07-14 19:52:31,152 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749138_8314 src: /192.168.158.1:56724 dest: /192.168.158.4:9866 2025-07-14 19:52:31,186 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56724, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_856599925_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749138_8314, duration(ns): 25126420 2025-07-14 19:52:31,186 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749138_8314, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-14 19:52:32,792 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749138_8314 replica FinalizedReplica, blk_1073749138_8314, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749138 for deletion 2025-07-14 19:52:32,793 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749138_8314 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749138 2025-07-14 19:56:31,159 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749142_8318 src: /192.168.158.1:55448 dest: /192.168.158.4:9866 2025-07-14 19:56:31,190 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55448, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-734513525_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749142_8318, duration(ns): 21806272 2025-07-14 19:56:31,190 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749142_8318, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-14 19:56:32,801 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749142_8318 replica FinalizedReplica, blk_1073749142_8318, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749142 for deletion 2025-07-14 19:56:32,802 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749142_8318 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749142 2025-07-14 19:57:31,166 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749143_8319 src: /192.168.158.8:55560 dest: /192.168.158.4:9866 2025-07-14 19:57:31,191 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:55560, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1884069203_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749143_8319, duration(ns): 19651108 2025-07-14 19:57:31,191 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749143_8319, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-14 19:57:32,802 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749143_8319 replica FinalizedReplica, blk_1073749143_8319, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749143 for deletion 2025-07-14 19:57:32,804 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749143_8319 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749143 2025-07-14 19:58:31,162 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749144_8320 src: /192.168.158.1:40256 dest: /192.168.158.4:9866 2025-07-14 19:58:31,195 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40256, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-600770802_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749144_8320, duration(ns): 23594893 2025-07-14 19:58:31,195 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749144_8320, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-14 19:58:32,804 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749144_8320 replica FinalizedReplica, blk_1073749144_8320, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749144 for deletion 2025-07-14 19:58:32,806 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749144_8320 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749144 2025-07-14 20:00:36,165 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749146_8322 src: /192.168.158.9:57376 dest: /192.168.158.4:9866 2025-07-14 20:00:36,192 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:57376, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1553261638_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749146_8322, duration(ns): 21438972 2025-07-14 20:00:36,192 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749146_8322, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-14 20:00:41,808 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749146_8322 replica FinalizedReplica, blk_1073749146_8322, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749146 for deletion 2025-07-14 20:00:41,809 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749146_8322 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749146 2025-07-14 20:03:41,175 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749149_8325 src: /192.168.158.8:56456 dest: /192.168.158.4:9866 2025-07-14 20:03:41,201 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:56456, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1055126478_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749149_8325, duration(ns): 20510943 2025-07-14 20:03:41,201 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749149_8325, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-14 20:03:44,813 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749149_8325 replica FinalizedReplica, blk_1073749149_8325, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749149 for deletion 2025-07-14 20:03:44,814 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749149_8325 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749149 2025-07-14 20:08:41,220 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749154_8330 src: /192.168.158.1:49408 dest: /192.168.158.4:9866 2025-07-14 20:08:41,251 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49408, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1366648208_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749154_8330, duration(ns): 22374157 2025-07-14 20:08:41,254 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749154_8330, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-14 20:08:47,823 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749154_8330 replica FinalizedReplica, blk_1073749154_8330, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749154 for deletion 2025-07-14 20:08:47,824 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749154_8330 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749154 2025-07-14 20:10:41,198 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749156_8332 src: /192.168.158.6:35990 dest: /192.168.158.4:9866 2025-07-14 20:10:41,215 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:35990, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1119633350_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749156_8332, duration(ns): 15505800 2025-07-14 20:10:41,215 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749156_8332, type=LAST_IN_PIPELINE terminating 2025-07-14 20:10:47,828 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749156_8332 replica FinalizedReplica, blk_1073749156_8332, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749156 for deletion 2025-07-14 20:10:47,830 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749156_8332 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749156 2025-07-14 20:14:46,193 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749160_8336 src: /192.168.158.1:36056 dest: /192.168.158.4:9866 2025-07-14 20:14:46,225 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36056, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2091914027_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749160_8336, duration(ns): 23109048 2025-07-14 20:14:46,226 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749160_8336, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-14 20:14:50,838 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749160_8336 replica FinalizedReplica, blk_1073749160_8336, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749160 for deletion 2025-07-14 20:14:50,839 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749160_8336 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749160 2025-07-14 20:15:46,192 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749161_8337 src: /192.168.158.1:41168 dest: /192.168.158.4:9866 2025-07-14 20:15:46,224 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41168, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_732748199_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749161_8337, duration(ns): 21971713 2025-07-14 20:15:46,224 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749161_8337, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-14 20:15:50,841 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749161_8337 replica FinalizedReplica, blk_1073749161_8337, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749161 for deletion 2025-07-14 20:15:50,842 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749161_8337 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749161 2025-07-14 20:18:51,196 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749164_8340 src: /192.168.158.1:55802 dest: /192.168.158.4:9866 2025-07-14 20:18:51,226 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55802, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_323850024_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749164_8340, duration(ns): 21274094 2025-07-14 20:18:51,227 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749164_8340, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-14 20:18:53,849 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749164_8340 replica FinalizedReplica, blk_1073749164_8340, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749164 for deletion 2025-07-14 20:18:53,850 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749164_8340 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749164 2025-07-14 20:19:51,201 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749165_8341 src: /192.168.158.6:40434 dest: /192.168.158.4:9866 2025-07-14 20:19:51,227 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:40434, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1651821465_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749165_8341, duration(ns): 20689087 2025-07-14 20:19:51,227 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749165_8341, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-14 20:19:53,850 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749165_8341 replica FinalizedReplica, blk_1073749165_8341, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749165 for deletion 2025-07-14 20:19:53,852 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749165_8341 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749165 2025-07-14 20:20:51,201 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749166_8342 src: /192.168.158.1:43736 dest: /192.168.158.4:9866 2025-07-14 20:20:51,236 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43736, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-22141672_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749166_8342, duration(ns): 25388664 2025-07-14 20:20:51,236 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749166_8342, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-14 20:20:53,853 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749166_8342 replica FinalizedReplica, blk_1073749166_8342, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749166 for deletion 2025-07-14 20:20:53,854 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749166_8342 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749166 2025-07-14 20:22:51,211 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749168_8344 src: /192.168.158.1:36526 dest: /192.168.158.4:9866 2025-07-14 20:22:51,261 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36526, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_37218382_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749168_8344, duration(ns): 41559060 2025-07-14 20:22:51,261 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749168_8344, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-14 20:22:53,856 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749168_8344 replica FinalizedReplica, blk_1073749168_8344, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749168 for deletion 2025-07-14 20:22:53,857 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749168_8344 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749168 2025-07-14 20:29:01,217 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749174_8350 src: /192.168.158.9:56900 dest: /192.168.158.4:9866 2025-07-14 20:29:01,244 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:56900, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1469628927_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749174_8350, duration(ns): 20909175 2025-07-14 20:29:01,244 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749174_8350, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-14 20:29:02,866 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749174_8350 replica FinalizedReplica, blk_1073749174_8350, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749174 for deletion 2025-07-14 20:29:02,867 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749174_8350 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749174 2025-07-14 20:31:01,227 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749176_8352 src: /192.168.158.5:59496 dest: /192.168.158.4:9866 2025-07-14 20:31:01,245 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:59496, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1274724312_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749176_8352, duration(ns): 16262643 2025-07-14 20:31:01,245 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749176_8352, type=LAST_IN_PIPELINE terminating 2025-07-14 20:31:02,873 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749176_8352 replica FinalizedReplica, blk_1073749176_8352, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749176 for deletion 2025-07-14 20:31:02,874 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749176_8352 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749176 2025-07-14 20:32:06,226 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749177_8353 src: /192.168.158.6:52116 dest: /192.168.158.4:9866 2025-07-14 20:32:06,243 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:52116, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-201394854_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749177_8353, duration(ns): 15627928 2025-07-14 20:32:06,244 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749177_8353, type=LAST_IN_PIPELINE terminating 2025-07-14 20:32:11,875 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749177_8353 replica FinalizedReplica, blk_1073749177_8353, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749177 for deletion 2025-07-14 20:32:11,876 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749177_8353 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749177 2025-07-14 20:36:06,257 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749181_8357 src: /192.168.158.1:54948 dest: /192.168.158.4:9866 2025-07-14 20:36:06,289 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54948, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-90417168_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749181_8357, duration(ns): 23418644 2025-07-14 20:36:06,290 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749181_8357, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-14 20:36:08,879 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749181_8357 replica FinalizedReplica, blk_1073749181_8357, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749181 for deletion 2025-07-14 20:36:08,881 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749181_8357 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749181 2025-07-14 20:38:11,242 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749183_8359 src: /192.168.158.1:54440 dest: /192.168.158.4:9866 2025-07-14 20:38:11,273 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54440, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-834614163_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749183_8359, duration(ns): 22756296 2025-07-14 20:38:11,274 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749183_8359, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-14 20:38:14,885 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749183_8359 replica FinalizedReplica, blk_1073749183_8359, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749183 for deletion 2025-07-14 20:38:14,886 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749183_8359 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749183 2025-07-14 20:39:11,240 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749184_8360 src: /192.168.158.8:35000 dest: /192.168.158.4:9866 2025-07-14 20:39:11,265 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:35000, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1096725914_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749184_8360, duration(ns): 19910796 2025-07-14 20:39:11,266 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749184_8360, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-14 20:39:14,888 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749184_8360 replica FinalizedReplica, blk_1073749184_8360, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749184 for deletion 2025-07-14 20:39:14,888 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749184_8360 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749184 2025-07-14 20:40:16,232 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749185_8361 src: /192.168.158.1:48754 dest: /192.168.158.4:9866 2025-07-14 20:40:16,267 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48754, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1998237594_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749185_8361, duration(ns): 25285470 2025-07-14 20:40:16,267 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749185_8361, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-14 20:40:17,890 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749185_8361 replica FinalizedReplica, blk_1073749185_8361, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749185 for deletion 2025-07-14 20:40:17,892 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749185_8361 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749185 2025-07-14 20:41:16,236 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749186_8362 src: /192.168.158.1:45522 dest: /192.168.158.4:9866 2025-07-14 20:41:16,266 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45522, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2063488635_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749186_8362, duration(ns): 21320124 2025-07-14 20:41:16,266 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749186_8362, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-14 20:41:20,893 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749186_8362 replica FinalizedReplica, blk_1073749186_8362, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749186 for deletion 2025-07-14 20:41:20,894 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749186_8362 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749186 2025-07-14 20:43:21,258 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749188_8364 src: /192.168.158.8:49688 dest: /192.168.158.4:9866 2025-07-14 20:43:21,276 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:49688, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1732514060_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749188_8364, duration(ns): 16629405 2025-07-14 20:43:21,277 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749188_8364, type=LAST_IN_PIPELINE terminating 2025-07-14 20:43:26,898 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749188_8364 replica FinalizedReplica, blk_1073749188_8364, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749188 for deletion 2025-07-14 20:43:26,899 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749188_8364 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749188 2025-07-14 20:46:31,250 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749191_8367 src: /192.168.158.1:51652 dest: /192.168.158.4:9866 2025-07-14 20:46:31,281 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51652, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1309025041_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749191_8367, duration(ns): 22684705 2025-07-14 20:46:31,281 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749191_8367, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-14 20:46:32,902 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749191_8367 replica FinalizedReplica, blk_1073749191_8367, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749191 for deletion 2025-07-14 20:46:32,903 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749191_8367 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749191 2025-07-14 20:47:31,259 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749192_8368 src: /192.168.158.1:56072 dest: /192.168.158.4:9866 2025-07-14 20:47:31,287 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56072, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1180749549_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749192_8368, duration(ns): 20057868 2025-07-14 20:47:31,288 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749192_8368, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-14 20:47:32,906 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749192_8368 replica FinalizedReplica, blk_1073749192_8368, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749192 for deletion 2025-07-14 20:47:32,907 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749192_8368 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749192 2025-07-14 20:48:31,265 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749193_8369 src: /192.168.158.7:42784 dest: /192.168.158.4:9866 2025-07-14 20:48:31,292 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:42784, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-406970865_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749193_8369, duration(ns): 21604191 2025-07-14 20:48:31,293 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749193_8369, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-14 20:48:32,907 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749193_8369 replica FinalizedReplica, blk_1073749193_8369, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749193 for deletion 2025-07-14 20:48:32,909 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749193_8369 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749193 2025-07-14 20:49:31,261 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749194_8370 src: /192.168.158.8:51246 dest: /192.168.158.4:9866 2025-07-14 20:49:31,285 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:51246, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1199573779_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749194_8370, duration(ns): 19192790 2025-07-14 20:49:31,286 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749194_8370, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-14 20:49:32,910 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749194_8370 replica FinalizedReplica, blk_1073749194_8370, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749194 for deletion 2025-07-14 20:49:32,911 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749194_8370 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749194 2025-07-14 20:51:31,271 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749196_8372 src: /192.168.158.7:52434 dest: /192.168.158.4:9866 2025-07-14 20:51:31,295 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:52434, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1186082413_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749196_8372, duration(ns): 18467222 2025-07-14 20:51:31,295 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749196_8372, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-14 20:51:32,912 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749196_8372 replica FinalizedReplica, blk_1073749196_8372, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749196 for deletion 2025-07-14 20:51:32,913 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749196_8372 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749196 2025-07-14 20:54:36,269 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749199_8375 src: /192.168.158.1:49534 dest: /192.168.158.4:9866 2025-07-14 20:54:36,301 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49534, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-94522721_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749199_8375, duration(ns): 23128045 2025-07-14 20:54:36,301 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749199_8375, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-14 20:54:41,921 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749199_8375 replica FinalizedReplica, blk_1073749199_8375, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749199 for deletion 2025-07-14 20:54:41,922 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749199_8375 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749199 2025-07-14 20:58:41,279 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749203_8379 src: /192.168.158.7:34370 dest: /192.168.158.4:9866 2025-07-14 20:58:41,306 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:34370, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1936805604_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749203_8379, duration(ns): 20648835 2025-07-14 20:58:41,306 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749203_8379, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-14 20:58:47,927 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749203_8379 replica FinalizedReplica, blk_1073749203_8379, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749203 for deletion 2025-07-14 20:58:47,928 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749203_8379 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749203 2025-07-14 20:59:41,277 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749204_8380 src: /192.168.158.5:42760 dest: /192.168.158.4:9866 2025-07-14 20:59:41,302 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:42760, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_581892424_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749204_8380, duration(ns): 19852457 2025-07-14 20:59:41,302 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749204_8380, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-14 20:59:47,927 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749204_8380 replica FinalizedReplica, blk_1073749204_8380, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749204 for deletion 2025-07-14 20:59:47,928 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749204_8380 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749204 2025-07-14 21:01:41,289 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749206_8382 src: /192.168.158.9:56656 dest: /192.168.158.4:9866 2025-07-14 21:01:41,313 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:56656, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-462217427_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749206_8382, duration(ns): 18573846 2025-07-14 21:01:41,314 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749206_8382, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-14 21:01:47,932 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749206_8382 replica FinalizedReplica, blk_1073749206_8382, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749206 for deletion 2025-07-14 21:01:47,933 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749206_8382 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749206 2025-07-14 21:02:41,315 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749207_8383 src: /192.168.158.5:40928 dest: /192.168.158.4:9866 2025-07-14 21:02:41,336 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:40928, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1309643776_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749207_8383, duration(ns): 18616533 2025-07-14 21:02:41,336 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749207_8383, type=LAST_IN_PIPELINE terminating 2025-07-14 21:02:44,934 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749207_8383 replica FinalizedReplica, blk_1073749207_8383, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749207 for deletion 2025-07-14 21:02:44,935 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749207_8383 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749207 2025-07-14 21:04:51,292 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749209_8385 src: /192.168.158.6:56474 dest: /192.168.158.4:9866 2025-07-14 21:04:51,309 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:56474, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1791864950_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749209_8385, duration(ns): 15396856 2025-07-14 21:04:51,309 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749209_8385, type=LAST_IN_PIPELINE terminating 2025-07-14 21:04:56,937 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749209_8385 replica FinalizedReplica, blk_1073749209_8385, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749209 for deletion 2025-07-14 21:04:56,938 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749209_8385 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749209 2025-07-14 21:06:51,293 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749211_8387 src: /192.168.158.6:54530 dest: /192.168.158.4:9866 2025-07-14 21:06:51,320 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:54530, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_112847992_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749211_8387, duration(ns): 20052995 2025-07-14 21:06:51,320 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749211_8387, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-14 21:06:53,940 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749211_8387 replica FinalizedReplica, blk_1073749211_8387, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749211 for deletion 2025-07-14 21:06:53,941 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749211_8387 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749211 2025-07-14 21:07:56,298 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749212_8388 src: /192.168.158.6:35476 dest: /192.168.158.4:9866 2025-07-14 21:07:56,324 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:35476, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1950111549_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749212_8388, duration(ns): 20621358 2025-07-14 21:07:56,324 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749212_8388, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-14 21:08:02,939 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749212_8388 replica FinalizedReplica, blk_1073749212_8388, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749212 for deletion 2025-07-14 21:08:02,941 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749212_8388 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749212 2025-07-14 21:10:06,302 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749214_8390 src: /192.168.158.8:34854 dest: /192.168.158.4:9866 2025-07-14 21:10:06,318 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:34854, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_118610680_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749214_8390, duration(ns): 13696513 2025-07-14 21:10:06,318 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749214_8390, type=LAST_IN_PIPELINE terminating 2025-07-14 21:10:08,945 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749214_8390 replica FinalizedReplica, blk_1073749214_8390, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749214 for deletion 2025-07-14 21:10:08,946 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749214_8390 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749214 2025-07-14 21:16:16,307 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749220_8396 src: /192.168.158.8:40920 dest: /192.168.158.4:9866 2025-07-14 21:16:16,332 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:40920, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1961861384_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749220_8396, duration(ns): 19309064 2025-07-14 21:16:16,333 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749220_8396, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-14 21:16:17,971 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749220_8396 replica FinalizedReplica, blk_1073749220_8396, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749220 for deletion 2025-07-14 21:16:17,972 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749220_8396 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749220 2025-07-14 21:21:26,319 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749225_8401 src: /192.168.158.9:54740 dest: /192.168.158.4:9866 2025-07-14 21:21:26,339 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:54740, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1985975645_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749225_8401, duration(ns): 18290626 2025-07-14 21:21:26,340 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749225_8401, type=LAST_IN_PIPELINE terminating 2025-07-14 21:21:29,978 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749225_8401 replica FinalizedReplica, blk_1073749225_8401, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749225 for deletion 2025-07-14 21:21:29,979 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749225_8401 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749225 2025-07-14 21:24:31,319 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749228_8404 src: /192.168.158.6:42346 dest: /192.168.158.4:9866 2025-07-14 21:24:31,337 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:42346, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_117183830_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749228_8404, duration(ns): 15497295 2025-07-14 21:24:31,337 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749228_8404, type=LAST_IN_PIPELINE terminating 2025-07-14 21:24:32,980 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749228_8404 replica FinalizedReplica, blk_1073749228_8404, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749228 for deletion 2025-07-14 21:24:32,981 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749228_8404 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749228 2025-07-14 21:28:41,326 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749232_8408 src: /192.168.158.7:56408 dest: /192.168.158.4:9866 2025-07-14 21:28:41,346 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:56408, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_633588039_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749232_8408, duration(ns): 17483163 2025-07-14 21:28:41,346 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749232_8408, type=LAST_IN_PIPELINE terminating 2025-07-14 21:28:47,992 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749232_8408 replica FinalizedReplica, blk_1073749232_8408, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749232 for deletion 2025-07-14 21:28:47,993 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749232_8408 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749232 2025-07-14 21:31:51,324 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749235_8411 src: /192.168.158.8:50086 dest: /192.168.158.4:9866 2025-07-14 21:31:51,343 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:50086, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-196900743_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749235_8411, duration(ns): 16345375 2025-07-14 21:31:51,343 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749235_8411, type=LAST_IN_PIPELINE terminating 2025-07-14 21:31:56,999 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749235_8411 replica FinalizedReplica, blk_1073749235_8411, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749235 for deletion 2025-07-14 21:31:57,000 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749235_8411 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749235 2025-07-14 21:39:56,339 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749243_8419 src: /192.168.158.8:44386 dest: /192.168.158.4:9866 2025-07-14 21:39:56,356 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:44386, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-514495208_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749243_8419, duration(ns): 14838832 2025-07-14 21:39:56,356 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749243_8419, type=LAST_IN_PIPELINE terminating 2025-07-14 21:40:00,008 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749243_8419 replica FinalizedReplica, blk_1073749243_8419, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749243 for deletion 2025-07-14 21:40:00,009 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749243_8419 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749243 2025-07-14 21:42:56,344 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749246_8422 src: /192.168.158.9:45706 dest: /192.168.158.4:9866 2025-07-14 21:42:56,369 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:45706, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1080582545_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749246_8422, duration(ns): 19446063 2025-07-14 21:42:56,369 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749246_8422, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-14 21:43:03,013 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749246_8422 replica FinalizedReplica, blk_1073749246_8422, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749246 for deletion 2025-07-14 21:43:03,015 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749246_8422 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749246 2025-07-14 21:44:01,343 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749247_8423 src: /192.168.158.5:58874 dest: /192.168.158.4:9866 2025-07-14 21:44:01,365 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:58874, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1226586927_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749247_8423, duration(ns): 17502220 2025-07-14 21:44:01,366 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749247_8423, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-14 21:44:03,015 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749247_8423 replica FinalizedReplica, blk_1073749247_8423, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749247 for deletion 2025-07-14 21:44:03,017 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749247_8423 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073749247 2025-07-14 21:46:01,359 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749249_8425 src: /192.168.158.8:37412 dest: /192.168.158.4:9866 2025-07-14 21:46:01,379 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:37412, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-270687728_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749249_8425, duration(ns): 17824072 2025-07-14 21:46:01,379 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749249_8425, type=LAST_IN_PIPELINE terminating 2025-07-14 21:46:03,020 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749249_8425 replica FinalizedReplica, blk_1073749249_8425, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749249 for deletion 2025-07-14 21:46:03,021 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749249_8425 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749249 2025-07-14 21:47:01,344 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749250_8426 src: /192.168.158.1:46714 dest: /192.168.158.4:9866 2025-07-14 21:47:01,378 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46714, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1356784956_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749250_8426, duration(ns): 25026850 2025-07-14 21:47:01,379 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749250_8426, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-14 21:47:06,022 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749250_8426 replica FinalizedReplica, blk_1073749250_8426, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749250 for deletion 2025-07-14 21:47:06,023 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749250_8426 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749250 2025-07-14 21:49:06,350 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749252_8428 src: /192.168.158.1:34996 dest: /192.168.158.4:9866 2025-07-14 21:49:06,383 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34996, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1349094204_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749252_8428, duration(ns): 24147657 2025-07-14 21:49:06,384 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749252_8428, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-14 21:49:09,023 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749252_8428 replica FinalizedReplica, blk_1073749252_8428, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749252 for deletion 2025-07-14 21:49:09,025 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749252_8428 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749252 2025-07-14 21:51:06,353 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749254_8430 src: /192.168.158.8:35800 dest: /192.168.158.4:9866 2025-07-14 21:51:06,377 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:35800, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1596427856_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749254_8430, duration(ns): 19536158 2025-07-14 21:51:06,378 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749254_8430, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-14 21:51:09,031 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749254_8430 replica FinalizedReplica, blk_1073749254_8430, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749254 for deletion 2025-07-14 21:51:09,032 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749254_8430 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749254 2025-07-14 21:52:06,362 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749255_8431 src: /192.168.158.8:50528 dest: /192.168.158.4:9866 2025-07-14 21:52:06,379 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:50528, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-843346533_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749255_8431, duration(ns): 15179893 2025-07-14 21:52:06,379 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749255_8431, type=LAST_IN_PIPELINE terminating 2025-07-14 21:52:09,035 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749255_8431 replica FinalizedReplica, blk_1073749255_8431, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749255 for deletion 2025-07-14 21:52:09,036 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749255_8431 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749255 2025-07-14 21:53:11,359 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749256_8432 src: /192.168.158.5:57770 dest: /192.168.158.4:9866 2025-07-14 21:53:11,385 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:57770, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-506762026_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749256_8432, duration(ns): 20692684 2025-07-14 21:53:11,385 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749256_8432, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-14 21:53:18,038 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749256_8432 replica FinalizedReplica, blk_1073749256_8432, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749256 for deletion 2025-07-14 21:53:18,039 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749256_8432 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749256 2025-07-14 21:54:11,364 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749257_8433 src: /192.168.158.6:33748 dest: /192.168.158.4:9866 2025-07-14 21:54:11,388 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:33748, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1007055920_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749257_8433, duration(ns): 18903577 2025-07-14 21:54:11,389 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749257_8433, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-14 21:54:15,038 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749257_8433 replica FinalizedReplica, blk_1073749257_8433, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749257 for deletion 2025-07-14 21:54:15,040 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749257_8433 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749257 2025-07-14 21:56:11,369 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749259_8435 src: /192.168.158.6:54472 dest: /192.168.158.4:9866 2025-07-14 21:56:11,387 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:54472, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1306956107_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749259_8435, duration(ns): 16049893 2025-07-14 21:56:11,387 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749259_8435, type=LAST_IN_PIPELINE terminating 2025-07-14 21:56:15,042 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749259_8435 replica FinalizedReplica, blk_1073749259_8435, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749259 for deletion 2025-07-14 21:56:15,043 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749259_8435 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749259 2025-07-14 21:57:11,377 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749260_8436 src: /192.168.158.7:53286 dest: /192.168.158.4:9866 2025-07-14 21:57:11,396 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:53286, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-710044412_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749260_8436, duration(ns): 17438651 2025-07-14 21:57:11,397 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749260_8436, type=LAST_IN_PIPELINE terminating 2025-07-14 21:57:18,044 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749260_8436 replica FinalizedReplica, blk_1073749260_8436, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749260 for deletion 2025-07-14 21:57:18,045 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749260_8436 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749260 2025-07-14 21:58:11,369 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749261_8437 src: /192.168.158.6:57104 dest: /192.168.158.4:9866 2025-07-14 21:58:11,396 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:57104, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1958886513_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749261_8437, duration(ns): 21322251 2025-07-14 21:58:11,397 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749261_8437, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-14 21:58:15,049 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749261_8437 replica FinalizedReplica, blk_1073749261_8437, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749261 for deletion 2025-07-14 21:58:15,050 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749261_8437 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749261 2025-07-14 22:00:11,378 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749263_8439 src: /192.168.158.1:37896 dest: /192.168.158.4:9866 2025-07-14 22:00:11,410 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37896, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1095637830_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749263_8439, duration(ns): 23824246 2025-07-14 22:00:11,411 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749263_8439, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-14 22:00:18,053 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749263_8439 replica FinalizedReplica, blk_1073749263_8439, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749263 for deletion 2025-07-14 22:00:18,054 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749263_8439 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749263 2025-07-14 22:01:16,411 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749264_8440 src: /192.168.158.9:46536 dest: /192.168.158.4:9866 2025-07-14 22:01:16,434 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:46536, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1948627775_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749264_8440, duration(ns): 18404417 2025-07-14 22:01:16,435 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749264_8440, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-14 22:01:21,055 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749264_8440 replica FinalizedReplica, blk_1073749264_8440, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749264 for deletion 2025-07-14 22:01:21,056 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749264_8440 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749264 2025-07-14 22:05:21,384 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749268_8444 src: /192.168.158.9:55480 dest: /192.168.158.4:9866 2025-07-14 22:05:21,409 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:55480, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1466161380_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749268_8444, duration(ns): 19262592 2025-07-14 22:05:21,409 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749268_8444, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-14 22:05:24,067 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749268_8444 replica FinalizedReplica, blk_1073749268_8444, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749268 for deletion 2025-07-14 22:05:24,068 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749268_8444 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749268 2025-07-14 22:07:26,390 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749270_8446 src: /192.168.158.5:52526 dest: /192.168.158.4:9866 2025-07-14 22:07:26,416 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:52526, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_250803622_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749270_8446, duration(ns): 20262131 2025-07-14 22:07:26,416 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749270_8446, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-14 22:07:30,067 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749270_8446 replica FinalizedReplica, blk_1073749270_8446, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749270 for deletion 2025-07-14 22:07:30,068 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749270_8446 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749270 2025-07-14 22:09:26,390 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749272_8448 src: /192.168.158.1:39732 dest: /192.168.158.4:9866 2025-07-14 22:09:26,425 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39732, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-323615555_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749272_8448, duration(ns): 26607918 2025-07-14 22:09:26,425 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749272_8448, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-14 22:09:30,070 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749272_8448 replica FinalizedReplica, blk_1073749272_8448, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749272 for deletion 2025-07-14 22:09:30,071 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749272_8448 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749272 2025-07-14 22:10:26,393 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749273_8449 src: /192.168.158.8:35834 dest: /192.168.158.4:9866 2025-07-14 22:10:26,411 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:35834, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1553782326_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749273_8449, duration(ns): 15680378 2025-07-14 22:10:26,411 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749273_8449, type=LAST_IN_PIPELINE terminating 2025-07-14 22:10:33,072 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749273_8449 replica FinalizedReplica, blk_1073749273_8449, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749273 for deletion 2025-07-14 22:10:33,073 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749273_8449 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749273 2025-07-14 22:11:31,394 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749274_8450 src: /192.168.158.9:38456 dest: /192.168.158.4:9866 2025-07-14 22:11:31,410 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:38456, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1278641396_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749274_8450, duration(ns): 13785796 2025-07-14 22:11:31,411 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749274_8450, type=LAST_IN_PIPELINE terminating 2025-07-14 22:11:36,073 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749274_8450 replica FinalizedReplica, blk_1073749274_8450, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749274 for deletion 2025-07-14 22:11:36,074 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749274_8450 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749274 2025-07-14 22:15:31,390 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749278_8454 src: /192.168.158.1:56904 dest: /192.168.158.4:9866 2025-07-14 22:15:31,421 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56904, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1885199906_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749278_8454, duration(ns): 22230264 2025-07-14 22:15:31,421 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749278_8454, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-14 22:15:33,081 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749278_8454 replica FinalizedReplica, blk_1073749278_8454, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749278 for deletion 2025-07-14 22:15:33,082 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749278_8454 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749278 2025-07-14 22:18:36,398 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749281_8457 src: /192.168.158.1:48042 dest: /192.168.158.4:9866 2025-07-14 22:18:36,429 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48042, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_905729404_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749281_8457, duration(ns): 22278664 2025-07-14 22:18:36,430 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749281_8457, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-14 22:18:39,088 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749281_8457 replica FinalizedReplica, blk_1073749281_8457, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749281 for deletion 2025-07-14 22:18:39,089 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749281_8457 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749281 2025-07-14 22:19:36,399 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749282_8458 src: /192.168.158.7:45030 dest: /192.168.158.4:9866 2025-07-14 22:19:36,424 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:45030, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-346225154_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749282_8458, duration(ns): 19001003 2025-07-14 22:19:36,424 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749282_8458, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-14 22:19:39,093 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749282_8458 replica FinalizedReplica, blk_1073749282_8458, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749282 for deletion 2025-07-14 22:19:39,094 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749282_8458 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749282 2025-07-14 22:20:36,403 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749283_8459 src: /192.168.158.1:52482 dest: /192.168.158.4:9866 2025-07-14 22:20:36,434 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52482, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-881131922_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749283_8459, duration(ns): 22758239 2025-07-14 22:20:36,435 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749283_8459, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-14 22:20:39,093 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749283_8459 replica FinalizedReplica, blk_1073749283_8459, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749283 for deletion 2025-07-14 22:20:39,095 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749283_8459 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749283 2025-07-14 22:23:46,398 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749286_8462 src: /192.168.158.1:34000 dest: /192.168.158.4:9866 2025-07-14 22:23:46,427 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34000, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_913770470_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749286_8462, duration(ns): 20345783 2025-07-14 22:23:46,427 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749286_8462, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-14 22:23:48,101 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749286_8462 replica FinalizedReplica, blk_1073749286_8462, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749286 for deletion 2025-07-14 22:23:48,102 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749286_8462 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749286 2025-07-14 22:26:46,410 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749289_8465 src: /192.168.158.1:56038 dest: /192.168.158.4:9866 2025-07-14 22:26:46,442 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56038, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1435107017_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749289_8465, duration(ns): 23784141 2025-07-14 22:26:46,442 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749289_8465, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-14 22:26:48,106 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749289_8465 replica FinalizedReplica, blk_1073749289_8465, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749289 for deletion 2025-07-14 22:26:48,107 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749289_8465 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749289 2025-07-14 22:27:46,411 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749290_8466 src: /192.168.158.1:45898 dest: /192.168.158.4:9866 2025-07-14 22:27:46,444 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45898, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-497853494_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749290_8466, duration(ns): 24311068 2025-07-14 22:27:46,444 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749290_8466, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-14 22:27:48,109 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749290_8466 replica FinalizedReplica, blk_1073749290_8466, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749290 for deletion 2025-07-14 22:27:48,110 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749290_8466 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749290 2025-07-14 22:29:51,416 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749292_8468 src: /192.168.158.1:46012 dest: /192.168.158.4:9866 2025-07-14 22:29:51,447 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46012, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_967819516_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749292_8468, duration(ns): 21869007 2025-07-14 22:29:51,447 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749292_8468, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-14 22:29:57,112 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749292_8468 replica FinalizedReplica, blk_1073749292_8468, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749292 for deletion 2025-07-14 22:29:57,114 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749292_8468 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749292 2025-07-14 22:32:56,423 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749295_8471 src: /192.168.158.1:32836 dest: /192.168.158.4:9866 2025-07-14 22:32:56,455 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:32836, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1370658958_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749295_8471, duration(ns): 22984399 2025-07-14 22:32:56,455 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749295_8471, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-14 22:33:03,122 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749295_8471 replica FinalizedReplica, blk_1073749295_8471, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749295 for deletion 2025-07-14 22:33:03,123 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749295_8471 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749295 2025-07-14 22:35:06,428 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749297_8473 src: /192.168.158.6:52808 dest: /192.168.158.4:9866 2025-07-14 22:35:06,445 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:52808, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1024699911_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749297_8473, duration(ns): 15017802 2025-07-14 22:35:06,445 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749297_8473, type=LAST_IN_PIPELINE terminating 2025-07-14 22:35:12,125 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749297_8473 replica FinalizedReplica, blk_1073749297_8473, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749297 for deletion 2025-07-14 22:35:12,127 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749297_8473 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749297 2025-07-14 22:38:06,432 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749300_8476 src: /192.168.158.6:47530 dest: /192.168.158.4:9866 2025-07-14 22:38:06,450 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:47530, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1855158256_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749300_8476, duration(ns): 15951078 2025-07-14 22:38:06,451 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749300_8476, type=LAST_IN_PIPELINE terminating 2025-07-14 22:38:09,135 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749300_8476 replica FinalizedReplica, blk_1073749300_8476, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749300 for deletion 2025-07-14 22:38:09,137 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749300_8476 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749300 2025-07-14 22:40:11,440 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749302_8478 src: /192.168.158.7:44470 dest: /192.168.158.4:9866 2025-07-14 22:40:11,458 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:44470, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-734765757_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749302_8478, duration(ns): 15622968 2025-07-14 22:40:11,458 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749302_8478, type=LAST_IN_PIPELINE terminating 2025-07-14 22:40:15,143 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749302_8478 replica FinalizedReplica, blk_1073749302_8478, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749302 for deletion 2025-07-14 22:40:15,144 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749302_8478 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749302 2025-07-14 22:41:11,430 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749303_8479 src: /192.168.158.1:52652 dest: /192.168.158.4:9866 2025-07-14 22:41:11,465 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52652, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1746425216_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749303_8479, duration(ns): 24879553 2025-07-14 22:41:11,465 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749303_8479, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-14 22:41:15,144 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749303_8479 replica FinalizedReplica, blk_1073749303_8479, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749303 for deletion 2025-07-14 22:41:15,146 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749303_8479 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749303 2025-07-14 22:43:11,439 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749305_8481 src: /192.168.158.8:57022 dest: /192.168.158.4:9866 2025-07-14 22:43:11,457 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:57022, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_654203920_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749305_8481, duration(ns): 15855977 2025-07-14 22:43:11,457 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749305_8481, type=LAST_IN_PIPELINE terminating 2025-07-14 22:43:15,150 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749305_8481 replica FinalizedReplica, blk_1073749305_8481, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749305 for deletion 2025-07-14 22:43:15,151 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749305_8481 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749305 2025-07-14 22:45:11,441 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749307_8483 src: /192.168.158.9:42690 dest: /192.168.158.4:9866 2025-07-14 22:45:11,459 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:42690, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1840628655_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749307_8483, duration(ns): 16125687 2025-07-14 22:45:11,460 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749307_8483, type=LAST_IN_PIPELINE terminating 2025-07-14 22:45:18,154 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749307_8483 replica FinalizedReplica, blk_1073749307_8483, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749307 for deletion 2025-07-14 22:45:18,155 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749307_8483 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749307 2025-07-14 22:47:16,442 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749309_8485 src: /192.168.158.1:38334 dest: /192.168.158.4:9866 2025-07-14 22:47:16,473 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38334, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1373874709_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749309_8485, duration(ns): 22324850 2025-07-14 22:47:16,474 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749309_8485, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-14 22:47:21,158 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749309_8485 replica FinalizedReplica, blk_1073749309_8485, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749309 for deletion 2025-07-14 22:47:21,159 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749309_8485 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749309 2025-07-14 22:49:21,444 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749311_8487 src: /192.168.158.6:43328 dest: /192.168.158.4:9866 2025-07-14 22:49:21,460 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:43328, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-322656052_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749311_8487, duration(ns): 14411754 2025-07-14 22:49:21,461 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749311_8487, type=LAST_IN_PIPELINE terminating 2025-07-14 22:49:27,166 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749311_8487 replica FinalizedReplica, blk_1073749311_8487, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749311 for deletion 2025-07-14 22:49:27,168 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749311_8487 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749311 2025-07-14 22:50:26,452 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749312_8488 src: /192.168.158.1:45144 dest: /192.168.158.4:9866 2025-07-14 22:50:26,482 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45144, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1656059652_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749312_8488, duration(ns): 21744410 2025-07-14 22:50:26,483 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749312_8488, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-14 22:50:30,169 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749312_8488 replica FinalizedReplica, blk_1073749312_8488, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749312 for deletion 2025-07-14 22:50:30,170 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749312_8488 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749312 2025-07-14 22:53:31,451 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749315_8491 src: /192.168.158.5:49412 dest: /192.168.158.4:9866 2025-07-14 22:53:31,476 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:49412, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1268426773_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749315_8491, duration(ns): 19567407 2025-07-14 22:53:31,476 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749315_8491, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-14 22:53:33,178 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749315_8491 replica FinalizedReplica, blk_1073749315_8491, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749315 for deletion 2025-07-14 22:53:33,180 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749315_8491 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749315 2025-07-14 23:06:41,489 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749328_8504 src: /192.168.158.8:39442 dest: /192.168.158.4:9866 2025-07-14 23:06:41,507 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:39442, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_483439640_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749328_8504, duration(ns): 15733625 2025-07-14 23:06:41,507 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749328_8504, type=LAST_IN_PIPELINE terminating 2025-07-14 23:06:48,218 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749328_8504 replica FinalizedReplica, blk_1073749328_8504, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749328 for deletion 2025-07-14 23:06:48,219 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749328_8504 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749328 2025-07-14 23:07:41,486 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749329_8505 src: /192.168.158.5:54696 dest: /192.168.158.4:9866 2025-07-14 23:07:41,502 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:54696, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1922804421_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749329_8505, duration(ns): 14862745 2025-07-14 23:07:41,503 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749329_8505, type=LAST_IN_PIPELINE terminating 2025-07-14 23:07:48,219 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749329_8505 replica FinalizedReplica, blk_1073749329_8505, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749329 for deletion 2025-07-14 23:07:48,221 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749329_8505 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749329 2025-07-14 23:09:41,492 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749331_8507 src: /192.168.158.1:56586 dest: /192.168.158.4:9866 2025-07-14 23:09:41,521 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56586, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1693059322_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749331_8507, duration(ns): 20736533 2025-07-14 23:09:41,523 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749331_8507, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-14 23:09:48,224 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749331_8507 replica FinalizedReplica, blk_1073749331_8507, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749331 for deletion 2025-07-14 23:09:48,226 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749331_8507 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749331 2025-07-14 23:12:51,487 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749334_8510 src: /192.168.158.1:55466 dest: /192.168.158.4:9866 2025-07-14 23:12:51,517 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55466, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1840092243_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749334_8510, duration(ns): 21193643 2025-07-14 23:12:51,517 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749334_8510, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-14 23:12:57,235 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749334_8510 replica FinalizedReplica, blk_1073749334_8510, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749334 for deletion 2025-07-14 23:12:57,236 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749334_8510 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749334 2025-07-14 23:14:56,487 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749336_8512 src: /192.168.158.6:52584 dest: /192.168.158.4:9866 2025-07-14 23:14:56,512 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:52584, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_615557008_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749336_8512, duration(ns): 20320466 2025-07-14 23:14:56,513 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749336_8512, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-14 23:15:00,240 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749336_8512 replica FinalizedReplica, blk_1073749336_8512, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749336 for deletion 2025-07-14 23:15:00,241 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749336_8512 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749336 2025-07-14 23:16:56,537 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749338_8514 src: /192.168.158.8:33982 dest: /192.168.158.4:9866 2025-07-14 23:16:56,563 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:33982, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1563971433_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749338_8514, duration(ns): 20817059 2025-07-14 23:16:56,563 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749338_8514, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-14 23:17:03,246 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749338_8514 replica FinalizedReplica, blk_1073749338_8514, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749338 for deletion 2025-07-14 23:17:03,248 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749338_8514 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749338 2025-07-14 23:17:56,500 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749339_8515 src: /192.168.158.6:34582 dest: /192.168.158.4:9866 2025-07-14 23:17:56,516 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:34582, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_225068364_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749339_8515, duration(ns): 14063775 2025-07-14 23:17:56,516 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749339_8515, type=LAST_IN_PIPELINE terminating 2025-07-14 23:18:00,248 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749339_8515 replica FinalizedReplica, blk_1073749339_8515, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749339 for deletion 2025-07-14 23:18:00,249 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749339_8515 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749339 2025-07-14 23:19:01,493 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749340_8516 src: /192.168.158.1:45774 dest: /192.168.158.4:9866 2025-07-14 23:19:01,523 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45774, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_219732044_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749340_8516, duration(ns): 22071724 2025-07-14 23:19:01,524 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749340_8516, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-14 23:19:03,249 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749340_8516 replica FinalizedReplica, blk_1073749340_8516, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749340 for deletion 2025-07-14 23:19:03,250 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749340_8516 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749340 2025-07-14 23:21:06,510 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749342_8518 src: /192.168.158.5:59630 dest: /192.168.158.4:9866 2025-07-14 23:21:06,528 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:59630, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1348728770_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749342_8518, duration(ns): 16036285 2025-07-14 23:21:06,528 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749342_8518, type=LAST_IN_PIPELINE terminating 2025-07-14 23:21:09,251 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749342_8518 replica FinalizedReplica, blk_1073749342_8518, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749342 for deletion 2025-07-14 23:21:09,252 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749342_8518 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749342 2025-07-14 23:28:21,509 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749349_8525 src: /192.168.158.7:56412 dest: /192.168.158.4:9866 2025-07-14 23:28:21,532 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:56412, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1556405276_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749349_8525, duration(ns): 17878464 2025-07-14 23:28:21,533 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749349_8525, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-14 23:28:27,269 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749349_8525 replica FinalizedReplica, blk_1073749349_8525, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749349 for deletion 2025-07-14 23:28:27,270 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749349_8525 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749349 2025-07-14 23:29:26,505 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749350_8526 src: /192.168.158.6:36826 dest: /192.168.158.4:9866 2025-07-14 23:29:26,530 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:36826, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-199468779_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749350_8526, duration(ns): 19997800 2025-07-14 23:29:26,530 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749350_8526, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-14 23:29:27,274 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749350_8526 replica FinalizedReplica, blk_1073749350_8526, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749350 for deletion 2025-07-14 23:29:27,275 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749350_8526 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749350 2025-07-14 23:30:31,504 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749351_8527 src: /192.168.158.1:44008 dest: /192.168.158.4:9866 2025-07-14 23:30:31,534 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44008, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_473713792_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749351_8527, duration(ns): 21973664 2025-07-14 23:30:31,535 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749351_8527, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-14 23:30:36,277 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749351_8527 replica FinalizedReplica, blk_1073749351_8527, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749351 for deletion 2025-07-14 23:30:36,278 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749351_8527 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749351 2025-07-14 23:33:31,510 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749354_8530 src: /192.168.158.1:46024 dest: /192.168.158.4:9866 2025-07-14 23:33:31,541 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46024, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2133690368_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749354_8530, duration(ns): 21810509 2025-07-14 23:33:31,541 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749354_8530, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-14 23:33:33,284 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749354_8530 replica FinalizedReplica, blk_1073749354_8530, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749354 for deletion 2025-07-14 23:33:33,285 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749354_8530 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749354 2025-07-14 23:35:31,516 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749356_8532 src: /192.168.158.6:59960 dest: /192.168.158.4:9866 2025-07-14 23:35:31,543 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:59960, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-910200078_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749356_8532, duration(ns): 21113243 2025-07-14 23:35:31,544 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749356_8532, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-14 23:35:36,286 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749356_8532 replica FinalizedReplica, blk_1073749356_8532, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749356 for deletion 2025-07-14 23:35:36,287 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749356_8532 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749356 2025-07-14 23:36:13,269 INFO org.apache.hadoop.hdfs.server.datanode.DirectoryScanner: BlockPool BP-1059995147-192.168.158.1-1752101929360 Total blocks: 8, missing metadata files:0, missing block files:0, missing blocks in memory:0, mismatched blocks:0 2025-07-14 23:37:21,294 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Successfully sent block report 0x12cfd9a757d31f3a, containing 4 storage report(s), of which we sent 4. The reports had 8 total blocks and used 1 RPC(s). This took 1 msec to generate and 4 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5. 2025-07-14 23:37:21,294 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Got finalize command for block pool BP-1059995147-192.168.158.1-1752101929360 2025-07-14 23:37:31,524 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749358_8534 src: /192.168.158.5:42192 dest: /192.168.158.4:9866 2025-07-14 23:37:31,542 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:42192, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_810500422_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749358_8534, duration(ns): 15829176 2025-07-14 23:37:31,543 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749358_8534, type=LAST_IN_PIPELINE terminating 2025-07-14 23:37:33,290 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749358_8534 replica FinalizedReplica, blk_1073749358_8534, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749358 for deletion 2025-07-14 23:37:33,292 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749358_8534 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749358 2025-07-14 23:39:36,530 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749360_8536 src: /192.168.158.8:41294 dest: /192.168.158.4:9866 2025-07-14 23:39:36,555 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:41294, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2147098911_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749360_8536, duration(ns): 20127425 2025-07-14 23:39:36,556 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749360_8536, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-14 23:39:39,293 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749360_8536 replica FinalizedReplica, blk_1073749360_8536, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749360 for deletion 2025-07-14 23:39:39,294 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749360_8536 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749360 2025-07-14 23:40:41,525 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749361_8537 src: /192.168.158.8:39586 dest: /192.168.158.4:9866 2025-07-14 23:40:41,545 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:39586, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1804068398_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749361_8537, duration(ns): 17855259 2025-07-14 23:40:41,545 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749361_8537, type=LAST_IN_PIPELINE terminating 2025-07-14 23:40:42,294 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749361_8537 replica FinalizedReplica, blk_1073749361_8537, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749361 for deletion 2025-07-14 23:40:42,295 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749361_8537 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749361 2025-07-14 23:43:41,524 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749364_8540 src: /192.168.158.1:55182 dest: /192.168.158.4:9866 2025-07-14 23:43:41,553 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55182, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1302964350_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749364_8540, duration(ns): 20594113 2025-07-14 23:43:41,553 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749364_8540, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-14 23:43:42,298 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749364_8540 replica FinalizedReplica, blk_1073749364_8540, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749364 for deletion 2025-07-14 23:43:42,299 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749364_8540 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749364 2025-07-14 23:44:41,535 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749365_8541 src: /192.168.158.5:50684 dest: /192.168.158.4:9866 2025-07-14 23:44:41,559 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:50684, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_983196510_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749365_8541, duration(ns): 18544733 2025-07-14 23:44:41,559 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749365_8541, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-14 23:44:42,299 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749365_8541 replica FinalizedReplica, blk_1073749365_8541, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749365 for deletion 2025-07-14 23:44:42,301 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749365_8541 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749365 2025-07-14 23:45:41,532 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749366_8542 src: /192.168.158.5:35764 dest: /192.168.158.4:9866 2025-07-14 23:45:41,556 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:35764, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1788642013_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749366_8542, duration(ns): 18678695 2025-07-14 23:45:41,557 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749366_8542, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-14 23:45:42,303 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749366_8542 replica FinalizedReplica, blk_1073749366_8542, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749366 for deletion 2025-07-14 23:45:42,303 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749366_8542 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749366 2025-07-14 23:48:46,538 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749369_8545 src: /192.168.158.1:40692 dest: /192.168.158.4:9866 2025-07-14 23:48:46,567 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40692, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_526368404_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749369_8545, duration(ns): 21265284 2025-07-14 23:48:46,568 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749369_8545, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-14 23:48:51,306 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749369_8545 replica FinalizedReplica, blk_1073749369_8545, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749369 for deletion 2025-07-14 23:48:51,307 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749369_8545 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749369 2025-07-14 23:49:46,537 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749370_8546 src: /192.168.158.7:59970 dest: /192.168.158.4:9866 2025-07-14 23:49:46,561 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:59970, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2028232281_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749370_8546, duration(ns): 19271130 2025-07-14 23:49:46,561 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749370_8546, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-14 23:49:48,308 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749370_8546 replica FinalizedReplica, blk_1073749370_8546, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749370 for deletion 2025-07-14 23:49:48,309 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749370_8546 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749370 2025-07-14 23:50:46,556 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749371_8547 src: /192.168.158.6:37412 dest: /192.168.158.4:9866 2025-07-14 23:50:46,573 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:37412, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-472834150_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749371_8547, duration(ns): 15184606 2025-07-14 23:50:46,573 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749371_8547, type=LAST_IN_PIPELINE terminating 2025-07-14 23:50:48,313 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749371_8547 replica FinalizedReplica, blk_1073749371_8547, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749371 for deletion 2025-07-14 23:50:48,314 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749371_8547 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749371 2025-07-14 23:52:46,557 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749373_8549 src: /192.168.158.7:49082 dest: /192.168.158.4:9866 2025-07-14 23:52:46,575 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:49082, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1641167825_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749373_8549, duration(ns): 15218587 2025-07-14 23:52:46,575 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749373_8549, type=LAST_IN_PIPELINE terminating 2025-07-14 23:52:51,316 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749373_8549 replica FinalizedReplica, blk_1073749373_8549, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749373 for deletion 2025-07-14 23:52:51,329 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749373_8549 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749373 2025-07-14 23:54:46,566 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749375_8551 src: /192.168.158.6:48562 dest: /192.168.158.4:9866 2025-07-14 23:54:46,585 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:48562, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1973285028_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749375_8551, duration(ns): 16743236 2025-07-14 23:54:46,585 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749375_8551, type=LAST_IN_PIPELINE terminating 2025-07-14 23:54:48,320 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749375_8551 replica FinalizedReplica, blk_1073749375_8551, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749375 for deletion 2025-07-14 23:54:48,321 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749375_8551 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749375 2025-07-15 00:00:51,562 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749381_8557 src: /192.168.158.9:41672 dest: /192.168.158.4:9866 2025-07-15 00:00:51,587 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:41672, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_161835169_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749381_8557, duration(ns): 19296426 2025-07-15 00:00:51,587 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749381_8557, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-15 00:00:54,327 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749381_8557 replica FinalizedReplica, blk_1073749381_8557, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749381 for deletion 2025-07-15 00:00:54,328 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749381_8557 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749381 2025-07-15 00:02:56,570 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749383_8559 src: /192.168.158.6:53322 dest: /192.168.158.4:9866 2025-07-15 00:02:56,588 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:53322, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1164513395_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749383_8559, duration(ns): 16110364 2025-07-15 00:02:56,588 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749383_8559, type=LAST_IN_PIPELINE terminating 2025-07-15 00:02:57,329 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749383_8559 replica FinalizedReplica, blk_1073749383_8559, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749383 for deletion 2025-07-15 00:02:57,330 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749383_8559 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749383 2025-07-15 00:05:01,596 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749385_8561 src: /192.168.158.6:46274 dest: /192.168.158.4:9866 2025-07-15 00:05:01,613 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:46274, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2022214858_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749385_8561, duration(ns): 15077000 2025-07-15 00:05:01,613 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749385_8561, type=LAST_IN_PIPELINE terminating 2025-07-15 00:05:03,335 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749385_8561 replica FinalizedReplica, blk_1073749385_8561, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749385 for deletion 2025-07-15 00:05:03,336 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749385_8561 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749385 2025-07-15 00:06:01,599 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749386_8562 src: /192.168.158.8:33094 dest: /192.168.158.4:9866 2025-07-15 00:06:01,621 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:33094, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1353252722_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749386_8562, duration(ns): 19663594 2025-07-15 00:06:01,621 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749386_8562, type=LAST_IN_PIPELINE terminating 2025-07-15 00:06:06,334 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749386_8562 replica FinalizedReplica, blk_1073749386_8562, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749386 for deletion 2025-07-15 00:06:06,336 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749386_8562 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749386 2025-07-15 00:08:01,601 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749388_8564 src: /192.168.158.5:54264 dest: /192.168.158.4:9866 2025-07-15 00:08:01,618 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:54264, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-508060340_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749388_8564, duration(ns): 15823827 2025-07-15 00:08:01,619 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749388_8564, type=LAST_IN_PIPELINE terminating 2025-07-15 00:08:03,342 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749388_8564 replica FinalizedReplica, blk_1073749388_8564, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749388 for deletion 2025-07-15 00:08:03,343 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749388_8564 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749388 2025-07-15 00:09:06,583 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749389_8565 src: /192.168.158.1:33736 dest: /192.168.158.4:9866 2025-07-15 00:09:06,614 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33736, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1783147357_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749389_8565, duration(ns): 22040043 2025-07-15 00:09:06,614 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749389_8565, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-15 00:09:12,344 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749389_8565 replica FinalizedReplica, blk_1073749389_8565, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749389 for deletion 2025-07-15 00:09:12,345 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749389_8565 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749389 2025-07-15 00:12:06,617 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749392_8568 src: /192.168.158.5:57856 dest: /192.168.158.4:9866 2025-07-15 00:12:06,645 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:57856, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-363633249_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749392_8568, duration(ns): 23306874 2025-07-15 00:12:06,646 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749392_8568, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-15 00:12:12,352 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749392_8568 replica FinalizedReplica, blk_1073749392_8568, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749392 for deletion 2025-07-15 00:12:12,353 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749392_8568 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749392 2025-07-15 00:16:11,607 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749396_8572 src: /192.168.158.9:35016 dest: /192.168.158.4:9866 2025-07-15 00:16:11,632 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:35016, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1658618078_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749396_8572, duration(ns): 20202094 2025-07-15 00:16:11,633 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749396_8572, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-15 00:16:15,357 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749396_8572 replica FinalizedReplica, blk_1073749396_8572, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749396 for deletion 2025-07-15 00:16:15,358 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749396_8572 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749396 2025-07-15 00:17:11,617 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749397_8573 src: /192.168.158.8:55968 dest: /192.168.158.4:9866 2025-07-15 00:17:11,635 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:55968, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_740416215_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749397_8573, duration(ns): 15249125 2025-07-15 00:17:11,635 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749397_8573, type=LAST_IN_PIPELINE terminating 2025-07-15 00:17:15,357 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749397_8573 replica FinalizedReplica, blk_1073749397_8573, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749397 for deletion 2025-07-15 00:17:15,359 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749397_8573 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749397 2025-07-15 00:18:11,617 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749398_8574 src: /192.168.158.9:50892 dest: /192.168.158.4:9866 2025-07-15 00:18:11,635 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:50892, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1829670127_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749398_8574, duration(ns): 16511071 2025-07-15 00:18:11,635 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749398_8574, type=LAST_IN_PIPELINE terminating 2025-07-15 00:18:12,359 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749398_8574 replica FinalizedReplica, blk_1073749398_8574, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749398 for deletion 2025-07-15 00:18:12,360 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749398_8574 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749398 2025-07-15 00:19:16,600 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749399_8575 src: /192.168.158.5:60764 dest: /192.168.158.4:9866 2025-07-15 00:19:16,625 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:60764, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1431044279_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749399_8575, duration(ns): 19859762 2025-07-15 00:19:16,627 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749399_8575, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-15 00:19:21,360 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749399_8575 replica FinalizedReplica, blk_1073749399_8575, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749399 for deletion 2025-07-15 00:19:21,361 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749399_8575 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749399 2025-07-15 00:21:21,606 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749401_8577 src: /192.168.158.1:45584 dest: /192.168.158.4:9866 2025-07-15 00:21:21,640 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45584, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-203619147_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749401_8577, duration(ns): 24267383 2025-07-15 00:21:21,640 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749401_8577, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-15 00:21:24,360 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749401_8577 replica FinalizedReplica, blk_1073749401_8577, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749401 for deletion 2025-07-15 00:21:24,361 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749401_8577 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749401 2025-07-15 00:22:21,629 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749402_8578 src: /192.168.158.5:58622 dest: /192.168.158.4:9866 2025-07-15 00:22:21,654 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:58622, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1445446292_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749402_8578, duration(ns): 19622394 2025-07-15 00:22:21,654 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749402_8578, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-15 00:22:24,360 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749402_8578 replica FinalizedReplica, blk_1073749402_8578, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749402 for deletion 2025-07-15 00:22:24,361 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749402_8578 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749402 2025-07-15 00:23:21,626 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749403_8579 src: /192.168.158.1:33126 dest: /192.168.158.4:9866 2025-07-15 00:23:21,657 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33126, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_852547486_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749403_8579, duration(ns): 21973799 2025-07-15 00:23:21,657 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749403_8579, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-15 00:23:24,362 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749403_8579 replica FinalizedReplica, blk_1073749403_8579, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749403 for deletion 2025-07-15 00:23:24,364 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749403_8579 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749403 2025-07-15 00:24:21,623 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749404_8580 src: /192.168.158.8:59382 dest: /192.168.158.4:9866 2025-07-15 00:24:21,648 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:59382, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1570779907_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749404_8580, duration(ns): 20463366 2025-07-15 00:24:21,649 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749404_8580, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-15 00:24:24,361 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749404_8580 replica FinalizedReplica, blk_1073749404_8580, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749404 for deletion 2025-07-15 00:24:24,363 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749404_8580 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749404 2025-07-15 00:30:36,605 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749410_8586 src: /192.168.158.6:33230 dest: /192.168.158.4:9866 2025-07-15 00:30:36,630 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:33230, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_454684858_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749410_8586, duration(ns): 19459473 2025-07-15 00:30:36,630 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749410_8586, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-15 00:30:39,374 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749410_8586 replica FinalizedReplica, blk_1073749410_8586, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749410 for deletion 2025-07-15 00:30:39,375 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749410_8586 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749410 2025-07-15 00:31:41,612 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749411_8587 src: /192.168.158.1:47382 dest: /192.168.158.4:9866 2025-07-15 00:31:41,645 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47382, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_657953913_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749411_8587, duration(ns): 23010115 2025-07-15 00:31:41,645 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749411_8587, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-15 00:31:42,377 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749411_8587 replica FinalizedReplica, blk_1073749411_8587, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749411 for deletion 2025-07-15 00:31:42,378 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749411_8587 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749411 2025-07-15 00:33:46,636 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749413_8589 src: /192.168.158.8:49286 dest: /192.168.158.4:9866 2025-07-15 00:33:46,662 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:49286, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1740106723_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749413_8589, duration(ns): 20848486 2025-07-15 00:33:46,663 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749413_8589, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-15 00:33:51,383 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749413_8589 replica FinalizedReplica, blk_1073749413_8589, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749413 for deletion 2025-07-15 00:33:51,384 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749413_8589 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749413 2025-07-15 00:34:46,627 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749414_8590 src: /192.168.158.7:42490 dest: /192.168.158.4:9866 2025-07-15 00:34:46,651 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:42490, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1229902841_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749414_8590, duration(ns): 18162026 2025-07-15 00:34:46,651 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749414_8590, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-15 00:34:51,388 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749414_8590 replica FinalizedReplica, blk_1073749414_8590, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749414 for deletion 2025-07-15 00:34:51,389 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749414_8590 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749414 2025-07-15 00:35:46,636 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749415_8591 src: /192.168.158.1:52362 dest: /192.168.158.4:9866 2025-07-15 00:35:46,669 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52362, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_519759344_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749415_8591, duration(ns): 24696966 2025-07-15 00:35:46,670 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749415_8591, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-15 00:35:48,387 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749415_8591 replica FinalizedReplica, blk_1073749415_8591, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749415 for deletion 2025-07-15 00:35:48,389 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749415_8591 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749415 2025-07-15 00:40:56,648 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749420_8596 src: /192.168.158.1:44272 dest: /192.168.158.4:9866 2025-07-15 00:40:56,677 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44272, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1235757006_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749420_8596, duration(ns): 20761351 2025-07-15 00:40:56,677 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749420_8596, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-15 00:40:57,393 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749420_8596 replica FinalizedReplica, blk_1073749420_8596, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749420 for deletion 2025-07-15 00:40:57,395 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749420_8596 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749420 2025-07-15 00:43:01,661 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749422_8598 src: /192.168.158.9:53232 dest: /192.168.158.4:9866 2025-07-15 00:43:01,686 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:53232, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1261165365_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749422_8598, duration(ns): 19090674 2025-07-15 00:43:01,687 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749422_8598, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-15 00:43:03,396 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749422_8598 replica FinalizedReplica, blk_1073749422_8598, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749422 for deletion 2025-07-15 00:43:03,397 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749422_8598 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749422 2025-07-15 00:45:06,727 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749424_8600 src: /192.168.158.6:56968 dest: /192.168.158.4:9866 2025-07-15 00:45:06,753 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:56968, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2050589748_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749424_8600, duration(ns): 20645814 2025-07-15 00:45:06,754 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749424_8600, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-15 00:45:12,401 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749424_8600 replica FinalizedReplica, blk_1073749424_8600, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749424 for deletion 2025-07-15 00:45:12,402 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749424_8600 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749424 2025-07-15 00:46:11,659 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749425_8601 src: /192.168.158.1:56766 dest: /192.168.158.4:9866 2025-07-15 00:46:11,691 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56766, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-518502781_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749425_8601, duration(ns): 22312105 2025-07-15 00:46:11,691 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749425_8601, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-15 00:46:15,403 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749425_8601 replica FinalizedReplica, blk_1073749425_8601, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749425 for deletion 2025-07-15 00:46:15,404 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749425_8601 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749425 2025-07-15 00:49:11,658 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749428_8604 src: /192.168.158.1:53506 dest: /192.168.158.4:9866 2025-07-15 00:49:11,689 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53506, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2001835889_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749428_8604, duration(ns): 22361949 2025-07-15 00:49:11,689 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749428_8604, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-15 00:49:15,409 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749428_8604 replica FinalizedReplica, blk_1073749428_8604, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749428 for deletion 2025-07-15 00:49:15,410 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749428_8604 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749428 2025-07-15 00:51:16,669 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749430_8606 src: /192.168.158.6:58404 dest: /192.168.158.4:9866 2025-07-15 00:51:16,687 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:58404, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_137906990_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749430_8606, duration(ns): 15979502 2025-07-15 00:51:16,687 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749430_8606, type=LAST_IN_PIPELINE terminating 2025-07-15 00:51:18,412 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749430_8606 replica FinalizedReplica, blk_1073749430_8606, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749430 for deletion 2025-07-15 00:51:18,413 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749430_8606 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749430 2025-07-15 00:52:21,666 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749431_8607 src: /192.168.158.9:39400 dest: /192.168.158.4:9866 2025-07-15 00:52:21,691 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:39400, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1364173349_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749431_8607, duration(ns): 19746756 2025-07-15 00:52:21,691 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749431_8607, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-15 00:52:24,414 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749431_8607 replica FinalizedReplica, blk_1073749431_8607, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749431 for deletion 2025-07-15 00:52:24,415 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749431_8607 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749431 2025-07-15 00:53:21,670 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749432_8608 src: /192.168.158.5:48402 dest: /192.168.158.4:9866 2025-07-15 00:53:21,694 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:48402, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1153285316_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749432_8608, duration(ns): 19261507 2025-07-15 00:53:21,695 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749432_8608, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-15 00:53:24,416 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749432_8608 replica FinalizedReplica, blk_1073749432_8608, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749432 for deletion 2025-07-15 00:53:24,417 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749432_8608 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749432 2025-07-15 00:55:26,686 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749434_8610 src: /192.168.158.5:41716 dest: /192.168.158.4:9866 2025-07-15 00:55:26,702 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:41716, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1413039656_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749434_8610, duration(ns): 12830364 2025-07-15 00:55:26,703 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749434_8610, type=LAST_IN_PIPELINE terminating 2025-07-15 00:55:27,419 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749434_8610 replica FinalizedReplica, blk_1073749434_8610, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749434 for deletion 2025-07-15 00:55:27,420 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749434_8610 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749434 2025-07-15 00:56:26,657 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749435_8611 src: /192.168.158.1:55686 dest: /192.168.158.4:9866 2025-07-15 00:56:26,688 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55686, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1709231707_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749435_8611, duration(ns): 21559940 2025-07-15 00:56:26,688 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749435_8611, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-15 00:56:27,421 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749435_8611 replica FinalizedReplica, blk_1073749435_8611, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749435 for deletion 2025-07-15 00:56:27,422 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749435_8611 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749435 2025-07-15 00:57:26,661 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749436_8612 src: /192.168.158.1:58672 dest: /192.168.158.4:9866 2025-07-15 00:57:26,693 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58672, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_546900910_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749436_8612, duration(ns): 23309254 2025-07-15 00:57:26,693 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749436_8612, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-15 00:57:27,422 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749436_8612 replica FinalizedReplica, blk_1073749436_8612, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749436 for deletion 2025-07-15 00:57:27,425 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749436_8612 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749436 2025-07-15 00:59:36,669 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749438_8614 src: /192.168.158.6:56900 dest: /192.168.158.4:9866 2025-07-15 00:59:36,689 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:56900, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1220840830_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749438_8614, duration(ns): 17160808 2025-07-15 00:59:36,689 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749438_8614, type=LAST_IN_PIPELINE terminating 2025-07-15 00:59:42,426 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749438_8614 replica FinalizedReplica, blk_1073749438_8614, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749438 for deletion 2025-07-15 00:59:42,428 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749438_8614 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749438 2025-07-15 01:00:36,663 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749439_8615 src: /192.168.158.1:39394 dest: /192.168.158.4:9866 2025-07-15 01:00:36,697 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39394, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_759637612_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749439_8615, duration(ns): 25161854 2025-07-15 01:00:36,697 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749439_8615, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-15 01:00:42,429 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749439_8615 replica FinalizedReplica, blk_1073749439_8615, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749439 for deletion 2025-07-15 01:00:42,430 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749439_8615 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749439 2025-07-15 01:05:51,675 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749444_8620 src: /192.168.158.6:51804 dest: /192.168.158.4:9866 2025-07-15 01:05:51,700 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:51804, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1286612480_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749444_8620, duration(ns): 19369797 2025-07-15 01:05:51,700 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749444_8620, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-15 01:05:54,438 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749444_8620 replica FinalizedReplica, blk_1073749444_8620, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749444 for deletion 2025-07-15 01:05:54,439 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749444_8620 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749444 2025-07-15 01:07:56,677 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749446_8622 src: /192.168.158.1:36032 dest: /192.168.158.4:9866 2025-07-15 01:07:56,709 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36032, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-284608487_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749446_8622, duration(ns): 22625832 2025-07-15 01:07:56,709 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749446_8622, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-15 01:08:00,444 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749446_8622 replica FinalizedReplica, blk_1073749446_8622, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749446 for deletion 2025-07-15 01:08:00,445 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749446_8622 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749446 2025-07-15 01:10:01,687 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749448_8624 src: /192.168.158.5:43364 dest: /192.168.158.4:9866 2025-07-15 01:10:01,711 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:43364, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-195384758_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749448_8624, duration(ns): 18761815 2025-07-15 01:10:01,712 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749448_8624, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-15 01:10:06,447 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749448_8624 replica FinalizedReplica, blk_1073749448_8624, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749448 for deletion 2025-07-15 01:10:06,448 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749448_8624 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749448 2025-07-15 01:12:01,696 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749450_8626 src: /192.168.158.9:47424 dest: /192.168.158.4:9866 2025-07-15 01:12:01,722 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:47424, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1168233018_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749450_8626, duration(ns): 20938608 2025-07-15 01:12:01,723 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749450_8626, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-15 01:12:03,450 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749450_8626 replica FinalizedReplica, blk_1073749450_8626, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749450 for deletion 2025-07-15 01:12:03,451 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749450_8626 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749450 2025-07-15 01:13:01,702 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749451_8627 src: /192.168.158.6:42712 dest: /192.168.158.4:9866 2025-07-15 01:13:01,721 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:42712, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1008616025_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749451_8627, duration(ns): 16674590 2025-07-15 01:13:01,721 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749451_8627, type=LAST_IN_PIPELINE terminating 2025-07-15 01:13:06,453 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749451_8627 replica FinalizedReplica, blk_1073749451_8627, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749451 for deletion 2025-07-15 01:13:06,455 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749451_8627 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749451 2025-07-15 01:17:06,706 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749455_8631 src: /192.168.158.7:43232 dest: /192.168.158.4:9866 2025-07-15 01:17:06,730 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:43232, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1172253246_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749455_8631, duration(ns): 19746805 2025-07-15 01:17:06,731 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749455_8631, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-15 01:17:12,460 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749455_8631 replica FinalizedReplica, blk_1073749455_8631, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749455 for deletion 2025-07-15 01:17:12,462 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749455_8631 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749455 2025-07-15 01:18:06,695 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749456_8632 src: /192.168.158.1:43334 dest: /192.168.158.4:9866 2025-07-15 01:18:06,728 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43334, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1860673684_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749456_8632, duration(ns): 23912104 2025-07-15 01:18:06,729 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749456_8632, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-15 01:18:09,461 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749456_8632 replica FinalizedReplica, blk_1073749456_8632, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749456 for deletion 2025-07-15 01:18:09,462 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749456_8632 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749456 2025-07-15 01:19:06,701 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749457_8633 src: /192.168.158.7:59222 dest: /192.168.158.4:9866 2025-07-15 01:19:06,726 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:59222, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1214910703_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749457_8633, duration(ns): 20015872 2025-07-15 01:19:06,727 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749457_8633, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-15 01:19:09,463 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749457_8633 replica FinalizedReplica, blk_1073749457_8633, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749457 for deletion 2025-07-15 01:19:09,464 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749457_8633 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749457 2025-07-15 01:21:06,702 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749459_8635 src: /192.168.158.7:55332 dest: /192.168.158.4:9866 2025-07-15 01:21:06,730 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:55332, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_839946911_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749459_8635, duration(ns): 22854030 2025-07-15 01:21:06,730 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749459_8635, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-15 01:21:12,464 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749459_8635 replica FinalizedReplica, blk_1073749459_8635, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749459 for deletion 2025-07-15 01:21:12,465 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749459_8635 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749459 2025-07-15 01:22:11,708 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749460_8636 src: /192.168.158.5:46038 dest: /192.168.158.4:9866 2025-07-15 01:22:11,725 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:46038, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1663023302_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749460_8636, duration(ns): 14732863 2025-07-15 01:22:11,726 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749460_8636, type=LAST_IN_PIPELINE terminating 2025-07-15 01:22:12,465 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749460_8636 replica FinalizedReplica, blk_1073749460_8636, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749460 for deletion 2025-07-15 01:22:12,466 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749460_8636 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749460 2025-07-15 01:25:16,709 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749463_8639 src: /192.168.158.8:54330 dest: /192.168.158.4:9866 2025-07-15 01:25:16,726 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:54330, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-204871303_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749463_8639, duration(ns): 15147188 2025-07-15 01:25:16,727 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749463_8639, type=LAST_IN_PIPELINE terminating 2025-07-15 01:25:21,468 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749463_8639 replica FinalizedReplica, blk_1073749463_8639, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749463 for deletion 2025-07-15 01:25:21,469 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749463_8639 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749463 2025-07-15 01:27:16,721 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749465_8641 src: /192.168.158.1:54104 dest: /192.168.158.4:9866 2025-07-15 01:27:16,754 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54104, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_833598033_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749465_8641, duration(ns): 23405206 2025-07-15 01:27:16,754 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749465_8641, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-15 01:27:18,471 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749465_8641 replica FinalizedReplica, blk_1073749465_8641, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749465 for deletion 2025-07-15 01:27:18,472 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749465_8641 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749465 2025-07-15 01:28:21,711 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749466_8642 src: /192.168.158.5:55930 dest: /192.168.158.4:9866 2025-07-15 01:28:21,737 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:55930, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1822263890_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749466_8642, duration(ns): 20308268 2025-07-15 01:28:21,737 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749466_8642, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-15 01:28:24,472 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749466_8642 replica FinalizedReplica, blk_1073749466_8642, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749466 for deletion 2025-07-15 01:28:24,473 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749466_8642 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749466 2025-07-15 01:31:26,743 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749469_8645 src: /192.168.158.7:59428 dest: /192.168.158.4:9866 2025-07-15 01:31:26,762 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:59428, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_521647481_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749469_8645, duration(ns): 17717070 2025-07-15 01:31:26,763 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749469_8645, type=LAST_IN_PIPELINE terminating 2025-07-15 01:31:27,475 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749469_8645 replica FinalizedReplica, blk_1073749469_8645, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749469 for deletion 2025-07-15 01:31:27,476 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749469_8645 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749469 2025-07-15 01:34:26,726 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749472_8648 src: /192.168.158.1:35442 dest: /192.168.158.4:9866 2025-07-15 01:34:26,755 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35442, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-239029448_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749472_8648, duration(ns): 20667604 2025-07-15 01:34:26,756 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749472_8648, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-15 01:34:27,480 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749472_8648 replica FinalizedReplica, blk_1073749472_8648, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749472 for deletion 2025-07-15 01:34:27,481 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749472_8648 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749472 2025-07-15 01:38:31,734 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749476_8652 src: /192.168.158.6:36112 dest: /192.168.158.4:9866 2025-07-15 01:38:31,761 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:36112, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_141053345_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749476_8652, duration(ns): 21387013 2025-07-15 01:38:31,762 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749476_8652, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-15 01:38:33,486 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749476_8652 replica FinalizedReplica, blk_1073749476_8652, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749476 for deletion 2025-07-15 01:38:33,487 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749476_8652 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749476 2025-07-15 01:42:36,777 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749480_8656 src: /192.168.158.7:38674 dest: /192.168.158.4:9866 2025-07-15 01:42:36,795 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:38674, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_201682298_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749480_8656, duration(ns): 16148314 2025-07-15 01:42:36,795 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749480_8656, type=LAST_IN_PIPELINE terminating 2025-07-15 01:42:39,491 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749480_8656 replica FinalizedReplica, blk_1073749480_8656, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749480 for deletion 2025-07-15 01:42:39,492 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749480_8656 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749480 2025-07-15 01:43:36,732 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749481_8657 src: /192.168.158.8:49494 dest: /192.168.158.4:9866 2025-07-15 01:43:36,757 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:49494, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2019753258_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749481_8657, duration(ns): 19165460 2025-07-15 01:43:36,757 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749481_8657, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-15 01:43:39,491 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749481_8657 replica FinalizedReplica, blk_1073749481_8657, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749481 for deletion 2025-07-15 01:43:39,492 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749481_8657 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749481 2025-07-15 01:45:36,723 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749483_8659 src: /192.168.158.6:33760 dest: /192.168.158.4:9866 2025-07-15 01:45:36,747 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:33760, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-184642729_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749483_8659, duration(ns): 18731766 2025-07-15 01:45:36,748 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749483_8659, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-15 01:45:42,493 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749483_8659 replica FinalizedReplica, blk_1073749483_8659, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749483 for deletion 2025-07-15 01:45:42,495 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749483_8659 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749483 2025-07-15 01:47:41,748 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749485_8661 src: /192.168.158.8:44798 dest: /192.168.158.4:9866 2025-07-15 01:47:41,766 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:44798, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_466155758_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749485_8661, duration(ns): 15852317 2025-07-15 01:47:41,767 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749485_8661, type=LAST_IN_PIPELINE terminating 2025-07-15 01:47:42,497 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749485_8661 replica FinalizedReplica, blk_1073749485_8661, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749485 for deletion 2025-07-15 01:47:42,498 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749485_8661 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749485 2025-07-15 01:48:46,738 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749486_8662 src: /192.168.158.1:45762 dest: /192.168.158.4:9866 2025-07-15 01:48:46,768 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45762, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1468780862_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749486_8662, duration(ns): 21292024 2025-07-15 01:48:46,768 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749486_8662, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-15 01:48:48,499 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749486_8662 replica FinalizedReplica, blk_1073749486_8662, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749486 for deletion 2025-07-15 01:48:48,500 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749486_8662 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749486 2025-07-15 01:49:46,768 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749487_8663 src: /192.168.158.8:54020 dest: /192.168.158.4:9866 2025-07-15 01:49:46,785 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:54020, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1384994584_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749487_8663, duration(ns): 15187124 2025-07-15 01:49:46,786 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749487_8663, type=LAST_IN_PIPELINE terminating 2025-07-15 01:49:51,499 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749487_8663 replica FinalizedReplica, blk_1073749487_8663, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749487 for deletion 2025-07-15 01:49:51,501 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749487_8663 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749487 2025-07-15 01:52:46,746 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749490_8666 src: /192.168.158.7:47062 dest: /192.168.158.4:9866 2025-07-15 01:52:46,771 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:47062, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-714246963_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749490_8666, duration(ns): 19589946 2025-07-15 01:52:46,771 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749490_8666, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-15 01:52:48,509 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749490_8666 replica FinalizedReplica, blk_1073749490_8666, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749490 for deletion 2025-07-15 01:52:48,510 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749490_8666 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749490 2025-07-15 01:53:46,746 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749491_8667 src: /192.168.158.1:59500 dest: /192.168.158.4:9866 2025-07-15 01:53:46,779 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59500, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1063650010_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749491_8667, duration(ns): 23906579 2025-07-15 01:53:46,779 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749491_8667, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-15 01:53:48,509 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749491_8667 replica FinalizedReplica, blk_1073749491_8667, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749491 for deletion 2025-07-15 01:53:48,510 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749491_8667 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749491 2025-07-15 01:54:46,758 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749492_8668 src: /192.168.158.9:58764 dest: /192.168.158.4:9866 2025-07-15 01:54:46,785 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:58764, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2034522883_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749492_8668, duration(ns): 20697755 2025-07-15 01:54:46,785 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749492_8668, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-15 01:54:48,510 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749492_8668 replica FinalizedReplica, blk_1073749492_8668, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749492 for deletion 2025-07-15 01:54:48,512 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749492_8668 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749492 2025-07-15 01:56:46,758 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749494_8670 src: /192.168.158.5:43940 dest: /192.168.158.4:9866 2025-07-15 01:56:46,775 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:43940, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1159540993_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749494_8670, duration(ns): 15315780 2025-07-15 01:56:46,775 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749494_8670, type=LAST_IN_PIPELINE terminating 2025-07-15 01:56:48,512 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749494_8670 replica FinalizedReplica, blk_1073749494_8670, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749494 for deletion 2025-07-15 01:56:48,513 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749494_8670 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749494 2025-07-15 01:59:51,760 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749497_8673 src: /192.168.158.1:52958 dest: /192.168.158.4:9866 2025-07-15 01:59:51,792 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52958, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-774604015_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749497_8673, duration(ns): 23975356 2025-07-15 01:59:51,792 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749497_8673, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-15 01:59:57,515 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749497_8673 replica FinalizedReplica, blk_1073749497_8673, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749497 for deletion 2025-07-15 01:59:57,516 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749497_8673 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749497 2025-07-15 02:00:51,763 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749498_8674 src: /192.168.158.1:57634 dest: /192.168.158.4:9866 2025-07-15 02:00:51,795 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57634, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_309188144_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749498_8674, duration(ns): 22061341 2025-07-15 02:00:51,795 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749498_8674, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-15 02:00:57,515 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749498_8674 replica FinalizedReplica, blk_1073749498_8674, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749498 for deletion 2025-07-15 02:00:57,516 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749498_8674 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749498 2025-07-15 02:01:51,768 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749499_8675 src: /192.168.158.1:46984 dest: /192.168.158.4:9866 2025-07-15 02:01:51,798 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46984, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-613387532_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749499_8675, duration(ns): 21371921 2025-07-15 02:01:51,798 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749499_8675, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-15 02:01:54,515 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749499_8675 replica FinalizedReplica, blk_1073749499_8675, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749499 for deletion 2025-07-15 02:01:54,516 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749499_8675 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749499 2025-07-15 02:05:06,788 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749502_8678 src: /192.168.158.6:44618 dest: /192.168.158.4:9866 2025-07-15 02:05:06,812 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:44618, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2122110029_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749502_8678, duration(ns): 18384114 2025-07-15 02:05:06,812 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749502_8678, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-15 02:05:12,519 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749502_8678 replica FinalizedReplica, blk_1073749502_8678, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749502 for deletion 2025-07-15 02:05:12,521 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749502_8678 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073749502 2025-07-15 02:07:11,785 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749504_8680 src: /192.168.158.9:34948 dest: /192.168.158.4:9866 2025-07-15 02:07:11,801 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:34948, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_573078316_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749504_8680, duration(ns): 13291707 2025-07-15 02:07:11,801 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749504_8680, type=LAST_IN_PIPELINE terminating 2025-07-15 02:07:12,521 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749504_8680 replica FinalizedReplica, blk_1073749504_8680, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749504 for deletion 2025-07-15 02:07:12,522 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749504_8680 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749504 2025-07-15 02:10:11,788 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749507_8683 src: /192.168.158.1:54802 dest: /192.168.158.4:9866 2025-07-15 02:10:11,818 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54802, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1432266889_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749507_8683, duration(ns): 20929199 2025-07-15 02:10:11,818 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749507_8683, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-15 02:10:15,526 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749507_8683 replica FinalizedReplica, blk_1073749507_8683, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749507 for deletion 2025-07-15 02:10:15,527 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749507_8683 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749507 2025-07-15 02:13:16,901 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749510_8686 src: /192.168.158.9:35768 dest: /192.168.158.4:9866 2025-07-15 02:13:16,920 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:35768, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1545162251_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749510_8686, duration(ns): 16832179 2025-07-15 02:13:16,921 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749510_8686, type=LAST_IN_PIPELINE terminating 2025-07-15 02:13:18,532 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749510_8686 replica FinalizedReplica, blk_1073749510_8686, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749510 for deletion 2025-07-15 02:13:18,533 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749510_8686 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749510 2025-07-15 02:15:16,800 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749512_8688 src: /192.168.158.1:43066 dest: /192.168.158.4:9866 2025-07-15 02:15:16,833 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43066, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1571859711_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749512_8688, duration(ns): 23543960 2025-07-15 02:15:16,833 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749512_8688, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-15 02:15:18,536 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749512_8688 replica FinalizedReplica, blk_1073749512_8688, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749512 for deletion 2025-07-15 02:15:18,537 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749512_8688 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749512 2025-07-15 02:16:21,802 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749513_8689 src: /192.168.158.9:48530 dest: /192.168.158.4:9866 2025-07-15 02:16:21,819 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:48530, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1667366690_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749513_8689, duration(ns): 15567181 2025-07-15 02:16:21,820 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749513_8689, type=LAST_IN_PIPELINE terminating 2025-07-15 02:16:24,537 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749513_8689 replica FinalizedReplica, blk_1073749513_8689, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749513 for deletion 2025-07-15 02:16:24,538 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749513_8689 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749513 2025-07-15 02:19:26,805 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749516_8692 src: /192.168.158.6:59110 dest: /192.168.158.4:9866 2025-07-15 02:19:26,823 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:59110, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_209429700_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749516_8692, duration(ns): 15466922 2025-07-15 02:19:26,824 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749516_8692, type=LAST_IN_PIPELINE terminating 2025-07-15 02:19:27,543 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749516_8692 replica FinalizedReplica, blk_1073749516_8692, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749516 for deletion 2025-07-15 02:19:27,544 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749516_8692 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749516 2025-07-15 02:20:26,814 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749517_8693 src: /192.168.158.9:58926 dest: /192.168.158.4:9866 2025-07-15 02:20:26,840 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:58926, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2082031955_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749517_8693, duration(ns): 21320262 2025-07-15 02:20:26,841 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749517_8693, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-15 02:20:27,545 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749517_8693 replica FinalizedReplica, blk_1073749517_8693, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749517 for deletion 2025-07-15 02:20:27,546 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749517_8693 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749517 2025-07-15 02:21:26,816 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749518_8694 src: /192.168.158.8:50226 dest: /192.168.158.4:9866 2025-07-15 02:21:26,837 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:50226, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1224655904_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749518_8694, duration(ns): 15783131 2025-07-15 02:21:26,837 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749518_8694, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-15 02:21:30,548 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749518_8694 replica FinalizedReplica, blk_1073749518_8694, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749518 for deletion 2025-07-15 02:21:30,549 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749518_8694 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749518 2025-07-15 02:23:31,813 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749520_8696 src: /192.168.158.6:48662 dest: /192.168.158.4:9866 2025-07-15 02:23:31,838 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:48662, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2045333335_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749520_8696, duration(ns): 18717956 2025-07-15 02:23:31,838 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749520_8696, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-15 02:23:33,555 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749520_8696 replica FinalizedReplica, blk_1073749520_8696, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749520 for deletion 2025-07-15 02:23:33,556 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749520_8696 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749520 2025-07-15 02:24:31,822 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749521_8697 src: /192.168.158.8:44032 dest: /192.168.158.4:9866 2025-07-15 02:24:31,839 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:44032, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_687478590_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749521_8697, duration(ns): 15144827 2025-07-15 02:24:31,839 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749521_8697, type=LAST_IN_PIPELINE terminating 2025-07-15 02:24:36,554 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749521_8697 replica FinalizedReplica, blk_1073749521_8697, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749521 for deletion 2025-07-15 02:24:36,556 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749521_8697 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749521 2025-07-15 02:25:36,809 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749522_8698 src: /192.168.158.1:52296 dest: /192.168.158.4:9866 2025-07-15 02:25:36,839 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52296, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-674478795_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749522_8698, duration(ns): 21633262 2025-07-15 02:25:36,840 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749522_8698, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-15 02:25:42,557 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749522_8698 replica FinalizedReplica, blk_1073749522_8698, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749522 for deletion 2025-07-15 02:25:42,558 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749522_8698 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749522 2025-07-15 02:27:41,802 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749524_8700 src: /192.168.158.1:47442 dest: /192.168.158.4:9866 2025-07-15 02:27:41,833 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47442, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1685604999_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749524_8700, duration(ns): 22313654 2025-07-15 02:27:41,833 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749524_8700, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-15 02:27:45,565 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749524_8700 replica FinalizedReplica, blk_1073749524_8700, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749524 for deletion 2025-07-15 02:27:45,566 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749524_8700 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749524 2025-07-15 02:28:41,809 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749525_8701 src: /192.168.158.6:45016 dest: /192.168.158.4:9866 2025-07-15 02:28:41,827 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:45016, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1652504630_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749525_8701, duration(ns): 15902868 2025-07-15 02:28:41,827 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749525_8701, type=LAST_IN_PIPELINE terminating 2025-07-15 02:28:42,568 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749525_8701 replica FinalizedReplica, blk_1073749525_8701, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749525 for deletion 2025-07-15 02:28:42,569 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749525_8701 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749525 2025-07-15 02:29:41,804 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749526_8702 src: /192.168.158.9:51328 dest: /192.168.158.4:9866 2025-07-15 02:29:41,829 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:51328, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1901537053_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749526_8702, duration(ns): 19903987 2025-07-15 02:29:41,829 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749526_8702, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-15 02:29:45,571 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749526_8702 replica FinalizedReplica, blk_1073749526_8702, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749526 for deletion 2025-07-15 02:29:45,572 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749526_8702 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749526 2025-07-15 02:30:46,811 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749527_8703 src: /192.168.158.9:49770 dest: /192.168.158.4:9866 2025-07-15 02:30:46,829 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:49770, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-212564292_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749527_8703, duration(ns): 16671192 2025-07-15 02:30:46,830 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749527_8703, type=LAST_IN_PIPELINE terminating 2025-07-15 02:30:48,574 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749527_8703 replica FinalizedReplica, blk_1073749527_8703, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749527 for deletion 2025-07-15 02:30:48,575 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749527_8703 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749527 2025-07-15 02:31:46,825 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749528_8704 src: /192.168.158.5:36584 dest: /192.168.158.4:9866 2025-07-15 02:31:46,842 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:36584, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-775854679_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749528_8704, duration(ns): 15081393 2025-07-15 02:31:46,842 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749528_8704, type=LAST_IN_PIPELINE terminating 2025-07-15 02:31:48,577 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749528_8704 replica FinalizedReplica, blk_1073749528_8704, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749528 for deletion 2025-07-15 02:31:48,578 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749528_8704 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749528 2025-07-15 02:32:46,827 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749529_8705 src: /192.168.158.1:34648 dest: /192.168.158.4:9866 2025-07-15 02:32:46,864 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34648, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-163310666_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749529_8705, duration(ns): 26996971 2025-07-15 02:32:46,864 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749529_8705, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-15 02:32:51,579 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749529_8705 replica FinalizedReplica, blk_1073749529_8705, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749529 for deletion 2025-07-15 02:32:51,580 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749529_8705 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749529 2025-07-15 02:33:51,819 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749530_8706 src: /192.168.158.1:42932 dest: /192.168.158.4:9866 2025-07-15 02:33:51,853 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42932, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1376454657_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749530_8706, duration(ns): 24882183 2025-07-15 02:33:51,853 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749530_8706, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-15 02:33:57,581 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749530_8706 replica FinalizedReplica, blk_1073749530_8706, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749530 for deletion 2025-07-15 02:33:57,582 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749530_8706 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749530 2025-07-15 02:35:51,820 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749532_8708 src: /192.168.158.5:41774 dest: /192.168.158.4:9866 2025-07-15 02:35:51,840 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:41774, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1063420614_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749532_8708, duration(ns): 17206125 2025-07-15 02:35:51,840 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749532_8708, type=LAST_IN_PIPELINE terminating 2025-07-15 02:35:57,584 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749532_8708 replica FinalizedReplica, blk_1073749532_8708, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749532 for deletion 2025-07-15 02:35:57,585 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749532_8708 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749532 2025-07-15 02:36:51,848 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749533_8709 src: /192.168.158.1:48794 dest: /192.168.158.4:9866 2025-07-15 02:36:51,879 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48794, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_29840152_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749533_8709, duration(ns): 23062878 2025-07-15 02:36:51,880 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749533_8709, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-15 02:36:57,584 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749533_8709 replica FinalizedReplica, blk_1073749533_8709, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749533 for deletion 2025-07-15 02:36:57,585 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749533_8709 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749533 2025-07-15 02:37:51,831 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749534_8710 src: /192.168.158.1:45366 dest: /192.168.158.4:9866 2025-07-15 02:37:51,864 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45366, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_268839679_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749534_8710, duration(ns): 24069135 2025-07-15 02:37:51,864 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749534_8710, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-15 02:37:57,589 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749534_8710 replica FinalizedReplica, blk_1073749534_8710, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749534 for deletion 2025-07-15 02:37:57,590 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749534_8710 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749534 2025-07-15 02:38:56,832 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749535_8711 src: /192.168.158.7:49352 dest: /192.168.158.4:9866 2025-07-15 02:38:56,849 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:49352, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-494533784_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749535_8711, duration(ns): 14336983 2025-07-15 02:38:56,849 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749535_8711, type=LAST_IN_PIPELINE terminating 2025-07-15 02:39:00,590 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749535_8711 replica FinalizedReplica, blk_1073749535_8711, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749535 for deletion 2025-07-15 02:39:00,591 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749535_8711 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749535 2025-07-15 02:39:56,823 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749536_8712 src: /192.168.158.5:56156 dest: /192.168.158.4:9866 2025-07-15 02:39:56,842 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:56156, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-90414162_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749536_8712, duration(ns): 16900560 2025-07-15 02:39:56,843 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749536_8712, type=LAST_IN_PIPELINE terminating 2025-07-15 02:39:57,592 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749536_8712 replica FinalizedReplica, blk_1073749536_8712, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749536 for deletion 2025-07-15 02:39:57,594 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749536_8712 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749536 2025-07-15 02:41:01,838 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749537_8713 src: /192.168.158.7:55874 dest: /192.168.158.4:9866 2025-07-15 02:41:01,857 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:55874, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_243847138_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749537_8713, duration(ns): 16028438 2025-07-15 02:41:01,857 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749537_8713, type=LAST_IN_PIPELINE terminating 2025-07-15 02:41:03,598 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749537_8713 replica FinalizedReplica, blk_1073749537_8713, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749537 for deletion 2025-07-15 02:41:03,599 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749537_8713 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749537 2025-07-15 02:44:01,825 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749540_8716 src: /192.168.158.9:44336 dest: /192.168.158.4:9866 2025-07-15 02:44:01,849 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:44336, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-475701266_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749540_8716, duration(ns): 18453707 2025-07-15 02:44:01,849 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749540_8716, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-15 02:44:06,607 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749540_8716 replica FinalizedReplica, blk_1073749540_8716, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749540 for deletion 2025-07-15 02:44:06,608 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749540_8716 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749540 2025-07-15 02:45:06,830 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749541_8717 src: /192.168.158.1:58192 dest: /192.168.158.4:9866 2025-07-15 02:45:06,863 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58192, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1042414474_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749541_8717, duration(ns): 24238989 2025-07-15 02:45:06,864 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749541_8717, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-15 02:45:09,609 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749541_8717 replica FinalizedReplica, blk_1073749541_8717, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749541 for deletion 2025-07-15 02:45:09,610 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749541_8717 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749541 2025-07-15 02:47:11,835 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749543_8719 src: /192.168.158.8:48740 dest: /192.168.158.4:9866 2025-07-15 02:47:11,863 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:48740, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1165412068_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749543_8719, duration(ns): 22040986 2025-07-15 02:47:11,863 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749543_8719, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-15 02:47:15,610 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749543_8719 replica FinalizedReplica, blk_1073749543_8719, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749543 for deletion 2025-07-15 02:47:15,611 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749543_8719 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749543 2025-07-15 02:48:11,829 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749544_8720 src: /192.168.158.7:45638 dest: /192.168.158.4:9866 2025-07-15 02:48:11,852 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:45638, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1468271802_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749544_8720, duration(ns): 18461761 2025-07-15 02:48:11,853 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749544_8720, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-15 02:48:12,611 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749544_8720 replica FinalizedReplica, blk_1073749544_8720, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749544 for deletion 2025-07-15 02:48:12,612 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749544_8720 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749544 2025-07-15 02:51:11,851 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749547_8723 src: /192.168.158.6:54146 dest: /192.168.158.4:9866 2025-07-15 02:51:11,869 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:54146, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-894432385_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749547_8723, duration(ns): 15529387 2025-07-15 02:51:11,869 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749547_8723, type=LAST_IN_PIPELINE terminating 2025-07-15 02:51:15,615 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749547_8723 replica FinalizedReplica, blk_1073749547_8723, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749547 for deletion 2025-07-15 02:51:15,616 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749547_8723 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749547 2025-07-15 02:52:11,846 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749548_8724 src: /192.168.158.5:47440 dest: /192.168.158.4:9866 2025-07-15 02:52:11,871 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:47440, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1273183729_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749548_8724, duration(ns): 19693950 2025-07-15 02:52:11,871 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749548_8724, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-15 02:52:12,620 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749548_8724 replica FinalizedReplica, blk_1073749548_8724, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749548 for deletion 2025-07-15 02:52:12,621 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749548_8724 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749548 2025-07-15 02:54:16,885 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749550_8726 src: /192.168.158.1:43412 dest: /192.168.158.4:9866 2025-07-15 02:54:16,918 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43412, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1620438344_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749550_8726, duration(ns): 24301742 2025-07-15 02:54:16,918 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749550_8726, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-15 02:54:21,621 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749550_8726 replica FinalizedReplica, blk_1073749550_8726, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749550 for deletion 2025-07-15 02:54:21,622 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749550_8726 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749550 2025-07-15 02:55:16,860 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749551_8727 src: /192.168.158.1:51470 dest: /192.168.158.4:9866 2025-07-15 02:55:16,891 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51470, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2116860766_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749551_8727, duration(ns): 22484988 2025-07-15 02:55:16,891 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749551_8727, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-15 02:55:21,622 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749551_8727 replica FinalizedReplica, blk_1073749551_8727, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749551 for deletion 2025-07-15 02:55:21,624 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749551_8727 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749551 2025-07-15 02:57:16,827 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749553_8729 src: /192.168.158.1:47622 dest: /192.168.158.4:9866 2025-07-15 02:57:16,860 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47622, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1235902832_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749553_8729, duration(ns): 23530444 2025-07-15 02:57:16,861 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749553_8729, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-15 02:57:18,625 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749553_8729 replica FinalizedReplica, blk_1073749553_8729, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749553 for deletion 2025-07-15 02:57:18,627 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749553_8729 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749553 2025-07-15 02:58:21,866 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749554_8730 src: /192.168.158.7:33864 dest: /192.168.158.4:9866 2025-07-15 02:58:21,884 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:33864, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1892544869_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749554_8730, duration(ns): 15223717 2025-07-15 02:58:21,884 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749554_8730, type=LAST_IN_PIPELINE terminating 2025-07-15 02:58:27,627 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749554_8730 replica FinalizedReplica, blk_1073749554_8730, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749554 for deletion 2025-07-15 02:58:27,629 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749554_8730 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749554 2025-07-15 02:59:26,873 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749555_8731 src: /192.168.158.8:41532 dest: /192.168.158.4:9866 2025-07-15 02:59:26,897 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:41532, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1756596895_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749555_8731, duration(ns): 18782373 2025-07-15 02:59:26,898 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749555_8731, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-15 02:59:30,630 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749555_8731 replica FinalizedReplica, blk_1073749555_8731, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749555 for deletion 2025-07-15 02:59:30,631 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749555_8731 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749555 2025-07-15 03:00:31,863 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749556_8732 src: /192.168.158.6:35496 dest: /192.168.158.4:9866 2025-07-15 03:00:31,887 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:35496, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1357898901_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749556_8732, duration(ns): 19208610 2025-07-15 03:00:31,888 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749556_8732, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-15 03:00:33,632 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749556_8732 replica FinalizedReplica, blk_1073749556_8732, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749556 for deletion 2025-07-15 03:00:33,633 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749556_8732 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749556 2025-07-15 03:01:31,858 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749557_8733 src: /192.168.158.9:44312 dest: /192.168.158.4:9866 2025-07-15 03:01:31,884 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:44312, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-132890984_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749557_8733, duration(ns): 20880038 2025-07-15 03:01:31,884 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749557_8733, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-15 03:01:36,635 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749557_8733 replica FinalizedReplica, blk_1073749557_8733, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749557 for deletion 2025-07-15 03:01:36,636 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749557_8733 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749557 2025-07-15 03:05:31,884 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749561_8737 src: /192.168.158.6:44390 dest: /192.168.158.4:9866 2025-07-15 03:05:31,903 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:44390, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1254851763_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749561_8737, duration(ns): 16354400 2025-07-15 03:05:31,903 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749561_8737, type=LAST_IN_PIPELINE terminating 2025-07-15 03:05:33,640 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749561_8737 replica FinalizedReplica, blk_1073749561_8737, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749561 for deletion 2025-07-15 03:05:33,641 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749561_8737 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749561 2025-07-15 03:07:36,880 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749563_8739 src: /192.168.158.1:46840 dest: /192.168.158.4:9866 2025-07-15 03:07:36,912 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46840, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-792328755_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749563_8739, duration(ns): 23671326 2025-07-15 03:07:36,913 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749563_8739, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-15 03:07:42,645 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749563_8739 replica FinalizedReplica, blk_1073749563_8739, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749563 for deletion 2025-07-15 03:07:42,646 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749563_8739 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749563 2025-07-15 03:09:36,892 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749565_8741 src: /192.168.158.1:34396 dest: /192.168.158.4:9866 2025-07-15 03:09:36,923 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34396, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-976359257_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749565_8741, duration(ns): 22235895 2025-07-15 03:09:36,923 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749565_8741, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-15 03:09:42,650 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749565_8741 replica FinalizedReplica, blk_1073749565_8741, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749565 for deletion 2025-07-15 03:09:42,651 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749565_8741 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749565 2025-07-15 03:10:36,903 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749566_8742 src: /192.168.158.1:38064 dest: /192.168.158.4:9866 2025-07-15 03:10:36,936 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38064, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1518490139_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749566_8742, duration(ns): 23568192 2025-07-15 03:10:36,936 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749566_8742, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-15 03:10:42,653 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749566_8742 replica FinalizedReplica, blk_1073749566_8742, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749566 for deletion 2025-07-15 03:10:42,654 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749566_8742 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749566 2025-07-15 03:12:36,902 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749568_8744 src: /192.168.158.9:54054 dest: /192.168.158.4:9866 2025-07-15 03:12:36,930 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:54054, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2074244256_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749568_8744, duration(ns): 21943369 2025-07-15 03:12:36,930 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749568_8744, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-15 03:12:42,656 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749568_8744 replica FinalizedReplica, blk_1073749568_8744, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749568 for deletion 2025-07-15 03:12:42,657 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749568_8744 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749568 2025-07-15 03:17:41,909 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749573_8749 src: /192.168.158.1:36738 dest: /192.168.158.4:9866 2025-07-15 03:17:41,940 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36738, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-630738440_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749573_8749, duration(ns): 22802363 2025-07-15 03:17:41,940 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749573_8749, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-15 03:17:45,667 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749573_8749 replica FinalizedReplica, blk_1073749573_8749, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749573 for deletion 2025-07-15 03:17:45,668 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749573_8749 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749573 2025-07-15 03:18:41,917 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749574_8750 src: /192.168.158.1:45396 dest: /192.168.158.4:9866 2025-07-15 03:18:41,952 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45396, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-431795427_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749574_8750, duration(ns): 25669344 2025-07-15 03:18:41,952 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749574_8750, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-15 03:18:42,667 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749574_8750 replica FinalizedReplica, blk_1073749574_8750, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749574 for deletion 2025-07-15 03:18:42,668 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749574_8750 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749574 2025-07-15 03:20:46,922 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749576_8752 src: /192.168.158.5:54880 dest: /192.168.158.4:9866 2025-07-15 03:20:46,946 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:54880, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1148321244_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749576_8752, duration(ns): 18669392 2025-07-15 03:20:46,946 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749576_8752, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-15 03:20:51,671 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749576_8752 replica FinalizedReplica, blk_1073749576_8752, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749576 for deletion 2025-07-15 03:20:51,672 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749576_8752 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749576 2025-07-15 03:22:46,925 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749578_8754 src: /192.168.158.6:45852 dest: /192.168.158.4:9866 2025-07-15 03:22:46,951 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:45852, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1579684182_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749578_8754, duration(ns): 20092702 2025-07-15 03:22:46,951 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749578_8754, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-15 03:22:48,675 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749578_8754 replica FinalizedReplica, blk_1073749578_8754, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749578 for deletion 2025-07-15 03:22:48,676 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749578_8754 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749578 2025-07-15 03:25:46,921 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749581_8757 src: /192.168.158.7:59752 dest: /192.168.158.4:9866 2025-07-15 03:25:46,940 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:59752, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1278421469_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749581_8757, duration(ns): 17604594 2025-07-15 03:25:46,940 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749581_8757, type=LAST_IN_PIPELINE terminating 2025-07-15 03:25:48,681 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749581_8757 replica FinalizedReplica, blk_1073749581_8757, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749581 for deletion 2025-07-15 03:25:48,682 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749581_8757 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749581 2025-07-15 03:27:56,918 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749583_8759 src: /192.168.158.6:54426 dest: /192.168.158.4:9866 2025-07-15 03:27:56,942 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:54426, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_838762431_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749583_8759, duration(ns): 18014184 2025-07-15 03:27:56,942 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749583_8759, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-15 03:28:00,685 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749583_8759 replica FinalizedReplica, blk_1073749583_8759, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749583 for deletion 2025-07-15 03:28:00,686 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749583_8759 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749583 2025-07-15 03:30:56,932 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749586_8762 src: /192.168.158.5:45530 dest: /192.168.158.4:9866 2025-07-15 03:30:56,951 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:45530, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1149006160_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749586_8762, duration(ns): 16645828 2025-07-15 03:30:56,951 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749586_8762, type=LAST_IN_PIPELINE terminating 2025-07-15 03:30:57,689 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749586_8762 replica FinalizedReplica, blk_1073749586_8762, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749586 for deletion 2025-07-15 03:30:57,690 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749586_8762 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749586 2025-07-15 03:33:01,933 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749588_8764 src: /192.168.158.8:60434 dest: /192.168.158.4:9866 2025-07-15 03:33:01,957 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:60434, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-224426425_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749588_8764, duration(ns): 18878215 2025-07-15 03:33:01,958 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749588_8764, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-15 03:33:03,691 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749588_8764 replica FinalizedReplica, blk_1073749588_8764, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749588 for deletion 2025-07-15 03:33:03,692 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749588_8764 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749588 2025-07-15 03:35:01,972 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749590_8766 src: /192.168.158.8:60656 dest: /192.168.158.4:9866 2025-07-15 03:35:01,995 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:60656, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-218317272_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749590_8766, duration(ns): 20523957 2025-07-15 03:35:01,995 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749590_8766, type=LAST_IN_PIPELINE terminating 2025-07-15 03:35:06,692 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749590_8766 replica FinalizedReplica, blk_1073749590_8766, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749590 for deletion 2025-07-15 03:35:06,694 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749590_8766 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749590 2025-07-15 03:39:11,949 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749594_8770 src: /192.168.158.1:58726 dest: /192.168.158.4:9866 2025-07-15 03:39:11,980 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58726, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1347389545_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749594_8770, duration(ns): 21802219 2025-07-15 03:39:11,980 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749594_8770, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-15 03:39:12,703 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749594_8770 replica FinalizedReplica, blk_1073749594_8770, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749594 for deletion 2025-07-15 03:39:12,704 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749594_8770 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749594 2025-07-15 03:41:11,967 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749596_8772 src: /192.168.158.1:44400 dest: /192.168.158.4:9866 2025-07-15 03:41:11,999 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44400, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_423462812_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749596_8772, duration(ns): 23125830 2025-07-15 03:41:11,999 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749596_8772, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-15 03:41:15,708 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749596_8772 replica FinalizedReplica, blk_1073749596_8772, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749596 for deletion 2025-07-15 03:41:15,709 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749596_8772 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749596 2025-07-15 03:43:11,987 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749598_8774 src: /192.168.158.9:34924 dest: /192.168.158.4:9866 2025-07-15 03:43:12,010 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:34924, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_217439062_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749598_8774, duration(ns): 17752777 2025-07-15 03:43:12,010 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749598_8774, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-15 03:43:12,709 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749598_8774 replica FinalizedReplica, blk_1073749598_8774, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749598 for deletion 2025-07-15 03:43:12,710 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749598_8774 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749598 2025-07-15 03:48:11,954 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749603_8779 src: /192.168.158.9:43176 dest: /192.168.158.4:9866 2025-07-15 03:48:11,980 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:43176, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1717810298_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749603_8779, duration(ns): 19960597 2025-07-15 03:48:11,980 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749603_8779, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-15 03:48:15,716 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749603_8779 replica FinalizedReplica, blk_1073749603_8779, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749603 for deletion 2025-07-15 03:48:15,717 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749603_8779 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749603 2025-07-15 03:50:16,961 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749605_8781 src: /192.168.158.1:49824 dest: /192.168.158.4:9866 2025-07-15 03:50:16,993 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49824, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-425878907_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749605_8781, duration(ns): 22490774 2025-07-15 03:50:16,993 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749605_8781, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-15 03:50:21,719 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749605_8781 replica FinalizedReplica, blk_1073749605_8781, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749605 for deletion 2025-07-15 03:50:21,720 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749605_8781 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749605 2025-07-15 03:51:16,966 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749606_8782 src: /192.168.158.6:52442 dest: /192.168.158.4:9866 2025-07-15 03:51:16,994 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:52442, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_734138355_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749606_8782, duration(ns): 22471046 2025-07-15 03:51:16,994 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749606_8782, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-15 03:51:24,720 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749606_8782 replica FinalizedReplica, blk_1073749606_8782, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749606 for deletion 2025-07-15 03:51:24,722 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749606_8782 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749606 2025-07-15 03:52:16,975 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749607_8783 src: /192.168.158.6:44618 dest: /192.168.158.4:9866 2025-07-15 03:52:16,994 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:44618, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-989241593_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749607_8783, duration(ns): 17216063 2025-07-15 03:52:16,994 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749607_8783, type=LAST_IN_PIPELINE terminating 2025-07-15 03:52:24,723 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749607_8783 replica FinalizedReplica, blk_1073749607_8783, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749607 for deletion 2025-07-15 03:52:24,724 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749607_8783 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749607 2025-07-15 03:53:16,963 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749608_8784 src: /192.168.158.6:52452 dest: /192.168.158.4:9866 2025-07-15 03:53:16,981 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:52452, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1980993666_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749608_8784, duration(ns): 15727232 2025-07-15 03:53:16,981 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749608_8784, type=LAST_IN_PIPELINE terminating 2025-07-15 03:53:21,723 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749608_8784 replica FinalizedReplica, blk_1073749608_8784, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749608 for deletion 2025-07-15 03:53:21,725 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749608_8784 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749608 2025-07-15 03:55:16,970 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749610_8786 src: /192.168.158.5:55600 dest: /192.168.158.4:9866 2025-07-15 03:55:16,989 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:55600, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-232151208_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749610_8786, duration(ns): 17018658 2025-07-15 03:55:16,989 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749610_8786, type=LAST_IN_PIPELINE terminating 2025-07-15 03:55:21,726 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749610_8786 replica FinalizedReplica, blk_1073749610_8786, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749610 for deletion 2025-07-15 03:55:21,727 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749610_8786 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749610 2025-07-15 04:00:16,978 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749615_8791 src: /192.168.158.1:35036 dest: /192.168.158.4:9866 2025-07-15 04:00:17,009 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35036, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1390023179_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749615_8791, duration(ns): 22068737 2025-07-15 04:00:17,009 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749615_8791, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-15 04:00:21,739 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749615_8791 replica FinalizedReplica, blk_1073749615_8791, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749615 for deletion 2025-07-15 04:00:21,740 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749615_8791 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749615 2025-07-15 04:01:16,982 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749616_8792 src: /192.168.158.7:53550 dest: /192.168.158.4:9866 2025-07-15 04:01:17,006 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:53550, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-781934852_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749616_8792, duration(ns): 18831883 2025-07-15 04:01:17,007 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749616_8792, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-15 04:01:24,744 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749616_8792 replica FinalizedReplica, blk_1073749616_8792, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749616 for deletion 2025-07-15 04:01:24,745 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749616_8792 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749616 2025-07-15 04:02:16,997 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749617_8793 src: /192.168.158.5:45144 dest: /192.168.158.4:9866 2025-07-15 04:02:17,014 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:45144, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-145382912_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749617_8793, duration(ns): 14856447 2025-07-15 04:02:17,014 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749617_8793, type=LAST_IN_PIPELINE terminating 2025-07-15 04:02:21,746 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749617_8793 replica FinalizedReplica, blk_1073749617_8793, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749617 for deletion 2025-07-15 04:02:21,747 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749617_8793 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749617 2025-07-15 04:04:16,991 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749619_8795 src: /192.168.158.6:51830 dest: /192.168.158.4:9866 2025-07-15 04:04:17,009 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:51830, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1248172480_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749619_8795, duration(ns): 16895921 2025-07-15 04:04:17,010 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749619_8795, type=LAST_IN_PIPELINE terminating 2025-07-15 04:04:21,752 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749619_8795 replica FinalizedReplica, blk_1073749619_8795, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749619 for deletion 2025-07-15 04:04:21,753 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749619_8795 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749619 2025-07-15 04:08:16,999 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749623_8799 src: /192.168.158.5:44886 dest: /192.168.158.4:9866 2025-07-15 04:08:17,022 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:44886, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1649432095_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749623_8799, duration(ns): 17995929 2025-07-15 04:08:17,022 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749623_8799, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-15 04:08:21,764 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749623_8799 replica FinalizedReplica, blk_1073749623_8799, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749623 for deletion 2025-07-15 04:08:21,765 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749623_8799 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749623 2025-07-15 04:10:17,007 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749625_8801 src: /192.168.158.5:50736 dest: /192.168.158.4:9866 2025-07-15 04:10:17,035 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:50736, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-623093624_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749625_8801, duration(ns): 21373001 2025-07-15 04:10:17,036 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749625_8801, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-15 04:10:24,768 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749625_8801 replica FinalizedReplica, blk_1073749625_8801, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749625 for deletion 2025-07-15 04:10:24,769 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749625_8801 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749625 2025-07-15 04:11:17,007 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749626_8802 src: /192.168.158.6:59072 dest: /192.168.158.4:9866 2025-07-15 04:11:17,028 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:59072, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_309074677_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749626_8802, duration(ns): 18769617 2025-07-15 04:11:17,028 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749626_8802, type=LAST_IN_PIPELINE terminating 2025-07-15 04:11:21,770 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749626_8802 replica FinalizedReplica, blk_1073749626_8802, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749626 for deletion 2025-07-15 04:11:21,771 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749626_8802 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749626 2025-07-15 04:20:27,020 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749635_8811 src: /192.168.158.7:38462 dest: /192.168.158.4:9866 2025-07-15 04:20:27,037 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:38462, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-830573662_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749635_8811, duration(ns): 14544055 2025-07-15 04:20:27,037 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749635_8811, type=LAST_IN_PIPELINE terminating 2025-07-15 04:20:33,794 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749635_8811 replica FinalizedReplica, blk_1073749635_8811, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749635 for deletion 2025-07-15 04:20:33,795 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749635_8811 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749635 2025-07-15 04:21:32,026 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749636_8812 src: /192.168.158.6:40160 dest: /192.168.158.4:9866 2025-07-15 04:21:32,051 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:40160, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-610120436_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749636_8812, duration(ns): 19680126 2025-07-15 04:21:32,051 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749636_8812, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-15 04:21:39,799 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749636_8812 replica FinalizedReplica, blk_1073749636_8812, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749636 for deletion 2025-07-15 04:21:39,800 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749636_8812 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749636 2025-07-15 04:22:37,020 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749637_8813 src: /192.168.158.1:55766 dest: /192.168.158.4:9866 2025-07-15 04:22:37,050 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55766, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2085347329_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749637_8813, duration(ns): 21606482 2025-07-15 04:22:37,051 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749637_8813, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-15 04:22:42,801 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749637_8813 replica FinalizedReplica, blk_1073749637_8813, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749637 for deletion 2025-07-15 04:22:42,802 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749637_8813 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749637 2025-07-15 04:23:42,024 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749638_8814 src: /192.168.158.1:47422 dest: /192.168.158.4:9866 2025-07-15 04:23:42,058 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47422, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2675737_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749638_8814, duration(ns): 25081841 2025-07-15 04:23:42,058 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749638_8814, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-15 04:23:48,804 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749638_8814 replica FinalizedReplica, blk_1073749638_8814, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749638 for deletion 2025-07-15 04:23:48,805 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749638_8814 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749638 2025-07-15 04:25:52,067 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749640_8816 src: /192.168.158.1:35098 dest: /192.168.158.4:9866 2025-07-15 04:25:52,098 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35098, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1643562669_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749640_8816, duration(ns): 22848221 2025-07-15 04:25:52,098 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749640_8816, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-15 04:25:57,811 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749640_8816 replica FinalizedReplica, blk_1073749640_8816, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749640 for deletion 2025-07-15 04:25:57,812 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749640_8816 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749640 2025-07-15 04:26:52,030 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749641_8817 src: /192.168.158.8:48760 dest: /192.168.158.4:9866 2025-07-15 04:26:52,047 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:48760, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1699725544_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749641_8817, duration(ns): 15657369 2025-07-15 04:26:52,048 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749641_8817, type=LAST_IN_PIPELINE terminating 2025-07-15 04:26:57,814 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749641_8817 replica FinalizedReplica, blk_1073749641_8817, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749641 for deletion 2025-07-15 04:26:57,815 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749641_8817 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749641 2025-07-15 04:27:57,021 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749642_8818 src: /192.168.158.1:35144 dest: /192.168.158.4:9866 2025-07-15 04:27:57,052 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35144, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1491929478_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749642_8818, duration(ns): 22272726 2025-07-15 04:27:57,052 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749642_8818, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-15 04:28:00,813 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749642_8818 replica FinalizedReplica, blk_1073749642_8818, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749642 for deletion 2025-07-15 04:28:00,814 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749642_8818 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749642 2025-07-15 04:28:57,026 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749643_8819 src: /192.168.158.1:38908 dest: /192.168.158.4:9866 2025-07-15 04:28:57,057 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38908, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1540048809_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749643_8819, duration(ns): 22151249 2025-07-15 04:28:57,057 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749643_8819, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-15 04:29:00,816 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749643_8819 replica FinalizedReplica, blk_1073749643_8819, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749643 for deletion 2025-07-15 04:29:00,817 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749643_8819 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749643 2025-07-15 04:29:57,031 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749644_8820 src: /192.168.158.1:52416 dest: /192.168.158.4:9866 2025-07-15 04:29:57,062 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52416, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1412095295_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749644_8820, duration(ns): 22435639 2025-07-15 04:29:57,062 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749644_8820, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-15 04:30:00,820 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749644_8820 replica FinalizedReplica, blk_1073749644_8820, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749644 for deletion 2025-07-15 04:30:00,821 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749644_8820 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749644 2025-07-15 04:30:57,036 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749645_8821 src: /192.168.158.8:44984 dest: /192.168.158.4:9866 2025-07-15 04:30:57,061 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:44984, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-147623955_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749645_8821, duration(ns): 18175264 2025-07-15 04:30:57,061 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749645_8821, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-15 04:31:00,824 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749645_8821 replica FinalizedReplica, blk_1073749645_8821, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749645 for deletion 2025-07-15 04:31:00,825 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749645_8821 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749645 2025-07-15 04:32:57,032 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749647_8823 src: /192.168.158.6:54114 dest: /192.168.158.4:9866 2025-07-15 04:32:57,057 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:54114, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1833590744_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749647_8823, duration(ns): 20149049 2025-07-15 04:32:57,058 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749647_8823, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-15 04:33:00,828 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749647_8823 replica FinalizedReplica, blk_1073749647_8823, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749647 for deletion 2025-07-15 04:33:00,829 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749647_8823 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749647 2025-07-15 04:35:02,041 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749649_8825 src: /192.168.158.1:46060 dest: /192.168.158.4:9866 2025-07-15 04:35:02,072 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46060, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1918622440_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749649_8825, duration(ns): 22186810 2025-07-15 04:35:02,072 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749649_8825, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-15 04:35:06,832 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749649_8825 replica FinalizedReplica, blk_1073749649_8825, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749649 for deletion 2025-07-15 04:35:06,833 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749649_8825 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749649 2025-07-15 04:36:07,038 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749650_8826 src: /192.168.158.1:34294 dest: /192.168.158.4:9866 2025-07-15 04:36:07,072 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34294, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1164849541_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749650_8826, duration(ns): 25020407 2025-07-15 04:36:07,072 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749650_8826, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-15 04:36:12,833 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749650_8826 replica FinalizedReplica, blk_1073749650_8826, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749650 for deletion 2025-07-15 04:36:12,835 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749650_8826 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749650 2025-07-15 04:37:12,038 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749651_8827 src: /192.168.158.7:39138 dest: /192.168.158.4:9866 2025-07-15 04:37:12,062 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:39138, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2136047878_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749651_8827, duration(ns): 18286385 2025-07-15 04:37:12,064 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749651_8827, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-15 04:37:18,835 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749651_8827 replica FinalizedReplica, blk_1073749651_8827, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749651 for deletion 2025-07-15 04:37:18,836 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749651_8827 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749651 2025-07-15 04:40:17,032 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749654_8830 src: /192.168.158.1:44016 dest: /192.168.158.4:9866 2025-07-15 04:40:17,063 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44016, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2117631789_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749654_8830, duration(ns): 22214356 2025-07-15 04:40:17,063 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749654_8830, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-15 04:40:24,842 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749654_8830 replica FinalizedReplica, blk_1073749654_8830, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749654 for deletion 2025-07-15 04:40:24,843 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749654_8830 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749654 2025-07-15 04:41:17,036 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749655_8831 src: /192.168.158.1:59676 dest: /192.168.158.4:9866 2025-07-15 04:41:17,067 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59676, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_129046007_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749655_8831, duration(ns): 22863889 2025-07-15 04:41:17,068 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749655_8831, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-15 04:41:24,842 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749655_8831 replica FinalizedReplica, blk_1073749655_8831, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749655 for deletion 2025-07-15 04:41:24,843 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749655_8831 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749655 2025-07-15 04:44:27,053 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749658_8834 src: /192.168.158.1:60980 dest: /192.168.158.4:9866 2025-07-15 04:44:27,085 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60980, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-664573401_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749658_8834, duration(ns): 22799660 2025-07-15 04:44:27,085 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749658_8834, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-15 04:44:33,847 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749658_8834 replica FinalizedReplica, blk_1073749658_8834, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749658 for deletion 2025-07-15 04:44:33,848 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749658_8834 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749658 2025-07-15 04:45:27,049 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749659_8835 src: /192.168.158.9:46674 dest: /192.168.158.4:9866 2025-07-15 04:45:27,073 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:46674, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_257732444_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749659_8835, duration(ns): 18925986 2025-07-15 04:45:27,074 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749659_8835, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-15 04:45:30,848 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749659_8835 replica FinalizedReplica, blk_1073749659_8835, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749659 for deletion 2025-07-15 04:45:30,849 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749659_8835 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749659 2025-07-15 04:47:32,075 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749661_8837 src: /192.168.158.8:44664 dest: /192.168.158.4:9866 2025-07-15 04:47:32,103 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:44664, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-964842941_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749661_8837, duration(ns): 21616790 2025-07-15 04:47:32,103 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749661_8837, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-15 04:47:36,851 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749661_8837 replica FinalizedReplica, blk_1073749661_8837, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749661 for deletion 2025-07-15 04:47:36,852 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749661_8837 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749661 2025-07-15 04:49:32,064 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749663_8839 src: /192.168.158.6:36576 dest: /192.168.158.4:9866 2025-07-15 04:49:32,086 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:36576, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1430986198_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749663_8839, duration(ns): 19514042 2025-07-15 04:49:32,086 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749663_8839, type=LAST_IN_PIPELINE terminating 2025-07-15 04:49:36,853 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749663_8839 replica FinalizedReplica, blk_1073749663_8839, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749663 for deletion 2025-07-15 04:49:36,854 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749663_8839 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749663 2025-07-15 04:52:37,079 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749666_8842 src: /192.168.158.6:60768 dest: /192.168.158.4:9866 2025-07-15 04:52:37,103 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:60768, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_173920483_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749666_8842, duration(ns): 19511947 2025-07-15 04:52:37,104 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749666_8842, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-15 04:52:45,861 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749666_8842 replica FinalizedReplica, blk_1073749666_8842, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749666 for deletion 2025-07-15 04:52:45,862 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749666_8842 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749666 2025-07-15 04:55:42,076 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749669_8845 src: /192.168.158.6:34238 dest: /192.168.158.4:9866 2025-07-15 04:55:42,094 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:34238, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1642682479_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749669_8845, duration(ns): 15663789 2025-07-15 04:55:42,094 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749669_8845, type=LAST_IN_PIPELINE terminating 2025-07-15 04:55:45,868 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749669_8845 replica FinalizedReplica, blk_1073749669_8845, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749669 for deletion 2025-07-15 04:55:45,869 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749669_8845 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749669 2025-07-15 04:57:42,080 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749671_8847 src: /192.168.158.8:57306 dest: /192.168.158.4:9866 2025-07-15 04:57:42,098 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:57306, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_163333720_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749671_8847, duration(ns): 15616304 2025-07-15 04:57:42,098 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749671_8847, type=LAST_IN_PIPELINE terminating 2025-07-15 04:57:45,870 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749671_8847 replica FinalizedReplica, blk_1073749671_8847, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749671 for deletion 2025-07-15 04:57:45,871 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749671_8847 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749671 2025-07-15 04:58:47,068 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749672_8848 src: /192.168.158.1:34732 dest: /192.168.158.4:9866 2025-07-15 04:58:47,100 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34732, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2117244551_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749672_8848, duration(ns): 22886705 2025-07-15 04:58:47,101 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749672_8848, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-15 04:58:51,872 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749672_8848 replica FinalizedReplica, blk_1073749672_8848, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749672 for deletion 2025-07-15 04:58:51,873 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749672_8848 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749672 2025-07-15 05:01:52,086 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749675_8851 src: /192.168.158.1:36988 dest: /192.168.158.4:9866 2025-07-15 05:01:52,117 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36988, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_550169141_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749675_8851, duration(ns): 22911758 2025-07-15 05:01:52,117 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749675_8851, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-15 05:02:00,877 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749675_8851 replica FinalizedReplica, blk_1073749675_8851, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749675 for deletion 2025-07-15 05:02:00,878 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749675_8851 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749675 2025-07-15 05:02:57,086 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749676_8852 src: /192.168.158.9:45934 dest: /192.168.158.4:9866 2025-07-15 05:02:57,110 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:45934, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2101641764_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749676_8852, duration(ns): 18687391 2025-07-15 05:02:57,110 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749676_8852, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-15 05:03:00,878 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749676_8852 replica FinalizedReplica, blk_1073749676_8852, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749676 for deletion 2025-07-15 05:03:00,880 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749676_8852 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749676 2025-07-15 05:03:57,080 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749677_8853 src: /192.168.158.1:51456 dest: /192.168.158.4:9866 2025-07-15 05:03:57,109 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51456, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1206624156_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749677_8853, duration(ns): 20944781 2025-07-15 05:03:57,109 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749677_8853, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-15 05:04:03,881 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749677_8853 replica FinalizedReplica, blk_1073749677_8853, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749677 for deletion 2025-07-15 05:04:03,882 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749677_8853 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749677 2025-07-15 05:05:02,082 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749678_8854 src: /192.168.158.6:39232 dest: /192.168.158.4:9866 2025-07-15 05:05:02,110 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:39232, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1002194573_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749678_8854, duration(ns): 21802426 2025-07-15 05:05:02,110 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749678_8854, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-15 05:05:09,884 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749678_8854 replica FinalizedReplica, blk_1073749678_8854, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749678 for deletion 2025-07-15 05:05:09,885 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749678_8854 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749678 2025-07-15 05:06:02,100 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749679_8855 src: /192.168.158.5:40088 dest: /192.168.158.4:9866 2025-07-15 05:06:02,125 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:40088, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1346605744_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749679_8855, duration(ns): 19725439 2025-07-15 05:06:02,125 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749679_8855, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-15 05:06:09,888 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749679_8855 replica FinalizedReplica, blk_1073749679_8855, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749679 for deletion 2025-07-15 05:06:09,889 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749679_8855 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749679 2025-07-15 05:07:02,097 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749680_8856 src: /192.168.158.5:40798 dest: /192.168.158.4:9866 2025-07-15 05:07:02,117 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:40798, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_477083186_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749680_8856, duration(ns): 17856091 2025-07-15 05:07:02,117 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749680_8856, type=LAST_IN_PIPELINE terminating 2025-07-15 05:07:06,891 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749680_8856 replica FinalizedReplica, blk_1073749680_8856, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749680 for deletion 2025-07-15 05:07:06,892 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749680_8856 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749680 2025-07-15 05:08:07,089 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749681_8857 src: /192.168.158.7:45160 dest: /192.168.158.4:9866 2025-07-15 05:08:07,107 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:45160, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2027112543_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749681_8857, duration(ns): 15205852 2025-07-15 05:08:07,107 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749681_8857, type=LAST_IN_PIPELINE terminating 2025-07-15 05:08:12,893 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749681_8857 replica FinalizedReplica, blk_1073749681_8857, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749681 for deletion 2025-07-15 05:08:12,894 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749681_8857 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749681 2025-07-15 05:10:07,091 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749683_8859 src: /192.168.158.1:59432 dest: /192.168.158.4:9866 2025-07-15 05:10:07,122 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59432, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_601369333_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749683_8859, duration(ns): 22292771 2025-07-15 05:10:07,122 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749683_8859, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-15 05:10:12,894 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749683_8859 replica FinalizedReplica, blk_1073749683_8859, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749683 for deletion 2025-07-15 05:10:12,895 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749683_8859 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749683 2025-07-15 05:16:17,097 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749689_8865 src: /192.168.158.1:51522 dest: /192.168.158.4:9866 2025-07-15 05:16:17,130 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51522, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2033405381_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749689_8865, duration(ns): 23574650 2025-07-15 05:16:17,131 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749689_8865, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-15 05:16:21,912 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749689_8865 replica FinalizedReplica, blk_1073749689_8865, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749689 for deletion 2025-07-15 05:16:21,913 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749689_8865 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749689 2025-07-15 05:17:17,125 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749690_8866 src: /192.168.158.1:37682 dest: /192.168.158.4:9866 2025-07-15 05:17:17,152 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37682, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_590878710_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749690_8866, duration(ns): 18932695 2025-07-15 05:17:17,153 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749690_8866, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-15 05:17:24,916 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749690_8866 replica FinalizedReplica, blk_1073749690_8866, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749690 for deletion 2025-07-15 05:17:24,917 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749690_8866 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749690 2025-07-15 05:22:22,139 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749695_8871 src: /192.168.158.5:49438 dest: /192.168.158.4:9866 2025-07-15 05:22:22,158 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:49438, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-314070741_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749695_8871, duration(ns): 16662379 2025-07-15 05:22:22,159 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749695_8871, type=LAST_IN_PIPELINE terminating 2025-07-15 05:22:30,934 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749695_8871 replica FinalizedReplica, blk_1073749695_8871, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749695 for deletion 2025-07-15 05:22:30,935 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749695_8871 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749695 2025-07-15 05:26:37,118 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749699_8875 src: /192.168.158.6:45602 dest: /192.168.158.4:9866 2025-07-15 05:26:37,143 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:45602, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1391712409_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749699_8875, duration(ns): 19329984 2025-07-15 05:26:37,143 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749699_8875, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-15 05:26:45,945 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749699_8875 replica FinalizedReplica, blk_1073749699_8875, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749699 for deletion 2025-07-15 05:26:45,946 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749699_8875 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749699 2025-07-15 05:28:42,151 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749701_8877 src: /192.168.158.8:42096 dest: /192.168.158.4:9866 2025-07-15 05:28:42,177 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:42096, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1096593558_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749701_8877, duration(ns): 20149892 2025-07-15 05:28:42,177 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749701_8877, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-15 05:28:45,948 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749701_8877 replica FinalizedReplica, blk_1073749701_8877, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749701 for deletion 2025-07-15 05:28:45,949 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749701_8877 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749701 2025-07-15 05:29:42,137 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749702_8878 src: /192.168.158.1:60380 dest: /192.168.158.4:9866 2025-07-15 05:29:42,168 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60380, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1250139869_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749702_8878, duration(ns): 22639887 2025-07-15 05:29:42,168 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749702_8878, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-15 05:29:45,953 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749702_8878 replica FinalizedReplica, blk_1073749702_8878, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749702 for deletion 2025-07-15 05:29:45,954 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749702_8878 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749702 2025-07-15 05:32:57,164 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749705_8881 src: /192.168.158.7:33828 dest: /192.168.158.4:9866 2025-07-15 05:32:57,184 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:33828, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1213768705_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749705_8881, duration(ns): 16245970 2025-07-15 05:32:57,184 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749705_8881, type=LAST_IN_PIPELINE terminating 2025-07-15 05:33:00,960 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749705_8881 replica FinalizedReplica, blk_1073749705_8881, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749705 for deletion 2025-07-15 05:33:00,962 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749705_8881 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749705 2025-07-15 05:33:57,152 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749706_8882 src: /192.168.158.7:35846 dest: /192.168.158.4:9866 2025-07-15 05:33:57,180 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:35846, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1747416004_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749706_8882, duration(ns): 22012025 2025-07-15 05:33:57,180 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749706_8882, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-15 05:34:00,966 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749706_8882 replica FinalizedReplica, blk_1073749706_8882, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749706 for deletion 2025-07-15 05:34:00,967 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749706_8882 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749706 2025-07-15 05:36:13,269 INFO org.apache.hadoop.hdfs.server.datanode.DirectoryScanner: BlockPool BP-1059995147-192.168.158.1-1752101929360 Total blocks: 8, missing metadata files:0, missing block files:0, missing blocks in memory:0, mismatched blocks:0 2025-07-15 05:37:02,122 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749709_8885 src: /192.168.158.8:56766 dest: /192.168.158.4:9866 2025-07-15 05:37:02,140 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:56766, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2051355555_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749709_8885, duration(ns): 15873889 2025-07-15 05:37:02,141 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749709_8885, type=LAST_IN_PIPELINE terminating 2025-07-15 05:37:09,974 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749709_8885 replica FinalizedReplica, blk_1073749709_8885, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749709 for deletion 2025-07-15 05:37:09,975 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749709_8885 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749709 2025-07-15 05:37:18,978 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Successfully sent block report 0x12cfd9a757d31f3b, containing 4 storage report(s), of which we sent 4. The reports had 8 total blocks and used 1 RPC(s). This took 0 msec to generate and 4 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5. 2025-07-15 05:37:18,978 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Got finalize command for block pool BP-1059995147-192.168.158.1-1752101929360 2025-07-15 05:39:02,123 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749711_8887 src: /192.168.158.1:59398 dest: /192.168.158.4:9866 2025-07-15 05:39:02,153 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59398, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-319132990_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749711_8887, duration(ns): 21402967 2025-07-15 05:39:02,154 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749711_8887, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-15 05:39:09,975 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749711_8887 replica FinalizedReplica, blk_1073749711_8887, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749711 for deletion 2025-07-15 05:39:09,976 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749711_8887 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749711 2025-07-15 05:41:02,136 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749713_8889 src: /192.168.158.5:34146 dest: /192.168.158.4:9866 2025-07-15 05:41:02,153 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:34146, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1014330732_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749713_8889, duration(ns): 15604898 2025-07-15 05:41:02,154 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749713_8889, type=LAST_IN_PIPELINE terminating 2025-07-15 05:41:06,981 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749713_8889 replica FinalizedReplica, blk_1073749713_8889, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749713 for deletion 2025-07-15 05:41:06,983 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749713_8889 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749713 2025-07-15 05:43:07,138 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749715_8891 src: /192.168.158.7:44248 dest: /192.168.158.4:9866 2025-07-15 05:43:07,155 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:44248, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-441180309_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749715_8891, duration(ns): 14903556 2025-07-15 05:43:07,156 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749715_8891, type=LAST_IN_PIPELINE terminating 2025-07-15 05:43:12,985 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749715_8891 replica FinalizedReplica, blk_1073749715_8891, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749715 for deletion 2025-07-15 05:43:12,986 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749715_8891 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749715 2025-07-15 05:45:12,133 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749717_8893 src: /192.168.158.1:54250 dest: /192.168.158.4:9866 2025-07-15 05:45:12,163 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54250, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1545499086_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749717_8893, duration(ns): 20977174 2025-07-15 05:45:12,163 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749717_8893, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-15 05:45:15,991 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749717_8893 replica FinalizedReplica, blk_1073749717_8893, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749717 for deletion 2025-07-15 05:45:15,992 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749717_8893 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749717 2025-07-15 05:49:22,168 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749721_8897 src: /192.168.158.1:55058 dest: /192.168.158.4:9866 2025-07-15 05:49:22,200 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55058, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2094430268_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749721_8897, duration(ns): 23115718 2025-07-15 05:49:22,200 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749721_8897, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-15 05:49:25,000 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749721_8897 replica FinalizedReplica, blk_1073749721_8897, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749721 for deletion 2025-07-15 05:49:25,001 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749721_8897 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749721 2025-07-15 05:50:22,178 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749722_8898 src: /192.168.158.1:47366 dest: /192.168.158.4:9866 2025-07-15 05:50:22,207 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47366, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1208307896_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749722_8898, duration(ns): 21030232 2025-07-15 05:50:22,207 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749722_8898, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-15 05:50:25,004 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749722_8898 replica FinalizedReplica, blk_1073749722_8898, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749722 for deletion 2025-07-15 05:50:25,005 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749722_8898 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749722 2025-07-15 05:55:32,160 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749727_8903 src: /192.168.158.5:52164 dest: /192.168.158.4:9866 2025-07-15 05:55:32,179 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:52164, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-444199749_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749727_8903, duration(ns): 17221748 2025-07-15 05:55:32,180 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749727_8903, type=LAST_IN_PIPELINE terminating 2025-07-15 05:55:40,012 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749727_8903 replica FinalizedReplica, blk_1073749727_8903, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749727 for deletion 2025-07-15 05:55:40,013 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749727_8903 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749727 2025-07-15 05:58:37,174 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749730_8906 src: /192.168.158.9:57460 dest: /192.168.158.4:9866 2025-07-15 05:58:37,191 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:57460, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1263372563_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749730_8906, duration(ns): 15279117 2025-07-15 05:58:37,192 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749730_8906, type=LAST_IN_PIPELINE terminating 2025-07-15 05:58:43,019 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749730_8906 replica FinalizedReplica, blk_1073749730_8906, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749730 for deletion 2025-07-15 05:58:43,021 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749730_8906 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749730 2025-07-15 05:59:37,193 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749731_8907 src: /192.168.158.7:46366 dest: /192.168.158.4:9866 2025-07-15 05:59:37,219 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:46366, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1575881590_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749731_8907, duration(ns): 20947670 2025-07-15 05:59:37,220 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749731_8907, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-15 05:59:43,022 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749731_8907 replica FinalizedReplica, blk_1073749731_8907, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749731 for deletion 2025-07-15 05:59:43,024 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749731_8907 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749731 2025-07-15 06:01:37,182 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749733_8909 src: /192.168.158.5:48106 dest: /192.168.158.4:9866 2025-07-15 06:01:37,200 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:48106, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1154978546_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749733_8909, duration(ns): 16352758 2025-07-15 06:01:37,201 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749733_8909, type=LAST_IN_PIPELINE terminating 2025-07-15 06:01:40,029 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749733_8909 replica FinalizedReplica, blk_1073749733_8909, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749733 for deletion 2025-07-15 06:01:40,031 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749733_8909 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749733 2025-07-15 06:03:42,186 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749735_8911 src: /192.168.158.7:58660 dest: /192.168.158.4:9866 2025-07-15 06:03:42,205 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:58660, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_430462800_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749735_8911, duration(ns): 16056984 2025-07-15 06:03:42,205 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749735_8911, type=LAST_IN_PIPELINE terminating 2025-07-15 06:03:49,033 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749735_8911 replica FinalizedReplica, blk_1073749735_8911, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749735 for deletion 2025-07-15 06:03:49,034 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749735_8911 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749735 2025-07-15 06:04:42,190 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749736_8912 src: /192.168.158.5:43242 dest: /192.168.158.4:9866 2025-07-15 06:04:42,207 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:43242, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1455377814_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749736_8912, duration(ns): 15060137 2025-07-15 06:04:42,208 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749736_8912, type=LAST_IN_PIPELINE terminating 2025-07-15 06:04:46,036 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749736_8912 replica FinalizedReplica, blk_1073749736_8912, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749736 for deletion 2025-07-15 06:04:46,038 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749736_8912 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749736 2025-07-15 06:06:47,177 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749738_8914 src: /192.168.158.1:50630 dest: /192.168.158.4:9866 2025-07-15 06:06:47,208 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50630, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1310901009_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749738_8914, duration(ns): 22299325 2025-07-15 06:06:47,208 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749738_8914, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-15 06:06:55,042 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749738_8914 replica FinalizedReplica, blk_1073749738_8914, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749738 for deletion 2025-07-15 06:06:55,043 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749738_8914 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749738 2025-07-15 06:09:47,189 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749741_8917 src: /192.168.158.9:38640 dest: /192.168.158.4:9866 2025-07-15 06:09:47,207 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:38640, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-658374700_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749741_8917, duration(ns): 16572241 2025-07-15 06:09:47,207 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749741_8917, type=LAST_IN_PIPELINE terminating 2025-07-15 06:09:52,051 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749741_8917 replica FinalizedReplica, blk_1073749741_8917, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749741 for deletion 2025-07-15 06:09:52,052 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749741_8917 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749741 2025-07-15 06:10:47,196 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749742_8918 src: /192.168.158.5:33780 dest: /192.168.158.4:9866 2025-07-15 06:10:47,214 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:33780, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1103182381_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749742_8918, duration(ns): 15619107 2025-07-15 06:10:47,214 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749742_8918, type=LAST_IN_PIPELINE terminating 2025-07-15 06:10:55,054 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749742_8918 replica FinalizedReplica, blk_1073749742_8918, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749742 for deletion 2025-07-15 06:10:55,055 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749742_8918 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749742 2025-07-15 06:11:52,211 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749743_8919 src: /192.168.158.6:58236 dest: /192.168.158.4:9866 2025-07-15 06:11:52,237 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:58236, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1117508663_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749743_8919, duration(ns): 20853973 2025-07-15 06:11:52,237 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749743_8919, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-15 06:11:55,058 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749743_8919 replica FinalizedReplica, blk_1073749743_8919, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749743 for deletion 2025-07-15 06:11:55,059 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749743_8919 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749743 2025-07-15 06:12:52,252 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749744_8920 src: /192.168.158.7:39182 dest: /192.168.158.4:9866 2025-07-15 06:12:52,272 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:39182, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-734493013_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749744_8920, duration(ns): 18612997 2025-07-15 06:12:52,273 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749744_8920, type=LAST_IN_PIPELINE terminating 2025-07-15 06:12:58,059 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749744_8920 replica FinalizedReplica, blk_1073749744_8920, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749744 for deletion 2025-07-15 06:12:58,060 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749744_8920 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749744 2025-07-15 06:13:52,223 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749745_8921 src: /192.168.158.7:41614 dest: /192.168.158.4:9866 2025-07-15 06:13:52,251 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:41614, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1994639720_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749745_8921, duration(ns): 21983519 2025-07-15 06:13:52,251 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749745_8921, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-15 06:13:58,061 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749745_8921 replica FinalizedReplica, blk_1073749745_8921, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749745 for deletion 2025-07-15 06:13:58,062 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749745_8921 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749745 2025-07-15 06:14:52,197 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749746_8922 src: /192.168.158.1:39066 dest: /192.168.158.4:9866 2025-07-15 06:14:52,228 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39066, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-413307247_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749746_8922, duration(ns): 22372183 2025-07-15 06:14:52,228 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749746_8922, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-15 06:14:55,061 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749746_8922 replica FinalizedReplica, blk_1073749746_8922, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749746 for deletion 2025-07-15 06:14:55,062 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749746_8922 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749746 2025-07-15 06:17:02,213 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749748_8924 src: /192.168.158.1:43234 dest: /192.168.158.4:9866 2025-07-15 06:17:02,244 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43234, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_443988978_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749748_8924, duration(ns): 23108054 2025-07-15 06:17:02,245 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749748_8924, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-15 06:17:07,065 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749748_8924 replica FinalizedReplica, blk_1073749748_8924, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749748 for deletion 2025-07-15 06:17:07,066 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749748_8924 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749748 2025-07-15 06:25:12,215 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749756_8932 src: /192.168.158.1:48100 dest: /192.168.158.4:9866 2025-07-15 06:25:12,247 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48100, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_586107274_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749756_8932, duration(ns): 23311304 2025-07-15 06:25:12,248 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749756_8932, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-15 06:25:16,084 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749756_8932 replica FinalizedReplica, blk_1073749756_8932, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749756 for deletion 2025-07-15 06:25:16,085 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749756_8932 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749756 2025-07-15 06:26:12,239 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749757_8933 src: /192.168.158.6:39754 dest: /192.168.158.4:9866 2025-07-15 06:26:12,257 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:39754, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_257517486_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749757_8933, duration(ns): 16183712 2025-07-15 06:26:12,257 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749757_8933, type=LAST_IN_PIPELINE terminating 2025-07-15 06:26:19,087 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749757_8933 replica FinalizedReplica, blk_1073749757_8933, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749757 for deletion 2025-07-15 06:26:19,088 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749757_8933 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749757 2025-07-15 06:28:17,214 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749759_8935 src: /192.168.158.6:46168 dest: /192.168.158.4:9866 2025-07-15 06:28:17,239 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:46168, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1804904475_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749759_8935, duration(ns): 19214285 2025-07-15 06:28:17,239 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749759_8935, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-15 06:28:22,091 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749759_8935 replica FinalizedReplica, blk_1073749759_8935, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749759 for deletion 2025-07-15 06:28:22,092 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749759_8935 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073749759 2025-07-15 06:30:17,220 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749761_8937 src: /192.168.158.7:37004 dest: /192.168.158.4:9866 2025-07-15 06:30:17,237 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:37004, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1600505521_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749761_8937, duration(ns): 15508070 2025-07-15 06:30:17,238 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749761_8937, type=LAST_IN_PIPELINE terminating 2025-07-15 06:30:25,097 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749761_8937 replica FinalizedReplica, blk_1073749761_8937, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749761 for deletion 2025-07-15 06:30:25,098 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749761_8937 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749761 2025-07-15 06:31:17,219 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749762_8938 src: /192.168.158.7:41880 dest: /192.168.158.4:9866 2025-07-15 06:31:17,240 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:41880, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_498643601_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749762_8938, duration(ns): 17889852 2025-07-15 06:31:17,240 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749762_8938, type=LAST_IN_PIPELINE terminating 2025-07-15 06:31:25,101 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749762_8938 replica FinalizedReplica, blk_1073749762_8938, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749762 for deletion 2025-07-15 06:31:25,102 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749762_8938 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749762 2025-07-15 06:32:17,207 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749763_8939 src: /192.168.158.8:57626 dest: /192.168.158.4:9866 2025-07-15 06:32:17,233 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:57626, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1925238112_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749763_8939, duration(ns): 20963085 2025-07-15 06:32:17,234 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749763_8939, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-15 06:32:22,101 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749763_8939 replica FinalizedReplica, blk_1073749763_8939, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749763 for deletion 2025-07-15 06:32:22,103 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749763_8939 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749763 2025-07-15 06:34:17,225 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749765_8941 src: /192.168.158.7:48170 dest: /192.168.158.4:9866 2025-07-15 06:34:17,250 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:48170, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-725601188_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749765_8941, duration(ns): 19478627 2025-07-15 06:34:17,251 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749765_8941, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-15 06:34:25,107 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749765_8941 replica FinalizedReplica, blk_1073749765_8941, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749765 for deletion 2025-07-15 06:34:25,108 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749765_8941 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749765 2025-07-15 06:36:17,234 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749767_8943 src: /192.168.158.8:59378 dest: /192.168.158.4:9866 2025-07-15 06:36:17,251 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:59378, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_99053375_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749767_8943, duration(ns): 15445363 2025-07-15 06:36:17,252 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749767_8943, type=LAST_IN_PIPELINE terminating 2025-07-15 06:36:22,109 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749767_8943 replica FinalizedReplica, blk_1073749767_8943, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749767 for deletion 2025-07-15 06:36:22,110 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749767_8943 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749767 2025-07-15 06:39:22,222 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749770_8946 src: /192.168.158.8:53616 dest: /192.168.158.4:9866 2025-07-15 06:39:22,240 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:53616, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2032111153_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749770_8946, duration(ns): 16157146 2025-07-15 06:39:22,241 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749770_8946, type=LAST_IN_PIPELINE terminating 2025-07-15 06:39:25,116 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749770_8946 replica FinalizedReplica, blk_1073749770_8946, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749770 for deletion 2025-07-15 06:39:25,117 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749770_8946 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749770 2025-07-15 06:40:22,223 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749771_8947 src: /192.168.158.1:39156 dest: /192.168.158.4:9866 2025-07-15 06:40:22,254 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39156, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1757483580_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749771_8947, duration(ns): 22174354 2025-07-15 06:40:22,255 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749771_8947, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-15 06:40:25,117 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749771_8947 replica FinalizedReplica, blk_1073749771_8947, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749771 for deletion 2025-07-15 06:40:25,119 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749771_8947 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749771 2025-07-15 06:41:27,229 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749772_8948 src: /192.168.158.9:56190 dest: /192.168.158.4:9866 2025-07-15 06:41:27,254 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:56190, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1472706491_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749772_8948, duration(ns): 20075732 2025-07-15 06:41:27,254 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749772_8948, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-15 06:41:31,119 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749772_8948 replica FinalizedReplica, blk_1073749772_8948, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749772 for deletion 2025-07-15 06:41:31,120 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749772_8948 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749772 2025-07-15 06:43:27,238 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749774_8950 src: /192.168.158.1:45022 dest: /192.168.158.4:9866 2025-07-15 06:43:27,270 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45022, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1410869123_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749774_8950, duration(ns): 23503947 2025-07-15 06:43:27,271 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749774_8950, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-15 06:43:34,121 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749774_8950 replica FinalizedReplica, blk_1073749774_8950, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749774 for deletion 2025-07-15 06:43:34,122 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749774_8950 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749774 2025-07-15 06:44:32,244 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749775_8951 src: /192.168.158.7:54062 dest: /192.168.158.4:9866 2025-07-15 06:44:32,267 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:54062, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-410497437_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749775_8951, duration(ns): 18073850 2025-07-15 06:44:32,267 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749775_8951, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-15 06:44:37,122 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749775_8951 replica FinalizedReplica, blk_1073749775_8951, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749775 for deletion 2025-07-15 06:44:37,123 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749775_8951 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749775 2025-07-15 06:47:32,239 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749778_8954 src: /192.168.158.5:44356 dest: /192.168.158.4:9866 2025-07-15 06:47:32,256 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:44356, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1755303757_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749778_8954, duration(ns): 14849797 2025-07-15 06:47:32,257 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749778_8954, type=LAST_IN_PIPELINE terminating 2025-07-15 06:47:37,126 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749778_8954 replica FinalizedReplica, blk_1073749778_8954, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749778 for deletion 2025-07-15 06:47:37,127 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749778_8954 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749778 2025-07-15 06:52:37,253 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749783_8959 src: /192.168.158.5:33632 dest: /192.168.158.4:9866 2025-07-15 06:52:37,273 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:33632, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1390710923_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749783_8959, duration(ns): 18188973 2025-07-15 06:52:37,274 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749783_8959, type=LAST_IN_PIPELINE terminating 2025-07-15 06:52:43,142 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749783_8959 replica FinalizedReplica, blk_1073749783_8959, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749783 for deletion 2025-07-15 06:52:43,143 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749783_8959 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749783 2025-07-15 06:53:37,264 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749784_8960 src: /192.168.158.1:58520 dest: /192.168.158.4:9866 2025-07-15 06:53:37,298 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58520, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-38497165_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749784_8960, duration(ns): 24438491 2025-07-15 06:53:37,299 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749784_8960, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-15 06:53:43,145 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749784_8960 replica FinalizedReplica, blk_1073749784_8960, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749784 for deletion 2025-07-15 06:53:43,146 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749784_8960 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749784 2025-07-15 06:54:42,249 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749785_8961 src: /192.168.158.9:37184 dest: /192.168.158.4:9866 2025-07-15 06:54:42,275 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:37184, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_941202024_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749785_8961, duration(ns): 19549027 2025-07-15 06:54:42,276 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749785_8961, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-15 06:54:46,147 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749785_8961 replica FinalizedReplica, blk_1073749785_8961, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749785 for deletion 2025-07-15 06:54:46,148 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749785_8961 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749785 2025-07-15 06:56:47,254 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749787_8963 src: /192.168.158.1:40522 dest: /192.168.158.4:9866 2025-07-15 06:56:47,287 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40522, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-255817067_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749787_8963, duration(ns): 23437993 2025-07-15 06:56:47,287 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749787_8963, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-15 06:56:52,148 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749787_8963 replica FinalizedReplica, blk_1073749787_8963, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749787 for deletion 2025-07-15 06:56:52,149 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749787_8963 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749787 2025-07-15 06:57:52,274 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749788_8964 src: /192.168.158.9:53038 dest: /192.168.158.4:9866 2025-07-15 06:57:52,300 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:53038, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1511958940_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749788_8964, duration(ns): 20189704 2025-07-15 06:57:52,300 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749788_8964, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-15 06:57:55,149 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749788_8964 replica FinalizedReplica, blk_1073749788_8964, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749788 for deletion 2025-07-15 06:57:55,150 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749788_8964 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749788 2025-07-15 06:59:52,276 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749790_8966 src: /192.168.158.7:37480 dest: /192.168.158.4:9866 2025-07-15 06:59:52,296 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:37480, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_808086704_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749790_8966, duration(ns): 17187302 2025-07-15 06:59:52,296 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749790_8966, type=LAST_IN_PIPELINE terminating 2025-07-15 06:59:55,152 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749790_8966 replica FinalizedReplica, blk_1073749790_8966, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749790 for deletion 2025-07-15 06:59:55,153 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749790_8966 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749790 2025-07-15 07:00:57,272 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749791_8967 src: /192.168.158.6:37180 dest: /192.168.158.4:9866 2025-07-15 07:00:57,297 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:37180, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2108976444_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749791_8967, duration(ns): 19266410 2025-07-15 07:00:57,297 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749791_8967, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-15 07:01:04,153 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749791_8967 replica FinalizedReplica, blk_1073749791_8967, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749791 for deletion 2025-07-15 07:01:04,154 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749791_8967 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749791 2025-07-15 07:07:02,278 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749797_8973 src: /192.168.158.1:44560 dest: /192.168.158.4:9866 2025-07-15 07:07:02,310 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44560, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_678074329_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749797_8973, duration(ns): 23084286 2025-07-15 07:07:02,313 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749797_8973, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-15 07:07:10,162 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749797_8973 replica FinalizedReplica, blk_1073749797_8973, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749797 for deletion 2025-07-15 07:07:10,163 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749797_8973 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749797 2025-07-15 07:08:02,281 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749798_8974 src: /192.168.158.6:49014 dest: /192.168.158.4:9866 2025-07-15 07:08:02,304 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:49014, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_230572603_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749798_8974, duration(ns): 18333949 2025-07-15 07:08:02,305 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749798_8974, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-15 07:08:10,163 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749798_8974 replica FinalizedReplica, blk_1073749798_8974, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749798 for deletion 2025-07-15 07:08:10,164 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749798_8974 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749798 2025-07-15 07:09:07,282 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749799_8975 src: /192.168.158.6:56434 dest: /192.168.158.4:9866 2025-07-15 07:09:07,299 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:56434, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1608655964_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749799_8975, duration(ns): 14936208 2025-07-15 07:09:07,299 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749799_8975, type=LAST_IN_PIPELINE terminating 2025-07-15 07:09:13,163 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749799_8975 replica FinalizedReplica, blk_1073749799_8975, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749799 for deletion 2025-07-15 07:09:13,164 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749799_8975 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749799 2025-07-15 07:10:07,285 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749800_8976 src: /192.168.158.1:49842 dest: /192.168.158.4:9866 2025-07-15 07:10:07,316 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49842, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1835481704_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749800_8976, duration(ns): 22348116 2025-07-15 07:10:07,316 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749800_8976, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-15 07:10:10,166 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749800_8976 replica FinalizedReplica, blk_1073749800_8976, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749800 for deletion 2025-07-15 07:10:10,167 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749800_8976 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749800 2025-07-15 07:14:17,269 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749804_8980 src: /192.168.158.1:33034 dest: /192.168.158.4:9866 2025-07-15 07:14:17,301 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33034, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_894993855_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749804_8980, duration(ns): 23183113 2025-07-15 07:14:17,302 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749804_8980, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-15 07:14:22,172 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749804_8980 replica FinalizedReplica, blk_1073749804_8980, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749804 for deletion 2025-07-15 07:14:22,173 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749804_8980 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749804 2025-07-15 07:15:17,282 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749805_8981 src: /192.168.158.8:33938 dest: /192.168.158.4:9866 2025-07-15 07:15:17,300 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:33938, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_86420249_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749805_8981, duration(ns): 15486409 2025-07-15 07:15:17,300 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749805_8981, type=LAST_IN_PIPELINE terminating 2025-07-15 07:15:22,172 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749805_8981 replica FinalizedReplica, blk_1073749805_8981, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749805 for deletion 2025-07-15 07:15:22,173 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749805_8981 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749805 2025-07-15 07:17:17,285 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749807_8983 src: /192.168.158.5:54458 dest: /192.168.158.4:9866 2025-07-15 07:17:17,307 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:54458, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1422143681_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749807_8983, duration(ns): 16498201 2025-07-15 07:17:17,307 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749807_8983, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-15 07:17:22,175 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749807_8983 replica FinalizedReplica, blk_1073749807_8983, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749807 for deletion 2025-07-15 07:17:22,176 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749807_8983 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749807 2025-07-15 07:18:17,281 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749808_8984 src: /192.168.158.7:55752 dest: /192.168.158.4:9866 2025-07-15 07:18:17,307 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:55752, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2047602466_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749808_8984, duration(ns): 20535301 2025-07-15 07:18:17,307 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749808_8984, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-15 07:18:22,176 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749808_8984 replica FinalizedReplica, blk_1073749808_8984, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749808 for deletion 2025-07-15 07:18:22,178 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749808_8984 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749808 2025-07-15 07:19:22,282 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749809_8985 src: /192.168.158.5:41524 dest: /192.168.158.4:9866 2025-07-15 07:19:22,305 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:41524, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-928350613_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749809_8985, duration(ns): 17787409 2025-07-15 07:19:22,305 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749809_8985, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-15 07:19:25,176 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749809_8985 replica FinalizedReplica, blk_1073749809_8985, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749809 for deletion 2025-07-15 07:19:25,177 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749809_8985 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749809 2025-07-15 07:21:27,309 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749811_8987 src: /192.168.158.9:54984 dest: /192.168.158.4:9866 2025-07-15 07:21:27,326 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:54984, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_164043461_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749811_8987, duration(ns): 14847300 2025-07-15 07:21:27,327 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749811_8987, type=LAST_IN_PIPELINE terminating 2025-07-15 07:21:31,179 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749811_8987 replica FinalizedReplica, blk_1073749811_8987, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749811 for deletion 2025-07-15 07:21:31,180 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749811_8987 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749811 2025-07-15 07:23:32,291 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749813_8989 src: /192.168.158.8:36242 dest: /192.168.158.4:9866 2025-07-15 07:23:32,315 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:36242, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1747578328_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749813_8989, duration(ns): 18878272 2025-07-15 07:23:32,315 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749813_8989, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-15 07:23:37,182 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749813_8989 replica FinalizedReplica, blk_1073749813_8989, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749813 for deletion 2025-07-15 07:23:37,183 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749813_8989 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749813 2025-07-15 07:24:37,298 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749814_8990 src: /192.168.158.5:50874 dest: /192.168.158.4:9866 2025-07-15 07:24:37,314 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:50874, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1627933534_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749814_8990, duration(ns): 13724483 2025-07-15 07:24:37,314 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749814_8990, type=LAST_IN_PIPELINE terminating 2025-07-15 07:24:40,183 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749814_8990 replica FinalizedReplica, blk_1073749814_8990, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749814 for deletion 2025-07-15 07:24:40,184 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749814_8990 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749814 2025-07-15 07:26:37,304 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749816_8992 src: /192.168.158.5:54464 dest: /192.168.158.4:9866 2025-07-15 07:26:37,323 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:54464, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1089661493_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749816_8992, duration(ns): 16409605 2025-07-15 07:26:37,323 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749816_8992, type=LAST_IN_PIPELINE terminating 2025-07-15 07:26:40,187 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749816_8992 replica FinalizedReplica, blk_1073749816_8992, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749816 for deletion 2025-07-15 07:26:40,188 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749816_8992 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749816 2025-07-15 07:27:37,312 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749817_8993 src: /192.168.158.9:55046 dest: /192.168.158.4:9866 2025-07-15 07:27:37,334 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:55046, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1880064158_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749817_8993, duration(ns): 17312070 2025-07-15 07:27:37,335 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749817_8993, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-15 07:27:40,186 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749817_8993 replica FinalizedReplica, blk_1073749817_8993, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749817 for deletion 2025-07-15 07:27:40,188 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749817_8993 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749817 2025-07-15 07:30:42,305 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749820_8996 src: /192.168.158.9:34192 dest: /192.168.158.4:9866 2025-07-15 07:30:42,323 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:34192, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1592416047_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749820_8996, duration(ns): 16142860 2025-07-15 07:30:42,324 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749820_8996, type=LAST_IN_PIPELINE terminating 2025-07-15 07:30:46,191 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749820_8996 replica FinalizedReplica, blk_1073749820_8996, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749820 for deletion 2025-07-15 07:30:46,192 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749820_8996 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749820 2025-07-15 07:32:42,310 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749822_8998 src: /192.168.158.6:57628 dest: /192.168.158.4:9866 2025-07-15 07:32:42,329 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:57628, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_309336083_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749822_8998, duration(ns): 16270286 2025-07-15 07:32:42,329 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749822_8998, type=LAST_IN_PIPELINE terminating 2025-07-15 07:32:49,194 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749822_8998 replica FinalizedReplica, blk_1073749822_8998, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749822 for deletion 2025-07-15 07:32:49,195 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749822_8998 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749822 2025-07-15 07:37:57,337 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749827_9003 src: /192.168.158.1:55524 dest: /192.168.158.4:9866 2025-07-15 07:37:57,372 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55524, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_949153716_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749827_9003, duration(ns): 25370941 2025-07-15 07:37:57,372 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749827_9003, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-15 07:38:01,202 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749827_9003 replica FinalizedReplica, blk_1073749827_9003, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749827 for deletion 2025-07-15 07:38:01,203 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749827_9003 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749827 2025-07-15 07:39:02,353 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749828_9004 src: /192.168.158.7:45538 dest: /192.168.158.4:9866 2025-07-15 07:39:02,379 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:45538, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1667092666_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749828_9004, duration(ns): 21055799 2025-07-15 07:39:02,380 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749828_9004, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-15 07:39:07,202 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749828_9004 replica FinalizedReplica, blk_1073749828_9004, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749828 for deletion 2025-07-15 07:39:07,203 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749828_9004 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749828 2025-07-15 07:40:02,358 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749829_9005 src: /192.168.158.9:54572 dest: /192.168.158.4:9866 2025-07-15 07:40:02,384 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:54572, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2101954811_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749829_9005, duration(ns): 21099093 2025-07-15 07:40:02,385 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749829_9005, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-15 07:40:07,205 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749829_9005 replica FinalizedReplica, blk_1073749829_9005, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749829 for deletion 2025-07-15 07:40:07,206 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749829_9005 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749829 2025-07-15 07:43:12,309 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749832_9008 src: /192.168.158.5:52132 dest: /192.168.158.4:9866 2025-07-15 07:43:12,328 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:52132, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-251772277_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749832_9008, duration(ns): 16928378 2025-07-15 07:43:12,328 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749832_9008, type=LAST_IN_PIPELINE terminating 2025-07-15 07:43:19,208 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749832_9008 replica FinalizedReplica, blk_1073749832_9008, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749832 for deletion 2025-07-15 07:43:19,209 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749832_9008 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749832 2025-07-15 07:45:12,354 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749834_9010 src: /192.168.158.8:50332 dest: /192.168.158.4:9866 2025-07-15 07:45:12,372 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:50332, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2138269594_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749834_9010, duration(ns): 16085540 2025-07-15 07:45:12,373 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749834_9010, type=LAST_IN_PIPELINE terminating 2025-07-15 07:45:16,211 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749834_9010 replica FinalizedReplica, blk_1073749834_9010, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749834 for deletion 2025-07-15 07:45:16,212 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749834_9010 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749834 2025-07-15 07:49:17,346 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749838_9014 src: /192.168.158.1:37288 dest: /192.168.158.4:9866 2025-07-15 07:49:17,376 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37288, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-884083882_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749838_9014, duration(ns): 22135607 2025-07-15 07:49:17,377 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749838_9014, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-15 07:49:22,216 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749838_9014 replica FinalizedReplica, blk_1073749838_9014, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749838 for deletion 2025-07-15 07:49:22,218 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749838_9014 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749838 2025-07-15 07:50:17,360 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749839_9015 src: /192.168.158.1:48954 dest: /192.168.158.4:9866 2025-07-15 07:50:17,393 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48954, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-276409360_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749839_9015, duration(ns): 24018385 2025-07-15 07:50:17,393 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749839_9015, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-15 07:50:25,217 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749839_9015 replica FinalizedReplica, blk_1073749839_9015, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749839 for deletion 2025-07-15 07:50:25,218 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749839_9015 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749839 2025-07-15 07:52:17,360 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749841_9017 src: /192.168.158.8:40218 dest: /192.168.158.4:9866 2025-07-15 07:52:17,384 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:40218, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1266072058_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749841_9017, duration(ns): 18850300 2025-07-15 07:52:17,385 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749841_9017, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-15 07:52:22,219 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749841_9017 replica FinalizedReplica, blk_1073749841_9017, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749841 for deletion 2025-07-15 07:52:22,220 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749841_9017 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749841 2025-07-15 07:54:17,362 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749843_9019 src: /192.168.158.1:60722 dest: /192.168.158.4:9866 2025-07-15 07:54:17,394 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60722, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1077500312_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749843_9019, duration(ns): 24107054 2025-07-15 07:54:17,394 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749843_9019, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-15 07:54:22,225 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749843_9019 replica FinalizedReplica, blk_1073749843_9019, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749843 for deletion 2025-07-15 07:54:22,226 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749843_9019 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749843 2025-07-15 07:56:22,375 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749845_9021 src: /192.168.158.5:44740 dest: /192.168.158.4:9866 2025-07-15 07:56:22,395 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:44740, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_71665129_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749845_9021, duration(ns): 18058381 2025-07-15 07:56:22,396 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749845_9021, type=LAST_IN_PIPELINE terminating 2025-07-15 07:56:28,227 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749845_9021 replica FinalizedReplica, blk_1073749845_9021, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749845 for deletion 2025-07-15 07:56:28,229 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749845_9021 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749845 2025-07-15 07:57:22,376 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749846_9022 src: /192.168.158.7:38336 dest: /192.168.158.4:9866 2025-07-15 07:57:22,402 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:38336, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_631264868_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749846_9022, duration(ns): 20860321 2025-07-15 07:57:22,402 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749846_9022, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-15 07:57:28,229 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749846_9022 replica FinalizedReplica, blk_1073749846_9022, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749846 for deletion 2025-07-15 07:57:28,230 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749846_9022 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749846 2025-07-15 08:02:27,383 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749851_9027 src: /192.168.158.1:39856 dest: /192.168.158.4:9866 2025-07-15 08:02:27,417 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39856, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1280566614_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749851_9027, duration(ns): 25078665 2025-07-15 08:02:27,417 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749851_9027, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-15 08:02:31,234 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749851_9027 replica FinalizedReplica, blk_1073749851_9027, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749851 for deletion 2025-07-15 08:02:31,236 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749851_9027 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749851 2025-07-15 08:03:27,378 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749852_9028 src: /192.168.158.6:60308 dest: /192.168.158.4:9866 2025-07-15 08:03:27,400 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:60308, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1889882465_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749852_9028, duration(ns): 19352827 2025-07-15 08:03:27,400 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749852_9028, type=LAST_IN_PIPELINE terminating 2025-07-15 08:03:31,237 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749852_9028 replica FinalizedReplica, blk_1073749852_9028, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749852 for deletion 2025-07-15 08:03:31,238 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749852_9028 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749852 2025-07-15 08:04:27,384 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749853_9029 src: /192.168.158.5:53708 dest: /192.168.158.4:9866 2025-07-15 08:04:27,404 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:53708, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-407478515_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749853_9029, duration(ns): 18126471 2025-07-15 08:04:27,404 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749853_9029, type=LAST_IN_PIPELINE terminating 2025-07-15 08:04:31,238 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749853_9029 replica FinalizedReplica, blk_1073749853_9029, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749853 for deletion 2025-07-15 08:04:31,239 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749853_9029 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749853 2025-07-15 08:05:27,381 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749854_9030 src: /192.168.158.1:42500 dest: /192.168.158.4:9866 2025-07-15 08:05:27,412 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42500, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_324729214_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749854_9030, duration(ns): 21605207 2025-07-15 08:05:27,413 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749854_9030, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-15 08:05:34,242 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749854_9030 replica FinalizedReplica, blk_1073749854_9030, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749854 for deletion 2025-07-15 08:05:34,243 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749854_9030 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749854 2025-07-15 08:06:32,393 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749855_9031 src: /192.168.158.5:36886 dest: /192.168.158.4:9866 2025-07-15 08:06:32,413 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:36886, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-563225517_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749855_9031, duration(ns): 17214032 2025-07-15 08:06:32,413 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749855_9031, type=LAST_IN_PIPELINE terminating 2025-07-15 08:06:40,241 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749855_9031 replica FinalizedReplica, blk_1073749855_9031, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749855 for deletion 2025-07-15 08:06:40,243 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749855_9031 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749855 2025-07-15 08:07:32,362 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749856_9032 src: /192.168.158.6:51898 dest: /192.168.158.4:9866 2025-07-15 08:07:32,387 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:51898, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1020275468_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749856_9032, duration(ns): 19090837 2025-07-15 08:07:32,387 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749856_9032, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-15 08:07:37,243 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749856_9032 replica FinalizedReplica, blk_1073749856_9032, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749856 for deletion 2025-07-15 08:07:37,245 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749856_9032 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749856 2025-07-15 08:09:37,372 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749858_9034 src: /192.168.158.7:59966 dest: /192.168.158.4:9866 2025-07-15 08:09:37,390 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:59966, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1498759349_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749858_9034, duration(ns): 15869181 2025-07-15 08:09:37,390 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749858_9034, type=LAST_IN_PIPELINE terminating 2025-07-15 08:09:43,247 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749858_9034 replica FinalizedReplica, blk_1073749858_9034, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749858 for deletion 2025-07-15 08:09:43,249 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749858_9034 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749858 2025-07-15 08:11:37,366 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749860_9036 src: /192.168.158.7:44492 dest: /192.168.158.4:9866 2025-07-15 08:11:37,390 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:44492, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1967903877_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749860_9036, duration(ns): 18970056 2025-07-15 08:11:37,390 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749860_9036, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-15 08:11:40,253 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749860_9036 replica FinalizedReplica, blk_1073749860_9036, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749860 for deletion 2025-07-15 08:11:40,254 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749860_9036 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749860 2025-07-15 08:14:47,397 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749863_9039 src: /192.168.158.5:33550 dest: /192.168.158.4:9866 2025-07-15 08:14:47,415 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:33550, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1138650038_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749863_9039, duration(ns): 15740481 2025-07-15 08:14:47,415 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749863_9039, type=LAST_IN_PIPELINE terminating 2025-07-15 08:14:52,259 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749863_9039 replica FinalizedReplica, blk_1073749863_9039, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749863 for deletion 2025-07-15 08:14:52,260 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749863_9039 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749863 2025-07-15 08:16:52,374 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749865_9041 src: /192.168.158.6:36600 dest: /192.168.158.4:9866 2025-07-15 08:16:52,392 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:36600, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_423694604_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749865_9041, duration(ns): 14707074 2025-07-15 08:16:52,392 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749865_9041, type=LAST_IN_PIPELINE terminating 2025-07-15 08:16:55,263 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749865_9041 replica FinalizedReplica, blk_1073749865_9041, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749865 for deletion 2025-07-15 08:16:55,264 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749865_9041 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749865 2025-07-15 08:17:52,385 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749866_9042 src: /192.168.158.7:36346 dest: /192.168.158.4:9866 2025-07-15 08:17:52,411 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:36346, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1777111617_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749866_9042, duration(ns): 21041541 2025-07-15 08:17:52,411 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749866_9042, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-15 08:17:58,265 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749866_9042 replica FinalizedReplica, blk_1073749866_9042, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749866 for deletion 2025-07-15 08:17:58,267 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749866_9042 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749866 2025-07-15 08:19:57,372 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749868_9044 src: /192.168.158.6:33636 dest: /192.168.158.4:9866 2025-07-15 08:19:57,396 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:33636, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1724053335_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749868_9044, duration(ns): 19530837 2025-07-15 08:19:57,397 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749868_9044, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-15 08:20:01,273 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749868_9044 replica FinalizedReplica, blk_1073749868_9044, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749868 for deletion 2025-07-15 08:20:01,274 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749868_9044 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749868 2025-07-15 08:23:07,393 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749871_9047 src: /192.168.158.5:53660 dest: /192.168.158.4:9866 2025-07-15 08:23:07,411 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:53660, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_170813673_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749871_9047, duration(ns): 15812135 2025-07-15 08:23:07,412 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749871_9047, type=LAST_IN_PIPELINE terminating 2025-07-15 08:23:10,280 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749871_9047 replica FinalizedReplica, blk_1073749871_9047, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749871 for deletion 2025-07-15 08:23:10,282 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749871_9047 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749871 2025-07-15 08:26:12,384 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749874_9050 src: /192.168.158.8:53560 dest: /192.168.158.4:9866 2025-07-15 08:26:12,410 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:53560, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_757416981_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749874_9050, duration(ns): 19975758 2025-07-15 08:26:12,410 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749874_9050, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-15 08:26:16,288 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749874_9050 replica FinalizedReplica, blk_1073749874_9050, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749874 for deletion 2025-07-15 08:26:16,289 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749874_9050 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749874 2025-07-15 08:27:12,378 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749875_9051 src: /192.168.158.1:59890 dest: /192.168.158.4:9866 2025-07-15 08:27:12,411 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59890, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1809521245_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749875_9051, duration(ns): 23997587 2025-07-15 08:27:12,411 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749875_9051, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-15 08:27:16,289 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749875_9051 replica FinalizedReplica, blk_1073749875_9051, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749875 for deletion 2025-07-15 08:27:16,291 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749875_9051 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749875 2025-07-15 08:29:12,389 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749877_9053 src: /192.168.158.1:55200 dest: /192.168.158.4:9866 2025-07-15 08:29:12,420 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55200, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1278822137_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749877_9053, duration(ns): 22433475 2025-07-15 08:29:12,420 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749877_9053, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-15 08:29:16,293 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749877_9053 replica FinalizedReplica, blk_1073749877_9053, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749877 for deletion 2025-07-15 08:29:16,294 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749877_9053 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749877 2025-07-15 08:33:12,402 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749881_9057 src: /192.168.158.8:34028 dest: /192.168.158.4:9866 2025-07-15 08:33:12,429 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:34028, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-434912435_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749881_9057, duration(ns): 21270067 2025-07-15 08:33:12,429 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749881_9057, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-15 08:33:16,303 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749881_9057 replica FinalizedReplica, blk_1073749881_9057, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749881 for deletion 2025-07-15 08:33:16,304 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749881_9057 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749881 2025-07-15 08:34:12,400 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749882_9058 src: /192.168.158.8:60414 dest: /192.168.158.4:9866 2025-07-15 08:34:12,427 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:60414, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-268793688_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749882_9058, duration(ns): 21373589 2025-07-15 08:34:12,427 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749882_9058, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-15 08:34:16,307 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749882_9058 replica FinalizedReplica, blk_1073749882_9058, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749882 for deletion 2025-07-15 08:34:16,308 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749882_9058 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749882 2025-07-15 08:35:12,409 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749883_9059 src: /192.168.158.6:48074 dest: /192.168.158.4:9866 2025-07-15 08:35:12,426 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:48074, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-827905716_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749883_9059, duration(ns): 15333501 2025-07-15 08:35:12,427 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749883_9059, type=LAST_IN_PIPELINE terminating 2025-07-15 08:35:16,307 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749883_9059 replica FinalizedReplica, blk_1073749883_9059, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749883 for deletion 2025-07-15 08:35:16,309 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749883_9059 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749883 2025-07-15 08:36:12,413 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749884_9060 src: /192.168.158.5:56616 dest: /192.168.158.4:9866 2025-07-15 08:36:12,433 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:56616, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_802945585_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749884_9060, duration(ns): 17618930 2025-07-15 08:36:12,433 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749884_9060, type=LAST_IN_PIPELINE terminating 2025-07-15 08:36:16,308 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749884_9060 replica FinalizedReplica, blk_1073749884_9060, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749884 for deletion 2025-07-15 08:36:16,310 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749884_9060 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749884 2025-07-15 08:38:12,426 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749886_9062 src: /192.168.158.5:39790 dest: /192.168.158.4:9866 2025-07-15 08:38:12,445 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:39790, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1797283240_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749886_9062, duration(ns): 17256415 2025-07-15 08:38:12,445 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749886_9062, type=LAST_IN_PIPELINE terminating 2025-07-15 08:38:16,309 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749886_9062 replica FinalizedReplica, blk_1073749886_9062, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749886 for deletion 2025-07-15 08:38:16,310 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749886_9062 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749886 2025-07-15 08:43:17,413 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749891_9067 src: /192.168.158.1:54712 dest: /192.168.158.4:9866 2025-07-15 08:43:17,445 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54712, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1032114267_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749891_9067, duration(ns): 22842219 2025-07-15 08:43:17,445 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749891_9067, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-15 08:43:22,317 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749891_9067 replica FinalizedReplica, blk_1073749891_9067, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749891 for deletion 2025-07-15 08:43:22,318 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749891_9067 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749891 2025-07-15 08:44:17,436 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749892_9068 src: /192.168.158.1:56402 dest: /192.168.158.4:9866 2025-07-15 08:44:17,468 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56402, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1772399967_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749892_9068, duration(ns): 23637148 2025-07-15 08:44:17,468 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749892_9068, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-15 08:44:25,322 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749892_9068 replica FinalizedReplica, blk_1073749892_9068, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749892 for deletion 2025-07-15 08:44:25,323 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749892_9068 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749892 2025-07-15 08:45:22,419 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749893_9069 src: /192.168.158.7:47950 dest: /192.168.158.4:9866 2025-07-15 08:45:22,437 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:47950, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2037648892_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749893_9069, duration(ns): 15738493 2025-07-15 08:45:22,437 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749893_9069, type=LAST_IN_PIPELINE terminating 2025-07-15 08:45:28,325 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749893_9069 replica FinalizedReplica, blk_1073749893_9069, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749893 for deletion 2025-07-15 08:45:28,326 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749893_9069 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749893 2025-07-15 08:47:27,420 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749895_9071 src: /192.168.158.1:41140 dest: /192.168.158.4:9866 2025-07-15 08:47:27,452 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41140, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_404358198_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749895_9071, duration(ns): 22323766 2025-07-15 08:47:27,452 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749895_9071, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-15 08:47:31,329 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749895_9071 replica FinalizedReplica, blk_1073749895_9071, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749895 for deletion 2025-07-15 08:47:31,330 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749895_9071 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749895 2025-07-15 08:49:32,419 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749897_9073 src: /192.168.158.7:48050 dest: /192.168.158.4:9866 2025-07-15 08:49:32,442 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:48050, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1582336784_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749897_9073, duration(ns): 18047255 2025-07-15 08:49:32,442 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749897_9073, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-15 08:49:37,335 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749897_9073 replica FinalizedReplica, blk_1073749897_9073, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749897 for deletion 2025-07-15 08:49:37,336 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749897_9073 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749897 2025-07-15 08:51:32,425 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749899_9075 src: /192.168.158.1:58052 dest: /192.168.158.4:9866 2025-07-15 08:51:32,457 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58052, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-690687351_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749899_9075, duration(ns): 23332542 2025-07-15 08:51:32,457 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749899_9075, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-15 08:51:37,341 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749899_9075 replica FinalizedReplica, blk_1073749899_9075, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749899 for deletion 2025-07-15 08:51:37,342 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749899_9075 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749899 2025-07-15 08:53:32,427 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749901_9077 src: /192.168.158.1:54752 dest: /192.168.158.4:9866 2025-07-15 08:53:32,460 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54752, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_59249851_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749901_9077, duration(ns): 23735101 2025-07-15 08:53:32,460 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749901_9077, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-15 08:53:37,347 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749901_9077 replica FinalizedReplica, blk_1073749901_9077, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749901 for deletion 2025-07-15 08:53:37,348 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749901_9077 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749901 2025-07-15 09:00:47,461 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749908_9084 src: /192.168.158.8:36720 dest: /192.168.158.4:9866 2025-07-15 09:00:47,478 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:36720, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_660399318_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749908_9084, duration(ns): 14991607 2025-07-15 09:00:47,478 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749908_9084, type=LAST_IN_PIPELINE terminating 2025-07-15 09:00:52,369 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749908_9084 replica FinalizedReplica, blk_1073749908_9084, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749908 for deletion 2025-07-15 09:00:52,370 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749908_9084 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749908 2025-07-15 09:09:07,473 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749916_9092 src: /192.168.158.1:47442 dest: /192.168.158.4:9866 2025-07-15 09:09:07,508 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47442, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_5689626_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749916_9092, duration(ns): 25684107 2025-07-15 09:09:07,508 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749916_9092, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-15 09:09:13,391 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749916_9092 replica FinalizedReplica, blk_1073749916_9092, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749916 for deletion 2025-07-15 09:09:13,392 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749916_9092 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749916 2025-07-15 09:10:12,456 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749917_9093 src: /192.168.158.6:35532 dest: /192.168.158.4:9866 2025-07-15 09:10:12,476 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:35532, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_599193289_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749917_9093, duration(ns): 18438141 2025-07-15 09:10:12,476 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749917_9093, type=LAST_IN_PIPELINE terminating 2025-07-15 09:10:16,392 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749917_9093 replica FinalizedReplica, blk_1073749917_9093, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749917 for deletion 2025-07-15 09:10:16,393 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749917_9093 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749917 2025-07-15 09:11:12,457 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749918_9094 src: /192.168.158.1:32790 dest: /192.168.158.4:9866 2025-07-15 09:11:12,488 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:32790, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1145788461_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749918_9094, duration(ns): 22160597 2025-07-15 09:11:12,488 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749918_9094, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-15 09:11:19,393 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749918_9094 replica FinalizedReplica, blk_1073749918_9094, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749918 for deletion 2025-07-15 09:11:19,394 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749918_9094 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749918 2025-07-15 09:12:17,453 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749919_9095 src: /192.168.158.1:39710 dest: /192.168.158.4:9866 2025-07-15 09:12:17,487 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39710, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1322194185_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749919_9095, duration(ns): 23730524 2025-07-15 09:12:17,487 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749919_9095, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-15 09:12:22,395 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749919_9095 replica FinalizedReplica, blk_1073749919_9095, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749919 for deletion 2025-07-15 09:12:22,396 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749919_9095 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749919 2025-07-15 09:13:17,457 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749920_9096 src: /192.168.158.9:36884 dest: /192.168.158.4:9866 2025-07-15 09:13:17,484 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:36884, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-599261867_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749920_9096, duration(ns): 22081915 2025-07-15 09:13:17,485 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749920_9096, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-15 09:13:25,397 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749920_9096 replica FinalizedReplica, blk_1073749920_9096, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749920 for deletion 2025-07-15 09:13:25,398 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749920_9096 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749920 2025-07-15 09:14:17,457 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749921_9097 src: /192.168.158.1:33388 dest: /192.168.158.4:9866 2025-07-15 09:14:17,492 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33388, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-84738268_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749921_9097, duration(ns): 26734514 2025-07-15 09:14:17,492 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749921_9097, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-15 09:14:22,399 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749921_9097 replica FinalizedReplica, blk_1073749921_9097, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749921 for deletion 2025-07-15 09:14:22,400 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749921_9097 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749921 2025-07-15 09:15:17,473 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749922_9098 src: /192.168.158.1:60306 dest: /192.168.158.4:9866 2025-07-15 09:15:17,504 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60306, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_890267582_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749922_9098, duration(ns): 21964990 2025-07-15 09:15:17,504 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749922_9098, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-15 09:15:22,401 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749922_9098 replica FinalizedReplica, blk_1073749922_9098, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749922 for deletion 2025-07-15 09:15:22,402 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749922_9098 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749922 2025-07-15 09:23:22,495 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749930_9106 src: /192.168.158.7:54452 dest: /192.168.158.4:9866 2025-07-15 09:23:22,515 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:54452, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1810110895_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749930_9106, duration(ns): 16877088 2025-07-15 09:23:22,515 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749930_9106, type=LAST_IN_PIPELINE terminating 2025-07-15 09:23:28,422 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749930_9106 replica FinalizedReplica, blk_1073749930_9106, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749930 for deletion 2025-07-15 09:23:28,423 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749930_9106 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749930 2025-07-15 09:24:27,521 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749931_9107 src: /192.168.158.9:37300 dest: /192.168.158.4:9866 2025-07-15 09:24:27,545 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:37300, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-231827113_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749931_9107, duration(ns): 19242398 2025-07-15 09:24:27,545 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749931_9107, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-15 09:24:34,422 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749931_9107 replica FinalizedReplica, blk_1073749931_9107, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749931 for deletion 2025-07-15 09:24:34,423 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749931_9107 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749931 2025-07-15 09:26:27,489 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749933_9109 src: /192.168.158.9:53120 dest: /192.168.158.4:9866 2025-07-15 09:26:27,505 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:53120, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2043544332_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749933_9109, duration(ns): 13943643 2025-07-15 09:26:27,505 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749933_9109, type=LAST_IN_PIPELINE terminating 2025-07-15 09:26:34,426 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749933_9109 replica FinalizedReplica, blk_1073749933_9109, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749933 for deletion 2025-07-15 09:26:34,427 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749933_9109 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749933 2025-07-15 09:27:27,493 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749934_9110 src: /192.168.158.7:58434 dest: /192.168.158.4:9866 2025-07-15 09:27:27,516 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:58434, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_493778060_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749934_9110, duration(ns): 18752931 2025-07-15 09:27:27,517 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749934_9110, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-15 09:27:34,427 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749934_9110 replica FinalizedReplica, blk_1073749934_9110, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749934 for deletion 2025-07-15 09:27:34,429 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749934_9110 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749934 2025-07-15 09:30:32,494 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749937_9113 src: /192.168.158.5:51014 dest: /192.168.158.4:9866 2025-07-15 09:30:32,520 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:51014, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-146577080_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749937_9113, duration(ns): 21322487 2025-07-15 09:30:32,521 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749937_9113, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-15 09:30:37,435 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749937_9113 replica FinalizedReplica, blk_1073749937_9113, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749937 for deletion 2025-07-15 09:30:37,436 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749937_9113 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749937 2025-07-15 09:35:37,501 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749942_9118 src: /192.168.158.5:48566 dest: /192.168.158.4:9866 2025-07-15 09:35:37,519 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:48566, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1299829876_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749942_9118, duration(ns): 15715648 2025-07-15 09:35:37,519 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749942_9118, type=LAST_IN_PIPELINE terminating 2025-07-15 09:35:40,446 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749942_9118 replica FinalizedReplica, blk_1073749942_9118, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749942 for deletion 2025-07-15 09:35:40,447 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749942_9118 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749942 2025-07-15 09:36:37,503 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749943_9119 src: /192.168.158.1:58006 dest: /192.168.158.4:9866 2025-07-15 09:36:37,536 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58006, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2096225209_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749943_9119, duration(ns): 24261714 2025-07-15 09:36:37,536 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749943_9119, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-15 09:36:40,449 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749943_9119 replica FinalizedReplica, blk_1073749943_9119, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749943 for deletion 2025-07-15 09:36:40,450 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749943_9119 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749943 2025-07-15 09:39:37,507 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749946_9122 src: /192.168.158.6:53996 dest: /192.168.158.4:9866 2025-07-15 09:39:37,534 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:53996, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1247761743_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749946_9122, duration(ns): 19435283 2025-07-15 09:39:37,534 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749946_9122, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-15 09:39:43,454 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749946_9122 replica FinalizedReplica, blk_1073749946_9122, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749946 for deletion 2025-07-15 09:39:43,455 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749946_9122 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749946 2025-07-15 09:44:37,516 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749951_9127 src: /192.168.158.9:46286 dest: /192.168.158.4:9866 2025-07-15 09:44:37,541 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:46286, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1466917868_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749951_9127, duration(ns): 19412369 2025-07-15 09:44:37,541 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749951_9127, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-15 09:44:40,460 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749951_9127 replica FinalizedReplica, blk_1073749951_9127, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749951 for deletion 2025-07-15 09:44:40,461 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749951_9127 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749951 2025-07-15 09:45:42,517 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749952_9128 src: /192.168.158.1:46834 dest: /192.168.158.4:9866 2025-07-15 09:45:42,547 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46834, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-288549159_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749952_9128, duration(ns): 21821606 2025-07-15 09:45:42,547 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749952_9128, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-15 09:45:46,463 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749952_9128 replica FinalizedReplica, blk_1073749952_9128, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749952 for deletion 2025-07-15 09:45:46,464 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749952_9128 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749952 2025-07-15 09:46:42,522 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749953_9129 src: /192.168.158.9:51684 dest: /192.168.158.4:9866 2025-07-15 09:46:42,539 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:51684, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1161215177_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749953_9129, duration(ns): 14822729 2025-07-15 09:46:42,540 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749953_9129, type=LAST_IN_PIPELINE terminating 2025-07-15 09:46:46,467 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749953_9129 replica FinalizedReplica, blk_1073749953_9129, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749953 for deletion 2025-07-15 09:46:46,468 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749953_9129 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749953 2025-07-15 09:49:47,516 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749956_9132 src: /192.168.158.1:59940 dest: /192.168.158.4:9866 2025-07-15 09:49:47,551 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59940, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1155838777_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749956_9132, duration(ns): 26866414 2025-07-15 09:49:47,552 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749956_9132, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-15 09:49:52,474 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749956_9132 replica FinalizedReplica, blk_1073749956_9132, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749956 for deletion 2025-07-15 09:49:52,475 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749956_9132 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749956 2025-07-15 09:50:47,524 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749957_9133 src: /192.168.158.9:42186 dest: /192.168.158.4:9866 2025-07-15 09:50:47,541 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:42186, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_644082849_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749957_9133, duration(ns): 14673265 2025-07-15 09:50:47,541 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749957_9133, type=LAST_IN_PIPELINE terminating 2025-07-15 09:50:52,477 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749957_9133 replica FinalizedReplica, blk_1073749957_9133, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749957 for deletion 2025-07-15 09:50:52,478 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749957_9133 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749957 2025-07-15 09:51:52,523 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749958_9134 src: /192.168.158.1:60562 dest: /192.168.158.4:9866 2025-07-15 09:51:52,555 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60562, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2115226321_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749958_9134, duration(ns): 23329639 2025-07-15 09:51:52,556 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749958_9134, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-15 09:51:58,480 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749958_9134 replica FinalizedReplica, blk_1073749958_9134, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749958 for deletion 2025-07-15 09:51:58,481 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749958_9134 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749958 2025-07-15 09:55:57,529 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749962_9138 src: /192.168.158.5:55534 dest: /192.168.158.4:9866 2025-07-15 09:55:57,548 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:55534, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1180441149_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749962_9138, duration(ns): 17768667 2025-07-15 09:55:57,549 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749962_9138, type=LAST_IN_PIPELINE terminating 2025-07-15 09:56:01,490 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749962_9138 replica FinalizedReplica, blk_1073749962_9138, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749962 for deletion 2025-07-15 09:56:01,492 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749962_9138 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749962 2025-07-15 09:56:57,572 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749963_9139 src: /192.168.158.7:49754 dest: /192.168.158.4:9866 2025-07-15 09:56:57,596 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:49754, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2108536382_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749963_9139, duration(ns): 18436160 2025-07-15 09:56:57,596 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749963_9139, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-15 09:57:04,493 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749963_9139 replica FinalizedReplica, blk_1073749963_9139, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749963 for deletion 2025-07-15 09:57:04,494 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749963_9139 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749963 2025-07-15 09:59:02,538 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749965_9141 src: /192.168.158.5:33340 dest: /192.168.158.4:9866 2025-07-15 09:59:02,564 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:33340, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1649977422_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749965_9141, duration(ns): 19303681 2025-07-15 09:59:02,564 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749965_9141, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-15 09:59:07,497 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749965_9141 replica FinalizedReplica, blk_1073749965_9141, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749965 for deletion 2025-07-15 09:59:07,498 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749965_9141 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749965 2025-07-15 10:00:02,534 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749966_9142 src: /192.168.158.8:47966 dest: /192.168.158.4:9866 2025-07-15 10:00:02,550 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:47966, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_319487809_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749966_9142, duration(ns): 14421601 2025-07-15 10:00:02,551 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749966_9142, type=LAST_IN_PIPELINE terminating 2025-07-15 10:00:10,500 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749966_9142 replica FinalizedReplica, blk_1073749966_9142, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749966 for deletion 2025-07-15 10:00:10,501 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749966_9142 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749966 2025-07-15 10:02:02,546 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749968_9144 src: /192.168.158.6:32982 dest: /192.168.158.4:9866 2025-07-15 10:02:02,563 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:32982, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1939294639_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749968_9144, duration(ns): 15679308 2025-07-15 10:02:02,564 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749968_9144, type=LAST_IN_PIPELINE terminating 2025-07-15 10:02:07,504 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749968_9144 replica FinalizedReplica, blk_1073749968_9144, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749968 for deletion 2025-07-15 10:02:07,505 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749968_9144 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749968 2025-07-15 10:03:02,541 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749969_9145 src: /192.168.158.6:50800 dest: /192.168.158.4:9866 2025-07-15 10:03:02,565 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:50800, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-512565656_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749969_9145, duration(ns): 19055034 2025-07-15 10:03:02,565 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749969_9145, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-15 10:03:07,506 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749969_9145 replica FinalizedReplica, blk_1073749969_9145, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749969 for deletion 2025-07-15 10:03:07,507 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749969_9145 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749969 2025-07-15 10:05:02,541 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749971_9147 src: /192.168.158.1:41954 dest: /192.168.158.4:9866 2025-07-15 10:05:02,573 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41954, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1174982112_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749971_9147, duration(ns): 22688646 2025-07-15 10:05:02,573 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749971_9147, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-15 10:05:07,513 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749971_9147 replica FinalizedReplica, blk_1073749971_9147, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749971 for deletion 2025-07-15 10:05:07,514 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749971_9147 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749971 2025-07-15 10:08:07,569 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749974_9150 src: /192.168.158.1:57492 dest: /192.168.158.4:9866 2025-07-15 10:08:07,599 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57492, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-649996617_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749974_9150, duration(ns): 22034662 2025-07-15 10:08:07,600 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749974_9150, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-15 10:08:10,515 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749974_9150 replica FinalizedReplica, blk_1073749974_9150, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749974 for deletion 2025-07-15 10:08:10,517 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749974_9150 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749974 2025-07-15 10:10:07,544 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749976_9152 src: /192.168.158.1:37956 dest: /192.168.158.4:9866 2025-07-15 10:10:07,577 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37956, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1613958250_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749976_9152, duration(ns): 24579909 2025-07-15 10:10:07,578 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749976_9152, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-15 10:10:10,521 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749976_9152 replica FinalizedReplica, blk_1073749976_9152, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749976 for deletion 2025-07-15 10:10:10,522 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749976_9152 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749976 2025-07-15 10:11:07,552 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749977_9153 src: /192.168.158.1:41120 dest: /192.168.158.4:9866 2025-07-15 10:11:07,584 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41120, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1677334381_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749977_9153, duration(ns): 22952640 2025-07-15 10:11:07,584 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749977_9153, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-15 10:11:10,522 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749977_9153 replica FinalizedReplica, blk_1073749977_9153, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749977 for deletion 2025-07-15 10:11:10,524 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749977_9153 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749977 2025-07-15 10:14:07,588 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749980_9156 src: /192.168.158.1:48954 dest: /192.168.158.4:9866 2025-07-15 10:14:07,621 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48954, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1769794307_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749980_9156, duration(ns): 25135454 2025-07-15 10:14:07,622 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749980_9156, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-15 10:14:13,528 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749980_9156 replica FinalizedReplica, blk_1073749980_9156, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749980 for deletion 2025-07-15 10:14:13,530 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749980_9156 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749980 2025-07-15 10:16:12,560 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749982_9158 src: /192.168.158.1:57970 dest: /192.168.158.4:9866 2025-07-15 10:16:12,590 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57970, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1680294174_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749982_9158, duration(ns): 21085613 2025-07-15 10:16:12,592 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749982_9158, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-15 10:16:16,534 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749982_9158 replica FinalizedReplica, blk_1073749982_9158, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749982 for deletion 2025-07-15 10:16:16,535 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749982_9158 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749982 2025-07-15 10:22:12,572 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749988_9164 src: /192.168.158.1:59566 dest: /192.168.158.4:9866 2025-07-15 10:22:12,602 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59566, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-340485088_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749988_9164, duration(ns): 22136826 2025-07-15 10:22:12,603 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749988_9164, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-15 10:22:16,551 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749988_9164 replica FinalizedReplica, blk_1073749988_9164, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749988 for deletion 2025-07-15 10:22:16,552 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749988_9164 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749988 2025-07-15 10:24:12,578 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749990_9166 src: /192.168.158.1:44216 dest: /192.168.158.4:9866 2025-07-15 10:24:12,610 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44216, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-633218681_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749990_9166, duration(ns): 24144432 2025-07-15 10:24:12,611 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749990_9166, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-15 10:24:19,554 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749990_9166 replica FinalizedReplica, blk_1073749990_9166, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749990 for deletion 2025-07-15 10:24:19,555 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749990_9166 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749990 2025-07-15 10:28:27,581 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749994_9170 src: /192.168.158.6:55642 dest: /192.168.158.4:9866 2025-07-15 10:28:27,605 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:55642, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_958012055_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749994_9170, duration(ns): 18931534 2025-07-15 10:28:27,605 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749994_9170, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-15 10:28:31,565 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749994_9170 replica FinalizedReplica, blk_1073749994_9170, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749994 for deletion 2025-07-15 10:28:31,566 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749994_9170 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749994 2025-07-15 10:30:32,599 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749996_9172 src: /192.168.158.5:56900 dest: /192.168.158.4:9866 2025-07-15 10:30:32,617 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:56900, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2131715277_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749996_9172, duration(ns): 16439917 2025-07-15 10:30:32,617 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749996_9172, type=LAST_IN_PIPELINE terminating 2025-07-15 10:30:37,569 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749996_9172 replica FinalizedReplica, blk_1073749996_9172, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749996 for deletion 2025-07-15 10:30:37,570 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749996_9172 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749996 2025-07-15 10:31:32,593 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749997_9173 src: /192.168.158.6:47364 dest: /192.168.158.4:9866 2025-07-15 10:31:32,618 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:47364, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_340308475_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749997_9173, duration(ns): 19519082 2025-07-15 10:31:32,618 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749997_9173, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-15 10:31:37,573 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749997_9173 replica FinalizedReplica, blk_1073749997_9173, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749997 for deletion 2025-07-15 10:31:37,574 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749997_9173 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749997 2025-07-15 10:32:37,590 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749998_9174 src: /192.168.158.9:47850 dest: /192.168.158.4:9866 2025-07-15 10:32:37,611 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:47850, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_398883382_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749998_9174, duration(ns): 19462815 2025-07-15 10:32:37,612 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749998_9174, type=LAST_IN_PIPELINE terminating 2025-07-15 10:32:40,570 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749998_9174 replica FinalizedReplica, blk_1073749998_9174, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749998 for deletion 2025-07-15 10:32:40,572 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749998_9174 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749998 2025-07-15 10:33:37,578 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073749999_9175 src: /192.168.158.1:45170 dest: /192.168.158.4:9866 2025-07-15 10:33:37,610 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45170, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_270795081_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073749999_9175, duration(ns): 23038530 2025-07-15 10:33:37,610 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073749999_9175, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-15 10:33:40,573 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073749999_9175 replica FinalizedReplica, blk_1073749999_9175, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749999 for deletion 2025-07-15 10:33:40,574 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073749999_9175 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073749999 2025-07-15 10:34:37,585 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750000_9176 src: /192.168.158.7:43924 dest: /192.168.158.4:9866 2025-07-15 10:34:37,612 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:43924, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1110730558_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750000_9176, duration(ns): 20596094 2025-07-15 10:34:37,612 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750000_9176, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-15 10:34:40,575 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750000_9176 replica FinalizedReplica, blk_1073750000_9176, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073750000 for deletion 2025-07-15 10:34:40,577 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750000_9176 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073750000 2025-07-15 10:35:37,597 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750001_9177 src: /192.168.158.7:48794 dest: /192.168.158.4:9866 2025-07-15 10:35:37,617 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:48794, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-420329768_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750001_9177, duration(ns): 18197254 2025-07-15 10:35:37,617 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750001_9177, type=LAST_IN_PIPELINE terminating 2025-07-15 10:35:43,578 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750001_9177 replica FinalizedReplica, blk_1073750001_9177, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073750001 for deletion 2025-07-15 10:35:43,579 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750001_9177 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073750001 2025-07-15 10:37:37,603 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750003_9179 src: /192.168.158.1:59444 dest: /192.168.158.4:9866 2025-07-15 10:37:37,635 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59444, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1495861263_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750003_9179, duration(ns): 22996513 2025-07-15 10:37:37,635 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750003_9179, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-15 10:37:43,583 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750003_9179 replica FinalizedReplica, blk_1073750003_9179, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073750003 for deletion 2025-07-15 10:37:43,584 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750003_9179 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073750003 2025-07-15 10:38:37,600 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750004_9180 src: /192.168.158.1:43720 dest: /192.168.158.4:9866 2025-07-15 10:38:37,636 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43720, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-960615688_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750004_9180, duration(ns): 27087923 2025-07-15 10:38:37,636 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750004_9180, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-15 10:38:43,584 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750004_9180 replica FinalizedReplica, blk_1073750004_9180, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073750004 for deletion 2025-07-15 10:38:43,586 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750004_9180 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073750004 2025-07-15 10:39:42,598 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750005_9181 src: /192.168.158.5:57964 dest: /192.168.158.4:9866 2025-07-15 10:39:42,622 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:57964, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1033164029_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750005_9181, duration(ns): 17827328 2025-07-15 10:39:42,622 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750005_9181, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-15 10:39:46,585 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750005_9181 replica FinalizedReplica, blk_1073750005_9181, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073750005 for deletion 2025-07-15 10:39:46,587 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750005_9181 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073750005 2025-07-15 10:43:42,619 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750009_9185 src: /192.168.158.1:33128 dest: /192.168.158.4:9866 2025-07-15 10:43:42,651 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33128, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-449588206_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750009_9185, duration(ns): 22863966 2025-07-15 10:43:42,651 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750009_9185, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-15 10:43:49,595 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750009_9185 replica FinalizedReplica, blk_1073750009_9185, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073750009 for deletion 2025-07-15 10:43:49,596 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750009_9185 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073750009 2025-07-15 10:45:47,633 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750011_9187 src: /192.168.158.1:35480 dest: /192.168.158.4:9866 2025-07-15 10:45:47,666 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35480, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_805315321_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750011_9187, duration(ns): 24444860 2025-07-15 10:45:47,666 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750011_9187, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-15 10:45:55,600 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750011_9187 replica FinalizedReplica, blk_1073750011_9187, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073750011 for deletion 2025-07-15 10:45:55,601 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750011_9187 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073750011 2025-07-15 10:47:52,634 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750013_9189 src: /192.168.158.7:58690 dest: /192.168.158.4:9866 2025-07-15 10:47:52,652 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:58690, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_594151498_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750013_9189, duration(ns): 16007042 2025-07-15 10:47:52,652 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750013_9189, type=LAST_IN_PIPELINE terminating 2025-07-15 10:47:55,607 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750013_9189 replica FinalizedReplica, blk_1073750013_9189, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073750013 for deletion 2025-07-15 10:47:55,608 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750013_9189 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073750013 2025-07-15 10:48:57,626 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750014_9190 src: /192.168.158.7:42390 dest: /192.168.158.4:9866 2025-07-15 10:48:57,651 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:42390, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_57917827_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750014_9190, duration(ns): 19797114 2025-07-15 10:48:57,651 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750014_9190, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-15 10:49:04,610 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750014_9190 replica FinalizedReplica, blk_1073750014_9190, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073750014 for deletion 2025-07-15 10:49:04,611 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750014_9190 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073750014 2025-07-15 10:54:07,631 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750019_9195 src: /192.168.158.1:42918 dest: /192.168.158.4:9866 2025-07-15 10:54:07,663 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42918, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1565875067_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750019_9195, duration(ns): 23411386 2025-07-15 10:54:07,664 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750019_9195, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-15 10:54:13,625 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750019_9195 replica FinalizedReplica, blk_1073750019_9195, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750019 for deletion 2025-07-15 10:54:13,626 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750019_9195 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750019 2025-07-15 10:55:07,639 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750020_9196 src: /192.168.158.1:35184 dest: /192.168.158.4:9866 2025-07-15 10:55:07,668 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35184, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_847035778_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750020_9196, duration(ns): 20250653 2025-07-15 10:55:07,669 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750020_9196, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-15 10:55:10,628 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750020_9196 replica FinalizedReplica, blk_1073750020_9196, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750020 for deletion 2025-07-15 10:55:10,629 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750020_9196 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750020 2025-07-15 10:56:07,641 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750021_9197 src: /192.168.158.1:46080 dest: /192.168.158.4:9866 2025-07-15 10:56:07,676 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46080, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-657025666_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750021_9197, duration(ns): 25875046 2025-07-15 10:56:07,677 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750021_9197, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-15 10:56:10,633 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750021_9197 replica FinalizedReplica, blk_1073750021_9197, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750021 for deletion 2025-07-15 10:56:10,634 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750021_9197 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750021 2025-07-15 10:57:07,635 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750022_9198 src: /192.168.158.1:57132 dest: /192.168.158.4:9866 2025-07-15 10:57:07,667 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57132, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-377958479_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750022_9198, duration(ns): 23042436 2025-07-15 10:57:07,667 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750022_9198, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-15 10:57:10,635 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750022_9198 replica FinalizedReplica, blk_1073750022_9198, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750022 for deletion 2025-07-15 10:57:10,636 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750022_9198 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750022 2025-07-15 11:00:07,653 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750025_9201 src: /192.168.158.1:51920 dest: /192.168.158.4:9866 2025-07-15 11:00:07,684 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51920, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1114787506_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750025_9201, duration(ns): 23390790 2025-07-15 11:00:07,685 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750025_9201, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-15 11:00:10,645 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750025_9201 replica FinalizedReplica, blk_1073750025_9201, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750025 for deletion 2025-07-15 11:00:10,646 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750025_9201 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750025 2025-07-15 11:01:07,662 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750026_9202 src: /192.168.158.9:50504 dest: /192.168.158.4:9866 2025-07-15 11:01:07,681 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:50504, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2092513409_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750026_9202, duration(ns): 16441958 2025-07-15 11:01:07,681 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750026_9202, type=LAST_IN_PIPELINE terminating 2025-07-15 11:01:10,646 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750026_9202 replica FinalizedReplica, blk_1073750026_9202, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750026 for deletion 2025-07-15 11:01:10,648 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750026_9202 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750026 2025-07-15 11:02:07,660 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750027_9203 src: /192.168.158.5:35544 dest: /192.168.158.4:9866 2025-07-15 11:02:07,679 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:35544, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-363423176_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750027_9203, duration(ns): 17245704 2025-07-15 11:02:07,680 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750027_9203, type=LAST_IN_PIPELINE terminating 2025-07-15 11:02:13,647 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750027_9203 replica FinalizedReplica, blk_1073750027_9203, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750027 for deletion 2025-07-15 11:02:13,649 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750027_9203 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750027 2025-07-15 11:03:07,656 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750028_9204 src: /192.168.158.5:45916 dest: /192.168.158.4:9866 2025-07-15 11:03:07,682 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:45916, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-557561391_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750028_9204, duration(ns): 19580825 2025-07-15 11:03:07,682 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750028_9204, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-15 11:03:13,652 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750028_9204 replica FinalizedReplica, blk_1073750028_9204, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750028 for deletion 2025-07-15 11:03:13,653 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750028_9204 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750028 2025-07-15 11:08:12,669 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750033_9209 src: /192.168.158.8:59394 dest: /192.168.158.4:9866 2025-07-15 11:08:12,696 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:59394, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1899564745_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750033_9209, duration(ns): 21568035 2025-07-15 11:08:12,696 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750033_9209, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-15 11:08:16,666 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750033_9209 replica FinalizedReplica, blk_1073750033_9209, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750033 for deletion 2025-07-15 11:08:16,667 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750033_9209 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750033 2025-07-15 11:09:17,676 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750034_9210 src: /192.168.158.5:46320 dest: /192.168.158.4:9866 2025-07-15 11:09:17,695 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:46320, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1472241561_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750034_9210, duration(ns): 17030926 2025-07-15 11:09:17,695 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750034_9210, type=LAST_IN_PIPELINE terminating 2025-07-15 11:09:22,668 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750034_9210 replica FinalizedReplica, blk_1073750034_9210, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750034 for deletion 2025-07-15 11:09:22,669 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750034_9210 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750034 2025-07-15 11:11:17,694 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750036_9212 src: /192.168.158.7:39106 dest: /192.168.158.4:9866 2025-07-15 11:11:17,712 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:39106, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1685779333_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750036_9212, duration(ns): 15672663 2025-07-15 11:11:17,713 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750036_9212, type=LAST_IN_PIPELINE terminating 2025-07-15 11:11:22,677 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750036_9212 replica FinalizedReplica, blk_1073750036_9212, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750036 for deletion 2025-07-15 11:11:22,678 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750036_9212 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750036 2025-07-15 11:16:22,696 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750041_9217 src: /192.168.158.1:36424 dest: /192.168.158.4:9866 2025-07-15 11:16:22,731 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36424, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1402146480_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750041_9217, duration(ns): 25124540 2025-07-15 11:16:22,731 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750041_9217, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-15 11:16:28,687 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750041_9217 replica FinalizedReplica, blk_1073750041_9217, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750041 for deletion 2025-07-15 11:16:28,688 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750041_9217 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750041 2025-07-15 11:17:22,685 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750042_9218 src: /192.168.158.5:52276 dest: /192.168.158.4:9866 2025-07-15 11:17:22,714 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:52276, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1421208041_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750042_9218, duration(ns): 23724307 2025-07-15 11:17:22,715 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750042_9218, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-15 11:17:25,690 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750042_9218 replica FinalizedReplica, blk_1073750042_9218, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750042 for deletion 2025-07-15 11:17:25,692 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750042_9218 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750042 2025-07-15 11:18:22,694 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750043_9219 src: /192.168.158.6:51826 dest: /192.168.158.4:9866 2025-07-15 11:18:22,720 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:51826, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-750950187_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750043_9219, duration(ns): 20838086 2025-07-15 11:18:22,721 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750043_9219, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-15 11:18:25,698 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750043_9219 replica FinalizedReplica, blk_1073750043_9219, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750043 for deletion 2025-07-15 11:18:25,699 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750043_9219 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750043 2025-07-15 11:22:22,721 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750047_9223 src: /192.168.158.7:60160 dest: /192.168.158.4:9866 2025-07-15 11:22:22,742 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:60160, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1031744236_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750047_9223, duration(ns): 18909158 2025-07-15 11:22:22,742 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750047_9223, type=LAST_IN_PIPELINE terminating 2025-07-15 11:22:25,709 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750047_9223 replica FinalizedReplica, blk_1073750047_9223, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750047 for deletion 2025-07-15 11:22:25,710 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750047_9223 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750047 2025-07-15 11:23:22,727 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750048_9224 src: /192.168.158.6:50546 dest: /192.168.158.4:9866 2025-07-15 11:23:22,751 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:50546, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1006003001_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750048_9224, duration(ns): 18959114 2025-07-15 11:23:22,752 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750048_9224, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-15 11:23:25,710 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750048_9224 replica FinalizedReplica, blk_1073750048_9224, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750048 for deletion 2025-07-15 11:23:25,712 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750048_9224 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750048 2025-07-15 11:28:27,693 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750053_9229 src: /192.168.158.8:54266 dest: /192.168.158.4:9866 2025-07-15 11:28:27,711 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:54266, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_500989132_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750053_9229, duration(ns): 16053813 2025-07-15 11:28:27,712 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750053_9229, type=LAST_IN_PIPELINE terminating 2025-07-15 11:28:34,719 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750053_9229 replica FinalizedReplica, blk_1073750053_9229, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750053 for deletion 2025-07-15 11:28:34,721 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750053_9229 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750053 2025-07-15 11:29:32,712 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750054_9230 src: /192.168.158.5:32870 dest: /192.168.158.4:9866 2025-07-15 11:29:32,735 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:32870, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1809512252_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750054_9230, duration(ns): 18377757 2025-07-15 11:29:32,736 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750054_9230, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-15 11:29:37,722 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750054_9230 replica FinalizedReplica, blk_1073750054_9230, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750054 for deletion 2025-07-15 11:29:37,723 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750054_9230 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750054 2025-07-15 11:33:42,710 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750058_9234 src: /192.168.158.1:43764 dest: /192.168.158.4:9866 2025-07-15 11:33:42,740 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43764, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_3637695_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750058_9234, duration(ns): 21445407 2025-07-15 11:33:42,740 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750058_9234, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-15 11:33:49,733 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750058_9234 replica FinalizedReplica, blk_1073750058_9234, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750058 for deletion 2025-07-15 11:33:49,734 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750058_9234 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750058 2025-07-15 11:34:42,717 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750059_9235 src: /192.168.158.9:36868 dest: /192.168.158.4:9866 2025-07-15 11:34:42,742 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:36868, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2097586573_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750059_9235, duration(ns): 19867509 2025-07-15 11:34:42,743 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750059_9235, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-15 11:34:49,733 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750059_9235 replica FinalizedReplica, blk_1073750059_9235, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750059 for deletion 2025-07-15 11:34:49,735 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750059_9235 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750059 2025-07-15 11:36:13,270 INFO org.apache.hadoop.hdfs.server.datanode.DirectoryScanner: BlockPool BP-1059995147-192.168.158.1-1752101929360 Total blocks: 8, missing metadata files:0, missing block files:0, missing blocks in memory:0, mismatched blocks:0 2025-07-15 11:37:19,743 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Successfully sent block report 0x12cfd9a757d31f3c, containing 4 storage report(s), of which we sent 4. The reports had 8 total blocks and used 1 RPC(s). This took 1 msec to generate and 3 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5. 2025-07-15 11:37:19,744 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Got finalize command for block pool BP-1059995147-192.168.158.1-1752101929360 2025-07-15 11:37:47,725 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750062_9238 src: /192.168.158.1:48332 dest: /192.168.158.4:9866 2025-07-15 11:37:47,758 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48332, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-751487976_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750062_9238, duration(ns): 23067933 2025-07-15 11:37:47,758 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750062_9238, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-15 11:37:55,739 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750062_9238 replica FinalizedReplica, blk_1073750062_9238, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750062 for deletion 2025-07-15 11:37:55,740 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750062_9238 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750062 2025-07-15 11:43:02,760 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750067_9243 src: /192.168.158.6:46732 dest: /192.168.158.4:9866 2025-07-15 11:43:02,785 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:46732, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1736116581_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750067_9243, duration(ns): 19644527 2025-07-15 11:43:02,786 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750067_9243, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-15 11:43:04,756 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750067_9243 replica FinalizedReplica, blk_1073750067_9243, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750067 for deletion 2025-07-15 11:43:04,757 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750067_9243 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750067 2025-07-15 11:44:07,758 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750068_9244 src: /192.168.158.8:60672 dest: /192.168.158.4:9866 2025-07-15 11:44:07,785 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:60672, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-621032329_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750068_9244, duration(ns): 21606181 2025-07-15 11:44:07,786 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750068_9244, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-15 11:44:13,757 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750068_9244 replica FinalizedReplica, blk_1073750068_9244, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750068 for deletion 2025-07-15 11:44:13,758 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750068_9244 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750068 2025-07-15 11:45:07,741 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750069_9245 src: /192.168.158.8:53882 dest: /192.168.158.4:9866 2025-07-15 11:45:07,759 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:53882, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-691088361_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750069_9245, duration(ns): 15862074 2025-07-15 11:45:07,759 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750069_9245, type=LAST_IN_PIPELINE terminating 2025-07-15 11:45:10,757 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750069_9245 replica FinalizedReplica, blk_1073750069_9245, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750069 for deletion 2025-07-15 11:45:10,758 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750069_9245 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750069 2025-07-15 11:46:12,742 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750070_9246 src: /192.168.158.5:54742 dest: /192.168.158.4:9866 2025-07-15 11:46:12,762 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:54742, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_955635592_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750070_9246, duration(ns): 17908856 2025-07-15 11:46:12,763 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750070_9246, type=LAST_IN_PIPELINE terminating 2025-07-15 11:46:16,757 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750070_9246 replica FinalizedReplica, blk_1073750070_9246, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750070 for deletion 2025-07-15 11:46:16,758 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750070_9246 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750070 2025-07-15 11:49:22,731 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750073_9249 src: /192.168.158.1:37966 dest: /192.168.158.4:9866 2025-07-15 11:49:22,764 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37966, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1408410866_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750073_9249, duration(ns): 22765006 2025-07-15 11:49:22,765 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750073_9249, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-15 11:49:25,764 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750073_9249 replica FinalizedReplica, blk_1073750073_9249, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750073 for deletion 2025-07-15 11:49:25,765 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750073_9249 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750073 2025-07-15 11:53:22,776 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750077_9253 src: /192.168.158.6:40196 dest: /192.168.158.4:9866 2025-07-15 11:53:22,791 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:40196, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1387734610_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750077_9253, duration(ns): 13944872 2025-07-15 11:53:22,792 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750077_9253, type=LAST_IN_PIPELINE terminating 2025-07-15 11:53:25,771 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750077_9253 replica FinalizedReplica, blk_1073750077_9253, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750077 for deletion 2025-07-15 11:53:25,772 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750077_9253 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750077 2025-07-15 11:56:27,773 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750080_9256 src: /192.168.158.8:49700 dest: /192.168.158.4:9866 2025-07-15 11:56:27,799 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:49700, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1674845110_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750080_9256, duration(ns): 19950779 2025-07-15 11:56:27,799 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750080_9256, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-15 11:56:31,777 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750080_9256 replica FinalizedReplica, blk_1073750080_9256, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750080 for deletion 2025-07-15 11:56:31,778 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750080_9256 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750080 2025-07-15 11:58:27,773 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750082_9258 src: /192.168.158.1:47332 dest: /192.168.158.4:9866 2025-07-15 11:58:27,805 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47332, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2058775796_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750082_9258, duration(ns): 23264462 2025-07-15 11:58:27,806 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750082_9258, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-15 11:58:31,781 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750082_9258 replica FinalizedReplica, blk_1073750082_9258, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750082 for deletion 2025-07-15 11:58:31,782 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750082_9258 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750082 2025-07-15 11:59:32,773 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750083_9259 src: /192.168.158.6:39192 dest: /192.168.158.4:9866 2025-07-15 11:59:32,799 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:39192, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1739830729_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750083_9259, duration(ns): 20475079 2025-07-15 11:59:32,799 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750083_9259, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-15 11:59:37,783 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750083_9259 replica FinalizedReplica, blk_1073750083_9259, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750083 for deletion 2025-07-15 11:59:37,784 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750083_9259 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750083 2025-07-15 12:00:32,766 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750084_9260 src: /192.168.158.5:35042 dest: /192.168.158.4:9866 2025-07-15 12:00:32,791 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:35042, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1286022320_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750084_9260, duration(ns): 19278644 2025-07-15 12:00:32,791 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750084_9260, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-15 12:00:37,784 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750084_9260 replica FinalizedReplica, blk_1073750084_9260, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750084 for deletion 2025-07-15 12:00:37,786 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750084_9260 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750084 2025-07-15 12:05:42,785 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750089_9265 src: /192.168.158.1:52580 dest: /192.168.158.4:9866 2025-07-15 12:05:42,819 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52580, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1564607671_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750089_9265, duration(ns): 24563427 2025-07-15 12:05:42,819 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750089_9265, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-15 12:05:46,792 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750089_9265 replica FinalizedReplica, blk_1073750089_9265, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750089 for deletion 2025-07-15 12:05:46,794 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750089_9265 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750089 2025-07-15 12:07:42,784 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750091_9267 src: /192.168.158.8:54156 dest: /192.168.158.4:9866 2025-07-15 12:07:42,804 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:54156, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1069244909_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750091_9267, duration(ns): 18109753 2025-07-15 12:07:42,804 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750091_9267, type=LAST_IN_PIPELINE terminating 2025-07-15 12:07:49,797 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750091_9267 replica FinalizedReplica, blk_1073750091_9267, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750091 for deletion 2025-07-15 12:07:49,799 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750091_9267 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750091 2025-07-15 12:08:42,779 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750092_9268 src: /192.168.158.8:55198 dest: /192.168.158.4:9866 2025-07-15 12:08:42,805 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:55198, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1975250145_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750092_9268, duration(ns): 20539341 2025-07-15 12:08:42,805 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750092_9268, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-15 12:08:46,800 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750092_9268 replica FinalizedReplica, blk_1073750092_9268, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750092 for deletion 2025-07-15 12:08:46,801 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750092_9268 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750092 2025-07-15 12:09:47,783 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750093_9269 src: /192.168.158.9:32788 dest: /192.168.158.4:9866 2025-07-15 12:09:47,800 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:32788, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1977426246_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750093_9269, duration(ns): 15250645 2025-07-15 12:09:47,801 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750093_9269, type=LAST_IN_PIPELINE terminating 2025-07-15 12:09:52,803 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750093_9269 replica FinalizedReplica, blk_1073750093_9269, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750093 for deletion 2025-07-15 12:09:52,804 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750093_9269 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750093 2025-07-15 12:10:47,780 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750094_9270 src: /192.168.158.1:45522 dest: /192.168.158.4:9866 2025-07-15 12:10:47,813 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45522, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_277259618_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750094_9270, duration(ns): 24455467 2025-07-15 12:10:47,813 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750094_9270, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-15 12:10:49,805 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750094_9270 replica FinalizedReplica, blk_1073750094_9270, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750094 for deletion 2025-07-15 12:10:49,806 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750094_9270 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750094 2025-07-15 12:11:47,787 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750095_9271 src: /192.168.158.1:45948 dest: /192.168.158.4:9866 2025-07-15 12:11:47,817 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45948, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1027254741_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750095_9271, duration(ns): 22510373 2025-07-15 12:11:47,817 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750095_9271, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-15 12:11:49,808 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750095_9271 replica FinalizedReplica, blk_1073750095_9271, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750095 for deletion 2025-07-15 12:11:49,809 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750095_9271 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750095 2025-07-15 12:17:52,797 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750101_9277 src: /192.168.158.6:57196 dest: /192.168.158.4:9866 2025-07-15 12:17:52,816 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:57196, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_997527881_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750101_9277, duration(ns): 17532733 2025-07-15 12:17:52,817 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750101_9277, type=LAST_IN_PIPELINE terminating 2025-07-15 12:17:58,819 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750101_9277 replica FinalizedReplica, blk_1073750101_9277, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750101 for deletion 2025-07-15 12:17:58,821 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750101_9277 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750101 2025-07-15 12:18:52,796 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750102_9278 src: /192.168.158.9:58642 dest: /192.168.158.4:9866 2025-07-15 12:18:52,814 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:58642, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-234527011_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750102_9278, duration(ns): 15286891 2025-07-15 12:18:52,814 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750102_9278, type=LAST_IN_PIPELINE terminating 2025-07-15 12:18:55,821 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750102_9278 replica FinalizedReplica, blk_1073750102_9278, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750102 for deletion 2025-07-15 12:18:55,823 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750102_9278 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750102 2025-07-15 12:21:02,816 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750104_9280 src: /192.168.158.6:41090 dest: /192.168.158.4:9866 2025-07-15 12:21:02,839 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:41090, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-543207253_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750104_9280, duration(ns): 17779531 2025-07-15 12:21:02,839 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750104_9280, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-15 12:21:04,823 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750104_9280 replica FinalizedReplica, blk_1073750104_9280, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750104 for deletion 2025-07-15 12:21:04,824 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750104_9280 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750104 2025-07-15 12:23:07,818 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750106_9282 src: /192.168.158.1:52230 dest: /192.168.158.4:9866 2025-07-15 12:23:07,849 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52230, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1520364156_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750106_9282, duration(ns): 22307596 2025-07-15 12:23:07,849 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750106_9282, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-15 12:23:10,828 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750106_9282 replica FinalizedReplica, blk_1073750106_9282, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750106 for deletion 2025-07-15 12:23:10,829 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750106_9282 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750106 2025-07-15 12:24:07,818 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750107_9283 src: /192.168.158.7:55378 dest: /192.168.158.4:9866 2025-07-15 12:24:07,838 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:55378, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1861389688_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750107_9283, duration(ns): 17974507 2025-07-15 12:24:07,839 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750107_9283, type=LAST_IN_PIPELINE terminating 2025-07-15 12:24:13,831 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750107_9283 replica FinalizedReplica, blk_1073750107_9283, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750107 for deletion 2025-07-15 12:24:13,832 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750107_9283 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750107 2025-07-15 12:25:12,807 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750108_9284 src: /192.168.158.6:56750 dest: /192.168.158.4:9866 2025-07-15 12:25:12,827 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:56750, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-825413902_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750108_9284, duration(ns): 18136958 2025-07-15 12:25:12,827 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750108_9284, type=LAST_IN_PIPELINE terminating 2025-07-15 12:25:16,834 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750108_9284 replica FinalizedReplica, blk_1073750108_9284, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750108 for deletion 2025-07-15 12:25:16,835 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750108_9284 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750108 2025-07-15 12:26:12,815 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750109_9285 src: /192.168.158.5:59922 dest: /192.168.158.4:9866 2025-07-15 12:26:12,839 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:59922, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1994640066_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750109_9285, duration(ns): 18576632 2025-07-15 12:26:12,839 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750109_9285, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-15 12:26:16,836 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750109_9285 replica FinalizedReplica, blk_1073750109_9285, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750109 for deletion 2025-07-15 12:26:16,838 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750109_9285 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750109 2025-07-15 12:29:17,812 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750112_9288 src: /192.168.158.1:57144 dest: /192.168.158.4:9866 2025-07-15 12:29:17,845 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57144, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1839011267_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750112_9288, duration(ns): 24343483 2025-07-15 12:29:17,845 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750112_9288, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-15 12:29:19,843 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750112_9288 replica FinalizedReplica, blk_1073750112_9288, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750112 for deletion 2025-07-15 12:29:19,844 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750112_9288 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750112 2025-07-15 12:31:17,866 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750114_9290 src: /192.168.158.8:47628 dest: /192.168.158.4:9866 2025-07-15 12:31:17,890 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:47628, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_949474690_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750114_9290, duration(ns): 18851353 2025-07-15 12:31:17,891 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750114_9290, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-15 12:31:22,846 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750114_9290 replica FinalizedReplica, blk_1073750114_9290, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750114 for deletion 2025-07-15 12:31:22,847 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750114_9290 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750114 2025-07-15 12:33:17,853 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750116_9292 src: /192.168.158.5:60666 dest: /192.168.158.4:9866 2025-07-15 12:33:17,873 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:60666, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1356647017_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750116_9292, duration(ns): 17929456 2025-07-15 12:33:17,874 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750116_9292, type=LAST_IN_PIPELINE terminating 2025-07-15 12:33:22,849 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750116_9292 replica FinalizedReplica, blk_1073750116_9292, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750116 for deletion 2025-07-15 12:33:22,850 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750116_9292 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750116 2025-07-15 12:34:17,853 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750117_9293 src: /192.168.158.5:36894 dest: /192.168.158.4:9866 2025-07-15 12:34:17,879 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:36894, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1402068368_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750117_9293, duration(ns): 20475304 2025-07-15 12:34:17,879 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750117_9293, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-15 12:34:19,851 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750117_9293 replica FinalizedReplica, blk_1073750117_9293, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750117 for deletion 2025-07-15 12:34:19,852 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750117_9293 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750117 2025-07-15 12:37:22,868 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750120_9296 src: /192.168.158.8:45966 dest: /192.168.158.4:9866 2025-07-15 12:37:22,894 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:45966, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2005135115_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750120_9296, duration(ns): 20531253 2025-07-15 12:37:22,894 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750120_9296, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-15 12:37:28,855 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750120_9296 replica FinalizedReplica, blk_1073750120_9296, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750120 for deletion 2025-07-15 12:37:28,857 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750120_9296 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750120 2025-07-15 12:38:22,848 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750121_9297 src: /192.168.158.7:55420 dest: /192.168.158.4:9866 2025-07-15 12:38:22,871 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:55420, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_403652024_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750121_9297, duration(ns): 17975648 2025-07-15 12:38:22,871 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750121_9297, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-15 12:38:25,857 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750121_9297 replica FinalizedReplica, blk_1073750121_9297, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750121 for deletion 2025-07-15 12:38:25,858 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750121_9297 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750121 2025-07-15 12:39:27,861 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750122_9298 src: /192.168.158.6:39490 dest: /192.168.158.4:9866 2025-07-15 12:39:27,879 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:39490, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1100795568_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750122_9298, duration(ns): 15661929 2025-07-15 12:39:27,879 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750122_9298, type=LAST_IN_PIPELINE terminating 2025-07-15 12:39:31,858 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750122_9298 replica FinalizedReplica, blk_1073750122_9298, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750122 for deletion 2025-07-15 12:39:31,860 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750122_9298 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750122 2025-07-15 12:41:32,866 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750124_9300 src: /192.168.158.8:35902 dest: /192.168.158.4:9866 2025-07-15 12:41:32,885 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:35902, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_426710237_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750124_9300, duration(ns): 15547913 2025-07-15 12:41:32,885 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750124_9300, type=LAST_IN_PIPELINE terminating 2025-07-15 12:41:37,860 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750124_9300 replica FinalizedReplica, blk_1073750124_9300, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750124 for deletion 2025-07-15 12:41:37,861 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750124_9300 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750124 2025-07-15 12:42:32,848 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750125_9301 src: /192.168.158.9:37548 dest: /192.168.158.4:9866 2025-07-15 12:42:32,874 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:37548, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1948826075_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750125_9301, duration(ns): 20645245 2025-07-15 12:42:32,874 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750125_9301, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-15 12:42:34,865 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750125_9301 replica FinalizedReplica, blk_1073750125_9301, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750125 for deletion 2025-07-15 12:42:34,866 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750125_9301 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750125 2025-07-15 12:45:32,865 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750128_9304 src: /192.168.158.9:34500 dest: /192.168.158.4:9866 2025-07-15 12:45:32,884 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:34500, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_84856121_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750128_9304, duration(ns): 17008083 2025-07-15 12:45:32,884 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750128_9304, type=LAST_IN_PIPELINE terminating 2025-07-15 12:45:34,870 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750128_9304 replica FinalizedReplica, blk_1073750128_9304, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750128 for deletion 2025-07-15 12:45:34,871 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750128_9304 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750128 2025-07-15 12:46:32,882 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750129_9305 src: /192.168.158.1:58794 dest: /192.168.158.4:9866 2025-07-15 12:46:32,912 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58794, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1542496934_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750129_9305, duration(ns): 22065833 2025-07-15 12:46:32,912 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750129_9305, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-15 12:46:34,873 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750129_9305 replica FinalizedReplica, blk_1073750129_9305, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750129 for deletion 2025-07-15 12:46:34,874 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750129_9305 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750129 2025-07-15 12:47:32,868 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750130_9306 src: /192.168.158.1:38542 dest: /192.168.158.4:9866 2025-07-15 12:47:32,902 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38542, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1171100682_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750130_9306, duration(ns): 25125632 2025-07-15 12:47:32,902 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750130_9306, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-15 12:47:34,876 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750130_9306 replica FinalizedReplica, blk_1073750130_9306, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750130 for deletion 2025-07-15 12:47:34,877 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750130_9306 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750130 2025-07-15 12:49:42,873 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750132_9308 src: /192.168.158.6:38992 dest: /192.168.158.4:9866 2025-07-15 12:49:42,899 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:38992, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1987804724_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750132_9308, duration(ns): 20539638 2025-07-15 12:49:42,899 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750132_9308, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-15 12:49:46,880 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750132_9308 replica FinalizedReplica, blk_1073750132_9308, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750132 for deletion 2025-07-15 12:49:46,881 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750132_9308 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750132 2025-07-15 12:51:42,845 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750134_9310 src: /192.168.158.7:60826 dest: /192.168.158.4:9866 2025-07-15 12:51:42,869 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:60826, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-572860183_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750134_9310, duration(ns): 21421794 2025-07-15 12:51:42,869 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750134_9310, type=LAST_IN_PIPELINE terminating 2025-07-15 12:51:46,883 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750134_9310 replica FinalizedReplica, blk_1073750134_9310, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750134 for deletion 2025-07-15 12:51:46,884 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750134_9310 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750134 2025-07-15 12:55:52,881 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750138_9314 src: /192.168.158.1:58722 dest: /192.168.158.4:9866 2025-07-15 12:55:52,912 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58722, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-654125313_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750138_9314, duration(ns): 21727981 2025-07-15 12:55:52,913 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750138_9314, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-15 12:55:55,895 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750138_9314 replica FinalizedReplica, blk_1073750138_9314, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750138 for deletion 2025-07-15 12:55:55,897 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750138_9314 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750138 2025-07-15 12:56:57,888 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750139_9315 src: /192.168.158.9:50878 dest: /192.168.158.4:9866 2025-07-15 12:56:57,906 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:50878, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-143159729_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750139_9315, duration(ns): 15457534 2025-07-15 12:56:57,906 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750139_9315, type=LAST_IN_PIPELINE terminating 2025-07-15 12:57:04,895 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750139_9315 replica FinalizedReplica, blk_1073750139_9315, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750139 for deletion 2025-07-15 12:57:04,897 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750139_9315 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750139 2025-07-15 12:58:02,858 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750140_9316 src: /192.168.158.9:37022 dest: /192.168.158.4:9866 2025-07-15 12:58:02,875 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:37022, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1313238556_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750140_9316, duration(ns): 15128564 2025-07-15 12:58:02,875 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750140_9316, type=LAST_IN_PIPELINE terminating 2025-07-15 12:58:07,898 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750140_9316 replica FinalizedReplica, blk_1073750140_9316, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750140 for deletion 2025-07-15 12:58:07,900 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750140_9316 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750140 2025-07-15 13:01:02,884 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750143_9319 src: /192.168.158.6:47078 dest: /192.168.158.4:9866 2025-07-15 13:01:02,902 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:47078, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1489980253_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750143_9319, duration(ns): 16313295 2025-07-15 13:01:02,902 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750143_9319, type=LAST_IN_PIPELINE terminating 2025-07-15 13:01:04,903 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750143_9319 replica FinalizedReplica, blk_1073750143_9319, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750143 for deletion 2025-07-15 13:01:04,904 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750143_9319 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750143 2025-07-15 13:03:07,923 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750145_9321 src: /192.168.158.5:42562 dest: /192.168.158.4:9866 2025-07-15 13:03:07,952 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:42562, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1102443098_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750145_9321, duration(ns): 23453359 2025-07-15 13:03:07,953 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750145_9321, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-15 13:03:10,908 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750145_9321 replica FinalizedReplica, blk_1073750145_9321, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750145 for deletion 2025-07-15 13:03:10,909 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750145_9321 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750145 2025-07-15 13:04:07,916 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750146_9322 src: /192.168.158.1:53010 dest: /192.168.158.4:9866 2025-07-15 13:04:07,948 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53010, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1122342366_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750146_9322, duration(ns): 22940498 2025-07-15 13:04:07,949 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750146_9322, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-15 13:04:13,913 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750146_9322 replica FinalizedReplica, blk_1073750146_9322, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750146 for deletion 2025-07-15 13:04:13,914 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750146_9322 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750146 2025-07-15 13:06:12,927 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750148_9324 src: /192.168.158.9:39494 dest: /192.168.158.4:9866 2025-07-15 13:06:12,945 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:39494, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1041030126_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750148_9324, duration(ns): 15819584 2025-07-15 13:06:12,946 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750148_9324, type=LAST_IN_PIPELINE terminating 2025-07-15 13:06:19,918 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750148_9324 replica FinalizedReplica, blk_1073750148_9324, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750148 for deletion 2025-07-15 13:06:19,919 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750148_9324 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750148 2025-07-15 13:07:12,930 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750149_9325 src: /192.168.158.9:47178 dest: /192.168.158.4:9866 2025-07-15 13:07:12,947 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:47178, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-47756598_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750149_9325, duration(ns): 14860302 2025-07-15 13:07:12,947 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750149_9325, type=LAST_IN_PIPELINE terminating 2025-07-15 13:07:16,919 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750149_9325 replica FinalizedReplica, blk_1073750149_9325, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750149 for deletion 2025-07-15 13:07:16,920 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750149_9325 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750149 2025-07-15 13:08:12,935 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750150_9326 src: /192.168.158.7:37450 dest: /192.168.158.4:9866 2025-07-15 13:08:12,957 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:37450, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-508275788_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750150_9326, duration(ns): 19650139 2025-07-15 13:08:12,957 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750150_9326, type=LAST_IN_PIPELINE terminating 2025-07-15 13:08:16,921 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750150_9326 replica FinalizedReplica, blk_1073750150_9326, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750150 for deletion 2025-07-15 13:08:16,923 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750150_9326 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750150 2025-07-15 13:10:22,919 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750152_9328 src: /192.168.158.1:44870 dest: /192.168.158.4:9866 2025-07-15 13:10:22,951 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44870, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1412541892_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750152_9328, duration(ns): 22573488 2025-07-15 13:10:22,951 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750152_9328, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-15 13:10:25,925 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750152_9328 replica FinalizedReplica, blk_1073750152_9328, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750152 for deletion 2025-07-15 13:10:25,926 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750152_9328 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750152 2025-07-15 13:13:27,950 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750155_9331 src: /192.168.158.1:45688 dest: /192.168.158.4:9866 2025-07-15 13:13:27,978 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45688, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-944812343_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750155_9331, duration(ns): 19903095 2025-07-15 13:13:27,978 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750155_9331, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-15 13:13:31,933 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750155_9331 replica FinalizedReplica, blk_1073750155_9331, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750155 for deletion 2025-07-15 13:13:31,935 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750155_9331 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750155 2025-07-15 13:15:27,923 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750157_9333 src: /192.168.158.8:36372 dest: /192.168.158.4:9866 2025-07-15 13:15:27,942 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:36372, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1929050540_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750157_9333, duration(ns): 17569072 2025-07-15 13:15:27,943 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750157_9333, type=LAST_IN_PIPELINE terminating 2025-07-15 13:15:31,939 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750157_9333 replica FinalizedReplica, blk_1073750157_9333, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750157 for deletion 2025-07-15 13:15:31,940 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750157_9333 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750157 2025-07-15 13:16:32,938 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750158_9334 src: /192.168.158.8:40344 dest: /192.168.158.4:9866 2025-07-15 13:16:32,954 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:40344, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-954829355_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750158_9334, duration(ns): 14406587 2025-07-15 13:16:32,954 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750158_9334, type=LAST_IN_PIPELINE terminating 2025-07-15 13:16:34,940 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750158_9334 replica FinalizedReplica, blk_1073750158_9334, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750158 for deletion 2025-07-15 13:16:34,941 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750158_9334 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750158 2025-07-15 13:21:37,933 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750163_9339 src: /192.168.158.7:51944 dest: /192.168.158.4:9866 2025-07-15 13:21:37,952 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:51944, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-200038298_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750163_9339, duration(ns): 16178847 2025-07-15 13:21:37,952 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750163_9339, type=LAST_IN_PIPELINE terminating 2025-07-15 13:21:40,951 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750163_9339 replica FinalizedReplica, blk_1073750163_9339, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750163 for deletion 2025-07-15 13:21:40,952 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750163_9339 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750163 2025-07-15 13:22:37,947 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750164_9340 src: /192.168.158.6:57304 dest: /192.168.158.4:9866 2025-07-15 13:22:37,974 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:57304, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1261050569_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750164_9340, duration(ns): 20933718 2025-07-15 13:22:37,975 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750164_9340, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-15 13:22:40,952 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750164_9340 replica FinalizedReplica, blk_1073750164_9340, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750164 for deletion 2025-07-15 13:22:40,953 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750164_9340 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750164 2025-07-15 13:27:47,947 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750169_9345 src: /192.168.158.9:45934 dest: /192.168.158.4:9866 2025-07-15 13:27:47,974 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:45934, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_856726950_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750169_9345, duration(ns): 21374747 2025-07-15 13:27:47,974 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750169_9345, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-15 13:27:52,962 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750169_9345 replica FinalizedReplica, blk_1073750169_9345, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750169 for deletion 2025-07-15 13:27:52,963 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750169_9345 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750169 2025-07-15 13:29:47,973 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750171_9347 src: /192.168.158.1:43662 dest: /192.168.158.4:9866 2025-07-15 13:29:48,006 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43662, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2055748294_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750171_9347, duration(ns): 23911320 2025-07-15 13:29:48,006 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750171_9347, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-15 13:29:49,966 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750171_9347 replica FinalizedReplica, blk_1073750171_9347, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750171 for deletion 2025-07-15 13:29:49,967 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750171_9347 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750171 2025-07-15 13:32:52,959 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750174_9350 src: /192.168.158.9:58494 dest: /192.168.158.4:9866 2025-07-15 13:32:52,978 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:58494, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1353934454_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750174_9350, duration(ns): 17519147 2025-07-15 13:32:52,979 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750174_9350, type=LAST_IN_PIPELINE terminating 2025-07-15 13:32:55,972 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750174_9350 replica FinalizedReplica, blk_1073750174_9350, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750174 for deletion 2025-07-15 13:32:55,973 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750174_9350 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750174 2025-07-15 13:34:52,955 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750176_9352 src: /192.168.158.1:60064 dest: /192.168.158.4:9866 2025-07-15 13:34:52,991 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60064, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1105273360_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750176_9352, duration(ns): 26503982 2025-07-15 13:34:52,992 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750176_9352, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-15 13:34:55,978 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750176_9352 replica FinalizedReplica, blk_1073750176_9352, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750176 for deletion 2025-07-15 13:34:55,979 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750176_9352 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750176 2025-07-15 13:35:52,962 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750177_9353 src: /192.168.158.5:60986 dest: /192.168.158.4:9866 2025-07-15 13:35:52,979 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:60986, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1157722619_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750177_9353, duration(ns): 15108074 2025-07-15 13:35:52,980 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750177_9353, type=LAST_IN_PIPELINE terminating 2025-07-15 13:35:55,981 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750177_9353 replica FinalizedReplica, blk_1073750177_9353, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750177 for deletion 2025-07-15 13:35:55,982 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750177_9353 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750177 2025-07-15 13:36:57,937 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750178_9354 src: /192.168.158.6:35976 dest: /192.168.158.4:9866 2025-07-15 13:36:57,965 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:35976, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-775748476_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750178_9354, duration(ns): 21382889 2025-07-15 13:36:57,965 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750178_9354, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-15 13:37:01,982 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750178_9354 replica FinalizedReplica, blk_1073750178_9354, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750178 for deletion 2025-07-15 13:37:01,984 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750178_9354 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750178 2025-07-15 13:39:57,959 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750181_9357 src: /192.168.158.1:45908 dest: /192.168.158.4:9866 2025-07-15 13:39:57,989 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45908, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_874082251_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750181_9357, duration(ns): 22562754 2025-07-15 13:39:57,990 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750181_9357, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-15 13:40:01,987 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750181_9357 replica FinalizedReplica, blk_1073750181_9357, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750181 for deletion 2025-07-15 13:40:01,989 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750181_9357 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750181 2025-07-15 13:43:02,941 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750184_9360 src: /192.168.158.1:42996 dest: /192.168.158.4:9866 2025-07-15 13:43:02,972 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42996, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1230272974_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750184_9360, duration(ns): 21587391 2025-07-15 13:43:02,972 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750184_9360, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-15 13:43:07,993 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750184_9360 replica FinalizedReplica, blk_1073750184_9360, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750184 for deletion 2025-07-15 13:43:07,994 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750184_9360 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750184 2025-07-15 13:44:02,950 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750185_9361 src: /192.168.158.1:52870 dest: /192.168.158.4:9866 2025-07-15 13:44:02,982 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52870, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-635167947_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750185_9361, duration(ns): 23359954 2025-07-15 13:44:02,982 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750185_9361, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-15 13:44:07,998 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750185_9361 replica FinalizedReplica, blk_1073750185_9361, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750185 for deletion 2025-07-15 13:44:07,999 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750185_9361 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750185 2025-07-15 13:45:07,944 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750186_9362 src: /192.168.158.1:35482 dest: /192.168.158.4:9866 2025-07-15 13:45:07,979 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35482, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1382327299_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750186_9362, duration(ns): 26453365 2025-07-15 13:45:07,979 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750186_9362, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-15 13:45:11,000 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750186_9362 replica FinalizedReplica, blk_1073750186_9362, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750186 for deletion 2025-07-15 13:45:11,001 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750186_9362 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750186 2025-07-15 13:47:07,945 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750188_9364 src: /192.168.158.1:34210 dest: /192.168.158.4:9866 2025-07-15 13:47:07,976 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34210, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_329001910_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750188_9364, duration(ns): 22042775 2025-07-15 13:47:07,976 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750188_9364, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-15 13:47:11,007 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750188_9364 replica FinalizedReplica, blk_1073750188_9364, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750188 for deletion 2025-07-15 13:47:11,008 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750188_9364 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750188 2025-07-15 13:49:07,978 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750190_9366 src: /192.168.158.7:52332 dest: /192.168.158.4:9866 2025-07-15 13:49:07,997 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:52332, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1436536279_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750190_9366, duration(ns): 16738674 2025-07-15 13:49:07,997 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750190_9366, type=LAST_IN_PIPELINE terminating 2025-07-15 13:49:14,011 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750190_9366 replica FinalizedReplica, blk_1073750190_9366, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750190 for deletion 2025-07-15 13:49:14,012 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750190_9366 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750190 2025-07-15 13:53:12,970 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750194_9370 src: /192.168.158.5:35202 dest: /192.168.158.4:9866 2025-07-15 13:53:12,987 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:35202, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_55577663_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750194_9370, duration(ns): 15473751 2025-07-15 13:53:12,988 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750194_9370, type=LAST_IN_PIPELINE terminating 2025-07-15 13:53:17,018 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750194_9370 replica FinalizedReplica, blk_1073750194_9370, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750194 for deletion 2025-07-15 13:53:17,019 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750194_9370 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750194 2025-07-15 13:54:17,963 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750195_9371 src: /192.168.158.1:44650 dest: /192.168.158.4:9866 2025-07-15 13:54:17,995 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44650, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-383575106_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750195_9371, duration(ns): 22268779 2025-07-15 13:54:17,995 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750195_9371, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-15 13:54:20,020 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750195_9371 replica FinalizedReplica, blk_1073750195_9371, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750195 for deletion 2025-07-15 13:54:20,022 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750195_9371 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750195 2025-07-15 13:56:17,971 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750197_9373 src: /192.168.158.7:49474 dest: /192.168.158.4:9866 2025-07-15 13:56:17,995 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:49474, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-53276126_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750197_9373, duration(ns): 18548138 2025-07-15 13:56:17,996 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750197_9373, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-15 13:56:20,027 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750197_9373 replica FinalizedReplica, blk_1073750197_9373, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750197 for deletion 2025-07-15 13:56:20,029 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750197_9373 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750197 2025-07-15 13:57:17,970 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750198_9374 src: /192.168.158.8:50686 dest: /192.168.158.4:9866 2025-07-15 13:57:17,995 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:50686, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-346242997_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750198_9374, duration(ns): 19785025 2025-07-15 13:57:17,995 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750198_9374, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-15 13:57:20,031 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750198_9374 replica FinalizedReplica, blk_1073750198_9374, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750198 for deletion 2025-07-15 13:57:20,033 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750198_9374 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750198 2025-07-15 13:58:17,975 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750199_9375 src: /192.168.158.7:34750 dest: /192.168.158.4:9866 2025-07-15 13:58:17,996 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:34750, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-921291939_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750199_9375, duration(ns): 18685612 2025-07-15 13:58:17,997 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750199_9375, type=LAST_IN_PIPELINE terminating 2025-07-15 13:58:20,030 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750199_9375 replica FinalizedReplica, blk_1073750199_9375, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750199 for deletion 2025-07-15 13:58:20,032 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750199_9375 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750199 2025-07-15 14:01:17,988 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750202_9378 src: /192.168.158.7:49716 dest: /192.168.158.4:9866 2025-07-15 14:01:18,013 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:49716, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1973834337_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750202_9378, duration(ns): 19341827 2025-07-15 14:01:18,013 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750202_9378, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-15 14:01:20,037 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750202_9378 replica FinalizedReplica, blk_1073750202_9378, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750202 for deletion 2025-07-15 14:01:20,038 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750202_9378 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750202 2025-07-15 14:02:17,979 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750203_9379 src: /192.168.158.1:44390 dest: /192.168.158.4:9866 2025-07-15 14:02:18,013 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44390, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_863758352_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750203_9379, duration(ns): 24196590 2025-07-15 14:02:18,013 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750203_9379, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-15 14:02:20,040 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750203_9379 replica FinalizedReplica, blk_1073750203_9379, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750203 for deletion 2025-07-15 14:02:20,041 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750203_9379 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750203 2025-07-15 14:04:27,977 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750205_9381 src: /192.168.158.1:60922 dest: /192.168.158.4:9866 2025-07-15 14:04:28,011 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60922, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1491758982_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750205_9381, duration(ns): 24744509 2025-07-15 14:04:28,012 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750205_9381, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-15 14:04:35,047 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750205_9381 replica FinalizedReplica, blk_1073750205_9381, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750205 for deletion 2025-07-15 14:04:35,049 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750205_9381 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750205 2025-07-15 14:05:27,983 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750206_9382 src: /192.168.158.7:46976 dest: /192.168.158.4:9866 2025-07-15 14:05:28,007 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:46976, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-130669033_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750206_9382, duration(ns): 19269355 2025-07-15 14:05:28,008 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750206_9382, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-15 14:05:32,050 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750206_9382 replica FinalizedReplica, blk_1073750206_9382, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750206 for deletion 2025-07-15 14:05:32,051 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750206_9382 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750206 2025-07-15 14:06:32,985 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750207_9383 src: /192.168.158.8:54620 dest: /192.168.158.4:9866 2025-07-15 14:06:33,003 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:54620, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_386726877_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750207_9383, duration(ns): 16247116 2025-07-15 14:06:33,003 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750207_9383, type=LAST_IN_PIPELINE terminating 2025-07-15 14:06:38,050 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750207_9383 replica FinalizedReplica, blk_1073750207_9383, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750207 for deletion 2025-07-15 14:06:38,051 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750207_9383 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750207 2025-07-15 14:08:37,980 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750209_9385 src: /192.168.158.1:37176 dest: /192.168.158.4:9866 2025-07-15 14:08:38,011 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37176, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1366111401_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750209_9385, duration(ns): 22652896 2025-07-15 14:08:38,012 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750209_9385, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-15 14:08:41,055 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750209_9385 replica FinalizedReplica, blk_1073750209_9385, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750209 for deletion 2025-07-15 14:08:41,056 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750209_9385 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750209 2025-07-15 14:10:37,988 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750211_9387 src: /192.168.158.6:54734 dest: /192.168.158.4:9866 2025-07-15 14:10:38,015 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:54734, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1245728107_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750211_9387, duration(ns): 19837771 2025-07-15 14:10:38,015 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750211_9387, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-15 14:10:44,062 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750211_9387 replica FinalizedReplica, blk_1073750211_9387, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750211 for deletion 2025-07-15 14:10:44,063 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750211_9387 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750211 2025-07-15 14:11:43,012 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750212_9388 src: /192.168.158.8:40330 dest: /192.168.158.4:9866 2025-07-15 14:11:43,037 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:40330, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1527714059_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750212_9388, duration(ns): 20000403 2025-07-15 14:11:43,037 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750212_9388, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-15 14:11:47,063 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750212_9388 replica FinalizedReplica, blk_1073750212_9388, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750212 for deletion 2025-07-15 14:11:47,064 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750212_9388 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750212 2025-07-15 14:12:43,000 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750213_9389 src: /192.168.158.7:33176 dest: /192.168.158.4:9866 2025-07-15 14:12:43,026 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:33176, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-509779428_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750213_9389, duration(ns): 21211963 2025-07-15 14:12:43,027 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750213_9389, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-15 14:12:47,063 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750213_9389 replica FinalizedReplica, blk_1073750213_9389, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750213 for deletion 2025-07-15 14:12:47,064 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750213_9389 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750213 2025-07-15 14:14:42,994 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750215_9391 src: /192.168.158.1:53470 dest: /192.168.158.4:9866 2025-07-15 14:14:43,025 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53470, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_475263386_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750215_9391, duration(ns): 22789382 2025-07-15 14:14:43,025 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750215_9391, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-15 14:14:47,066 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750215_9391 replica FinalizedReplica, blk_1073750215_9391, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750215 for deletion 2025-07-15 14:14:47,067 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750215_9391 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750215 2025-07-15 14:17:52,998 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750218_9394 src: /192.168.158.8:33352 dest: /192.168.158.4:9866 2025-07-15 14:17:53,023 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:33352, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1332073614_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750218_9394, duration(ns): 19490811 2025-07-15 14:17:53,023 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750218_9394, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-15 14:17:56,072 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750218_9394 replica FinalizedReplica, blk_1073750218_9394, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750218 for deletion 2025-07-15 14:17:56,073 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750218_9394 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750218 2025-07-15 14:18:53,002 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750219_9395 src: /192.168.158.1:47278 dest: /192.168.158.4:9866 2025-07-15 14:18:53,035 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47278, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_815612486_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750219_9395, duration(ns): 24143099 2025-07-15 14:18:53,035 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750219_9395, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-15 14:18:56,071 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750219_9395 replica FinalizedReplica, blk_1073750219_9395, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750219 for deletion 2025-07-15 14:18:56,072 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750219_9395 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750219 2025-07-15 14:19:53,013 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750220_9396 src: /192.168.158.8:46756 dest: /192.168.158.4:9866 2025-07-15 14:19:53,033 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:46756, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-590348800_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750220_9396, duration(ns): 18005616 2025-07-15 14:19:53,033 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750220_9396, type=LAST_IN_PIPELINE terminating 2025-07-15 14:19:59,072 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750220_9396 replica FinalizedReplica, blk_1073750220_9396, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750220 for deletion 2025-07-15 14:19:59,073 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750220_9396 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750220 2025-07-15 14:20:53,002 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750221_9397 src: /192.168.158.1:46632 dest: /192.168.158.4:9866 2025-07-15 14:20:53,036 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46632, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1931032680_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750221_9397, duration(ns): 24731100 2025-07-15 14:20:53,036 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750221_9397, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-15 14:20:56,071 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750221_9397 replica FinalizedReplica, blk_1073750221_9397, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750221 for deletion 2025-07-15 14:20:56,073 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750221_9397 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750221 2025-07-15 14:21:58,012 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750222_9398 src: /192.168.158.5:34340 dest: /192.168.158.4:9866 2025-07-15 14:21:58,030 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:34340, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1214780536_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750222_9398, duration(ns): 15130554 2025-07-15 14:21:58,030 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750222_9398, type=LAST_IN_PIPELINE terminating 2025-07-15 14:22:02,073 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750222_9398 replica FinalizedReplica, blk_1073750222_9398, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750222 for deletion 2025-07-15 14:22:02,074 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750222_9398 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750222 2025-07-15 14:22:58,033 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750223_9399 src: /192.168.158.1:54586 dest: /192.168.158.4:9866 2025-07-15 14:22:58,064 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54586, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1825749558_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750223_9399, duration(ns): 22640988 2025-07-15 14:22:58,064 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750223_9399, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-15 14:23:05,076 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750223_9399 replica FinalizedReplica, blk_1073750223_9399, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750223 for deletion 2025-07-15 14:23:05,077 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750223_9399 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750223 2025-07-15 14:23:58,013 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750224_9400 src: /192.168.158.5:60834 dest: /192.168.158.4:9866 2025-07-15 14:23:58,036 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:60834, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1179308024_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750224_9400, duration(ns): 18349092 2025-07-15 14:23:58,036 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750224_9400, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-15 14:24:02,078 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750224_9400 replica FinalizedReplica, blk_1073750224_9400, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750224 for deletion 2025-07-15 14:24:02,079 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750224_9400 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750224 2025-07-15 14:24:58,017 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750225_9401 src: /192.168.158.5:34394 dest: /192.168.158.4:9866 2025-07-15 14:24:58,035 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:34394, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2061619826_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750225_9401, duration(ns): 15150581 2025-07-15 14:24:58,035 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750225_9401, type=LAST_IN_PIPELINE terminating 2025-07-15 14:25:02,080 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750225_9401 replica FinalizedReplica, blk_1073750225_9401, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750225 for deletion 2025-07-15 14:25:02,081 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750225_9401 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750225 2025-07-15 14:26:58,031 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750227_9403 src: /192.168.158.1:48902 dest: /192.168.158.4:9866 2025-07-15 14:26:58,063 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48902, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-930575147_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750227_9403, duration(ns): 22782915 2025-07-15 14:26:58,063 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750227_9403, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-15 14:27:05,085 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750227_9403 replica FinalizedReplica, blk_1073750227_9403, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750227 for deletion 2025-07-15 14:27:05,086 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750227_9403 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750227 2025-07-15 14:29:03,073 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750229_9405 src: /192.168.158.9:44134 dest: /192.168.158.4:9866 2025-07-15 14:29:03,100 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:44134, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1506488546_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750229_9405, duration(ns): 21490379 2025-07-15 14:29:03,100 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750229_9405, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-15 14:29:08,086 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750229_9405 replica FinalizedReplica, blk_1073750229_9405, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750229 for deletion 2025-07-15 14:29:08,087 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750229_9405 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750229 2025-07-15 14:30:03,050 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750230_9406 src: /192.168.158.1:42290 dest: /192.168.158.4:9866 2025-07-15 14:30:03,085 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42290, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1214806997_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750230_9406, duration(ns): 24399985 2025-07-15 14:30:03,085 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750230_9406, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-15 14:30:05,089 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750230_9406 replica FinalizedReplica, blk_1073750230_9406, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750230 for deletion 2025-07-15 14:30:05,091 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750230_9406 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750230 2025-07-15 14:31:03,072 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750231_9407 src: /192.168.158.5:33888 dest: /192.168.158.4:9866 2025-07-15 14:31:03,103 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:33888, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_973531997_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750231_9407, duration(ns): 24346526 2025-07-15 14:31:03,103 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750231_9407, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-15 14:31:08,090 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750231_9407 replica FinalizedReplica, blk_1073750231_9407, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750231 for deletion 2025-07-15 14:31:08,091 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750231_9407 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750231 2025-07-15 14:34:08,041 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750234_9410 src: /192.168.158.5:39712 dest: /192.168.158.4:9866 2025-07-15 14:34:08,062 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:39712, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-549527970_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750234_9410, duration(ns): 19070764 2025-07-15 14:34:08,062 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750234_9410, type=LAST_IN_PIPELINE terminating 2025-07-15 14:34:11,093 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750234_9410 replica FinalizedReplica, blk_1073750234_9410, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750234 for deletion 2025-07-15 14:34:11,094 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750234_9410 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750234 2025-07-15 14:35:08,053 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750235_9411 src: /192.168.158.8:40848 dest: /192.168.158.4:9866 2025-07-15 14:35:08,072 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:40848, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_530122545_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750235_9411, duration(ns): 16231813 2025-07-15 14:35:08,072 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750235_9411, type=LAST_IN_PIPELINE terminating 2025-07-15 14:35:14,095 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750235_9411 replica FinalizedReplica, blk_1073750235_9411, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750235 for deletion 2025-07-15 14:35:14,097 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750235_9411 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750235 2025-07-15 14:36:08,059 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750236_9412 src: /192.168.158.9:59474 dest: /192.168.158.4:9866 2025-07-15 14:36:08,083 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:59474, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-564232864_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750236_9412, duration(ns): 19073316 2025-07-15 14:36:08,083 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750236_9412, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-15 14:36:11,097 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750236_9412 replica FinalizedReplica, blk_1073750236_9412, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750236 for deletion 2025-07-15 14:36:11,098 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750236_9412 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750236 2025-07-15 14:37:08,069 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750237_9413 src: /192.168.158.1:33016 dest: /192.168.158.4:9866 2025-07-15 14:37:08,103 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33016, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1071734573_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750237_9413, duration(ns): 25073394 2025-07-15 14:37:08,103 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750237_9413, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-15 14:37:14,099 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750237_9413 replica FinalizedReplica, blk_1073750237_9413, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750237 for deletion 2025-07-15 14:37:14,101 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750237_9413 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750237 2025-07-15 14:39:13,051 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750239_9415 src: /192.168.158.1:43258 dest: /192.168.158.4:9866 2025-07-15 14:39:13,084 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43258, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1111739689_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750239_9415, duration(ns): 23371342 2025-07-15 14:39:13,084 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750239_9415, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-15 14:39:20,104 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750239_9415 replica FinalizedReplica, blk_1073750239_9415, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750239 for deletion 2025-07-15 14:39:20,105 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750239_9415 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750239 2025-07-15 14:43:18,069 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750243_9419 src: /192.168.158.6:58562 dest: /192.168.158.4:9866 2025-07-15 14:43:18,088 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:58562, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_657696474_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750243_9419, duration(ns): 17060613 2025-07-15 14:43:18,088 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750243_9419, type=LAST_IN_PIPELINE terminating 2025-07-15 14:43:23,112 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750243_9419 replica FinalizedReplica, blk_1073750243_9419, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750243 for deletion 2025-07-15 14:43:23,113 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750243_9419 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750243 2025-07-15 14:46:33,061 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750246_9422 src: /192.168.158.6:50026 dest: /192.168.158.4:9866 2025-07-15 14:46:33,079 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:50026, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-650953376_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750246_9422, duration(ns): 15868964 2025-07-15 14:46:33,079 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750246_9422, type=LAST_IN_PIPELINE terminating 2025-07-15 14:46:35,118 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750246_9422 replica FinalizedReplica, blk_1073750246_9422, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750246 for deletion 2025-07-15 14:46:35,119 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750246_9422 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750246 2025-07-15 14:48:33,064 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750248_9424 src: /192.168.158.1:39702 dest: /192.168.158.4:9866 2025-07-15 14:48:33,096 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39702, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-179125033_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750248_9424, duration(ns): 23572322 2025-07-15 14:48:33,096 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750248_9424, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-15 14:48:35,122 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750248_9424 replica FinalizedReplica, blk_1073750248_9424, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750248 for deletion 2025-07-15 14:48:35,123 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750248_9424 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750248 2025-07-15 14:50:38,092 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750250_9426 src: /192.168.158.5:52622 dest: /192.168.158.4:9866 2025-07-15 14:50:38,111 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:52622, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-124236067_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750250_9426, duration(ns): 17599247 2025-07-15 14:50:38,112 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750250_9426, type=LAST_IN_PIPELINE terminating 2025-07-15 14:50:41,129 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750250_9426 replica FinalizedReplica, blk_1073750250_9426, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750250 for deletion 2025-07-15 14:50:41,130 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750250_9426 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750250 2025-07-15 14:51:38,115 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750251_9427 src: /192.168.158.1:42178 dest: /192.168.158.4:9866 2025-07-15 14:51:38,144 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42178, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1569902911_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750251_9427, duration(ns): 21042475 2025-07-15 14:51:38,144 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750251_9427, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-15 14:51:44,134 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750251_9427 replica FinalizedReplica, blk_1073750251_9427, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750251 for deletion 2025-07-15 14:51:44,135 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750251_9427 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750251 2025-07-15 14:52:38,058 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750252_9428 src: /192.168.158.6:47824 dest: /192.168.158.4:9866 2025-07-15 14:52:38,085 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:47824, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-714265382_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750252_9428, duration(ns): 21337217 2025-07-15 14:52:38,085 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750252_9428, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-15 14:52:41,137 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750252_9428 replica FinalizedReplica, blk_1073750252_9428, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750252 for deletion 2025-07-15 14:52:41,138 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750252_9428 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750252 2025-07-15 14:54:43,091 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750254_9430 src: /192.168.158.9:43664 dest: /192.168.158.4:9866 2025-07-15 14:54:43,109 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:43664, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1696262936_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750254_9430, duration(ns): 15772007 2025-07-15 14:54:43,109 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750254_9430, type=LAST_IN_PIPELINE terminating 2025-07-15 14:54:47,141 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750254_9430 replica FinalizedReplica, blk_1073750254_9430, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750254 for deletion 2025-07-15 14:54:47,142 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750254_9430 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750254 2025-07-15 14:56:48,085 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750256_9432 src: /192.168.158.6:53798 dest: /192.168.158.4:9866 2025-07-15 14:56:48,111 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:53798, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-607008767_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750256_9432, duration(ns): 20760555 2025-07-15 14:56:48,111 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750256_9432, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-15 14:56:50,148 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750256_9432 replica FinalizedReplica, blk_1073750256_9432, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750256 for deletion 2025-07-15 14:56:50,149 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750256_9432 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750256 2025-07-15 14:57:48,205 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750257_9433 src: /192.168.158.8:60246 dest: /192.168.158.4:9866 2025-07-15 14:57:48,230 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:60246, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2118491653_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750257_9433, duration(ns): 19602078 2025-07-15 14:57:48,230 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750257_9433, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-15 14:57:50,152 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750257_9433 replica FinalizedReplica, blk_1073750257_9433, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750257 for deletion 2025-07-15 14:57:50,153 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750257_9433 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750257 2025-07-15 14:59:48,111 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750259_9435 src: /192.168.158.9:39388 dest: /192.168.158.4:9866 2025-07-15 14:59:48,138 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:39388, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-145509899_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750259_9435, duration(ns): 21254462 2025-07-15 14:59:48,138 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750259_9435, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-15 14:59:50,154 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750259_9435 replica FinalizedReplica, blk_1073750259_9435, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750259 for deletion 2025-07-15 14:59:50,155 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750259_9435 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750259 2025-07-15 15:02:48,084 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750262_9438 src: /192.168.158.8:58514 dest: /192.168.158.4:9866 2025-07-15 15:02:48,101 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:58514, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1243279718_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750262_9438, duration(ns): 15087997 2025-07-15 15:02:48,102 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750262_9438, type=LAST_IN_PIPELINE terminating 2025-07-15 15:02:53,157 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750262_9438 replica FinalizedReplica, blk_1073750262_9438, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750262 for deletion 2025-07-15 15:02:53,159 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750262_9438 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750262 2025-07-15 15:03:48,088 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750263_9439 src: /192.168.158.6:41254 dest: /192.168.158.4:9866 2025-07-15 15:03:48,113 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:41254, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-559342793_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750263_9439, duration(ns): 19395598 2025-07-15 15:03:48,113 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750263_9439, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-15 15:03:50,159 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750263_9439 replica FinalizedReplica, blk_1073750263_9439, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750263 for deletion 2025-07-15 15:03:50,160 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750263_9439 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750263 2025-07-15 15:06:48,082 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750266_9442 src: /192.168.158.8:32926 dest: /192.168.158.4:9866 2025-07-15 15:06:48,109 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:32926, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-599973815_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750266_9442, duration(ns): 21184671 2025-07-15 15:06:48,109 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750266_9442, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-15 15:06:53,168 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750266_9442 replica FinalizedReplica, blk_1073750266_9442, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750266 for deletion 2025-07-15 15:06:53,169 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750266_9442 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750266 2025-07-15 15:07:48,085 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750267_9443 src: /192.168.158.8:58708 dest: /192.168.158.4:9866 2025-07-15 15:07:48,102 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:58708, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-956669868_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750267_9443, duration(ns): 15155970 2025-07-15 15:07:48,102 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750267_9443, type=LAST_IN_PIPELINE terminating 2025-07-15 15:07:53,171 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750267_9443 replica FinalizedReplica, blk_1073750267_9443, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750267 for deletion 2025-07-15 15:07:53,172 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750267_9443 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750267 2025-07-15 15:09:48,070 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750269_9445 src: /192.168.158.1:38872 dest: /192.168.158.4:9866 2025-07-15 15:09:48,105 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38872, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2129221055_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750269_9445, duration(ns): 26125519 2025-07-15 15:09:48,106 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750269_9445, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-15 15:09:50,174 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750269_9445 replica FinalizedReplica, blk_1073750269_9445, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750269 for deletion 2025-07-15 15:09:50,176 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750269_9445 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073750269 2025-07-15 15:12:53,104 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750272_9448 src: /192.168.158.6:59170 dest: /192.168.158.4:9866 2025-07-15 15:12:53,125 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:59170, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-118772255_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750272_9448, duration(ns): 19796640 2025-07-15 15:12:53,126 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750272_9448, type=LAST_IN_PIPELINE terminating 2025-07-15 15:12:56,182 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750272_9448 replica FinalizedReplica, blk_1073750272_9448, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750272 for deletion 2025-07-15 15:12:56,183 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750272_9448 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750272 2025-07-15 15:13:58,080 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750273_9449 src: /192.168.158.7:59542 dest: /192.168.158.4:9866 2025-07-15 15:13:58,103 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:59542, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1638050071_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750273_9449, duration(ns): 17839983 2025-07-15 15:13:58,104 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750273_9449, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-15 15:14:05,184 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750273_9449 replica FinalizedReplica, blk_1073750273_9449, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750273 for deletion 2025-07-15 15:14:05,185 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750273_9449 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750273 2025-07-15 15:15:03,098 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750274_9450 src: /192.168.158.5:56046 dest: /192.168.158.4:9866 2025-07-15 15:15:03,115 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:56046, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-272777082_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750274_9450, duration(ns): 15506873 2025-07-15 15:15:03,116 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750274_9450, type=LAST_IN_PIPELINE terminating 2025-07-15 15:15:05,185 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750274_9450 replica FinalizedReplica, blk_1073750274_9450, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750274 for deletion 2025-07-15 15:15:05,187 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750274_9450 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750274 2025-07-15 15:16:03,100 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750275_9451 src: /192.168.158.8:41722 dest: /192.168.158.4:9866 2025-07-15 15:16:03,125 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:41722, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-480856894_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750275_9451, duration(ns): 19544915 2025-07-15 15:16:03,126 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750275_9451, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-15 15:16:08,190 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750275_9451 replica FinalizedReplica, blk_1073750275_9451, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750275 for deletion 2025-07-15 15:16:08,191 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750275_9451 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750275 2025-07-15 15:17:03,097 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750276_9452 src: /192.168.158.5:37708 dest: /192.168.158.4:9866 2025-07-15 15:17:03,121 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:37708, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1648871559_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750276_9452, duration(ns): 18120372 2025-07-15 15:17:03,121 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750276_9452, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-15 15:17:08,190 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750276_9452 replica FinalizedReplica, blk_1073750276_9452, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750276 for deletion 2025-07-15 15:17:08,191 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750276_9452 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750276 2025-07-15 15:18:03,100 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750277_9453 src: /192.168.158.1:52096 dest: /192.168.158.4:9866 2025-07-15 15:18:03,132 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52096, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-531853752_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750277_9453, duration(ns): 23733801 2025-07-15 15:18:03,133 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750277_9453, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-15 15:18:05,191 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750277_9453 replica FinalizedReplica, blk_1073750277_9453, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750277 for deletion 2025-07-15 15:18:05,192 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750277_9453 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750277 2025-07-15 15:21:13,140 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750280_9456 src: /192.168.158.5:50946 dest: /192.168.158.4:9866 2025-07-15 15:21:13,158 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:50946, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_13304170_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750280_9456, duration(ns): 15966316 2025-07-15 15:21:13,159 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750280_9456, type=LAST_IN_PIPELINE terminating 2025-07-15 15:21:17,196 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750280_9456 replica FinalizedReplica, blk_1073750280_9456, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750280 for deletion 2025-07-15 15:21:17,197 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750280_9456 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750280 2025-07-15 15:23:18,141 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750282_9458 src: /192.168.158.6:37036 dest: /192.168.158.4:9866 2025-07-15 15:23:18,158 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:37036, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1017071646_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750282_9458, duration(ns): 15921839 2025-07-15 15:23:18,159 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750282_9458, type=LAST_IN_PIPELINE terminating 2025-07-15 15:23:23,202 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750282_9458 replica FinalizedReplica, blk_1073750282_9458, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750282 for deletion 2025-07-15 15:23:23,203 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750282_9458 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750282 2025-07-15 15:25:18,141 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750284_9460 src: /192.168.158.7:37206 dest: /192.168.158.4:9866 2025-07-15 15:25:18,160 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:37206, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1032047894_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750284_9460, duration(ns): 16857583 2025-07-15 15:25:18,160 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750284_9460, type=LAST_IN_PIPELINE terminating 2025-07-15 15:25:20,207 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750284_9460 replica FinalizedReplica, blk_1073750284_9460, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750284 for deletion 2025-07-15 15:25:20,208 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750284_9460 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750284 2025-07-15 15:27:23,142 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750286_9462 src: /192.168.158.9:33788 dest: /192.168.158.4:9866 2025-07-15 15:27:23,161 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:33788, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_226703436_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750286_9462, duration(ns): 16854543 2025-07-15 15:27:23,161 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750286_9462, type=LAST_IN_PIPELINE terminating 2025-07-15 15:27:26,211 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750286_9462 replica FinalizedReplica, blk_1073750286_9462, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750286 for deletion 2025-07-15 15:27:26,212 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750286_9462 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750286 2025-07-15 15:28:23,125 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750287_9463 src: /192.168.158.1:47272 dest: /192.168.158.4:9866 2025-07-15 15:28:23,156 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47272, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1215535744_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750287_9463, duration(ns): 21623606 2025-07-15 15:28:23,156 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750287_9463, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-15 15:28:29,215 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750287_9463 replica FinalizedReplica, blk_1073750287_9463, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750287 for deletion 2025-07-15 15:28:29,216 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750287_9463 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750287 2025-07-15 15:29:23,143 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750288_9464 src: /192.168.158.6:53262 dest: /192.168.158.4:9866 2025-07-15 15:29:23,167 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:53262, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2080977295_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750288_9464, duration(ns): 18072433 2025-07-15 15:29:23,167 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750288_9464, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-15 15:29:29,219 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750288_9464 replica FinalizedReplica, blk_1073750288_9464, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750288 for deletion 2025-07-15 15:29:29,220 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750288_9464 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750288 2025-07-15 15:31:28,140 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750290_9466 src: /192.168.158.1:38426 dest: /192.168.158.4:9866 2025-07-15 15:31:28,172 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38426, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-430951110_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750290_9466, duration(ns): 24249669 2025-07-15 15:31:28,173 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750290_9466, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-15 15:31:32,223 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750290_9466 replica FinalizedReplica, blk_1073750290_9466, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750290 for deletion 2025-07-15 15:31:32,224 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750290_9466 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750290 2025-07-15 15:33:14,334 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750292_9468 src: /192.168.158.1:48250 dest: /192.168.158.4:9866 2025-07-15 15:33:14,372 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48250, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1602694035_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750292_9468, duration(ns): 28230454 2025-07-15 15:33:14,372 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750292_9468, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-15 15:33:20,224 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750292_9468 replica FinalizedReplica, blk_1073750292_9468, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750292 for deletion 2025-07-15 15:33:20,225 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750292_9468 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750292 2025-07-15 15:35:24,339 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750294_9470 src: /192.168.158.6:39080 dest: /192.168.158.4:9866 2025-07-15 15:35:24,358 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:39080, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1897595931_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750294_9470, duration(ns): 16622492 2025-07-15 15:35:24,358 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750294_9470, type=LAST_IN_PIPELINE terminating 2025-07-15 15:35:26,226 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750294_9470 replica FinalizedReplica, blk_1073750294_9470, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750294 for deletion 2025-07-15 15:35:26,227 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750294_9470 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750294 2025-07-15 15:37:24,310 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750296_9472 src: /192.168.158.1:54836 dest: /192.168.158.4:9866 2025-07-15 15:37:24,342 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54836, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1982285683_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750296_9472, duration(ns): 22922198 2025-07-15 15:37:24,342 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750296_9472, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-15 15:37:29,231 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750296_9472 replica FinalizedReplica, blk_1073750296_9472, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750296 for deletion 2025-07-15 15:37:29,232 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750296_9472 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750296 2025-07-15 15:40:24,321 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750299_9475 src: /192.168.158.1:43760 dest: /192.168.158.4:9866 2025-07-15 15:40:24,351 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43760, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1785401747_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750299_9475, duration(ns): 21574982 2025-07-15 15:40:24,351 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750299_9475, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-15 15:40:26,240 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750299_9475 replica FinalizedReplica, blk_1073750299_9475, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750299 for deletion 2025-07-15 15:40:26,241 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750299_9475 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750299 2025-07-15 15:42:29,337 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750301_9477 src: /192.168.158.7:38214 dest: /192.168.158.4:9866 2025-07-15 15:42:29,357 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:38214, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_986535738_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750301_9477, duration(ns): 17857842 2025-07-15 15:42:29,358 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750301_9477, type=LAST_IN_PIPELINE terminating 2025-07-15 15:42:32,242 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750301_9477 replica FinalizedReplica, blk_1073750301_9477, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750301 for deletion 2025-07-15 15:42:32,243 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750301_9477 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750301 2025-07-15 15:43:29,317 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750302_9478 src: /192.168.158.1:46326 dest: /192.168.158.4:9866 2025-07-15 15:43:29,345 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46326, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-101142113_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750302_9478, duration(ns): 18964079 2025-07-15 15:43:29,345 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750302_9478, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-15 15:43:32,243 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750302_9478 replica FinalizedReplica, blk_1073750302_9478, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750302 for deletion 2025-07-15 15:43:32,246 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750302_9478 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750302 2025-07-15 15:45:29,352 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750304_9480 src: /192.168.158.1:40592 dest: /192.168.158.4:9866 2025-07-15 15:45:29,388 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40592, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_357228443_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750304_9480, duration(ns): 26293927 2025-07-15 15:45:29,389 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750304_9480, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-15 15:45:32,246 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750304_9480 replica FinalizedReplica, blk_1073750304_9480, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750304 for deletion 2025-07-15 15:45:32,247 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750304_9480 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750304 2025-07-15 15:46:34,346 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750305_9481 src: /192.168.158.8:55608 dest: /192.168.158.4:9866 2025-07-15 15:46:34,363 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:55608, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1120305813_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750305_9481, duration(ns): 14831058 2025-07-15 15:46:34,364 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750305_9481, type=LAST_IN_PIPELINE terminating 2025-07-15 15:46:41,246 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750305_9481 replica FinalizedReplica, blk_1073750305_9481, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750305 for deletion 2025-07-15 15:46:41,248 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750305_9481 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750305 2025-07-15 15:48:34,327 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750307_9483 src: /192.168.158.8:54252 dest: /192.168.158.4:9866 2025-07-15 15:48:34,345 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:54252, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-319031200_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750307_9483, duration(ns): 16025680 2025-07-15 15:48:34,345 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750307_9483, type=LAST_IN_PIPELINE terminating 2025-07-15 15:48:38,252 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750307_9483 replica FinalizedReplica, blk_1073750307_9483, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750307 for deletion 2025-07-15 15:48:38,253 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750307_9483 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750307 2025-07-15 15:49:34,321 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750308_9484 src: /192.168.158.8:33030 dest: /192.168.158.4:9866 2025-07-15 15:49:34,339 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:33030, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_743478548_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750308_9484, duration(ns): 15224435 2025-07-15 15:49:34,339 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750308_9484, type=LAST_IN_PIPELINE terminating 2025-07-15 15:49:41,254 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750308_9484 replica FinalizedReplica, blk_1073750308_9484, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750308 for deletion 2025-07-15 15:49:41,255 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750308_9484 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750308 2025-07-15 15:50:39,371 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750309_9485 src: /192.168.158.7:49970 dest: /192.168.158.4:9866 2025-07-15 15:50:39,398 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:49970, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1217579061_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750309_9485, duration(ns): 21857720 2025-07-15 15:50:39,398 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750309_9485, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-15 15:50:44,257 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750309_9485 replica FinalizedReplica, blk_1073750309_9485, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750309 for deletion 2025-07-15 15:50:44,258 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750309_9485 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750309 2025-07-15 15:51:44,330 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750310_9486 src: /192.168.158.1:55716 dest: /192.168.158.4:9866 2025-07-15 15:51:44,361 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55716, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_159362155_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750310_9486, duration(ns): 21952249 2025-07-15 15:51:44,361 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750310_9486, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-15 15:51:47,259 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750310_9486 replica FinalizedReplica, blk_1073750310_9486, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750310 for deletion 2025-07-15 15:51:47,260 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750310_9486 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750310 2025-07-15 15:53:44,344 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750312_9488 src: /192.168.158.6:50090 dest: /192.168.158.4:9866 2025-07-15 15:53:44,358 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:50090, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1971386374_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750312_9488, duration(ns): 12889796 2025-07-15 15:53:44,359 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750312_9488, type=LAST_IN_PIPELINE terminating 2025-07-15 15:53:47,268 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750312_9488 replica FinalizedReplica, blk_1073750312_9488, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750312 for deletion 2025-07-15 15:53:47,269 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750312_9488 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750312 2025-07-15 15:59:54,363 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750318_9494 src: /192.168.158.8:57466 dest: /192.168.158.4:9866 2025-07-15 15:59:54,380 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:57466, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1502410257_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750318_9494, duration(ns): 14568261 2025-07-15 15:59:54,380 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750318_9494, type=LAST_IN_PIPELINE terminating 2025-07-15 15:59:56,285 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750318_9494 replica FinalizedReplica, blk_1073750318_9494, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750318 for deletion 2025-07-15 15:59:56,286 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750318_9494 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750318 2025-07-15 16:01:54,346 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750320_9496 src: /192.168.158.1:34854 dest: /192.168.158.4:9866 2025-07-15 16:01:54,376 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34854, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-418244197_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750320_9496, duration(ns): 21182529 2025-07-15 16:01:54,376 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750320_9496, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-15 16:01:56,290 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750320_9496 replica FinalizedReplica, blk_1073750320_9496, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750320 for deletion 2025-07-15 16:01:56,291 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750320_9496 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750320 2025-07-15 16:02:54,368 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750321_9497 src: /192.168.158.7:42492 dest: /192.168.158.4:9866 2025-07-15 16:02:54,396 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:42492, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1238684649_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750321_9497, duration(ns): 22734157 2025-07-15 16:02:54,396 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750321_9497, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-15 16:02:56,294 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750321_9497 replica FinalizedReplica, blk_1073750321_9497, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750321 for deletion 2025-07-15 16:02:56,295 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750321_9497 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750321 2025-07-15 16:03:54,364 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750322_9498 src: /192.168.158.1:36670 dest: /192.168.158.4:9866 2025-07-15 16:03:54,395 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36670, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_982100314_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750322_9498, duration(ns): 22562030 2025-07-15 16:03:54,395 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750322_9498, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-15 16:03:56,298 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750322_9498 replica FinalizedReplica, blk_1073750322_9498, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750322 for deletion 2025-07-15 16:03:56,299 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750322_9498 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750322 2025-07-15 16:05:54,374 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750324_9500 src: /192.168.158.9:60252 dest: /192.168.158.4:9866 2025-07-15 16:05:54,392 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:60252, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1442814068_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750324_9500, duration(ns): 16113534 2025-07-15 16:05:54,393 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750324_9500, type=LAST_IN_PIPELINE terminating 2025-07-15 16:05:59,298 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750324_9500 replica FinalizedReplica, blk_1073750324_9500, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750324 for deletion 2025-07-15 16:05:59,299 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750324_9500 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750324 2025-07-15 16:09:04,379 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750327_9503 src: /192.168.158.1:56330 dest: /192.168.158.4:9866 2025-07-15 16:09:04,414 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56330, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1518462613_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750327_9503, duration(ns): 26650870 2025-07-15 16:09:04,414 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750327_9503, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-15 16:09:08,309 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750327_9503 replica FinalizedReplica, blk_1073750327_9503, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750327 for deletion 2025-07-15 16:09:08,310 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750327_9503 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750327 2025-07-15 16:13:14,398 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750331_9507 src: /192.168.158.6:57990 dest: /192.168.158.4:9866 2025-07-15 16:13:14,415 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:57990, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1547667657_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750331_9507, duration(ns): 15627644 2025-07-15 16:13:14,416 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750331_9507, type=LAST_IN_PIPELINE terminating 2025-07-15 16:13:17,318 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750331_9507 replica FinalizedReplica, blk_1073750331_9507, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750331 for deletion 2025-07-15 16:13:17,319 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750331_9507 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750331 2025-07-15 16:14:19,388 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750332_9508 src: /192.168.158.6:47730 dest: /192.168.158.4:9866 2025-07-15 16:14:19,405 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:47730, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_176004233_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750332_9508, duration(ns): 15293189 2025-07-15 16:14:19,405 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750332_9508, type=LAST_IN_PIPELINE terminating 2025-07-15 16:14:23,319 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750332_9508 replica FinalizedReplica, blk_1073750332_9508, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750332 for deletion 2025-07-15 16:14:23,320 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750332_9508 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750332 2025-07-15 16:16:19,387 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750334_9510 src: /192.168.158.9:60744 dest: /192.168.158.4:9866 2025-07-15 16:16:19,412 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:60744, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1601594631_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750334_9510, duration(ns): 19429295 2025-07-15 16:16:19,412 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750334_9510, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-15 16:16:26,325 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750334_9510 replica FinalizedReplica, blk_1073750334_9510, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750334 for deletion 2025-07-15 16:16:26,327 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750334_9510 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750334 2025-07-15 16:18:19,406 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750336_9512 src: /192.168.158.1:39694 dest: /192.168.158.4:9866 2025-07-15 16:18:19,437 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39694, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-358755240_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750336_9512, duration(ns): 22673602 2025-07-15 16:18:19,437 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750336_9512, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-15 16:18:23,333 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750336_9512 replica FinalizedReplica, blk_1073750336_9512, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750336 for deletion 2025-07-15 16:18:23,334 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750336_9512 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750336 2025-07-15 16:21:24,406 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750339_9515 src: /192.168.158.1:50110 dest: /192.168.158.4:9866 2025-07-15 16:21:24,435 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50110, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1533850462_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750339_9515, duration(ns): 21172561 2025-07-15 16:21:24,436 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750339_9515, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-15 16:21:29,340 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750339_9515 replica FinalizedReplica, blk_1073750339_9515, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750339 for deletion 2025-07-15 16:21:29,341 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750339_9515 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750339 2025-07-15 16:23:24,410 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750341_9517 src: /192.168.158.5:35520 dest: /192.168.158.4:9866 2025-07-15 16:23:24,428 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:35520, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-552050019_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750341_9517, duration(ns): 16300288 2025-07-15 16:23:24,429 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750341_9517, type=LAST_IN_PIPELINE terminating 2025-07-15 16:23:29,343 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750341_9517 replica FinalizedReplica, blk_1073750341_9517, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750341 for deletion 2025-07-15 16:23:29,344 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750341_9517 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750341 2025-07-15 16:25:34,410 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750343_9519 src: /192.168.158.1:46326 dest: /192.168.158.4:9866 2025-07-15 16:25:34,440 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46326, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1879679809_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750343_9519, duration(ns): 21550169 2025-07-15 16:25:34,440 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750343_9519, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-15 16:25:38,347 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750343_9519 replica FinalizedReplica, blk_1073750343_9519, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750343 for deletion 2025-07-15 16:25:38,348 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750343_9519 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750343 2025-07-15 16:27:34,424 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750345_9521 src: /192.168.158.1:35336 dest: /192.168.158.4:9866 2025-07-15 16:27:34,460 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35336, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1645004954_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750345_9521, duration(ns): 25237383 2025-07-15 16:27:34,460 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750345_9521, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-15 16:27:41,353 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750345_9521 replica FinalizedReplica, blk_1073750345_9521, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750345 for deletion 2025-07-15 16:27:41,354 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750345_9521 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750345 2025-07-15 16:28:39,395 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750346_9522 src: /192.168.158.1:60070 dest: /192.168.158.4:9866 2025-07-15 16:28:39,427 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60070, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_6746735_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750346_9522, duration(ns): 23379960 2025-07-15 16:28:39,428 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750346_9522, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-15 16:28:44,355 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750346_9522 replica FinalizedReplica, blk_1073750346_9522, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750346 for deletion 2025-07-15 16:28:44,356 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750346_9522 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750346 2025-07-15 16:31:54,407 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750349_9525 src: /192.168.158.1:42628 dest: /192.168.158.4:9866 2025-07-15 16:31:54,437 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42628, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1151519570_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750349_9525, duration(ns): 21686473 2025-07-15 16:31:54,437 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750349_9525, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-15 16:31:56,361 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750349_9525 replica FinalizedReplica, blk_1073750349_9525, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750349 for deletion 2025-07-15 16:31:56,362 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750349_9525 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750349 2025-07-15 16:33:54,418 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750351_9527 src: /192.168.158.8:59996 dest: /192.168.158.4:9866 2025-07-15 16:33:54,436 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:59996, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-520566773_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750351_9527, duration(ns): 15658239 2025-07-15 16:33:54,437 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750351_9527, type=LAST_IN_PIPELINE terminating 2025-07-15 16:33:56,365 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750351_9527 replica FinalizedReplica, blk_1073750351_9527, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750351 for deletion 2025-07-15 16:33:56,366 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750351_9527 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750351 2025-07-15 16:35:54,419 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750353_9529 src: /192.168.158.9:39944 dest: /192.168.158.4:9866 2025-07-15 16:35:54,436 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:39944, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_6201907_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750353_9529, duration(ns): 14842674 2025-07-15 16:35:54,436 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750353_9529, type=LAST_IN_PIPELINE terminating 2025-07-15 16:35:56,368 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750353_9529 replica FinalizedReplica, blk_1073750353_9529, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750353 for deletion 2025-07-15 16:35:56,369 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750353_9529 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750353 2025-07-15 16:36:59,415 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750354_9530 src: /192.168.158.7:43990 dest: /192.168.158.4:9866 2025-07-15 16:36:59,437 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:43990, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1869693502_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750354_9530, duration(ns): 15568841 2025-07-15 16:36:59,437 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750354_9530, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-15 16:37:02,371 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750354_9530 replica FinalizedReplica, blk_1073750354_9530, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750354 for deletion 2025-07-15 16:37:02,372 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750354_9530 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750354 2025-07-15 16:41:04,427 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750358_9534 src: /192.168.158.5:38952 dest: /192.168.158.4:9866 2025-07-15 16:41:04,443 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:38952, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1453386988_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750358_9534, duration(ns): 14321498 2025-07-15 16:41:04,444 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750358_9534, type=LAST_IN_PIPELINE terminating 2025-07-15 16:41:11,374 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750358_9534 replica FinalizedReplica, blk_1073750358_9534, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750358 for deletion 2025-07-15 16:41:11,375 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750358_9534 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750358 2025-07-15 16:45:04,433 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750362_9538 src: /192.168.158.5:33824 dest: /192.168.158.4:9866 2025-07-15 16:45:04,449 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:33824, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1456225742_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750362_9538, duration(ns): 14082746 2025-07-15 16:45:04,449 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750362_9538, type=LAST_IN_PIPELINE terminating 2025-07-15 16:45:08,379 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750362_9538 replica FinalizedReplica, blk_1073750362_9538, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750362 for deletion 2025-07-15 16:45:08,380 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750362_9538 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750362 2025-07-15 16:48:09,466 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750365_9541 src: /192.168.158.7:38894 dest: /192.168.158.4:9866 2025-07-15 16:48:09,486 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:38894, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_482994372_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750365_9541, duration(ns): 15706718 2025-07-15 16:48:09,487 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750365_9541, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-15 16:48:11,388 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750365_9541 replica FinalizedReplica, blk_1073750365_9541, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750365 for deletion 2025-07-15 16:48:11,390 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750365_9541 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750365 2025-07-15 16:50:09,438 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750367_9543 src: /192.168.158.1:40406 dest: /192.168.158.4:9866 2025-07-15 16:50:09,473 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40406, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_136294589_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750367_9543, duration(ns): 25593340 2025-07-15 16:50:09,473 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750367_9543, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-15 16:50:11,394 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750367_9543 replica FinalizedReplica, blk_1073750367_9543, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750367 for deletion 2025-07-15 16:50:11,395 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750367_9543 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750367 2025-07-15 16:51:14,454 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750368_9544 src: /192.168.158.8:56688 dest: /192.168.158.4:9866 2025-07-15 16:51:14,478 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:56688, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1569859918_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750368_9544, duration(ns): 18991779 2025-07-15 16:51:14,478 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750368_9544, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-15 16:51:17,397 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750368_9544 replica FinalizedReplica, blk_1073750368_9544, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750368 for deletion 2025-07-15 16:51:17,398 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750368_9544 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750368 2025-07-15 16:53:19,451 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750370_9546 src: /192.168.158.6:57898 dest: /192.168.158.4:9866 2025-07-15 16:53:19,472 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:57898, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1822868185_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750370_9546, duration(ns): 19197147 2025-07-15 16:53:19,473 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750370_9546, type=LAST_IN_PIPELINE terminating 2025-07-15 16:53:23,403 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750370_9546 replica FinalizedReplica, blk_1073750370_9546, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750370 for deletion 2025-07-15 16:53:23,405 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750370_9546 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750370 2025-07-15 16:55:19,438 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750372_9548 src: /192.168.158.1:35096 dest: /192.168.158.4:9866 2025-07-15 16:55:19,471 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35096, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1458645857_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750372_9548, duration(ns): 24059131 2025-07-15 16:55:19,471 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750372_9548, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-15 16:55:23,409 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750372_9548 replica FinalizedReplica, blk_1073750372_9548, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750372 for deletion 2025-07-15 16:55:23,410 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750372_9548 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750372 2025-07-15 16:56:24,452 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750373_9549 src: /192.168.158.8:46464 dest: /192.168.158.4:9866 2025-07-15 16:56:24,467 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:46464, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1913597255_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750373_9549, duration(ns): 13103528 2025-07-15 16:56:24,467 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750373_9549, type=LAST_IN_PIPELINE terminating 2025-07-15 16:56:26,412 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750373_9549 replica FinalizedReplica, blk_1073750373_9549, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750373 for deletion 2025-07-15 16:56:26,413 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750373_9549 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750373 2025-07-15 17:00:24,466 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750377_9553 src: /192.168.158.8:47420 dest: /192.168.158.4:9866 2025-07-15 17:00:24,489 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:47420, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_611909340_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750377_9553, duration(ns): 17684825 2025-07-15 17:00:24,489 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750377_9553, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-15 17:00:26,420 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750377_9553 replica FinalizedReplica, blk_1073750377_9553, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750377 for deletion 2025-07-15 17:00:26,421 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750377_9553 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750377 2025-07-15 17:02:29,446 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750379_9555 src: /192.168.158.1:57678 dest: /192.168.158.4:9866 2025-07-15 17:02:29,478 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57678, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1452515225_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750379_9555, duration(ns): 23677147 2025-07-15 17:02:29,479 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750379_9555, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-15 17:02:32,422 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750379_9555 replica FinalizedReplica, blk_1073750379_9555, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750379 for deletion 2025-07-15 17:02:32,423 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750379_9555 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750379 2025-07-15 17:03:29,484 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750380_9556 src: /192.168.158.8:43650 dest: /192.168.158.4:9866 2025-07-15 17:03:29,506 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:43650, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-935776818_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750380_9556, duration(ns): 16633416 2025-07-15 17:03:29,506 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750380_9556, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-15 17:03:32,424 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750380_9556 replica FinalizedReplica, blk_1073750380_9556, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750380 for deletion 2025-07-15 17:03:32,425 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750380_9556 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750380 2025-07-15 17:06:39,467 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750383_9559 src: /192.168.158.9:40794 dest: /192.168.158.4:9866 2025-07-15 17:06:39,489 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:40794, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1939167337_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750383_9559, duration(ns): 16948362 2025-07-15 17:06:39,489 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750383_9559, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-15 17:06:41,432 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750383_9559 replica FinalizedReplica, blk_1073750383_9559, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750383 for deletion 2025-07-15 17:06:41,433 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750383_9559 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750383 2025-07-15 17:07:44,461 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750384_9560 src: /192.168.158.6:53784 dest: /192.168.158.4:9866 2025-07-15 17:07:44,487 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:53784, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1594790895_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750384_9560, duration(ns): 21388239 2025-07-15 17:07:44,488 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750384_9560, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-15 17:07:47,433 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750384_9560 replica FinalizedReplica, blk_1073750384_9560, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750384 for deletion 2025-07-15 17:07:47,434 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750384_9560 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750384 2025-07-15 17:08:49,455 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750385_9561 src: /192.168.158.1:40706 dest: /192.168.158.4:9866 2025-07-15 17:08:49,485 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40706, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2048580102_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750385_9561, duration(ns): 21592835 2025-07-15 17:08:49,485 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750385_9561, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-15 17:08:56,438 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750385_9561 replica FinalizedReplica, blk_1073750385_9561, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750385 for deletion 2025-07-15 17:08:56,439 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750385_9561 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750385 2025-07-15 17:09:49,468 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750386_9562 src: /192.168.158.8:38132 dest: /192.168.158.4:9866 2025-07-15 17:09:49,491 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:38132, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1262911049_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750386_9562, duration(ns): 16819894 2025-07-15 17:09:49,492 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750386_9562, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-15 17:09:56,441 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750386_9562 replica FinalizedReplica, blk_1073750386_9562, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750386 for deletion 2025-07-15 17:09:56,442 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750386_9562 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750386 2025-07-15 17:11:54,480 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750388_9564 src: /192.168.158.8:54336 dest: /192.168.158.4:9866 2025-07-15 17:11:54,501 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:54336, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1008876583_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750388_9564, duration(ns): 18262466 2025-07-15 17:11:54,501 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750388_9564, type=LAST_IN_PIPELINE terminating 2025-07-15 17:11:56,445 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750388_9564 replica FinalizedReplica, blk_1073750388_9564, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750388 for deletion 2025-07-15 17:11:56,446 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750388_9564 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750388 2025-07-15 17:13:59,475 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750390_9566 src: /192.168.158.1:38684 dest: /192.168.158.4:9866 2025-07-15 17:13:59,505 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38684, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1723151556_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750390_9566, duration(ns): 21218219 2025-07-15 17:13:59,505 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750390_9566, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-15 17:14:02,451 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750390_9566 replica FinalizedReplica, blk_1073750390_9566, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750390 for deletion 2025-07-15 17:14:02,452 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750390_9566 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750390 2025-07-15 17:16:04,468 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750392_9568 src: /192.168.158.9:50078 dest: /192.168.158.4:9866 2025-07-15 17:16:04,492 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:50078, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-808455111_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750392_9568, duration(ns): 18445461 2025-07-15 17:16:04,492 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750392_9568, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-15 17:16:08,454 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750392_9568 replica FinalizedReplica, blk_1073750392_9568, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750392 for deletion 2025-07-15 17:16:08,456 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750392_9568 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750392 2025-07-15 17:20:14,482 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750396_9572 src: /192.168.158.5:39610 dest: /192.168.158.4:9866 2025-07-15 17:20:14,506 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:39610, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_748134402_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750396_9572, duration(ns): 18926427 2025-07-15 17:20:14,506 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750396_9572, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-15 17:20:20,462 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750396_9572 replica FinalizedReplica, blk_1073750396_9572, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750396 for deletion 2025-07-15 17:20:20,464 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750396_9572 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750396 2025-07-15 17:21:19,494 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750397_9573 src: /192.168.158.1:45250 dest: /192.168.158.4:9866 2025-07-15 17:21:19,523 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45250, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-137067289_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750397_9573, duration(ns): 19416653 2025-07-15 17:21:19,523 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750397_9573, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-15 17:21:23,465 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750397_9573 replica FinalizedReplica, blk_1073750397_9573, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750397 for deletion 2025-07-15 17:21:23,466 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750397_9573 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750397 2025-07-15 17:22:19,503 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750398_9574 src: /192.168.158.8:41678 dest: /192.168.158.4:9866 2025-07-15 17:22:19,527 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:41678, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_665403009_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750398_9574, duration(ns): 19446143 2025-07-15 17:22:19,528 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750398_9574, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-15 17:22:23,468 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750398_9574 replica FinalizedReplica, blk_1073750398_9574, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750398 for deletion 2025-07-15 17:22:23,469 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750398_9574 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750398 2025-07-15 17:25:24,488 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750401_9577 src: /192.168.158.8:34658 dest: /192.168.158.4:9866 2025-07-15 17:25:24,504 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:34658, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1681583289_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750401_9577, duration(ns): 14087130 2025-07-15 17:25:24,504 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750401_9577, type=LAST_IN_PIPELINE terminating 2025-07-15 17:25:26,474 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750401_9577 replica FinalizedReplica, blk_1073750401_9577, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750401 for deletion 2025-07-15 17:25:26,475 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750401_9577 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750401 2025-07-15 17:26:29,521 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750402_9578 src: /192.168.158.1:40792 dest: /192.168.158.4:9866 2025-07-15 17:26:29,549 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40792, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-5143658_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750402_9578, duration(ns): 19741357 2025-07-15 17:26:29,549 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750402_9578, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-15 17:26:32,476 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750402_9578 replica FinalizedReplica, blk_1073750402_9578, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750402 for deletion 2025-07-15 17:26:32,477 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750402_9578 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750402 2025-07-15 17:29:44,502 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750405_9581 src: /192.168.158.1:56092 dest: /192.168.158.4:9866 2025-07-15 17:29:44,533 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56092, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2046059581_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750405_9581, duration(ns): 23182432 2025-07-15 17:29:44,533 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750405_9581, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-15 17:29:47,483 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750405_9581 replica FinalizedReplica, blk_1073750405_9581, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750405 for deletion 2025-07-15 17:29:47,484 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750405_9581 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750405 2025-07-15 17:30:49,522 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750406_9582 src: /192.168.158.1:36198 dest: /192.168.158.4:9866 2025-07-15 17:30:49,555 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36198, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-599232329_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750406_9582, duration(ns): 24046622 2025-07-15 17:30:49,555 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750406_9582, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-15 17:30:53,485 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750406_9582 replica FinalizedReplica, blk_1073750406_9582, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750406 for deletion 2025-07-15 17:30:53,487 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750406_9582 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750406 2025-07-15 17:33:49,517 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750409_9585 src: /192.168.158.1:44770 dest: /192.168.158.4:9866 2025-07-15 17:33:49,550 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44770, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-67220905_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750409_9585, duration(ns): 24331134 2025-07-15 17:33:49,550 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750409_9585, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-15 17:33:53,494 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750409_9585 replica FinalizedReplica, blk_1073750409_9585, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750409 for deletion 2025-07-15 17:33:53,495 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750409_9585 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750409 2025-07-15 17:36:13,269 INFO org.apache.hadoop.hdfs.server.datanode.DirectoryScanner: BlockPool BP-1059995147-192.168.158.1-1752101929360 Total blocks: 8, missing metadata files:0, missing block files:0, missing blocks in memory:0, mismatched blocks:0 2025-07-15 17:37:20,505 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Successfully sent block report 0x12cfd9a757d31f3d, containing 4 storage report(s), of which we sent 4. The reports had 8 total blocks and used 1 RPC(s). This took 0 msec to generate and 4 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5. 2025-07-15 17:37:20,505 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Got finalize command for block pool BP-1059995147-192.168.158.1-1752101929360 2025-07-15 17:37:54,518 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750413_9589 src: /192.168.158.1:47090 dest: /192.168.158.4:9866 2025-07-15 17:37:54,548 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47090, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1801999443_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750413_9589, duration(ns): 20638533 2025-07-15 17:37:54,548 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750413_9589, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-15 17:37:56,503 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750413_9589 replica FinalizedReplica, blk_1073750413_9589, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750413 for deletion 2025-07-15 17:37:56,504 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750413_9589 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750413 2025-07-15 17:41:59,522 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750417_9593 src: /192.168.158.6:50794 dest: /192.168.158.4:9866 2025-07-15 17:41:59,539 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:50794, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1987568753_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750417_9593, duration(ns): 14878594 2025-07-15 17:41:59,540 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750417_9593, type=LAST_IN_PIPELINE terminating 2025-07-15 17:42:05,511 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750417_9593 replica FinalizedReplica, blk_1073750417_9593, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750417 for deletion 2025-07-15 17:42:05,512 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750417_9593 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750417 2025-07-15 17:47:04,538 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750422_9598 src: /192.168.158.1:51228 dest: /192.168.158.4:9866 2025-07-15 17:47:04,567 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51228, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1666120564_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750422_9598, duration(ns): 20099625 2025-07-15 17:47:04,567 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750422_9598, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-15 17:47:11,523 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750422_9598 replica FinalizedReplica, blk_1073750422_9598, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750422 for deletion 2025-07-15 17:47:11,524 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750422_9598 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750422 2025-07-15 17:48:04,533 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750423_9599 src: /192.168.158.6:37962 dest: /192.168.158.4:9866 2025-07-15 17:48:04,550 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:37962, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1231721259_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750423_9599, duration(ns): 14229249 2025-07-15 17:48:04,550 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750423_9599, type=LAST_IN_PIPELINE terminating 2025-07-15 17:48:08,525 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750423_9599 replica FinalizedReplica, blk_1073750423_9599, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750423 for deletion 2025-07-15 17:48:08,526 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750423_9599 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750423 2025-07-15 17:49:04,527 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750424_9600 src: /192.168.158.6:44446 dest: /192.168.158.4:9866 2025-07-15 17:49:04,550 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:44446, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1274356721_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750424_9600, duration(ns): 17618216 2025-07-15 17:49:04,550 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750424_9600, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-15 17:49:08,528 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750424_9600 replica FinalizedReplica, blk_1073750424_9600, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750424 for deletion 2025-07-15 17:49:08,529 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750424_9600 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750424 2025-07-15 17:52:04,534 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750427_9603 src: /192.168.158.7:58112 dest: /192.168.158.4:9866 2025-07-15 17:52:04,559 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:58112, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1102884865_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750427_9603, duration(ns): 19070503 2025-07-15 17:52:04,559 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750427_9603, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-15 17:52:08,533 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750427_9603 replica FinalizedReplica, blk_1073750427_9603, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750427 for deletion 2025-07-15 17:52:08,534 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750427_9603 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750427 2025-07-15 17:57:14,539 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750432_9608 src: /192.168.158.7:35512 dest: /192.168.158.4:9866 2025-07-15 17:57:14,565 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:35512, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1193008511_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750432_9608, duration(ns): 20406411 2025-07-15 17:57:14,565 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750432_9608, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-15 17:57:17,544 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750432_9608 replica FinalizedReplica, blk_1073750432_9608, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750432 for deletion 2025-07-15 17:57:17,545 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750432_9608 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750432 2025-07-15 18:00:14,560 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750435_9611 src: /192.168.158.6:58420 dest: /192.168.158.4:9866 2025-07-15 18:00:14,586 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:58420, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1418782374_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750435_9611, duration(ns): 20055431 2025-07-15 18:00:14,586 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750435_9611, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-15 18:00:20,550 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750435_9611 replica FinalizedReplica, blk_1073750435_9611, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750435 for deletion 2025-07-15 18:00:20,551 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750435_9611 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750435 2025-07-15 18:03:19,557 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750438_9614 src: /192.168.158.1:39402 dest: /192.168.158.4:9866 2025-07-15 18:03:19,592 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39402, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_627561332_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750438_9614, duration(ns): 25550239 2025-07-15 18:03:19,592 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750438_9614, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-15 18:03:26,557 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750438_9614 replica FinalizedReplica, blk_1073750438_9614, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750438 for deletion 2025-07-15 18:03:26,558 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750438_9614 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750438 2025-07-15 18:04:24,573 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750439_9615 src: /192.168.158.1:60188 dest: /192.168.158.4:9866 2025-07-15 18:04:24,604 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60188, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_469844913_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750439_9615, duration(ns): 23076148 2025-07-15 18:04:24,604 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750439_9615, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-15 18:04:29,560 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750439_9615 replica FinalizedReplica, blk_1073750439_9615, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750439 for deletion 2025-07-15 18:04:29,561 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750439_9615 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750439 2025-07-15 18:08:34,558 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750443_9619 src: /192.168.158.1:36994 dest: /192.168.158.4:9866 2025-07-15 18:08:34,592 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36994, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_944621062_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750443_9619, duration(ns): 24398518 2025-07-15 18:08:34,592 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750443_9619, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-15 18:08:38,568 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750443_9619 replica FinalizedReplica, blk_1073750443_9619, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750443 for deletion 2025-07-15 18:08:38,569 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750443_9619 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750443 2025-07-15 18:09:34,578 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750444_9620 src: /192.168.158.8:38850 dest: /192.168.158.4:9866 2025-07-15 18:09:34,593 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:38850, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1192786896_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750444_9620, duration(ns): 12981485 2025-07-15 18:09:34,593 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750444_9620, type=LAST_IN_PIPELINE terminating 2025-07-15 18:09:41,569 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750444_9620 replica FinalizedReplica, blk_1073750444_9620, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750444 for deletion 2025-07-15 18:09:41,570 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750444_9620 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750444 2025-07-15 18:11:34,576 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750446_9622 src: /192.168.158.7:53056 dest: /192.168.158.4:9866 2025-07-15 18:11:34,592 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:53056, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_987954498_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750446_9622, duration(ns): 14088237 2025-07-15 18:11:34,593 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750446_9622, type=LAST_IN_PIPELINE terminating 2025-07-15 18:11:38,572 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750446_9622 replica FinalizedReplica, blk_1073750446_9622, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750446 for deletion 2025-07-15 18:11:38,573 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750446_9622 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750446 2025-07-15 18:13:34,569 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750448_9624 src: /192.168.158.8:38446 dest: /192.168.158.4:9866 2025-07-15 18:13:34,585 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:38446, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-780299181_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750448_9624, duration(ns): 14677185 2025-07-15 18:13:34,586 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750448_9624, type=LAST_IN_PIPELINE terminating 2025-07-15 18:13:41,574 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750448_9624 replica FinalizedReplica, blk_1073750448_9624, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750448 for deletion 2025-07-15 18:13:41,575 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750448_9624 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750448 2025-07-15 18:16:39,575 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750451_9627 src: /192.168.158.7:49400 dest: /192.168.158.4:9866 2025-07-15 18:16:39,598 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:49400, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1707651222_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750451_9627, duration(ns): 17430593 2025-07-15 18:16:39,598 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750451_9627, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-15 18:16:41,583 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750451_9627 replica FinalizedReplica, blk_1073750451_9627, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750451 for deletion 2025-07-15 18:16:41,584 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750451_9627 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750451 2025-07-15 18:21:39,593 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750456_9632 src: /192.168.158.5:51600 dest: /192.168.158.4:9866 2025-07-15 18:21:39,609 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:51600, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_723257011_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750456_9632, duration(ns): 14522628 2025-07-15 18:21:39,609 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750456_9632, type=LAST_IN_PIPELINE terminating 2025-07-15 18:21:44,593 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750456_9632 replica FinalizedReplica, blk_1073750456_9632, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750456 for deletion 2025-07-15 18:21:44,594 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750456_9632 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750456 2025-07-15 18:23:39,590 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750458_9634 src: /192.168.158.8:52070 dest: /192.168.158.4:9866 2025-07-15 18:23:39,608 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:52070, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-834595027_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750458_9634, duration(ns): 16110826 2025-07-15 18:23:39,608 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750458_9634, type=LAST_IN_PIPELINE terminating 2025-07-15 18:23:44,594 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750458_9634 replica FinalizedReplica, blk_1073750458_9634, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750458 for deletion 2025-07-15 18:23:44,596 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750458_9634 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750458 2025-07-15 18:25:39,600 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750460_9636 src: /192.168.158.8:49364 dest: /192.168.158.4:9866 2025-07-15 18:25:39,624 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:49364, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_435211907_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750460_9636, duration(ns): 19638269 2025-07-15 18:25:39,627 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750460_9636, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-15 18:25:41,601 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750460_9636 replica FinalizedReplica, blk_1073750460_9636, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750460 for deletion 2025-07-15 18:25:41,602 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750460_9636 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750460 2025-07-15 18:28:39,598 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750463_9639 src: /192.168.158.7:57684 dest: /192.168.158.4:9866 2025-07-15 18:28:39,621 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:57684, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-580032828_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750463_9639, duration(ns): 17679674 2025-07-15 18:28:39,621 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750463_9639, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-15 18:28:41,609 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750463_9639 replica FinalizedReplica, blk_1073750463_9639, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750463 for deletion 2025-07-15 18:28:41,610 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750463_9639 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750463 2025-07-15 18:30:39,595 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750465_9641 src: /192.168.158.1:38694 dest: /192.168.158.4:9866 2025-07-15 18:30:39,626 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38694, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1566229499_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750465_9641, duration(ns): 22488388 2025-07-15 18:30:39,627 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750465_9641, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-15 18:30:44,613 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750465_9641 replica FinalizedReplica, blk_1073750465_9641, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750465 for deletion 2025-07-15 18:30:44,615 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750465_9641 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750465 2025-07-15 18:31:39,613 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750466_9642 src: /192.168.158.8:42648 dest: /192.168.158.4:9866 2025-07-15 18:31:39,630 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:42648, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-196516025_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750466_9642, duration(ns): 14794545 2025-07-15 18:31:39,630 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750466_9642, type=LAST_IN_PIPELINE terminating 2025-07-15 18:31:41,616 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750466_9642 replica FinalizedReplica, blk_1073750466_9642, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750466 for deletion 2025-07-15 18:31:41,617 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750466_9642 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750466 2025-07-15 18:32:39,606 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750467_9643 src: /192.168.158.7:42910 dest: /192.168.158.4:9866 2025-07-15 18:32:39,623 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:42910, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2043932426_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750467_9643, duration(ns): 14875676 2025-07-15 18:32:39,623 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750467_9643, type=LAST_IN_PIPELINE terminating 2025-07-15 18:32:41,619 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750467_9643 replica FinalizedReplica, blk_1073750467_9643, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750467 for deletion 2025-07-15 18:32:41,620 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750467_9643 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750467 2025-07-15 18:33:39,605 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750468_9644 src: /192.168.158.1:33194 dest: /192.168.158.4:9866 2025-07-15 18:33:39,636 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33194, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1788904675_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750468_9644, duration(ns): 22720588 2025-07-15 18:33:39,636 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750468_9644, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-15 18:33:41,623 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750468_9644 replica FinalizedReplica, blk_1073750468_9644, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750468 for deletion 2025-07-15 18:33:41,624 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750468_9644 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750468 2025-07-15 18:34:44,607 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750469_9645 src: /192.168.158.1:36508 dest: /192.168.158.4:9866 2025-07-15 18:34:44,639 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36508, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-589999691_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750469_9645, duration(ns): 22987894 2025-07-15 18:34:44,639 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750469_9645, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-15 18:34:50,628 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750469_9645 replica FinalizedReplica, blk_1073750469_9645, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750469 for deletion 2025-07-15 18:34:50,629 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750469_9645 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750469 2025-07-15 18:37:44,622 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750472_9648 src: /192.168.158.9:40990 dest: /192.168.158.4:9866 2025-07-15 18:37:44,645 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:40990, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1578373441_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750472_9648, duration(ns): 17681878 2025-07-15 18:37:44,645 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750472_9648, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-15 18:37:50,634 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750472_9648 replica FinalizedReplica, blk_1073750472_9648, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750472 for deletion 2025-07-15 18:37:50,635 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750472_9648 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750472 2025-07-15 18:38:44,620 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750473_9649 src: /192.168.158.7:34784 dest: /192.168.158.4:9866 2025-07-15 18:38:44,635 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:34784, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1612254990_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750473_9649, duration(ns): 12854940 2025-07-15 18:38:44,635 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750473_9649, type=LAST_IN_PIPELINE terminating 2025-07-15 18:38:47,636 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750473_9649 replica FinalizedReplica, blk_1073750473_9649, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750473 for deletion 2025-07-15 18:38:47,637 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750473_9649 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750473 2025-07-15 18:41:49,621 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750476_9652 src: /192.168.158.1:55110 dest: /192.168.158.4:9866 2025-07-15 18:41:49,651 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55110, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1054233833_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750476_9652, duration(ns): 21612504 2025-07-15 18:41:49,651 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750476_9652, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-15 18:41:56,641 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750476_9652 replica FinalizedReplica, blk_1073750476_9652, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750476 for deletion 2025-07-15 18:41:56,642 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750476_9652 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750476 2025-07-15 18:43:49,629 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750478_9654 src: /192.168.158.9:35560 dest: /192.168.158.4:9866 2025-07-15 18:43:49,647 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:35560, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1950053559_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750478_9654, duration(ns): 16029738 2025-07-15 18:43:49,647 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750478_9654, type=LAST_IN_PIPELINE terminating 2025-07-15 18:43:56,644 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750478_9654 replica FinalizedReplica, blk_1073750478_9654, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750478 for deletion 2025-07-15 18:43:56,646 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750478_9654 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750478 2025-07-15 18:44:49,626 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750479_9655 src: /192.168.158.1:54578 dest: /192.168.158.4:9866 2025-07-15 18:44:49,657 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54578, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2065591016_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750479_9655, duration(ns): 21889293 2025-07-15 18:44:49,658 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750479_9655, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-15 18:44:56,646 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750479_9655 replica FinalizedReplica, blk_1073750479_9655, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750479 for deletion 2025-07-15 18:44:56,648 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750479_9655 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750479 2025-07-15 18:45:49,663 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750480_9656 src: /192.168.158.5:49162 dest: /192.168.158.4:9866 2025-07-15 18:45:49,680 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:49162, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-488273517_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750480_9656, duration(ns): 14922997 2025-07-15 18:45:49,680 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750480_9656, type=LAST_IN_PIPELINE terminating 2025-07-15 18:45:53,649 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750480_9656 replica FinalizedReplica, blk_1073750480_9656, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750480 for deletion 2025-07-15 18:45:53,650 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750480_9656 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750480 2025-07-15 18:47:54,637 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750482_9658 src: /192.168.158.9:59022 dest: /192.168.158.4:9866 2025-07-15 18:47:54,667 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:59022, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-858370080_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750482_9658, duration(ns): 24359336 2025-07-15 18:47:54,667 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750482_9658, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-15 18:47:59,653 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750482_9658 replica FinalizedReplica, blk_1073750482_9658, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750482 for deletion 2025-07-15 18:47:59,655 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750482_9658 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750482 2025-07-15 18:48:54,638 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750483_9659 src: /192.168.158.1:40586 dest: /192.168.158.4:9866 2025-07-15 18:48:54,668 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40586, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1142648161_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750483_9659, duration(ns): 21786124 2025-07-15 18:48:54,669 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750483_9659, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-15 18:48:59,656 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750483_9659 replica FinalizedReplica, blk_1073750483_9659, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750483 for deletion 2025-07-15 18:48:59,657 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750483_9659 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750483 2025-07-15 18:49:54,633 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750484_9660 src: /192.168.158.1:39252 dest: /192.168.158.4:9866 2025-07-15 18:49:54,663 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39252, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2054880592_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750484_9660, duration(ns): 21185324 2025-07-15 18:49:54,663 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750484_9660, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-15 18:49:56,660 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750484_9660 replica FinalizedReplica, blk_1073750484_9660, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750484 for deletion 2025-07-15 18:49:56,661 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750484_9660 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750484 2025-07-15 18:55:54,644 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750490_9666 src: /192.168.158.1:44354 dest: /192.168.158.4:9866 2025-07-15 18:55:54,675 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44354, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1498518537_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750490_9666, duration(ns): 22129671 2025-07-15 18:55:54,675 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750490_9666, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-15 18:55:59,672 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750490_9666 replica FinalizedReplica, blk_1073750490_9666, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750490 for deletion 2025-07-15 18:55:59,673 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750490_9666 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750490 2025-07-15 18:57:59,650 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750492_9668 src: /192.168.158.5:35800 dest: /192.168.158.4:9866 2025-07-15 18:57:59,666 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:35800, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1733660752_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750492_9668, duration(ns): 13985752 2025-07-15 18:57:59,666 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750492_9668, type=LAST_IN_PIPELINE terminating 2025-07-15 18:58:05,671 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750492_9668 replica FinalizedReplica, blk_1073750492_9668, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750492 for deletion 2025-07-15 18:58:05,672 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750492_9668 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750492 2025-07-15 19:03:04,669 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750497_9673 src: /192.168.158.5:50540 dest: /192.168.158.4:9866 2025-07-15 19:03:04,686 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:50540, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1365545034_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750497_9673, duration(ns): 15158201 2025-07-15 19:03:04,687 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750497_9673, type=LAST_IN_PIPELINE terminating 2025-07-15 19:03:08,676 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750497_9673 replica FinalizedReplica, blk_1073750497_9673, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750497 for deletion 2025-07-15 19:03:08,677 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750497_9673 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750497 2025-07-15 19:05:09,666 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750499_9675 src: /192.168.158.1:58878 dest: /192.168.158.4:9866 2025-07-15 19:05:09,695 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58878, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1127212707_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750499_9675, duration(ns): 19675887 2025-07-15 19:05:09,695 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750499_9675, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-15 19:05:11,682 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750499_9675 replica FinalizedReplica, blk_1073750499_9675, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750499 for deletion 2025-07-15 19:05:11,683 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750499_9675 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750499 2025-07-15 19:07:09,661 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750501_9677 src: /192.168.158.1:44348 dest: /192.168.158.4:9866 2025-07-15 19:07:09,692 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44348, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-614735702_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750501_9677, duration(ns): 21846879 2025-07-15 19:07:09,692 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750501_9677, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-15 19:07:11,686 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750501_9677 replica FinalizedReplica, blk_1073750501_9677, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750501 for deletion 2025-07-15 19:07:11,687 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750501_9677 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750501 2025-07-15 19:08:09,668 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750502_9678 src: /192.168.158.6:50628 dest: /192.168.158.4:9866 2025-07-15 19:08:09,689 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:50628, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1675619313_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750502_9678, duration(ns): 15820779 2025-07-15 19:08:09,690 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750502_9678, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-15 19:08:11,687 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750502_9678 replica FinalizedReplica, blk_1073750502_9678, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750502 for deletion 2025-07-15 19:08:11,689 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750502_9678 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750502 2025-07-15 19:10:09,676 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750504_9680 src: /192.168.158.8:42472 dest: /192.168.158.4:9866 2025-07-15 19:10:09,700 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:42472, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1658284316_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750504_9680, duration(ns): 19494975 2025-07-15 19:10:09,700 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750504_9680, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-15 19:10:14,691 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750504_9680 replica FinalizedReplica, blk_1073750504_9680, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750504 for deletion 2025-07-15 19:10:14,692 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750504_9680 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750504 2025-07-15 19:11:09,672 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750505_9681 src: /192.168.158.1:38850 dest: /192.168.158.4:9866 2025-07-15 19:11:09,701 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38850, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1939090271_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750505_9681, duration(ns): 20296236 2025-07-15 19:11:09,701 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750505_9681, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-15 19:11:11,693 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750505_9681 replica FinalizedReplica, blk_1073750505_9681, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750505 for deletion 2025-07-15 19:11:11,694 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750505_9681 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750505 2025-07-15 19:12:09,670 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750506_9682 src: /192.168.158.1:34320 dest: /192.168.158.4:9866 2025-07-15 19:12:09,698 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34320, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2107916706_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750506_9682, duration(ns): 19699311 2025-07-15 19:12:09,699 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750506_9682, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-15 19:12:14,694 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750506_9682 replica FinalizedReplica, blk_1073750506_9682, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750506 for deletion 2025-07-15 19:12:14,695 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750506_9682 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750506 2025-07-15 19:13:09,689 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750507_9683 src: /192.168.158.7:52994 dest: /192.168.158.4:9866 2025-07-15 19:13:09,712 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:52994, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-182023772_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750507_9683, duration(ns): 17479435 2025-07-15 19:13:09,712 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750507_9683, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-15 19:13:11,698 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750507_9683 replica FinalizedReplica, blk_1073750507_9683, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750507 for deletion 2025-07-15 19:13:11,699 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750507_9683 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750507 2025-07-15 19:14:14,689 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750508_9684 src: /192.168.158.7:38948 dest: /192.168.158.4:9866 2025-07-15 19:14:14,708 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:38948, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1234235335_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750508_9684, duration(ns): 16946309 2025-07-15 19:14:14,708 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750508_9684, type=LAST_IN_PIPELINE terminating 2025-07-15 19:14:17,699 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750508_9684 replica FinalizedReplica, blk_1073750508_9684, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750508 for deletion 2025-07-15 19:14:17,700 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750508_9684 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750508 2025-07-15 19:15:19,686 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750509_9685 src: /192.168.158.8:41672 dest: /192.168.158.4:9866 2025-07-15 19:15:19,710 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:41672, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-829147147_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750509_9685, duration(ns): 19194461 2025-07-15 19:15:19,711 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750509_9685, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-15 19:15:23,700 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750509_9685 replica FinalizedReplica, blk_1073750509_9685, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750509 for deletion 2025-07-15 19:15:23,701 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750509_9685 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750509 2025-07-15 19:16:19,688 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750510_9686 src: /192.168.158.8:40458 dest: /192.168.158.4:9866 2025-07-15 19:16:19,706 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:40458, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1688232030_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750510_9686, duration(ns): 16324559 2025-07-15 19:16:19,706 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750510_9686, type=LAST_IN_PIPELINE terminating 2025-07-15 19:16:20,703 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750510_9686 replica FinalizedReplica, blk_1073750510_9686, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750510 for deletion 2025-07-15 19:16:20,704 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750510_9686 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750510 2025-07-15 19:17:24,680 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750511_9687 src: /192.168.158.1:58540 dest: /192.168.158.4:9866 2025-07-15 19:17:24,714 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58540, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1187417134_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750511_9687, duration(ns): 25641005 2025-07-15 19:17:24,714 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750511_9687, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-15 19:17:29,706 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750511_9687 replica FinalizedReplica, blk_1073750511_9687, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750511 for deletion 2025-07-15 19:17:29,707 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750511_9687 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750511 2025-07-15 19:20:24,720 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750514_9690 src: /192.168.158.1:48002 dest: /192.168.158.4:9866 2025-07-15 19:20:24,748 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48002, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1585196321_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750514_9690, duration(ns): 19445994 2025-07-15 19:20:24,749 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750514_9690, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-15 19:20:26,713 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750514_9690 replica FinalizedReplica, blk_1073750514_9690, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750514 for deletion 2025-07-15 19:20:26,714 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750514_9690 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750514 2025-07-15 19:22:24,698 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750516_9692 src: /192.168.158.7:55424 dest: /192.168.158.4:9866 2025-07-15 19:22:24,725 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:55424, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_487242398_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750516_9692, duration(ns): 21249730 2025-07-15 19:22:24,725 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750516_9692, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-15 19:22:26,717 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750516_9692 replica FinalizedReplica, blk_1073750516_9692, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750516 for deletion 2025-07-15 19:22:26,718 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750516_9692 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750516 2025-07-15 19:24:24,707 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750518_9694 src: /192.168.158.6:34150 dest: /192.168.158.4:9866 2025-07-15 19:24:24,735 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:34150, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1505909188_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750518_9694, duration(ns): 21552300 2025-07-15 19:24:24,736 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750518_9694, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-15 19:24:29,721 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750518_9694 replica FinalizedReplica, blk_1073750518_9694, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750518 for deletion 2025-07-15 19:24:29,722 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750518_9694 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750518 2025-07-15 19:25:29,708 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750519_9695 src: /192.168.158.1:37328 dest: /192.168.158.4:9866 2025-07-15 19:25:29,738 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37328, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1865856702_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750519_9695, duration(ns): 20963810 2025-07-15 19:25:29,738 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750519_9695, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-15 19:25:35,723 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750519_9695 replica FinalizedReplica, blk_1073750519_9695, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750519 for deletion 2025-07-15 19:25:35,724 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750519_9695 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750519 2025-07-15 19:27:34,705 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750521_9697 src: /192.168.158.1:56328 dest: /192.168.158.4:9866 2025-07-15 19:27:34,737 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56328, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_229284032_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750521_9697, duration(ns): 22126427 2025-07-15 19:27:34,737 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750521_9697, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-15 19:27:38,725 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750521_9697 replica FinalizedReplica, blk_1073750521_9697, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750521 for deletion 2025-07-15 19:27:38,727 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750521_9697 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750521 2025-07-15 19:29:34,735 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750523_9699 src: /192.168.158.9:54698 dest: /192.168.158.4:9866 2025-07-15 19:29:34,764 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:54698, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1021310708_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750523_9699, duration(ns): 22420601 2025-07-15 19:29:34,764 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750523_9699, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-15 19:29:35,727 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750523_9699 replica FinalizedReplica, blk_1073750523_9699, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750523 for deletion 2025-07-15 19:29:35,728 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750523_9699 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750523 2025-07-15 19:32:34,720 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750526_9702 src: /192.168.158.5:47640 dest: /192.168.158.4:9866 2025-07-15 19:32:34,738 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:47640, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_463707594_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750526_9702, duration(ns): 15300833 2025-07-15 19:32:34,738 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750526_9702, type=LAST_IN_PIPELINE terminating 2025-07-15 19:32:35,731 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750526_9702 replica FinalizedReplica, blk_1073750526_9702, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750526 for deletion 2025-07-15 19:32:35,732 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750526_9702 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073750526 2025-07-15 19:35:34,727 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750529_9705 src: /192.168.158.9:41388 dest: /192.168.158.4:9866 2025-07-15 19:35:34,753 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:41388, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_530278799_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750529_9705, duration(ns): 20251539 2025-07-15 19:35:34,753 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750529_9705, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-15 19:35:35,735 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750529_9705 replica FinalizedReplica, blk_1073750529_9705, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750529 for deletion 2025-07-15 19:35:35,736 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750529_9705 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750529 2025-07-15 19:38:39,733 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750532_9708 src: /192.168.158.7:57312 dest: /192.168.158.4:9866 2025-07-15 19:38:39,749 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:57312, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-803925376_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750532_9708, duration(ns): 13943482 2025-07-15 19:38:39,749 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750532_9708, type=LAST_IN_PIPELINE terminating 2025-07-15 19:38:44,741 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750532_9708 replica FinalizedReplica, blk_1073750532_9708, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750532 for deletion 2025-07-15 19:38:44,742 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750532_9708 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750532 2025-07-15 19:42:49,742 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750536_9712 src: /192.168.158.5:56844 dest: /192.168.158.4:9866 2025-07-15 19:42:49,759 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:56844, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1046729846_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750536_9712, duration(ns): 14698886 2025-07-15 19:42:49,759 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750536_9712, type=LAST_IN_PIPELINE terminating 2025-07-15 19:42:50,741 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750536_9712 replica FinalizedReplica, blk_1073750536_9712, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750536 for deletion 2025-07-15 19:42:50,743 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750536_9712 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750536 2025-07-15 19:43:49,742 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750537_9713 src: /192.168.158.5:44134 dest: /192.168.158.4:9866 2025-07-15 19:43:49,767 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:44134, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1870010972_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750537_9713, duration(ns): 20307368 2025-07-15 19:43:49,768 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750537_9713, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-15 19:43:50,743 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750537_9713 replica FinalizedReplica, blk_1073750537_9713, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750537 for deletion 2025-07-15 19:43:50,744 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750537_9713 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750537 2025-07-15 19:47:59,750 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750541_9717 src: /192.168.158.1:54962 dest: /192.168.158.4:9866 2025-07-15 19:47:59,781 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54962, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1658006127_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750541_9717, duration(ns): 22730483 2025-07-15 19:47:59,781 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750541_9717, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-15 19:48:02,748 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750541_9717 replica FinalizedReplica, blk_1073750541_9717, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750541 for deletion 2025-07-15 19:48:02,749 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750541_9717 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750541 2025-07-15 19:53:04,766 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750546_9722 src: /192.168.158.1:38640 dest: /192.168.158.4:9866 2025-07-15 19:53:04,799 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38640, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1262870726_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750546_9722, duration(ns): 24367617 2025-07-15 19:53:04,799 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750546_9722, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-15 19:53:08,761 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750546_9722 replica FinalizedReplica, blk_1073750546_9722, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750546 for deletion 2025-07-15 19:53:08,762 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750546_9722 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750546 2025-07-15 19:56:09,775 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750549_9725 src: /192.168.158.1:51402 dest: /192.168.158.4:9866 2025-07-15 19:56:09,807 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51402, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-622760310_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750549_9725, duration(ns): 22226030 2025-07-15 19:56:09,807 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750549_9725, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-15 19:56:11,767 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750549_9725 replica FinalizedReplica, blk_1073750549_9725, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750549 for deletion 2025-07-15 19:56:11,768 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750549_9725 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750549 2025-07-15 20:00:09,770 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750553_9729 src: /192.168.158.1:59566 dest: /192.168.158.4:9866 2025-07-15 20:00:09,799 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59566, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-957404796_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750553_9729, duration(ns): 20446360 2025-07-15 20:00:09,799 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750553_9729, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-15 20:00:11,775 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750553_9729 replica FinalizedReplica, blk_1073750553_9729, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750553 for deletion 2025-07-15 20:00:11,776 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750553_9729 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750553 2025-07-15 20:01:14,781 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750554_9730 src: /192.168.158.9:57158 dest: /192.168.158.4:9866 2025-07-15 20:01:14,804 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:57158, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1394355609_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750554_9730, duration(ns): 16988308 2025-07-15 20:01:14,804 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750554_9730, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-15 20:01:17,778 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750554_9730 replica FinalizedReplica, blk_1073750554_9730, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750554 for deletion 2025-07-15 20:01:17,779 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750554_9730 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750554 2025-07-15 20:03:14,785 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750556_9732 src: /192.168.158.8:55218 dest: /192.168.158.4:9866 2025-07-15 20:03:14,808 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:55218, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_47856962_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750556_9732, duration(ns): 18401657 2025-07-15 20:03:14,809 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750556_9732, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-15 20:03:20,781 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750556_9732 replica FinalizedReplica, blk_1073750556_9732, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750556 for deletion 2025-07-15 20:03:20,783 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750556_9732 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750556 2025-07-15 20:07:19,797 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750560_9736 src: /192.168.158.6:39266 dest: /192.168.158.4:9866 2025-07-15 20:07:19,820 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:39266, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1243484155_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750560_9736, duration(ns): 17604826 2025-07-15 20:07:19,820 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750560_9736, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-15 20:07:20,784 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750560_9736 replica FinalizedReplica, blk_1073750560_9736, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750560 for deletion 2025-07-15 20:07:20,785 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750560_9736 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750560 2025-07-15 20:09:24,794 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750562_9738 src: /192.168.158.7:53770 dest: /192.168.158.4:9866 2025-07-15 20:09:24,817 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:53770, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2071827992_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750562_9738, duration(ns): 17517548 2025-07-15 20:09:24,817 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750562_9738, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-15 20:09:29,788 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750562_9738 replica FinalizedReplica, blk_1073750562_9738, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750562 for deletion 2025-07-15 20:09:29,789 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750562_9738 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750562 2025-07-15 20:11:29,789 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750564_9740 src: /192.168.158.7:60690 dest: /192.168.158.4:9866 2025-07-15 20:11:29,813 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:60690, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1653972288_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750564_9740, duration(ns): 18978490 2025-07-15 20:11:29,813 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750564_9740, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-15 20:11:32,789 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750564_9740 replica FinalizedReplica, blk_1073750564_9740, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750564 for deletion 2025-07-15 20:11:32,790 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750564_9740 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750564 2025-07-15 20:12:29,796 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750565_9741 src: /192.168.158.1:40588 dest: /192.168.158.4:9866 2025-07-15 20:12:29,827 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40588, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-272100339_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750565_9741, duration(ns): 22487932 2025-07-15 20:12:29,827 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750565_9741, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-15 20:12:32,790 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750565_9741 replica FinalizedReplica, blk_1073750565_9741, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750565 for deletion 2025-07-15 20:12:32,792 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750565_9741 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750565 2025-07-15 20:15:39,792 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750568_9744 src: /192.168.158.1:49124 dest: /192.168.158.4:9866 2025-07-15 20:15:39,827 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49124, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1885366560_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750568_9744, duration(ns): 24122185 2025-07-15 20:15:39,827 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750568_9744, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-15 20:15:41,802 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750568_9744 replica FinalizedReplica, blk_1073750568_9744, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750568 for deletion 2025-07-15 20:15:41,803 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750568_9744 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750568 2025-07-15 20:16:39,801 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750569_9745 src: /192.168.158.7:42130 dest: /192.168.158.4:9866 2025-07-15 20:16:39,827 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:42130, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1975366031_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750569_9745, duration(ns): 20085284 2025-07-15 20:16:39,827 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750569_9745, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-15 20:16:41,801 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750569_9745 replica FinalizedReplica, blk_1073750569_9745, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750569 for deletion 2025-07-15 20:16:41,802 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750569_9745 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750569 2025-07-15 20:17:44,799 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750570_9746 src: /192.168.158.9:53118 dest: /192.168.158.4:9866 2025-07-15 20:17:44,824 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:53118, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-312337809_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750570_9746, duration(ns): 20306344 2025-07-15 20:17:44,824 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750570_9746, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-15 20:17:47,803 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750570_9746 replica FinalizedReplica, blk_1073750570_9746, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750570 for deletion 2025-07-15 20:17:47,804 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750570_9746 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750570 2025-07-15 20:19:44,799 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750572_9748 src: /192.168.158.8:60266 dest: /192.168.158.4:9866 2025-07-15 20:19:44,823 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:60266, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1867030876_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750572_9748, duration(ns): 17947749 2025-07-15 20:19:44,823 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750572_9748, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-15 20:19:50,807 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750572_9748 replica FinalizedReplica, blk_1073750572_9748, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750572 for deletion 2025-07-15 20:19:50,808 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750572_9748 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750572 2025-07-15 20:21:44,810 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750574_9750 src: /192.168.158.9:52988 dest: /192.168.158.4:9866 2025-07-15 20:21:44,828 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:52988, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1538600250_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750574_9750, duration(ns): 15586869 2025-07-15 20:21:44,828 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750574_9750, type=LAST_IN_PIPELINE terminating 2025-07-15 20:21:47,811 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750574_9750 replica FinalizedReplica, blk_1073750574_9750, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750574 for deletion 2025-07-15 20:21:47,812 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750574_9750 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750574 2025-07-15 20:22:44,814 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750575_9751 src: /192.168.158.1:34260 dest: /192.168.158.4:9866 2025-07-15 20:22:44,845 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34260, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_311580323_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750575_9751, duration(ns): 22426278 2025-07-15 20:22:44,846 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750575_9751, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-15 20:22:47,811 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750575_9751 replica FinalizedReplica, blk_1073750575_9751, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750575 for deletion 2025-07-15 20:22:47,813 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750575_9751 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750575 2025-07-15 20:26:44,832 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750579_9755 src: /192.168.158.8:51510 dest: /192.168.158.4:9866 2025-07-15 20:26:44,855 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:51510, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2106979215_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750579_9755, duration(ns): 17213191 2025-07-15 20:26:44,855 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750579_9755, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-15 20:26:47,820 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750579_9755 replica FinalizedReplica, blk_1073750579_9755, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750579 for deletion 2025-07-15 20:26:47,821 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750579_9755 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750579 2025-07-15 20:27:49,832 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750580_9756 src: /192.168.158.1:40332 dest: /192.168.158.4:9866 2025-07-15 20:27:49,860 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40332, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1045009454_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750580_9756, duration(ns): 19826024 2025-07-15 20:27:49,860 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750580_9756, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-15 20:27:50,821 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750580_9756 replica FinalizedReplica, blk_1073750580_9756, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750580 for deletion 2025-07-15 20:27:50,822 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750580_9756 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750580 2025-07-15 20:28:54,833 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750581_9757 src: /192.168.158.6:41056 dest: /192.168.158.4:9866 2025-07-15 20:28:54,857 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:41056, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_269771726_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750581_9757, duration(ns): 18548781 2025-07-15 20:28:54,857 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750581_9757, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-15 20:28:56,823 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750581_9757 replica FinalizedReplica, blk_1073750581_9757, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750581 for deletion 2025-07-15 20:28:56,824 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750581_9757 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750581 2025-07-15 20:29:54,833 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750582_9758 src: /192.168.158.1:60704 dest: /192.168.158.4:9866 2025-07-15 20:29:54,865 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60704, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1020399982_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750582_9758, duration(ns): 22493070 2025-07-15 20:29:54,865 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750582_9758, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-15 20:29:59,825 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750582_9758 replica FinalizedReplica, blk_1073750582_9758, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750582 for deletion 2025-07-15 20:29:59,826 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750582_9758 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750582 2025-07-15 20:31:59,831 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750584_9760 src: /192.168.158.1:49850 dest: /192.168.158.4:9866 2025-07-15 20:31:59,860 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49850, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_950293429_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750584_9760, duration(ns): 20897876 2025-07-15 20:31:59,861 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750584_9760, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-15 20:32:05,827 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750584_9760 replica FinalizedReplica, blk_1073750584_9760, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750584 for deletion 2025-07-15 20:32:05,828 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750584_9760 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750584 2025-07-15 20:32:59,866 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750585_9761 src: /192.168.158.5:46208 dest: /192.168.158.4:9866 2025-07-15 20:32:59,884 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:46208, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_200806190_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750585_9761, duration(ns): 15977194 2025-07-15 20:32:59,884 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750585_9761, type=LAST_IN_PIPELINE terminating 2025-07-15 20:33:02,832 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750585_9761 replica FinalizedReplica, blk_1073750585_9761, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750585 for deletion 2025-07-15 20:33:02,833 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750585_9761 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750585 2025-07-15 20:34:00,620 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750586_9762 src: /192.168.158.1:42516 dest: /192.168.158.4:9866 2025-07-15 20:34:00,652 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42516, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1621399965_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750586_9762, duration(ns): 24300288 2025-07-15 20:34:00,653 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750586_9762, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-15 20:34:05,833 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750586_9762 replica FinalizedReplica, blk_1073750586_9762, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750586 for deletion 2025-07-15 20:34:05,835 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750586_9762 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750586 2025-07-15 20:34:59,847 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750587_9763 src: /192.168.158.5:57738 dest: /192.168.158.4:9866 2025-07-15 20:34:59,869 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:57738, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1762759722_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750587_9763, duration(ns): 16961937 2025-07-15 20:34:59,869 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750587_9763, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-15 20:35:02,834 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750587_9763 replica FinalizedReplica, blk_1073750587_9763, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750587 for deletion 2025-07-15 20:35:02,835 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750587_9763 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750587 2025-07-15 20:36:04,858 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750588_9764 src: /192.168.158.9:60582 dest: /192.168.158.4:9866 2025-07-15 20:36:04,877 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:60582, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1056360334_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750588_9764, duration(ns): 16888751 2025-07-15 20:36:04,877 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750588_9764, type=LAST_IN_PIPELINE terminating 2025-07-15 20:36:05,835 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750588_9764 replica FinalizedReplica, blk_1073750588_9764, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750588 for deletion 2025-07-15 20:36:05,836 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750588_9764 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750588 2025-07-15 20:37:04,855 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750589_9765 src: /192.168.158.5:54378 dest: /192.168.158.4:9866 2025-07-15 20:37:04,871 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:54378, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1856460925_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750589_9765, duration(ns): 13879508 2025-07-15 20:37:04,872 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750589_9765, type=LAST_IN_PIPELINE terminating 2025-07-15 20:37:05,837 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750589_9765 replica FinalizedReplica, blk_1073750589_9765, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750589 for deletion 2025-07-15 20:37:05,840 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750589_9765 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750589 2025-07-15 20:39:09,849 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750591_9767 src: /192.168.158.9:41664 dest: /192.168.158.4:9866 2025-07-15 20:39:09,865 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:41664, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1530472395_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750591_9767, duration(ns): 14159311 2025-07-15 20:39:09,865 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750591_9767, type=LAST_IN_PIPELINE terminating 2025-07-15 20:39:11,841 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750591_9767 replica FinalizedReplica, blk_1073750591_9767, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750591 for deletion 2025-07-15 20:39:11,842 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750591_9767 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750591 2025-07-15 20:45:14,866 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750597_9773 src: /192.168.158.1:58796 dest: /192.168.158.4:9866 2025-07-15 20:45:14,895 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58796, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1244744340_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750597_9773, duration(ns): 20938291 2025-07-15 20:45:14,895 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750597_9773, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-15 20:45:20,849 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750597_9773 replica FinalizedReplica, blk_1073750597_9773, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750597 for deletion 2025-07-15 20:45:20,850 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750597_9773 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750597 2025-07-15 20:46:14,866 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750598_9774 src: /192.168.158.7:49780 dest: /192.168.158.4:9866 2025-07-15 20:46:14,884 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:49780, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2054836723_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750598_9774, duration(ns): 15823548 2025-07-15 20:46:14,885 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750598_9774, type=LAST_IN_PIPELINE terminating 2025-07-15 20:46:20,849 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750598_9774 replica FinalizedReplica, blk_1073750598_9774, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750598 for deletion 2025-07-15 20:46:20,850 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750598_9774 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750598 2025-07-15 20:47:19,873 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750599_9775 src: /192.168.158.7:34458 dest: /192.168.158.4:9866 2025-07-15 20:47:19,899 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:34458, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-167664092_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750599_9775, duration(ns): 20217690 2025-07-15 20:47:19,900 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750599_9775, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-15 20:47:20,850 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750599_9775 replica FinalizedReplica, blk_1073750599_9775, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750599 for deletion 2025-07-15 20:47:20,850 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750599_9775 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750599 2025-07-15 20:48:19,870 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750600_9776 src: /192.168.158.5:53020 dest: /192.168.158.4:9866 2025-07-15 20:48:19,895 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:53020, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1794171586_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750600_9776, duration(ns): 19305641 2025-07-15 20:48:19,895 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750600_9776, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-15 20:48:20,850 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750600_9776 replica FinalizedReplica, blk_1073750600_9776, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750600 for deletion 2025-07-15 20:48:20,851 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750600_9776 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750600 2025-07-15 20:49:19,873 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750601_9777 src: /192.168.158.6:59136 dest: /192.168.158.4:9866 2025-07-15 20:49:19,893 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:59136, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1973450606_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750601_9777, duration(ns): 17971117 2025-07-15 20:49:19,893 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750601_9777, type=LAST_IN_PIPELINE terminating 2025-07-15 20:49:23,853 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750601_9777 replica FinalizedReplica, blk_1073750601_9777, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750601 for deletion 2025-07-15 20:49:23,854 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750601_9777 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750601 2025-07-15 20:50:19,873 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750602_9778 src: /192.168.158.1:40084 dest: /192.168.158.4:9866 2025-07-15 20:50:19,902 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40084, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1341113764_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750602_9778, duration(ns): 20653910 2025-07-15 20:50:19,902 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750602_9778, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-15 20:50:20,853 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750602_9778 replica FinalizedReplica, blk_1073750602_9778, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750602 for deletion 2025-07-15 20:50:20,854 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750602_9778 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750602 2025-07-15 20:51:24,895 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750603_9779 src: /192.168.158.6:56788 dest: /192.168.158.4:9866 2025-07-15 20:51:24,917 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:56788, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1687121484_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750603_9779, duration(ns): 15857938 2025-07-15 20:51:24,917 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750603_9779, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-15 20:51:29,854 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750603_9779 replica FinalizedReplica, blk_1073750603_9779, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750603 for deletion 2025-07-15 20:51:29,856 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750603_9779 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750603 2025-07-15 20:52:29,902 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750604_9780 src: /192.168.158.9:36514 dest: /192.168.158.4:9866 2025-07-15 20:52:29,925 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:36514, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1337389813_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750604_9780, duration(ns): 17490473 2025-07-15 20:52:29,925 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750604_9780, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-15 20:52:35,858 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750604_9780 replica FinalizedReplica, blk_1073750604_9780, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750604 for deletion 2025-07-15 20:52:35,859 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750604_9780 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750604 2025-07-15 20:53:34,892 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750605_9781 src: /192.168.158.5:53840 dest: /192.168.158.4:9866 2025-07-15 20:53:34,915 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:53840, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2077344621_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750605_9781, duration(ns): 17922545 2025-07-15 20:53:34,915 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750605_9781, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-15 20:53:38,858 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750605_9781 replica FinalizedReplica, blk_1073750605_9781, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750605 for deletion 2025-07-15 20:53:38,859 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750605_9781 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750605 2025-07-15 20:55:34,886 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750607_9783 src: /192.168.158.1:60998 dest: /192.168.158.4:9866 2025-07-15 20:55:34,915 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60998, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_394993517_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750607_9783, duration(ns): 20650272 2025-07-15 20:55:34,915 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750607_9783, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-15 20:55:35,860 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750607_9783 replica FinalizedReplica, blk_1073750607_9783, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750607 for deletion 2025-07-15 20:55:35,861 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750607_9783 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750607 2025-07-15 21:00:44,900 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750612_9788 src: /192.168.158.8:38264 dest: /192.168.158.4:9866 2025-07-15 21:00:44,924 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:38264, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1422086247_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750612_9788, duration(ns): 18207855 2025-07-15 21:00:44,924 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750612_9788, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-15 21:00:47,872 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750612_9788 replica FinalizedReplica, blk_1073750612_9788, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750612 for deletion 2025-07-15 21:00:47,873 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750612_9788 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750612 2025-07-15 21:02:44,915 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750614_9790 src: /192.168.158.1:47230 dest: /192.168.158.4:9866 2025-07-15 21:02:44,949 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47230, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_297527296_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750614_9790, duration(ns): 25147661 2025-07-15 21:02:44,950 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750614_9790, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-15 21:02:50,878 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750614_9790 replica FinalizedReplica, blk_1073750614_9790, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750614 for deletion 2025-07-15 21:02:50,879 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750614_9790 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750614 2025-07-15 21:03:44,917 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750615_9791 src: /192.168.158.5:49448 dest: /192.168.158.4:9866 2025-07-15 21:03:44,941 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:49448, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_645970254_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750615_9791, duration(ns): 18735767 2025-07-15 21:03:44,941 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750615_9791, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-15 21:03:47,879 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750615_9791 replica FinalizedReplica, blk_1073750615_9791, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750615 for deletion 2025-07-15 21:03:47,880 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750615_9791 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750615 2025-07-15 21:04:44,910 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750616_9792 src: /192.168.158.1:37290 dest: /192.168.158.4:9866 2025-07-15 21:04:44,941 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37290, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1050104504_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750616_9792, duration(ns): 22671901 2025-07-15 21:04:44,941 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750616_9792, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-15 21:04:47,883 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750616_9792 replica FinalizedReplica, blk_1073750616_9792, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750616 for deletion 2025-07-15 21:04:47,884 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750616_9792 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750616 2025-07-15 21:05:44,923 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750617_9793 src: /192.168.158.9:57796 dest: /192.168.158.4:9866 2025-07-15 21:05:44,943 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:57796, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1183316209_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750617_9793, duration(ns): 17212683 2025-07-15 21:05:44,943 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750617_9793, type=LAST_IN_PIPELINE terminating 2025-07-15 21:05:50,885 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750617_9793 replica FinalizedReplica, blk_1073750617_9793, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750617 for deletion 2025-07-15 21:05:50,886 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750617_9793 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750617 2025-07-15 21:08:44,916 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750620_9796 src: /192.168.158.9:49238 dest: /192.168.158.4:9866 2025-07-15 21:08:44,933 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:49238, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-461954208_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750620_9796, duration(ns): 14935450 2025-07-15 21:08:44,933 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750620_9796, type=LAST_IN_PIPELINE terminating 2025-07-15 21:08:50,890 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750620_9796 replica FinalizedReplica, blk_1073750620_9796, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750620 for deletion 2025-07-15 21:08:50,891 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750620_9796 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750620 2025-07-15 21:09:44,931 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750621_9797 src: /192.168.158.7:33580 dest: /192.168.158.4:9866 2025-07-15 21:09:44,954 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:33580, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2127806873_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750621_9797, duration(ns): 17669153 2025-07-15 21:09:44,955 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750621_9797, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-15 21:09:47,894 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750621_9797 replica FinalizedReplica, blk_1073750621_9797, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750621 for deletion 2025-07-15 21:09:47,895 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750621_9797 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750621 2025-07-15 21:10:44,933 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750622_9798 src: /192.168.158.5:48106 dest: /192.168.158.4:9866 2025-07-15 21:10:44,951 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:48106, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1788941962_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750622_9798, duration(ns): 15773841 2025-07-15 21:10:44,951 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750622_9798, type=LAST_IN_PIPELINE terminating 2025-07-15 21:10:50,896 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750622_9798 replica FinalizedReplica, blk_1073750622_9798, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750622 for deletion 2025-07-15 21:10:50,897 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750622_9798 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750622 2025-07-15 21:11:44,939 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750623_9799 src: /192.168.158.1:34166 dest: /192.168.158.4:9866 2025-07-15 21:11:44,966 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34166, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_615304396_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750623_9799, duration(ns): 18187915 2025-07-15 21:11:44,966 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750623_9799, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-15 21:11:50,897 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750623_9799 replica FinalizedReplica, blk_1073750623_9799, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750623 for deletion 2025-07-15 21:11:50,898 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750623_9799 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750623 2025-07-15 21:12:49,949 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750624_9800 src: /192.168.158.5:58376 dest: /192.168.158.4:9866 2025-07-15 21:12:49,967 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:58376, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1475360292_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750624_9800, duration(ns): 15828228 2025-07-15 21:12:49,968 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750624_9800, type=LAST_IN_PIPELINE terminating 2025-07-15 21:12:50,898 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750624_9800 replica FinalizedReplica, blk_1073750624_9800, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750624 for deletion 2025-07-15 21:12:50,899 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750624_9800 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750624 2025-07-15 21:14:49,934 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750626_9802 src: /192.168.158.6:59002 dest: /192.168.158.4:9866 2025-07-15 21:14:49,957 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:59002, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1077956105_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750626_9802, duration(ns): 17644970 2025-07-15 21:14:49,957 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750626_9802, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-15 21:14:53,905 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750626_9802 replica FinalizedReplica, blk_1073750626_9802, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750626 for deletion 2025-07-15 21:14:53,906 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750626_9802 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750626 2025-07-15 21:16:49,907 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750628_9804 src: /192.168.158.9:48800 dest: /192.168.158.4:9866 2025-07-15 21:16:49,934 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:48800, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-751363547_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750628_9804, duration(ns): 21094057 2025-07-15 21:16:49,934 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750628_9804, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-15 21:16:53,906 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750628_9804 replica FinalizedReplica, blk_1073750628_9804, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750628 for deletion 2025-07-15 21:16:53,906 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750628_9804 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750628 2025-07-15 21:18:54,927 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750630_9806 src: /192.168.158.1:48608 dest: /192.168.158.4:9866 2025-07-15 21:18:54,955 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48608, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1860457874_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750630_9806, duration(ns): 19344076 2025-07-15 21:18:54,955 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750630_9806, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-15 21:18:56,907 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750630_9806 replica FinalizedReplica, blk_1073750630_9806, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750630 for deletion 2025-07-15 21:18:56,908 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750630_9806 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750630 2025-07-15 21:19:54,947 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750631_9807 src: /192.168.158.7:55852 dest: /192.168.158.4:9866 2025-07-15 21:19:54,976 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:55852, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-780135570_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750631_9807, duration(ns): 21998748 2025-07-15 21:19:54,976 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750631_9807, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-15 21:19:59,909 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750631_9807 replica FinalizedReplica, blk_1073750631_9807, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750631 for deletion 2025-07-15 21:19:59,910 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750631_9807 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750631 2025-07-15 21:20:54,939 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750632_9808 src: /192.168.158.1:35832 dest: /192.168.158.4:9866 2025-07-15 21:20:54,970 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35832, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-556392136_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750632_9808, duration(ns): 22429688 2025-07-15 21:20:54,971 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750632_9808, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-15 21:20:59,911 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750632_9808 replica FinalizedReplica, blk_1073750632_9808, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750632 for deletion 2025-07-15 21:20:59,912 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750632_9808 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750632 2025-07-15 21:21:54,949 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750633_9809 src: /192.168.158.6:43510 dest: /192.168.158.4:9866 2025-07-15 21:21:54,965 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:43510, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-496334669_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750633_9809, duration(ns): 13875667 2025-07-15 21:21:54,966 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750633_9809, type=LAST_IN_PIPELINE terminating 2025-07-15 21:21:59,916 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750633_9809 replica FinalizedReplica, blk_1073750633_9809, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750633 for deletion 2025-07-15 21:21:59,917 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750633_9809 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750633 2025-07-15 21:23:59,938 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750635_9811 src: /192.168.158.1:58824 dest: /192.168.158.4:9866 2025-07-15 21:23:59,967 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58824, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_866920347_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750635_9811, duration(ns): 21160588 2025-07-15 21:23:59,967 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750635_9811, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-15 21:24:05,921 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750635_9811 replica FinalizedReplica, blk_1073750635_9811, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750635 for deletion 2025-07-15 21:24:05,922 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750635_9811 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750635 2025-07-15 21:26:59,961 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750638_9814 src: /192.168.158.7:41456 dest: /192.168.158.4:9866 2025-07-15 21:26:59,977 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:41456, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_728788534_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750638_9814, duration(ns): 14261471 2025-07-15 21:26:59,977 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750638_9814, type=LAST_IN_PIPELINE terminating 2025-07-15 21:27:02,928 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750638_9814 replica FinalizedReplica, blk_1073750638_9814, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750638 for deletion 2025-07-15 21:27:02,929 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750638_9814 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750638 2025-07-15 21:29:59,955 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750641_9817 src: /192.168.158.1:40934 dest: /192.168.158.4:9866 2025-07-15 21:29:59,986 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40934, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_743665897_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750641_9817, duration(ns): 21226049 2025-07-15 21:29:59,986 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750641_9817, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-15 21:30:02,937 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750641_9817 replica FinalizedReplica, blk_1073750641_9817, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750641 for deletion 2025-07-15 21:30:02,938 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750641_9817 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750641 2025-07-15 21:34:09,968 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750645_9821 src: /192.168.158.8:49998 dest: /192.168.158.4:9866 2025-07-15 21:34:09,991 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:49998, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1434010833_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750645_9821, duration(ns): 17571526 2025-07-15 21:34:09,991 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750645_9821, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-15 21:34:14,947 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750645_9821 replica FinalizedReplica, blk_1073750645_9821, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750645 for deletion 2025-07-15 21:34:14,948 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750645_9821 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750645 2025-07-15 21:35:09,980 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750646_9822 src: /192.168.158.8:55000 dest: /192.168.158.4:9866 2025-07-15 21:35:09,998 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:55000, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_444710530_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750646_9822, duration(ns): 15997984 2025-07-15 21:35:09,998 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750646_9822, type=LAST_IN_PIPELINE terminating 2025-07-15 21:35:11,949 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750646_9822 replica FinalizedReplica, blk_1073750646_9822, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750646 for deletion 2025-07-15 21:35:11,950 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750646_9822 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750646 2025-07-15 21:39:19,992 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750650_9826 src: /192.168.158.5:40216 dest: /192.168.158.4:9866 2025-07-15 21:39:20,018 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:40216, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1904025350_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750650_9826, duration(ns): 20089338 2025-07-15 21:39:20,018 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750650_9826, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-15 21:39:23,959 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750650_9826 replica FinalizedReplica, blk_1073750650_9826, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750650 for deletion 2025-07-15 21:39:23,960 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750650_9826 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750650 2025-07-15 21:40:19,982 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750651_9827 src: /192.168.158.6:34508 dest: /192.168.158.4:9866 2025-07-15 21:40:19,998 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:34508, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1573289098_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750651_9827, duration(ns): 14206966 2025-07-15 21:40:19,999 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750651_9827, type=LAST_IN_PIPELINE terminating 2025-07-15 21:40:20,961 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750651_9827 replica FinalizedReplica, blk_1073750651_9827, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750651 for deletion 2025-07-15 21:40:20,963 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750651_9827 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750651 2025-07-15 21:44:34,995 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750655_9831 src: /192.168.158.7:34322 dest: /192.168.158.4:9866 2025-07-15 21:44:35,019 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:34322, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1037350710_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750655_9831, duration(ns): 18343419 2025-07-15 21:44:35,019 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750655_9831, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-15 21:44:35,971 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750655_9831 replica FinalizedReplica, blk_1073750655_9831, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750655 for deletion 2025-07-15 21:44:35,972 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750655_9831 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750655 2025-07-15 21:46:40,011 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750657_9833 src: /192.168.158.8:57368 dest: /192.168.158.4:9866 2025-07-15 21:46:40,036 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:57368, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_747183293_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750657_9833, duration(ns): 19613600 2025-07-15 21:46:40,036 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750657_9833, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-15 21:46:44,975 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750657_9833 replica FinalizedReplica, blk_1073750657_9833, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750657 for deletion 2025-07-15 21:46:44,976 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750657_9833 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750657 2025-07-15 21:50:49,992 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750661_9837 src: /192.168.158.8:60206 dest: /192.168.158.4:9866 2025-07-15 21:50:50,015 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:60206, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1057444608_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750661_9837, duration(ns): 18155833 2025-07-15 21:50:50,015 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750661_9837, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-15 21:50:50,986 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750661_9837 replica FinalizedReplica, blk_1073750661_9837, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750661 for deletion 2025-07-15 21:50:50,987 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750661_9837 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750661 2025-07-15 21:51:50,011 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750662_9838 src: /192.168.158.5:40648 dest: /192.168.158.4:9866 2025-07-15 21:51:50,035 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:40648, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1952725635_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750662_9838, duration(ns): 18454197 2025-07-15 21:51:50,035 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750662_9838, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-15 21:51:53,986 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750662_9838 replica FinalizedReplica, blk_1073750662_9838, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750662 for deletion 2025-07-15 21:51:53,988 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750662_9838 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750662 2025-07-15 21:52:50,030 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750663_9839 src: /192.168.158.9:42026 dest: /192.168.158.4:9866 2025-07-15 21:52:50,047 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:42026, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1144108287_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750663_9839, duration(ns): 15076581 2025-07-15 21:52:50,048 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750663_9839, type=LAST_IN_PIPELINE terminating 2025-07-15 21:52:50,988 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750663_9839 replica FinalizedReplica, blk_1073750663_9839, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750663 for deletion 2025-07-15 21:52:50,989 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750663_9839 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750663 2025-07-15 21:53:50,033 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750664_9840 src: /192.168.158.5:39336 dest: /192.168.158.4:9866 2025-07-15 21:53:50,050 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:39336, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1635644310_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750664_9840, duration(ns): 14895110 2025-07-15 21:53:50,050 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750664_9840, type=LAST_IN_PIPELINE terminating 2025-07-15 21:53:50,990 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750664_9840 replica FinalizedReplica, blk_1073750664_9840, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750664 for deletion 2025-07-15 21:53:50,991 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750664_9840 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750664 2025-07-15 21:54:55,029 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750665_9841 src: /192.168.158.1:46372 dest: /192.168.158.4:9866 2025-07-15 21:54:55,060 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46372, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1097997763_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750665_9841, duration(ns): 21609022 2025-07-15 21:54:55,060 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750665_9841, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-15 21:54:59,995 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750665_9841 replica FinalizedReplica, blk_1073750665_9841, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750665 for deletion 2025-07-15 21:54:59,996 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750665_9841 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750665 2025-07-15 21:55:55,041 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750666_9842 src: /192.168.158.9:50772 dest: /192.168.158.4:9866 2025-07-15 21:55:55,065 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:50772, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_951827308_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750666_9842, duration(ns): 18230716 2025-07-15 21:55:55,066 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750666_9842, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-15 21:55:59,999 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750666_9842 replica FinalizedReplica, blk_1073750666_9842, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750666 for deletion 2025-07-15 21:56:00,000 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750666_9842 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750666 2025-07-15 22:01:05,038 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750671_9847 src: /192.168.158.9:57618 dest: /192.168.158.4:9866 2025-07-15 22:01:05,053 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:57618, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1838424768_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750671_9847, duration(ns): 12892221 2025-07-15 22:01:05,054 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750671_9847, type=LAST_IN_PIPELINE terminating 2025-07-15 22:01:06,010 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750671_9847 replica FinalizedReplica, blk_1073750671_9847, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750671 for deletion 2025-07-15 22:01:06,011 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750671_9847 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750671 2025-07-15 22:08:10,033 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750678_9854 src: /192.168.158.6:42828 dest: /192.168.158.4:9866 2025-07-15 22:08:10,055 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:42828, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-513220625_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750678_9854, duration(ns): 16195560 2025-07-15 22:08:10,055 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750678_9854, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-15 22:08:12,029 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750678_9854 replica FinalizedReplica, blk_1073750678_9854, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750678 for deletion 2025-07-15 22:08:12,030 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750678_9854 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750678 2025-07-15 22:09:10,041 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750679_9855 src: /192.168.158.8:53928 dest: /192.168.158.4:9866 2025-07-15 22:09:10,067 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:53928, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1652650109_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750679_9855, duration(ns): 19828606 2025-07-15 22:09:10,067 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750679_9855, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-15 22:09:12,030 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750679_9855 replica FinalizedReplica, blk_1073750679_9855, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750679 for deletion 2025-07-15 22:09:12,032 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750679_9855 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750679 2025-07-15 22:10:10,040 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750680_9856 src: /192.168.158.1:40212 dest: /192.168.158.4:9866 2025-07-15 22:10:10,072 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40212, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_286298694_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750680_9856, duration(ns): 23811148 2025-07-15 22:10:10,072 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750680_9856, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-15 22:10:15,031 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750680_9856 replica FinalizedReplica, blk_1073750680_9856, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750680 for deletion 2025-07-15 22:10:15,032 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750680_9856 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750680 2025-07-15 22:12:10,043 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750682_9858 src: /192.168.158.5:40452 dest: /192.168.158.4:9866 2025-07-15 22:12:10,060 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:40452, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1316899291_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750682_9858, duration(ns): 14498723 2025-07-15 22:12:10,060 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750682_9858, type=LAST_IN_PIPELINE terminating 2025-07-15 22:12:12,034 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750682_9858 replica FinalizedReplica, blk_1073750682_9858, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750682 for deletion 2025-07-15 22:12:12,035 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750682_9858 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750682 2025-07-15 22:13:10,064 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750683_9859 src: /192.168.158.1:57114 dest: /192.168.158.4:9866 2025-07-15 22:13:10,093 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57114, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1802391594_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750683_9859, duration(ns): 20782063 2025-07-15 22:13:10,093 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750683_9859, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-15 22:13:12,034 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750683_9859 replica FinalizedReplica, blk_1073750683_9859, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750683 for deletion 2025-07-15 22:13:12,035 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750683_9859 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750683 2025-07-15 22:15:15,052 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750685_9861 src: /192.168.158.5:44624 dest: /192.168.158.4:9866 2025-07-15 22:15:15,075 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:44624, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_146141447_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750685_9861, duration(ns): 17553811 2025-07-15 22:15:15,075 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750685_9861, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-15 22:15:21,037 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750685_9861 replica FinalizedReplica, blk_1073750685_9861, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750685 for deletion 2025-07-15 22:15:21,038 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750685_9861 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750685 2025-07-15 22:16:15,056 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750686_9862 src: /192.168.158.7:58184 dest: /192.168.158.4:9866 2025-07-15 22:16:15,074 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:58184, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1020465789_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750686_9862, duration(ns): 15975151 2025-07-15 22:16:15,074 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750686_9862, type=LAST_IN_PIPELINE terminating 2025-07-15 22:16:21,037 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750686_9862 replica FinalizedReplica, blk_1073750686_9862, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750686 for deletion 2025-07-15 22:16:21,038 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750686_9862 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750686 2025-07-15 22:17:15,074 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750687_9863 src: /192.168.158.8:49242 dest: /192.168.158.4:9866 2025-07-15 22:17:15,091 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:49242, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_134160855_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750687_9863, duration(ns): 15234069 2025-07-15 22:17:15,091 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750687_9863, type=LAST_IN_PIPELINE terminating 2025-07-15 22:17:18,039 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750687_9863 replica FinalizedReplica, blk_1073750687_9863, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750687 for deletion 2025-07-15 22:17:18,040 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750687_9863 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750687 2025-07-15 22:20:20,069 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750690_9866 src: /192.168.158.1:54386 dest: /192.168.158.4:9866 2025-07-15 22:20:20,100 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54386, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1615808882_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750690_9866, duration(ns): 22808287 2025-07-15 22:20:20,100 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750690_9866, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-15 22:20:21,046 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750690_9866 replica FinalizedReplica, blk_1073750690_9866, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750690 for deletion 2025-07-15 22:20:21,047 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750690_9866 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750690 2025-07-15 22:21:25,073 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750691_9867 src: /192.168.158.9:51084 dest: /192.168.158.4:9866 2025-07-15 22:21:25,096 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:51084, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1057691612_106, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750691_9867, duration(ns): 17597247 2025-07-15 22:21:25,096 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750691_9867, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-15 22:21:30,048 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750691_9867 replica FinalizedReplica, blk_1073750691_9867, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750691 for deletion 2025-07-15 22:21:30,049 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750691_9867 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750691 2025-07-15 22:27:27,722 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750697_9873 src: /192.168.158.6:36638 dest: /192.168.158.4:9866 2025-07-15 22:27:27,763 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:36638, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2004845568_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750697_9873, duration(ns): 38344402 2025-07-15 22:27:27,763 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750697_9873, type=LAST_IN_PIPELINE terminating 2025-07-15 22:27:30,065 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750697_9873 replica FinalizedReplica, blk_1073750697_9873, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750697 for deletion 2025-07-15 22:27:30,066 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750697_9873 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750697 2025-07-15 22:28:27,691 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750698_9874 src: /192.168.158.1:46230 dest: /192.168.158.4:9866 2025-07-15 22:28:27,725 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46230, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_847172142_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750698_9874, duration(ns): 25878725 2025-07-15 22:28:27,725 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750698_9874, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-15 22:28:33,066 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750698_9874 replica FinalizedReplica, blk_1073750698_9874, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750698 for deletion 2025-07-15 22:28:33,067 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750698_9874 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750698 2025-07-15 22:29:27,704 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750699_9875 src: /192.168.158.1:59888 dest: /192.168.158.4:9866 2025-07-15 22:29:27,737 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59888, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1495137510_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750699_9875, duration(ns): 24070290 2025-07-15 22:29:27,737 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750699_9875, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-15 22:29:33,068 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750699_9875 replica FinalizedReplica, blk_1073750699_9875, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750699 for deletion 2025-07-15 22:29:33,069 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750699_9875 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750699 2025-07-15 22:30:27,664 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750700_9876 src: /192.168.158.8:42548 dest: /192.168.158.4:9866 2025-07-15 22:30:27,689 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:42548, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1959316485_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750700_9876, duration(ns): 20126418 2025-07-15 22:30:27,690 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750700_9876, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-15 22:30:30,071 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750700_9876 replica FinalizedReplica, blk_1073750700_9876, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750700 for deletion 2025-07-15 22:30:30,072 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750700_9876 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750700 2025-07-15 22:31:27,670 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750701_9877 src: /192.168.158.6:56796 dest: /192.168.158.4:9866 2025-07-15 22:31:27,687 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:56796, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1553878718_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750701_9877, duration(ns): 15774346 2025-07-15 22:31:27,688 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750701_9877, type=LAST_IN_PIPELINE terminating 2025-07-15 22:31:30,072 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750701_9877 replica FinalizedReplica, blk_1073750701_9877, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750701 for deletion 2025-07-15 22:31:30,073 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750701_9877 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750701 2025-07-15 22:33:37,688 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750703_9879 src: /192.168.158.7:56152 dest: /192.168.158.4:9866 2025-07-15 22:33:37,706 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:56152, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-533704915_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750703_9879, duration(ns): 15227444 2025-07-15 22:33:37,706 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750703_9879, type=LAST_IN_PIPELINE terminating 2025-07-15 22:33:42,078 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750703_9879 replica FinalizedReplica, blk_1073750703_9879, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750703 for deletion 2025-07-15 22:33:42,080 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750703_9879 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750703 2025-07-15 22:34:42,655 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750704_9880 src: /192.168.158.8:60000 dest: /192.168.158.4:9866 2025-07-15 22:34:42,679 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:60000, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1337464875_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750704_9880, duration(ns): 18330976 2025-07-15 22:34:42,679 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750704_9880, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-15 22:34:45,081 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750704_9880 replica FinalizedReplica, blk_1073750704_9880, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750704 for deletion 2025-07-15 22:34:45,082 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750704_9880 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750704 2025-07-15 22:37:42,664 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750707_9883 src: /192.168.158.1:56828 dest: /192.168.158.4:9866 2025-07-15 22:37:42,695 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56828, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-582125469_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750707_9883, duration(ns): 23631291 2025-07-15 22:37:42,696 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750707_9883, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-15 22:37:48,082 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750707_9883 replica FinalizedReplica, blk_1073750707_9883, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750707 for deletion 2025-07-15 22:37:48,084 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750707_9883 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750707 2025-07-15 22:39:42,655 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750709_9885 src: /192.168.158.1:46972 dest: /192.168.158.4:9866 2025-07-15 22:39:42,684 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46972, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_241493625_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750709_9885, duration(ns): 20761920 2025-07-15 22:39:42,684 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750709_9885, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-15 22:39:45,085 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750709_9885 replica FinalizedReplica, blk_1073750709_9885, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750709 for deletion 2025-07-15 22:39:45,086 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750709_9885 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750709 2025-07-15 22:41:47,694 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750711_9887 src: /192.168.158.5:57888 dest: /192.168.158.4:9866 2025-07-15 22:41:47,712 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:57888, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_759978712_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750711_9887, duration(ns): 15722174 2025-07-15 22:41:47,712 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750711_9887, type=LAST_IN_PIPELINE terminating 2025-07-15 22:41:48,089 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750711_9887 replica FinalizedReplica, blk_1073750711_9887, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750711 for deletion 2025-07-15 22:41:48,090 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750711_9887 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750711 2025-07-15 22:43:47,674 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750713_9889 src: /192.168.158.6:51280 dest: /192.168.158.4:9866 2025-07-15 22:43:47,699 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:51280, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1565013728_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750713_9889, duration(ns): 18670981 2025-07-15 22:43:47,699 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750713_9889, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-15 22:43:48,091 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750713_9889 replica FinalizedReplica, blk_1073750713_9889, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750713 for deletion 2025-07-15 22:43:48,093 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750713_9889 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750713 2025-07-15 22:44:52,714 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750714_9890 src: /192.168.158.9:47274 dest: /192.168.158.4:9866 2025-07-15 22:44:52,743 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:47274, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1824710169_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750714_9890, duration(ns): 22389679 2025-07-15 22:44:52,743 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750714_9890, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-15 22:44:57,093 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750714_9890 replica FinalizedReplica, blk_1073750714_9890, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750714 for deletion 2025-07-15 22:44:57,094 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750714_9890 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750714 2025-07-15 22:45:57,692 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750715_9891 src: /192.168.158.1:58502 dest: /192.168.158.4:9866 2025-07-15 22:45:57,727 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58502, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_715619965_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750715_9891, duration(ns): 26640350 2025-07-15 22:45:57,727 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750715_9891, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-15 22:46:00,095 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750715_9891 replica FinalizedReplica, blk_1073750715_9891, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750715 for deletion 2025-07-15 22:46:00,097 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750715_9891 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750715 2025-07-15 22:48:02,704 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750717_9893 src: /192.168.158.6:55740 dest: /192.168.158.4:9866 2025-07-15 22:48:02,722 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:55740, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1889223723_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750717_9893, duration(ns): 16485150 2025-07-15 22:48:02,723 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750717_9893, type=LAST_IN_PIPELINE terminating 2025-07-15 22:48:06,101 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750717_9893 replica FinalizedReplica, blk_1073750717_9893, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750717 for deletion 2025-07-15 22:48:06,103 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750717_9893 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750717 2025-07-15 22:50:12,682 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750719_9895 src: /192.168.158.5:37460 dest: /192.168.158.4:9866 2025-07-15 22:50:12,698 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:37460, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-765767425_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750719_9895, duration(ns): 14385083 2025-07-15 22:50:12,699 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750719_9895, type=LAST_IN_PIPELINE terminating 2025-07-15 22:50:15,103 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750719_9895 replica FinalizedReplica, blk_1073750719_9895, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750719 for deletion 2025-07-15 22:50:15,105 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750719_9895 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750719 2025-07-15 22:51:00,106 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: IOException in offerService java.io.EOFException: End of File Exception between local host is: "dmidlkprdls04.svr.luc.edu/192.168.158.4"; destination host is: "dmidlkprdls01.svr.luc.edu":8022; : java.io.EOFException; For more details see: http://wiki.apache.org/hadoop/EOFException at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:892) at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:846) at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1620) at org.apache.hadoop.ipc.Client.call(Client.java:1562) at org.apache.hadoop.ipc.Client.call(Client.java:1459) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:233) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:118) at com.sun.proxy.$Proxy24.sendHeartbeat(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolClientSideTranslatorPB.sendHeartbeat(DatanodeProtocolClientSideTranslatorPB.java:166) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.sendHeartBeat(BPServiceActor.java:553) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.offerService(BPServiceActor.java:694) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:894) at java.lang.Thread.run(Thread.java:750) Caused by: java.io.EOFException at java.io.DataInputStream.readInt(DataInputStream.java:392) at org.apache.hadoop.ipc.Client$IpcStreams.readResponse(Client.java:1950) at org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1245) at org.apache.hadoop.ipc.Client$Connection.run(Client.java:1141) 2025-07-15 22:51:02,111 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-15 22:51:03,112 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-15 22:51:04,113 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-15 22:51:05,114 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-15 22:51:06,115 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-15 22:51:07,116 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-15 22:51:08,117 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-15 22:51:09,118 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-15 22:51:10,120 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-15 22:51:10,328 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: RemoteException in offerService org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.ipc.RetriableException): NameNode still not started at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.checkNNStartup(NameNodeRpcServer.java:2281) at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.cacheReport(NameNodeRpcServer.java:1609) at org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolServerSideTranslatorPB.cacheReport(DatanodeProtocolServerSideTranslatorPB.java:201) at org.apache.hadoop.hdfs.protocol.proto.DatanodeProtocolProtos$DatanodeProtocolService$2.callBlockingMethod(DatanodeProtocolProtos.java:31542) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:533) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1070) at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:994) at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:922) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1910) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2899) at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1616) at org.apache.hadoop.ipc.Client.call(Client.java:1562) at org.apache.hadoop.ipc.Client.call(Client.java:1459) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:233) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:118) at com.sun.proxy.$Proxy24.cacheReport(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolClientSideTranslatorPB.cacheReport(DatanodeProtocolClientSideTranslatorPB.java:236) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.cacheReport(BPServiceActor.java:499) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.offerService(BPServiceActor.java:740) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:894) at java.lang.Thread.run(Thread.java:750) 2025-07-15 22:51:11,407 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: DatanodeCommand action : DNA_REGISTER from dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 with active state 2025-07-15 22:51:11,418 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: RemoteException in offerService org.apache.hadoop.ipc.RemoteException(java.io.IOException): processCacheReport from dead or unregistered datanode: null at org.apache.hadoop.hdfs.server.namenode.CacheManager.processCacheReport(CacheManager.java:1021) at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.cacheReport(NameNodeRpcServer.java:1615) at org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolServerSideTranslatorPB.cacheReport(DatanodeProtocolServerSideTranslatorPB.java:201) at org.apache.hadoop.hdfs.protocol.proto.DatanodeProtocolProtos$DatanodeProtocolService$2.callBlockingMethod(DatanodeProtocolProtos.java:31542) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:533) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1070) at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:994) at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:922) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1910) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2899) at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1616) at org.apache.hadoop.ipc.Client.call(Client.java:1562) at org.apache.hadoop.ipc.Client.call(Client.java:1459) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:233) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:118) at com.sun.proxy.$Proxy24.cacheReport(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolClientSideTranslatorPB.cacheReport(DatanodeProtocolClientSideTranslatorPB.java:236) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.cacheReport(BPServiceActor.java:499) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.offerService(BPServiceActor.java:740) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:894) at java.lang.Thread.run(Thread.java:750) 2025-07-15 22:51:11,441 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool BP-1059995147-192.168.158.1-1752101929360 (Datanode Uuid be50c32a-aa23-4b9d-aa7f-05816b6e5f1a) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 beginning handshake with NN 2025-07-15 22:51:11,486 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool Block pool BP-1059995147-192.168.158.1-1752101929360 (Datanode Uuid be50c32a-aa23-4b9d-aa7f-05816b6e5f1a) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 successfully registered with NN 2025-07-15 22:51:12,500 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Successfully sent block report 0x12cfd9a757d31f3e, containing 4 storage report(s), of which we sent 4. The reports had 8 total blocks and used 1 RPC(s). This took 0 msec to generate and 75 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5. 2025-07-15 22:51:12,501 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Got finalize command for block pool BP-1059995147-192.168.158.1-1752101929360 2025-07-15 22:52:22,895 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750720_9896 src: /192.168.158.1:38908 dest: /192.168.158.4:9866 2025-07-15 22:52:22,929 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38908, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1872292614_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750720_9896, duration(ns): 23952967 2025-07-15 22:52:22,929 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750720_9896, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-15 22:53:22,695 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750721_9897 src: /192.168.158.1:54372 dest: /192.168.158.4:9866 2025-07-15 22:53:22,725 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54372, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1774355434_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750721_9897, duration(ns): 20797346 2025-07-15 22:53:22,726 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750721_9897, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-15 22:54:22,704 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750722_9898 src: /192.168.158.9:47510 dest: /192.168.158.4:9866 2025-07-15 22:54:22,731 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:47510, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1017793728_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750722_9898, duration(ns): 19968391 2025-07-15 22:54:22,732 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750722_9898, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-15 22:55:22,699 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750723_9899 src: /192.168.158.6:35762 dest: /192.168.158.4:9866 2025-07-15 22:55:22,717 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:35762, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1455340217_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750723_9899, duration(ns): 16030361 2025-07-15 22:55:22,718 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750723_9899, type=LAST_IN_PIPELINE terminating 2025-07-15 22:59:22,707 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750727_9903 src: /192.168.158.8:50214 dest: /192.168.158.4:9866 2025-07-15 22:59:22,729 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:50214, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_150399195_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750727_9903, duration(ns): 16482327 2025-07-15 22:59:22,729 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750727_9903, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-15 23:02:22,718 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750730_9906 src: /192.168.158.7:55062 dest: /192.168.158.4:9866 2025-07-15 23:02:22,735 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:55062, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1413212064_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750730_9906, duration(ns): 15108817 2025-07-15 23:02:22,736 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750730_9906, type=LAST_IN_PIPELINE terminating 2025-07-15 23:03:22,726 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750731_9907 src: /192.168.158.7:44316 dest: /192.168.158.4:9866 2025-07-15 23:03:22,746 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:44316, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-348317751_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750731_9907, duration(ns): 17958400 2025-07-15 23:03:22,747 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750731_9907, type=LAST_IN_PIPELINE terminating 2025-07-15 23:06:27,730 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750734_9910 src: /192.168.158.9:59128 dest: /192.168.158.4:9866 2025-07-15 23:06:27,749 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:59128, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1547378080_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750734_9910, duration(ns): 17268257 2025-07-15 23:06:27,750 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750734_9910, type=LAST_IN_PIPELINE terminating 2025-07-15 23:07:27,734 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750735_9911 src: /192.168.158.7:47908 dest: /192.168.158.4:9866 2025-07-15 23:07:27,761 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:47908, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1680623854_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750735_9911, duration(ns): 21736286 2025-07-15 23:07:27,762 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750735_9911, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-15 23:11:32,726 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750739_9915 src: /192.168.158.5:56804 dest: /192.168.158.4:9866 2025-07-15 23:11:32,754 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:56804, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1514763831_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750739_9915, duration(ns): 22374433 2025-07-15 23:11:32,758 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750739_9915, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-15 23:13:32,737 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750741_9917 src: /192.168.158.9:59556 dest: /192.168.158.4:9866 2025-07-15 23:13:32,753 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:59556, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1171275937_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750741_9917, duration(ns): 13948743 2025-07-15 23:13:32,753 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750741_9917, type=LAST_IN_PIPELINE terminating 2025-07-15 23:16:32,765 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750744_9920 src: /192.168.158.9:35172 dest: /192.168.158.4:9866 2025-07-15 23:16:32,783 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:35172, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-876355323_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750744_9920, duration(ns): 15251705 2025-07-15 23:16:32,783 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750744_9920, type=LAST_IN_PIPELINE terminating 2025-07-15 23:17:32,736 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750745_9921 src: /192.168.158.1:46068 dest: /192.168.158.4:9866 2025-07-15 23:17:32,768 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46068, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1024336568_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750745_9921, duration(ns): 22457517 2025-07-15 23:17:32,768 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750745_9921, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-15 23:18:37,756 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750746_9922 src: /192.168.158.9:57060 dest: /192.168.158.4:9866 2025-07-15 23:18:37,774 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:57060, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1393383119_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750746_9922, duration(ns): 15874246 2025-07-15 23:18:37,775 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750746_9922, type=LAST_IN_PIPELINE terminating 2025-07-15 23:19:37,736 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750747_9923 src: /192.168.158.8:55190 dest: /192.168.158.4:9866 2025-07-15 23:19:37,763 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:55190, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1887200360_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750747_9923, duration(ns): 20994238 2025-07-15 23:19:37,763 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750747_9923, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-15 23:21:37,739 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750749_9925 src: /192.168.158.9:48226 dest: /192.168.158.4:9866 2025-07-15 23:21:37,762 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:48226, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-776108133_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750749_9925, duration(ns): 17528183 2025-07-15 23:21:37,762 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750749_9925, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-15 23:22:37,747 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750750_9926 src: /192.168.158.6:49566 dest: /192.168.158.4:9866 2025-07-15 23:22:37,761 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:49566, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1674642148_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750750_9926, duration(ns): 11414029 2025-07-15 23:22:37,761 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750750_9926, type=LAST_IN_PIPELINE terminating 2025-07-15 23:24:37,752 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750752_9928 src: /192.168.158.8:34428 dest: /192.168.158.4:9866 2025-07-15 23:24:37,770 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:34428, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_346585863_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750752_9928, duration(ns): 15610702 2025-07-15 23:24:37,770 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750752_9928, type=LAST_IN_PIPELINE terminating 2025-07-15 23:25:37,762 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750753_9929 src: /192.168.158.6:42928 dest: /192.168.158.4:9866 2025-07-15 23:25:37,779 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:42928, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1749003178_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750753_9929, duration(ns): 14631886 2025-07-15 23:25:37,780 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750753_9929, type=LAST_IN_PIPELINE terminating 2025-07-15 23:27:37,748 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750755_9931 src: /192.168.158.1:36436 dest: /192.168.158.4:9866 2025-07-15 23:27:37,778 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36436, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-906564700_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750755_9931, duration(ns): 21014581 2025-07-15 23:27:37,778 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750755_9931, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-15 23:30:42,766 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750758_9934 src: /192.168.158.5:56022 dest: /192.168.158.4:9866 2025-07-15 23:30:42,784 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:56022, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-237681668_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750758_9934, duration(ns): 15974310 2025-07-15 23:30:42,785 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750758_9934, type=LAST_IN_PIPELINE terminating 2025-07-15 23:35:42,781 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750763_9939 src: /192.168.158.7:57510 dest: /192.168.158.4:9866 2025-07-15 23:35:42,798 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:57510, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_74352137_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750763_9939, duration(ns): 14332944 2025-07-15 23:35:42,798 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750763_9939, type=LAST_IN_PIPELINE terminating 2025-07-15 23:36:13,271 INFO org.apache.hadoop.hdfs.server.datanode.DirectoryScanner: BlockPool BP-1059995147-192.168.158.1-1752101929360 Total blocks: 30, missing metadata files:0, missing block files:0, missing blocks in memory:0, mismatched blocks:0 2025-07-15 23:36:42,808 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750764_9940 src: /192.168.158.9:49430 dest: /192.168.158.4:9866 2025-07-15 23:36:42,831 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:49430, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_434194304_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750764_9940, duration(ns): 17683269 2025-07-15 23:36:42,832 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750764_9940, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-15 23:38:42,785 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750766_9942 src: /192.168.158.7:57648 dest: /192.168.158.4:9866 2025-07-15 23:38:42,806 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:57648, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1720996058_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750766_9942, duration(ns): 16779731 2025-07-15 23:38:42,807 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750766_9942, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-15 23:40:52,780 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750768_9944 src: /192.168.158.9:53690 dest: /192.168.158.4:9866 2025-07-15 23:40:52,804 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:53690, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1207622294_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750768_9944, duration(ns): 18003994 2025-07-15 23:40:52,804 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750768_9944, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-15 23:41:52,791 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750769_9945 src: /192.168.158.6:42908 dest: /192.168.158.4:9866 2025-07-15 23:41:52,808 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:42908, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-30197704_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750769_9945, duration(ns): 14485963 2025-07-15 23:41:52,808 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750769_9945, type=LAST_IN_PIPELINE terminating 2025-07-15 23:44:57,797 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750772_9948 src: /192.168.158.7:58284 dest: /192.168.158.4:9866 2025-07-15 23:44:57,820 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:58284, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1523854116_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750772_9948, duration(ns): 17130067 2025-07-15 23:44:57,820 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750772_9948, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-15 23:47:02,797 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750774_9950 src: /192.168.158.7:53116 dest: /192.168.158.4:9866 2025-07-15 23:47:02,825 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:53116, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_441684279_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750774_9950, duration(ns): 20412164 2025-07-15 23:47:02,825 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750774_9950, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-15 23:51:02,821 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750778_9954 src: /192.168.158.8:48894 dest: /192.168.158.4:9866 2025-07-15 23:51:02,846 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:48894, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-538963251_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750778_9954, duration(ns): 19421609 2025-07-15 23:51:02,846 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750778_9954, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-15 23:51:12,583 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750720_9896 replica FinalizedReplica, blk_1073750720_9896, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750720 for deletion 2025-07-15 23:51:12,584 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750721_9897 replica FinalizedReplica, blk_1073750721_9897, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750721 for deletion 2025-07-15 23:51:12,584 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750720_9896 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750720 2025-07-15 23:51:12,585 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750722_9898 replica FinalizedReplica, blk_1073750722_9898, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750722 for deletion 2025-07-15 23:51:12,585 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750721_9897 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750721 2025-07-15 23:51:12,586 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750723_9899 replica FinalizedReplica, blk_1073750723_9899, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750723 for deletion 2025-07-15 23:51:12,586 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750722_9898 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750722 2025-07-15 23:51:12,587 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750727_9903 replica FinalizedReplica, blk_1073750727_9903, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750727 for deletion 2025-07-15 23:51:12,587 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750723_9899 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750723 2025-07-15 23:51:12,587 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750730_9906 replica FinalizedReplica, blk_1073750730_9906, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750730 for deletion 2025-07-15 23:51:12,587 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750727_9903 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750727 2025-07-15 23:51:12,588 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750730_9906 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750730 2025-07-15 23:51:12,588 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750731_9907 replica FinalizedReplica, blk_1073750731_9907, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750731 for deletion 2025-07-15 23:51:12,589 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750734_9910 replica FinalizedReplica, blk_1073750734_9910, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750734 for deletion 2025-07-15 23:51:12,589 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750731_9907 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750731 2025-07-15 23:51:12,589 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750734_9910 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750734 2025-07-15 23:51:12,589 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750735_9911 replica FinalizedReplica, blk_1073750735_9911, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750735 for deletion 2025-07-15 23:51:12,590 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750735_9911 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750735 2025-07-15 23:51:12,590 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750739_9915 replica FinalizedReplica, blk_1073750739_9915, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750739 for deletion 2025-07-15 23:51:12,590 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750741_9917 replica FinalizedReplica, blk_1073750741_9917, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750741 for deletion 2025-07-15 23:51:12,590 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750739_9915 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750739 2025-07-15 23:51:12,591 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750741_9917 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750741 2025-07-15 23:51:12,591 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750744_9920 replica FinalizedReplica, blk_1073750744_9920, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750744 for deletion 2025-07-15 23:51:12,591 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750745_9921 replica FinalizedReplica, blk_1073750745_9921, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750745 for deletion 2025-07-15 23:51:12,591 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750744_9920 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750744 2025-07-15 23:51:12,592 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750745_9921 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750745 2025-07-15 23:51:12,592 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750746_9922 replica FinalizedReplica, blk_1073750746_9922, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750746 for deletion 2025-07-15 23:51:12,592 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750747_9923 replica FinalizedReplica, blk_1073750747_9923, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750747 for deletion 2025-07-15 23:51:12,592 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750746_9922 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750746 2025-07-15 23:51:12,593 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750747_9923 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750747 2025-07-15 23:51:12,593 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750749_9925 replica FinalizedReplica, blk_1073750749_9925, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750749 for deletion 2025-07-15 23:51:12,593 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750750_9926 replica FinalizedReplica, blk_1073750750_9926, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750750 for deletion 2025-07-15 23:51:12,593 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750749_9925 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750749 2025-07-15 23:51:12,594 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750750_9926 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750750 2025-07-15 23:51:12,594 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750752_9928 replica FinalizedReplica, blk_1073750752_9928, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750752 for deletion 2025-07-15 23:51:12,594 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750753_9929 replica FinalizedReplica, blk_1073750753_9929, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750753 for deletion 2025-07-15 23:51:12,594 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750752_9928 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750752 2025-07-15 23:51:12,595 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750755_9931 replica FinalizedReplica, blk_1073750755_9931, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750755 for deletion 2025-07-15 23:51:12,595 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750753_9929 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750753 2025-07-15 23:51:12,595 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750755_9931 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750755 2025-07-15 23:51:12,595 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750758_9934 replica FinalizedReplica, blk_1073750758_9934, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750758 for deletion 2025-07-15 23:51:12,596 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750763_9939 replica FinalizedReplica, blk_1073750763_9939, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750763 for deletion 2025-07-15 23:51:12,596 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750758_9934 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750758 2025-07-15 23:51:12,596 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750763_9939 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750763 2025-07-15 23:51:12,596 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750764_9940 replica FinalizedReplica, blk_1073750764_9940, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750764 for deletion 2025-07-15 23:51:12,597 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750766_9942 replica FinalizedReplica, blk_1073750766_9942, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750766 for deletion 2025-07-15 23:51:12,597 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750764_9940 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750764 2025-07-15 23:51:12,597 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750768_9944 replica FinalizedReplica, blk_1073750768_9944, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750768 for deletion 2025-07-15 23:51:12,597 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750766_9942 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750766 2025-07-15 23:51:12,598 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750769_9945 replica FinalizedReplica, blk_1073750769_9945, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750769 for deletion 2025-07-15 23:51:12,598 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750768_9944 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750768 2025-07-15 23:51:12,599 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750769_9945 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750769 2025-07-15 23:51:12,599 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750772_9948 replica FinalizedReplica, blk_1073750772_9948, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750772 for deletion 2025-07-15 23:51:12,599 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750774_9950 replica FinalizedReplica, blk_1073750774_9950, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750774 for deletion 2025-07-15 23:51:12,599 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750772_9948 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750772 2025-07-15 23:51:12,600 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750774_9950 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750774 2025-07-15 23:51:12,600 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750778_9954 replica FinalizedReplica, blk_1073750778_9954, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750778 for deletion 2025-07-15 23:51:12,600 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750778_9954 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750778 2025-07-15 23:55:07,830 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750782_9958 src: /192.168.158.9:38508 dest: /192.168.158.4:9866 2025-07-15 23:55:07,847 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:38508, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2095753926_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750782_9958, duration(ns): 14912242 2025-07-15 23:55:07,848 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750782_9958, type=LAST_IN_PIPELINE terminating 2025-07-15 23:55:15,589 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750782_9958 replica FinalizedReplica, blk_1073750782_9958, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750782 for deletion 2025-07-15 23:55:15,590 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750782_9958 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073750782 2025-07-15 23:57:07,823 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750784_9960 src: /192.168.158.1:57716 dest: /192.168.158.4:9866 2025-07-15 23:57:07,854 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57716, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1480469766_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750784_9960, duration(ns): 22070894 2025-07-15 23:57:07,855 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750784_9960, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-15 23:57:12,593 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750784_9960 replica FinalizedReplica, blk_1073750784_9960, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750784 for deletion 2025-07-15 23:57:12,594 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750784_9960 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750784 2025-07-16 00:03:07,868 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750790_9966 src: /192.168.158.6:38674 dest: /192.168.158.4:9866 2025-07-16 00:03:07,893 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:38674, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-994817129_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750790_9966, duration(ns): 19058844 2025-07-16 00:03:07,893 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750790_9966, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-16 00:03:12,605 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750790_9966 replica FinalizedReplica, blk_1073750790_9966, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750790 for deletion 2025-07-16 00:03:12,606 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750790_9966 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750790 2025-07-16 00:05:07,857 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750792_9968 src: /192.168.158.7:46766 dest: /192.168.158.4:9866 2025-07-16 00:05:07,876 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:46766, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_63306297_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750792_9968, duration(ns): 16774426 2025-07-16 00:05:07,876 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750792_9968, type=LAST_IN_PIPELINE terminating 2025-07-16 00:05:15,610 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750792_9968 replica FinalizedReplica, blk_1073750792_9968, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750792 for deletion 2025-07-16 00:05:15,611 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750792_9968 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750792 2025-07-16 00:07:07,850 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750794_9970 src: /192.168.158.1:42028 dest: /192.168.158.4:9866 2025-07-16 00:07:07,881 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42028, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_740716019_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750794_9970, duration(ns): 22428685 2025-07-16 00:07:07,881 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750794_9970, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-16 00:07:15,613 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750794_9970 replica FinalizedReplica, blk_1073750794_9970, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750794 for deletion 2025-07-16 00:07:15,614 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750794_9970 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750794 2025-07-16 00:08:07,857 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750795_9971 src: /192.168.158.7:49644 dest: /192.168.158.4:9866 2025-07-16 00:08:07,883 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:49644, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1546995215_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750795_9971, duration(ns): 20497693 2025-07-16 00:08:07,883 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750795_9971, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-16 00:08:12,615 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750795_9971 replica FinalizedReplica, blk_1073750795_9971, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750795 for deletion 2025-07-16 00:08:12,616 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750795_9971 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750795 2025-07-16 00:09:07,867 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750796_9972 src: /192.168.158.1:46650 dest: /192.168.158.4:9866 2025-07-16 00:09:07,898 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46650, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1324388884_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750796_9972, duration(ns): 22422116 2025-07-16 00:09:07,898 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750796_9972, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-16 00:09:15,617 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750796_9972 replica FinalizedReplica, blk_1073750796_9972, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750796 for deletion 2025-07-16 00:09:15,618 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750796_9972 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750796 2025-07-16 00:11:12,859 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750798_9974 src: /192.168.158.1:44370 dest: /192.168.158.4:9866 2025-07-16 00:11:12,892 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44370, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_561833435_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750798_9974, duration(ns): 22694192 2025-07-16 00:11:12,892 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750798_9974, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-16 00:11:15,620 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750798_9974 replica FinalizedReplica, blk_1073750798_9974, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750798 for deletion 2025-07-16 00:11:15,621 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750798_9974 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750798 2025-07-16 00:12:12,870 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750799_9975 src: /192.168.158.1:34978 dest: /192.168.158.4:9866 2025-07-16 00:12:12,901 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34978, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-474253573_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750799_9975, duration(ns): 22586764 2025-07-16 00:12:12,901 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750799_9975, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-16 00:12:15,623 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750799_9975 replica FinalizedReplica, blk_1073750799_9975, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750799 for deletion 2025-07-16 00:12:15,624 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750799_9975 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750799 2025-07-16 00:14:12,875 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750801_9977 src: /192.168.158.9:42438 dest: /192.168.158.4:9866 2025-07-16 00:14:12,892 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:42438, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1356893018_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750801_9977, duration(ns): 14501868 2025-07-16 00:14:12,892 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750801_9977, type=LAST_IN_PIPELINE terminating 2025-07-16 00:14:15,623 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750801_9977 replica FinalizedReplica, blk_1073750801_9977, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750801 for deletion 2025-07-16 00:14:15,624 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750801_9977 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750801 2025-07-16 00:16:12,893 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750803_9979 src: /192.168.158.9:40246 dest: /192.168.158.4:9866 2025-07-16 00:16:12,913 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:40246, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1305805323_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750803_9979, duration(ns): 17866305 2025-07-16 00:16:12,913 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750803_9979, type=LAST_IN_PIPELINE terminating 2025-07-16 00:16:15,628 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750803_9979 replica FinalizedReplica, blk_1073750803_9979, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750803 for deletion 2025-07-16 00:16:15,629 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750803_9979 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750803 2025-07-16 00:17:12,882 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750804_9980 src: /192.168.158.9:60814 dest: /192.168.158.4:9866 2025-07-16 00:17:12,901 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:60814, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2099984763_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750804_9980, duration(ns): 16756915 2025-07-16 00:17:12,901 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750804_9980, type=LAST_IN_PIPELINE terminating 2025-07-16 00:17:15,632 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750804_9980 replica FinalizedReplica, blk_1073750804_9980, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750804 for deletion 2025-07-16 00:17:15,633 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750804_9980 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750804 2025-07-16 00:18:12,881 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750805_9981 src: /192.168.158.1:59668 dest: /192.168.158.4:9866 2025-07-16 00:18:12,911 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59668, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1409864460_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750805_9981, duration(ns): 21622171 2025-07-16 00:18:12,912 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750805_9981, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-16 00:18:18,633 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750805_9981 replica FinalizedReplica, blk_1073750805_9981, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750805 for deletion 2025-07-16 00:18:18,634 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750805_9981 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750805 2025-07-16 00:19:12,922 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750806_9982 src: /192.168.158.9:52152 dest: /192.168.158.4:9866 2025-07-16 00:19:12,944 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:52152, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-738626291_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750806_9982, duration(ns): 17055853 2025-07-16 00:19:12,945 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750806_9982, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-16 00:19:15,635 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750806_9982 replica FinalizedReplica, blk_1073750806_9982, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750806 for deletion 2025-07-16 00:19:15,636 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750806_9982 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750806 2025-07-16 00:21:17,892 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750808_9984 src: /192.168.158.6:42118 dest: /192.168.158.4:9866 2025-07-16 00:21:17,910 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:42118, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2050167276_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750808_9984, duration(ns): 16278499 2025-07-16 00:21:17,911 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750808_9984, type=LAST_IN_PIPELINE terminating 2025-07-16 00:21:21,639 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750808_9984 replica FinalizedReplica, blk_1073750808_9984, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750808 for deletion 2025-07-16 00:21:21,640 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750808_9984 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750808 2025-07-16 00:22:22,928 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750809_9985 src: /192.168.158.8:45690 dest: /192.168.158.4:9866 2025-07-16 00:22:22,945 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:45690, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_658640540_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750809_9985, duration(ns): 15451900 2025-07-16 00:22:22,945 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750809_9985, type=LAST_IN_PIPELINE terminating 2025-07-16 00:22:30,641 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750809_9985 replica FinalizedReplica, blk_1073750809_9985, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750809 for deletion 2025-07-16 00:22:30,643 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750809_9985 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750809 2025-07-16 00:23:22,907 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750810_9986 src: /192.168.158.1:44636 dest: /192.168.158.4:9866 2025-07-16 00:23:22,936 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44636, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_282723919_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750810_9986, duration(ns): 21040445 2025-07-16 00:23:22,937 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750810_9986, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-16 00:23:30,644 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750810_9986 replica FinalizedReplica, blk_1073750810_9986, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750810 for deletion 2025-07-16 00:23:30,645 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750810_9986 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750810 2025-07-16 00:24:27,903 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750811_9987 src: /192.168.158.8:56042 dest: /192.168.158.4:9866 2025-07-16 00:24:27,926 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:56042, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-911121045_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750811_9987, duration(ns): 17790615 2025-07-16 00:24:27,926 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750811_9987, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-16 00:24:30,644 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750811_9987 replica FinalizedReplica, blk_1073750811_9987, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750811 for deletion 2025-07-16 00:24:30,645 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750811_9987 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750811 2025-07-16 00:26:27,896 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750813_9989 src: /192.168.158.1:36138 dest: /192.168.158.4:9866 2025-07-16 00:26:27,929 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36138, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_776728646_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750813_9989, duration(ns): 24135715 2025-07-16 00:26:27,929 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750813_9989, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-16 00:26:30,647 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750813_9989 replica FinalizedReplica, blk_1073750813_9989, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750813 for deletion 2025-07-16 00:26:30,648 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750813_9989 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750813 2025-07-16 00:27:33,763 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750814_9990 src: /192.168.158.9:60752 dest: /192.168.158.4:9866 2025-07-16 00:27:33,788 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:60752, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2118174763_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750814_9990, duration(ns): 20292554 2025-07-16 00:27:33,788 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750814_9990, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-16 00:27:36,651 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750814_9990 replica FinalizedReplica, blk_1073750814_9990, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750814 for deletion 2025-07-16 00:27:36,652 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750814_9990 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750814 2025-07-16 00:29:37,895 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750816_9992 src: /192.168.158.9:33490 dest: /192.168.158.4:9866 2025-07-16 00:29:37,914 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:33490, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1475999601_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750816_9992, duration(ns): 17172487 2025-07-16 00:29:37,915 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750816_9992, type=LAST_IN_PIPELINE terminating 2025-07-16 00:29:42,656 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750816_9992 replica FinalizedReplica, blk_1073750816_9992, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750816 for deletion 2025-07-16 00:29:42,657 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750816_9992 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750816 2025-07-16 00:32:37,898 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750819_9995 src: /192.168.158.7:33488 dest: /192.168.158.4:9866 2025-07-16 00:32:37,925 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:33488, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1476372660_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750819_9995, duration(ns): 19527114 2025-07-16 00:32:37,925 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750819_9995, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-16 00:32:42,661 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750819_9995 replica FinalizedReplica, blk_1073750819_9995, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750819 for deletion 2025-07-16 00:32:42,662 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750819_9995 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750819 2025-07-16 00:36:42,937 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750823_9999 src: /192.168.158.6:50084 dest: /192.168.158.4:9866 2025-07-16 00:36:42,954 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:50084, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1842861857_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750823_9999, duration(ns): 15487363 2025-07-16 00:36:42,955 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750823_9999, type=LAST_IN_PIPELINE terminating 2025-07-16 00:36:48,674 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750823_9999 replica FinalizedReplica, blk_1073750823_9999, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750823 for deletion 2025-07-16 00:36:48,675 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750823_9999 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750823 2025-07-16 00:37:42,931 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750824_10000 src: /192.168.158.7:41982 dest: /192.168.158.4:9866 2025-07-16 00:37:42,963 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:41982, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1225477399_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750824_10000, duration(ns): 26411211 2025-07-16 00:37:42,963 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750824_10000, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-16 00:37:48,677 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750824_10000 replica FinalizedReplica, blk_1073750824_10000, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750824 for deletion 2025-07-16 00:37:48,678 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750824_10000 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750824 2025-07-16 00:38:42,912 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750825_10001 src: /192.168.158.1:44480 dest: /192.168.158.4:9866 2025-07-16 00:38:42,943 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44480, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-23137696_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750825_10001, duration(ns): 22111143 2025-07-16 00:38:42,944 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750825_10001, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-16 00:38:45,678 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750825_10001 replica FinalizedReplica, blk_1073750825_10001, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750825 for deletion 2025-07-16 00:38:45,679 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750825_10001 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750825 2025-07-16 00:40:42,912 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750827_10003 src: /192.168.158.6:60956 dest: /192.168.158.4:9866 2025-07-16 00:40:42,936 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:60956, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-389406006_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750827_10003, duration(ns): 19208905 2025-07-16 00:40:42,937 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750827_10003, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-16 00:40:45,681 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750827_10003 replica FinalizedReplica, blk_1073750827_10003, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750827 for deletion 2025-07-16 00:40:45,682 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750827_10003 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750827 2025-07-16 00:41:42,935 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750828_10004 src: /192.168.158.1:34044 dest: /192.168.158.4:9866 2025-07-16 00:41:42,965 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34044, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_279843428_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750828_10004, duration(ns): 21419706 2025-07-16 00:41:42,965 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750828_10004, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-16 00:41:45,684 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750828_10004 replica FinalizedReplica, blk_1073750828_10004, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750828 for deletion 2025-07-16 00:41:45,685 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750828_10004 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750828 2025-07-16 00:43:47,932 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750830_10006 src: /192.168.158.1:45730 dest: /192.168.158.4:9866 2025-07-16 00:43:47,967 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45730, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1354892209_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750830_10006, duration(ns): 25335997 2025-07-16 00:43:47,968 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750830_10006, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-16 00:43:51,688 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750830_10006 replica FinalizedReplica, blk_1073750830_10006, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750830 for deletion 2025-07-16 00:43:51,690 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750830_10006 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750830 2025-07-16 00:44:47,936 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750831_10007 src: /192.168.158.6:55420 dest: /192.168.158.4:9866 2025-07-16 00:44:47,953 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:55420, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_309226735_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750831_10007, duration(ns): 14670344 2025-07-16 00:44:47,953 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750831_10007, type=LAST_IN_PIPELINE terminating 2025-07-16 00:44:54,689 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750831_10007 replica FinalizedReplica, blk_1073750831_10007, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750831 for deletion 2025-07-16 00:44:54,690 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750831_10007 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750831 2025-07-16 00:45:52,926 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750832_10008 src: /192.168.158.9:50836 dest: /192.168.158.4:9866 2025-07-16 00:45:52,950 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:50836, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1341529743_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750832_10008, duration(ns): 18071643 2025-07-16 00:45:52,952 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750832_10008, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-16 00:45:57,690 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750832_10008 replica FinalizedReplica, blk_1073750832_10008, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750832 for deletion 2025-07-16 00:45:57,691 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750832_10008 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750832 2025-07-16 00:47:57,931 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750834_10010 src: /192.168.158.8:39904 dest: /192.168.158.4:9866 2025-07-16 00:47:57,948 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:39904, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_836107117_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750834_10010, duration(ns): 15045658 2025-07-16 00:47:57,948 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750834_10010, type=LAST_IN_PIPELINE terminating 2025-07-16 00:48:03,694 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750834_10010 replica FinalizedReplica, blk_1073750834_10010, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750834 for deletion 2025-07-16 00:48:03,695 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750834_10010 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750834 2025-07-16 00:53:07,945 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750839_10015 src: /192.168.158.6:58428 dest: /192.168.158.4:9866 2025-07-16 00:53:07,963 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:58428, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1986977299_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750839_10015, duration(ns): 15337362 2025-07-16 00:53:07,963 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750839_10015, type=LAST_IN_PIPELINE terminating 2025-07-16 00:53:15,707 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750839_10015 replica FinalizedReplica, blk_1073750839_10015, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750839 for deletion 2025-07-16 00:53:15,708 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750839_10015 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750839 2025-07-16 00:54:07,942 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750840_10016 src: /192.168.158.1:43100 dest: /192.168.158.4:9866 2025-07-16 00:54:07,973 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43100, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1352003576_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750840_10016, duration(ns): 22296714 2025-07-16 00:54:07,973 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750840_10016, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-16 00:54:15,709 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750840_10016 replica FinalizedReplica, blk_1073750840_10016, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750840 for deletion 2025-07-16 00:54:15,710 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750840_10016 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750840 2025-07-16 00:57:12,960 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750843_10019 src: /192.168.158.7:53170 dest: /192.168.158.4:9866 2025-07-16 00:57:12,977 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:53170, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_855046741_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750843_10019, duration(ns): 14971269 2025-07-16 00:57:12,977 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750843_10019, type=LAST_IN_PIPELINE terminating 2025-07-16 00:57:18,714 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750843_10019 replica FinalizedReplica, blk_1073750843_10019, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750843 for deletion 2025-07-16 00:57:18,715 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750843_10019 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750843 2025-07-16 00:58:12,964 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750844_10020 src: /192.168.158.5:52482 dest: /192.168.158.4:9866 2025-07-16 00:58:12,986 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:52482, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2016285519_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750844_10020, duration(ns): 16917333 2025-07-16 00:58:12,987 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750844_10020, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-16 00:58:15,717 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750844_10020 replica FinalizedReplica, blk_1073750844_10020, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750844 for deletion 2025-07-16 00:58:15,718 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750844_10020 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750844 2025-07-16 00:59:12,958 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750845_10021 src: /192.168.158.9:40050 dest: /192.168.158.4:9866 2025-07-16 00:59:12,982 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:40050, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_648191114_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750845_10021, duration(ns): 18528648 2025-07-16 00:59:12,982 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750845_10021, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-16 00:59:15,718 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750845_10021 replica FinalizedReplica, blk_1073750845_10021, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750845 for deletion 2025-07-16 00:59:15,721 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750845_10021 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750845 2025-07-16 01:00:17,963 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750846_10022 src: /192.168.158.1:58950 dest: /192.168.158.4:9866 2025-07-16 01:00:17,995 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58950, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-607743333_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750846_10022, duration(ns): 22718043 2025-07-16 01:00:17,995 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750846_10022, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-16 01:00:24,721 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750846_10022 replica FinalizedReplica, blk_1073750846_10022, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750846 for deletion 2025-07-16 01:00:24,722 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750846_10022 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750846 2025-07-16 01:04:22,965 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750850_10026 src: /192.168.158.6:57154 dest: /192.168.158.4:9866 2025-07-16 01:04:22,985 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:57154, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-853481977_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750850_10026, duration(ns): 17672156 2025-07-16 01:04:22,985 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750850_10026, type=LAST_IN_PIPELINE terminating 2025-07-16 01:04:27,725 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750850_10026 replica FinalizedReplica, blk_1073750850_10026, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750850 for deletion 2025-07-16 01:04:27,726 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750850_10026 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750850 2025-07-16 01:05:22,962 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750851_10027 src: /192.168.158.1:35248 dest: /192.168.158.4:9866 2025-07-16 01:05:22,994 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35248, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1975600049_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750851_10027, duration(ns): 23854134 2025-07-16 01:05:22,994 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750851_10027, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-16 01:05:27,729 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750851_10027 replica FinalizedReplica, blk_1073750851_10027, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750851 for deletion 2025-07-16 01:05:27,731 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750851_10027 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750851 2025-07-16 01:16:42,967 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750862_10038 src: /192.168.158.6:34514 dest: /192.168.158.4:9866 2025-07-16 01:16:42,992 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:34514, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-132019775_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750862_10038, duration(ns): 19281877 2025-07-16 01:16:42,993 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750862_10038, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-16 01:16:45,758 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750862_10038 replica FinalizedReplica, blk_1073750862_10038, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750862 for deletion 2025-07-16 01:16:45,759 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750862_10038 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750862 2025-07-16 01:17:42,971 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750863_10039 src: /192.168.158.9:52930 dest: /192.168.158.4:9866 2025-07-16 01:17:42,994 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:52930, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-927337627_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750863_10039, duration(ns): 16710400 2025-07-16 01:17:42,994 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750863_10039, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-16 01:17:48,759 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750863_10039 replica FinalizedReplica, blk_1073750863_10039, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750863 for deletion 2025-07-16 01:17:48,760 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750863_10039 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750863 2025-07-16 01:18:47,959 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750864_10040 src: /192.168.158.1:35870 dest: /192.168.158.4:9866 2025-07-16 01:18:47,990 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35870, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1967582771_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750864_10040, duration(ns): 22523325 2025-07-16 01:18:47,991 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750864_10040, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-16 01:18:51,762 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750864_10040 replica FinalizedReplica, blk_1073750864_10040, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750864 for deletion 2025-07-16 01:18:51,763 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750864_10040 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750864 2025-07-16 01:19:47,966 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750865_10041 src: /192.168.158.8:40148 dest: /192.168.158.4:9866 2025-07-16 01:19:47,991 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:40148, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1529905078_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750865_10041, duration(ns): 18678215 2025-07-16 01:19:47,991 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750865_10041, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-16 01:19:54,766 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750865_10041 replica FinalizedReplica, blk_1073750865_10041, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750865 for deletion 2025-07-16 01:19:54,767 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750865_10041 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750865 2025-07-16 01:23:57,973 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750869_10045 src: /192.168.158.8:47920 dest: /192.168.158.4:9866 2025-07-16 01:23:57,995 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:47920, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_588220916_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750869_10045, duration(ns): 16444849 2025-07-16 01:23:57,995 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750869_10045, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-16 01:24:00,778 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750869_10045 replica FinalizedReplica, blk_1073750869_10045, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750869 for deletion 2025-07-16 01:24:00,779 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750869_10045 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750869 2025-07-16 01:25:02,975 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750870_10046 src: /192.168.158.6:51994 dest: /192.168.158.4:9866 2025-07-16 01:25:02,998 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:51994, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1337217996_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750870_10046, duration(ns): 17867162 2025-07-16 01:25:02,998 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750870_10046, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-16 01:25:06,782 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750870_10046 replica FinalizedReplica, blk_1073750870_10046, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750870 for deletion 2025-07-16 01:25:06,783 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750870_10046 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750870 2025-07-16 01:26:02,986 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750871_10047 src: /192.168.158.7:43610 dest: /192.168.158.4:9866 2025-07-16 01:26:03,002 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:43610, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1287806257_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750871_10047, duration(ns): 13854267 2025-07-16 01:26:03,003 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750871_10047, type=LAST_IN_PIPELINE terminating 2025-07-16 01:26:09,784 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750871_10047 replica FinalizedReplica, blk_1073750871_10047, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750871 for deletion 2025-07-16 01:26:09,785 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750871_10047 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750871 2025-07-16 01:28:02,989 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750873_10049 src: /192.168.158.6:47356 dest: /192.168.158.4:9866 2025-07-16 01:28:03,011 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:47356, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1975774974_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750873_10049, duration(ns): 17603198 2025-07-16 01:28:03,012 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750873_10049, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-16 01:28:09,789 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750873_10049 replica FinalizedReplica, blk_1073750873_10049, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750873 for deletion 2025-07-16 01:28:09,790 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750873_10049 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750873 2025-07-16 01:32:07,983 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750877_10053 src: /192.168.158.1:36582 dest: /192.168.158.4:9866 2025-07-16 01:32:08,015 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36582, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_928882854_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750877_10053, duration(ns): 22856621 2025-07-16 01:32:08,016 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750877_10053, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-16 01:32:12,804 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750877_10053 replica FinalizedReplica, blk_1073750877_10053, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750877 for deletion 2025-07-16 01:32:12,805 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750877_10053 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750877 2025-07-16 01:33:07,988 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750878_10054 src: /192.168.158.1:44312 dest: /192.168.158.4:9866 2025-07-16 01:33:08,019 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44312, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-661396042_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750878_10054, duration(ns): 21481291 2025-07-16 01:33:08,020 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750878_10054, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-16 01:33:12,806 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750878_10054 replica FinalizedReplica, blk_1073750878_10054, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750878 for deletion 2025-07-16 01:33:12,807 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750878_10054 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750878 2025-07-16 01:34:12,990 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750879_10055 src: /192.168.158.1:55516 dest: /192.168.158.4:9866 2025-07-16 01:34:13,019 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55516, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_130356204_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750879_10055, duration(ns): 20378478 2025-07-16 01:34:13,019 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750879_10055, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-16 01:34:18,810 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750879_10055 replica FinalizedReplica, blk_1073750879_10055, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750879 for deletion 2025-07-16 01:34:18,811 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750879_10055 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750879 2025-07-16 01:36:12,993 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750881_10057 src: /192.168.158.5:42114 dest: /192.168.158.4:9866 2025-07-16 01:36:13,015 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:42114, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1225713116_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750881_10057, duration(ns): 17126298 2025-07-16 01:36:13,015 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750881_10057, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-16 01:36:18,815 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750881_10057 replica FinalizedReplica, blk_1073750881_10057, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750881 for deletion 2025-07-16 01:36:18,816 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750881_10057 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750881 2025-07-16 01:38:12,989 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750883_10059 src: /192.168.158.1:57692 dest: /192.168.158.4:9866 2025-07-16 01:38:13,021 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57692, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_309752853_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750883_10059, duration(ns): 22901041 2025-07-16 01:38:13,021 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750883_10059, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-16 01:38:18,823 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750883_10059 replica FinalizedReplica, blk_1073750883_10059, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750883 for deletion 2025-07-16 01:38:18,824 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750883_10059 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750883 2025-07-16 01:39:12,996 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750884_10060 src: /192.168.158.1:41776 dest: /192.168.158.4:9866 2025-07-16 01:39:13,026 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41776, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1124350029_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750884_10060, duration(ns): 21170426 2025-07-16 01:39:13,026 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750884_10060, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-16 01:39:15,824 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750884_10060 replica FinalizedReplica, blk_1073750884_10060, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750884 for deletion 2025-07-16 01:39:15,825 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750884_10060 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750884 2025-07-16 01:40:13,007 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750885_10061 src: /192.168.158.8:37460 dest: /192.168.158.4:9866 2025-07-16 01:40:13,023 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:37460, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1968966370_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750885_10061, duration(ns): 13876865 2025-07-16 01:40:13,023 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750885_10061, type=LAST_IN_PIPELINE terminating 2025-07-16 01:40:15,824 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750885_10061 replica FinalizedReplica, blk_1073750885_10061, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750885 for deletion 2025-07-16 01:40:15,825 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750885_10061 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750885 2025-07-16 01:41:12,997 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750886_10062 src: /192.168.158.1:36032 dest: /192.168.158.4:9866 2025-07-16 01:41:13,032 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36032, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_737342759_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750886_10062, duration(ns): 26806305 2025-07-16 01:41:13,032 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750886_10062, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-16 01:41:18,827 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750886_10062 replica FinalizedReplica, blk_1073750886_10062, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750886 for deletion 2025-07-16 01:41:18,828 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750886_10062 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750886 2025-07-16 01:45:13,011 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750890_10066 src: /192.168.158.8:43542 dest: /192.168.158.4:9866 2025-07-16 01:45:13,029 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:43542, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1426978288_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750890_10066, duration(ns): 16254316 2025-07-16 01:45:13,030 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750890_10066, type=LAST_IN_PIPELINE terminating 2025-07-16 01:45:18,839 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750890_10066 replica FinalizedReplica, blk_1073750890_10066, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750890 for deletion 2025-07-16 01:45:18,840 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750890_10066 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750890 2025-07-16 01:50:13,024 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750895_10071 src: /192.168.158.9:39864 dest: /192.168.158.4:9866 2025-07-16 01:50:13,047 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:39864, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-965163495_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750895_10071, duration(ns): 18084236 2025-07-16 01:50:13,047 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750895_10071, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-16 01:50:18,856 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750895_10071 replica FinalizedReplica, blk_1073750895_10071, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750895 for deletion 2025-07-16 01:50:18,857 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750895_10071 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750895 2025-07-16 01:51:13,031 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750896_10072 src: /192.168.158.6:43374 dest: /192.168.158.4:9866 2025-07-16 01:51:13,049 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:43374, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-262723452_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750896_10072, duration(ns): 15984324 2025-07-16 01:51:13,050 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750896_10072, type=LAST_IN_PIPELINE terminating 2025-07-16 01:51:18,858 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750896_10072 replica FinalizedReplica, blk_1073750896_10072, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750896 for deletion 2025-07-16 01:51:18,860 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750896_10072 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750896 2025-07-16 01:58:18,038 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750903_10079 src: /192.168.158.1:45398 dest: /192.168.158.4:9866 2025-07-16 01:58:18,069 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45398, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-376271910_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750903_10079, duration(ns): 22292176 2025-07-16 01:58:18,070 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750903_10079, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-16 01:58:21,877 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750903_10079 replica FinalizedReplica, blk_1073750903_10079, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750903 for deletion 2025-07-16 01:58:21,878 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750903_10079 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750903 2025-07-16 01:59:18,044 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750904_10080 src: /192.168.158.8:60174 dest: /192.168.158.4:9866 2025-07-16 01:59:18,067 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:60174, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1842277046_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750904_10080, duration(ns): 17674852 2025-07-16 01:59:18,068 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750904_10080, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-16 01:59:21,881 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750904_10080 replica FinalizedReplica, blk_1073750904_10080, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750904 for deletion 2025-07-16 01:59:21,882 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750904_10080 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750904 2025-07-16 02:00:18,048 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750905_10081 src: /192.168.158.1:57944 dest: /192.168.158.4:9866 2025-07-16 02:00:18,078 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57944, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-443303723_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750905_10081, duration(ns): 20979422 2025-07-16 02:00:18,078 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750905_10081, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-16 02:00:21,883 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750905_10081 replica FinalizedReplica, blk_1073750905_10081, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750905 for deletion 2025-07-16 02:00:21,884 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750905_10081 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750905 2025-07-16 02:00:54,888 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Successfully sent block report 0x12cfd9a757d31f3f, containing 4 storage report(s), of which we sent 4. The reports had 8 total blocks and used 1 RPC(s). This took 1 msec to generate and 4 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5. 2025-07-16 02:00:54,888 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Got finalize command for block pool BP-1059995147-192.168.158.1-1752101929360 2025-07-16 02:01:23,045 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750906_10082 src: /192.168.158.8:45990 dest: /192.168.158.4:9866 2025-07-16 02:01:23,069 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:45990, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-77717569_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750906_10082, duration(ns): 18750698 2025-07-16 02:01:23,069 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750906_10082, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-16 02:01:27,884 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750906_10082 replica FinalizedReplica, blk_1073750906_10082, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750906 for deletion 2025-07-16 02:01:27,885 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750906_10082 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750906 2025-07-16 02:02:23,054 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750907_10083 src: /192.168.158.8:40442 dest: /192.168.158.4:9866 2025-07-16 02:02:23,079 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:40442, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-141992308_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750907_10083, duration(ns): 19983336 2025-07-16 02:02:23,079 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750907_10083, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-16 02:02:30,885 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750907_10083 replica FinalizedReplica, blk_1073750907_10083, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750907 for deletion 2025-07-16 02:02:30,886 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750907_10083 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750907 2025-07-16 02:07:28,073 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750912_10088 src: /192.168.158.8:57948 dest: /192.168.158.4:9866 2025-07-16 02:07:28,091 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:57948, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_920960077_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750912_10088, duration(ns): 15732889 2025-07-16 02:07:28,091 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750912_10088, type=LAST_IN_PIPELINE terminating 2025-07-16 02:07:33,893 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750912_10088 replica FinalizedReplica, blk_1073750912_10088, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750912 for deletion 2025-07-16 02:07:33,894 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750912_10088 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750912 2025-07-16 02:08:28,066 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750913_10089 src: /192.168.158.5:49324 dest: /192.168.158.4:9866 2025-07-16 02:08:28,089 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:49324, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1452770792_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750913_10089, duration(ns): 17654425 2025-07-16 02:08:28,090 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750913_10089, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-16 02:08:33,896 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750913_10089 replica FinalizedReplica, blk_1073750913_10089, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750913 for deletion 2025-07-16 02:08:33,897 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750913_10089 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750913 2025-07-16 02:10:28,075 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750915_10091 src: /192.168.158.1:56692 dest: /192.168.158.4:9866 2025-07-16 02:10:28,106 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56692, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1987693197_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750915_10091, duration(ns): 22326616 2025-07-16 02:10:28,108 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750915_10091, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-16 02:10:30,897 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750915_10091 replica FinalizedReplica, blk_1073750915_10091, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750915 for deletion 2025-07-16 02:10:30,898 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750915_10091 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750915 2025-07-16 02:17:33,079 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750922_10098 src: /192.168.158.6:36014 dest: /192.168.158.4:9866 2025-07-16 02:17:33,103 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:36014, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_320446415_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750922_10098, duration(ns): 19391534 2025-07-16 02:17:33,104 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750922_10098, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-16 02:17:36,905 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750922_10098 replica FinalizedReplica, blk_1073750922_10098, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750922 for deletion 2025-07-16 02:17:36,907 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750922_10098 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750922 2025-07-16 02:20:43,088 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750925_10101 src: /192.168.158.5:40316 dest: /192.168.158.4:9866 2025-07-16 02:20:43,105 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:40316, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_447506497_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750925_10101, duration(ns): 14319181 2025-07-16 02:20:43,106 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750925_10101, type=LAST_IN_PIPELINE terminating 2025-07-16 02:20:45,910 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750925_10101 replica FinalizedReplica, blk_1073750925_10101, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750925 for deletion 2025-07-16 02:20:45,911 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750925_10101 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750925 2025-07-16 02:24:43,110 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750929_10105 src: /192.168.158.6:56668 dest: /192.168.158.4:9866 2025-07-16 02:24:43,129 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:56668, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-633613886_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750929_10105, duration(ns): 16344538 2025-07-16 02:24:43,129 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750929_10105, type=LAST_IN_PIPELINE terminating 2025-07-16 02:24:45,917 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750929_10105 replica FinalizedReplica, blk_1073750929_10105, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750929 for deletion 2025-07-16 02:24:45,918 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750929_10105 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750929 2025-07-16 02:25:43,112 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750930_10106 src: /192.168.158.8:44630 dest: /192.168.158.4:9866 2025-07-16 02:25:43,134 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:44630, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-738399244_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750930_10106, duration(ns): 16931418 2025-07-16 02:25:43,134 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750930_10106, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-16 02:25:45,917 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750930_10106 replica FinalizedReplica, blk_1073750930_10106, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750930 for deletion 2025-07-16 02:25:45,918 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750930_10106 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750930 2025-07-16 02:27:48,110 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750932_10108 src: /192.168.158.5:44406 dest: /192.168.158.4:9866 2025-07-16 02:27:48,133 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:44406, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1196990315_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750932_10108, duration(ns): 17981936 2025-07-16 02:27:48,133 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750932_10108, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-16 02:27:51,921 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750932_10108 replica FinalizedReplica, blk_1073750932_10108, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750932 for deletion 2025-07-16 02:27:51,941 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750932_10108 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750932 2025-07-16 02:30:53,113 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750935_10111 src: /192.168.158.1:51602 dest: /192.168.158.4:9866 2025-07-16 02:30:53,144 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51602, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1260338710_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750935_10111, duration(ns): 22312436 2025-07-16 02:30:53,144 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750935_10111, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-16 02:30:57,923 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750935_10111 replica FinalizedReplica, blk_1073750935_10111, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750935 for deletion 2025-07-16 02:30:57,924 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750935_10111 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750935 2025-07-16 02:31:53,119 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750936_10112 src: /192.168.158.5:36502 dest: /192.168.158.4:9866 2025-07-16 02:31:53,136 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:36502, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1513802014_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750936_10112, duration(ns): 15173513 2025-07-16 02:31:53,136 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750936_10112, type=LAST_IN_PIPELINE terminating 2025-07-16 02:31:57,926 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750936_10112 replica FinalizedReplica, blk_1073750936_10112, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750936 for deletion 2025-07-16 02:31:57,927 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750936_10112 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750936 2025-07-16 02:32:53,118 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750937_10113 src: /192.168.158.9:49066 dest: /192.168.158.4:9866 2025-07-16 02:32:53,143 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:49066, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-658197801_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750937_10113, duration(ns): 19648846 2025-07-16 02:32:53,143 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750937_10113, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-16 02:32:57,926 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750937_10113 replica FinalizedReplica, blk_1073750937_10113, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750937 for deletion 2025-07-16 02:32:57,927 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750937_10113 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750937 2025-07-16 02:34:53,121 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750939_10115 src: /192.168.158.1:34508 dest: /192.168.158.4:9866 2025-07-16 02:34:53,154 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34508, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1875849908_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750939_10115, duration(ns): 22680638 2025-07-16 02:34:53,154 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750939_10115, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-16 02:34:57,932 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750939_10115 replica FinalizedReplica, blk_1073750939_10115, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750939 for deletion 2025-07-16 02:34:57,934 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750939_10115 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750939 2025-07-16 02:38:53,127 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750943_10119 src: /192.168.158.1:43340 dest: /192.168.158.4:9866 2025-07-16 02:38:53,155 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43340, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1207840489_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750943_10119, duration(ns): 20219261 2025-07-16 02:38:53,155 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750943_10119, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-16 02:38:57,940 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750943_10119 replica FinalizedReplica, blk_1073750943_10119, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750943 for deletion 2025-07-16 02:38:57,941 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750943_10119 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750943 2025-07-16 02:39:53,131 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750944_10120 src: /192.168.158.6:34062 dest: /192.168.158.4:9866 2025-07-16 02:39:53,156 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:34062, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1221190645_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750944_10120, duration(ns): 20382128 2025-07-16 02:39:53,156 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750944_10120, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-16 02:39:57,943 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750944_10120 replica FinalizedReplica, blk_1073750944_10120, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750944 for deletion 2025-07-16 02:39:57,944 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750944_10120 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750944 2025-07-16 02:40:58,134 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750945_10121 src: /192.168.158.9:48518 dest: /192.168.158.4:9866 2025-07-16 02:40:58,157 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:48518, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1642680927_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750945_10121, duration(ns): 17507520 2025-07-16 02:40:58,157 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750945_10121, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-16 02:41:00,945 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750945_10121 replica FinalizedReplica, blk_1073750945_10121, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750945 for deletion 2025-07-16 02:41:00,946 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750945_10121 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750945 2025-07-16 02:43:58,145 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750948_10124 src: /192.168.158.6:57462 dest: /192.168.158.4:9866 2025-07-16 02:43:58,165 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:57462, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-707735664_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750948_10124, duration(ns): 17883445 2025-07-16 02:43:58,165 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750948_10124, type=LAST_IN_PIPELINE terminating 2025-07-16 02:44:03,950 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750948_10124 replica FinalizedReplica, blk_1073750948_10124, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750948 for deletion 2025-07-16 02:44:03,951 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750948_10124 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750948 2025-07-16 02:45:58,156 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750950_10126 src: /192.168.158.5:41400 dest: /192.168.158.4:9866 2025-07-16 02:45:58,178 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:41400, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1971394511_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750950_10126, duration(ns): 17145989 2025-07-16 02:45:58,178 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750950_10126, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-16 02:46:00,957 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750950_10126 replica FinalizedReplica, blk_1073750950_10126, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750950 for deletion 2025-07-16 02:46:00,958 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750950_10126 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750950 2025-07-16 02:46:58,157 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750951_10127 src: /192.168.158.1:41348 dest: /192.168.158.4:9866 2025-07-16 02:46:58,186 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41348, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1989467149_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750951_10127, duration(ns): 20842477 2025-07-16 02:46:58,187 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750951_10127, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-16 02:47:00,960 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750951_10127 replica FinalizedReplica, blk_1073750951_10127, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750951 for deletion 2025-07-16 02:47:00,961 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750951_10127 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750951 2025-07-16 02:50:58,167 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750955_10131 src: /192.168.158.9:50906 dest: /192.168.158.4:9866 2025-07-16 02:50:58,192 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:50906, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_661598882_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750955_10131, duration(ns): 19796941 2025-07-16 02:50:58,193 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750955_10131, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-16 02:51:00,968 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750955_10131 replica FinalizedReplica, blk_1073750955_10131, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750955 for deletion 2025-07-16 02:51:00,970 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750955_10131 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750955 2025-07-16 02:55:58,177 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750960_10136 src: /192.168.158.9:35866 dest: /192.168.158.4:9866 2025-07-16 02:55:58,201 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:35866, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_870649171_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750960_10136, duration(ns): 18507433 2025-07-16 02:55:58,201 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750960_10136, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-16 02:56:00,983 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750960_10136 replica FinalizedReplica, blk_1073750960_10136, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750960 for deletion 2025-07-16 02:56:00,984 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750960_10136 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750960 2025-07-16 02:56:58,197 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750961_10137 src: /192.168.158.7:47330 dest: /192.168.158.4:9866 2025-07-16 02:56:58,214 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:47330, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1703570413_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750961_10137, duration(ns): 15266096 2025-07-16 02:56:58,215 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750961_10137, type=LAST_IN_PIPELINE terminating 2025-07-16 02:57:00,984 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750961_10137 replica FinalizedReplica, blk_1073750961_10137, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750961 for deletion 2025-07-16 02:57:00,986 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750961_10137 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750961 2025-07-16 02:57:58,182 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750962_10138 src: /192.168.158.1:49898 dest: /192.168.158.4:9866 2025-07-16 02:57:58,216 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49898, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-569432791_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750962_10138, duration(ns): 25666607 2025-07-16 02:57:58,216 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750962_10138, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-16 02:58:03,988 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750962_10138 replica FinalizedReplica, blk_1073750962_10138, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750962 for deletion 2025-07-16 02:58:03,990 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750962_10138 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750962 2025-07-16 03:00:03,185 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750964_10140 src: /192.168.158.1:41078 dest: /192.168.158.4:9866 2025-07-16 03:00:03,216 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41078, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_237432260_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750964_10140, duration(ns): 22159082 2025-07-16 03:00:03,218 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750964_10140, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-16 03:00:09,995 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750964_10140 replica FinalizedReplica, blk_1073750964_10140, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750964 for deletion 2025-07-16 03:00:09,996 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750964_10140 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750964 2025-07-16 03:02:03,190 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750966_10142 src: /192.168.158.7:58870 dest: /192.168.158.4:9866 2025-07-16 03:02:03,213 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:58870, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-747722049_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750966_10142, duration(ns): 17641681 2025-07-16 03:02:03,216 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750966_10142, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-16 03:02:09,997 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750966_10142 replica FinalizedReplica, blk_1073750966_10142, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750966 for deletion 2025-07-16 03:02:09,998 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750966_10142 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750966 2025-07-16 03:06:08,200 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750970_10146 src: /192.168.158.9:57084 dest: /192.168.158.4:9866 2025-07-16 03:06:08,227 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:57084, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_617436171_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750970_10146, duration(ns): 20825761 2025-07-16 03:06:08,227 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750970_10146, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-16 03:06:13,004 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750970_10146 replica FinalizedReplica, blk_1073750970_10146, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750970 for deletion 2025-07-16 03:06:13,005 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750970_10146 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750970 2025-07-16 03:07:08,213 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750971_10147 src: /192.168.158.1:33722 dest: /192.168.158.4:9866 2025-07-16 03:07:08,245 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33722, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1703405773_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750971_10147, duration(ns): 23316700 2025-07-16 03:07:08,246 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750971_10147, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-16 03:07:16,005 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750971_10147 replica FinalizedReplica, blk_1073750971_10147, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750971 for deletion 2025-07-16 03:07:16,006 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750971_10147 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750971 2025-07-16 03:08:08,205 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750972_10148 src: /192.168.158.6:42898 dest: /192.168.158.4:9866 2025-07-16 03:08:08,222 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:42898, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1691621692_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750972_10148, duration(ns): 14987902 2025-07-16 03:08:08,223 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750972_10148, type=LAST_IN_PIPELINE terminating 2025-07-16 03:08:16,008 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750972_10148 replica FinalizedReplica, blk_1073750972_10148, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750972 for deletion 2025-07-16 03:08:16,009 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750972_10148 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750972 2025-07-16 03:10:08,219 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750974_10150 src: /192.168.158.9:41554 dest: /192.168.158.4:9866 2025-07-16 03:10:08,243 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:41554, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2123413218_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750974_10150, duration(ns): 19134514 2025-07-16 03:10:08,244 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750974_10150, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-16 03:10:13,009 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750974_10150 replica FinalizedReplica, blk_1073750974_10150, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750974 for deletion 2025-07-16 03:10:13,010 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750974_10150 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750974 2025-07-16 03:13:13,219 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750977_10153 src: /192.168.158.6:50324 dest: /192.168.158.4:9866 2025-07-16 03:13:13,238 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:50324, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-28991198_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750977_10153, duration(ns): 17351893 2025-07-16 03:13:13,239 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750977_10153, type=LAST_IN_PIPELINE terminating 2025-07-16 03:13:16,013 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750977_10153 replica FinalizedReplica, blk_1073750977_10153, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750977 for deletion 2025-07-16 03:13:16,014 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750977_10153 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750977 2025-07-16 03:14:13,226 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750978_10154 src: /192.168.158.8:58642 dest: /192.168.158.4:9866 2025-07-16 03:14:13,243 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:58642, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1990871733_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750978_10154, duration(ns): 14822779 2025-07-16 03:14:13,243 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750978_10154, type=LAST_IN_PIPELINE terminating 2025-07-16 03:14:16,014 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750978_10154 replica FinalizedReplica, blk_1073750978_10154, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750978 for deletion 2025-07-16 03:14:16,016 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750978_10154 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750978 2025-07-16 03:15:13,220 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750979_10155 src: /192.168.158.8:41934 dest: /192.168.158.4:9866 2025-07-16 03:15:13,239 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:41934, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1375668236_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750979_10155, duration(ns): 16771565 2025-07-16 03:15:13,239 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750979_10155, type=LAST_IN_PIPELINE terminating 2025-07-16 03:15:16,019 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750979_10155 replica FinalizedReplica, blk_1073750979_10155, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750979 for deletion 2025-07-16 03:15:16,020 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750979_10155 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750979 2025-07-16 03:17:18,214 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750981_10157 src: /192.168.158.1:33310 dest: /192.168.158.4:9866 2025-07-16 03:17:18,245 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33310, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1738519024_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750981_10157, duration(ns): 21393298 2025-07-16 03:17:18,245 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750981_10157, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-16 03:17:22,025 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750981_10157 replica FinalizedReplica, blk_1073750981_10157, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750981 for deletion 2025-07-16 03:17:22,026 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750981_10157 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750981 2025-07-16 03:18:18,220 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750982_10158 src: /192.168.158.7:44392 dest: /192.168.158.4:9866 2025-07-16 03:18:18,243 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:44392, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_532197616_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750982_10158, duration(ns): 17534777 2025-07-16 03:18:18,243 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750982_10158, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-16 03:18:25,026 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750982_10158 replica FinalizedReplica, blk_1073750982_10158, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750982 for deletion 2025-07-16 03:18:25,027 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750982_10158 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750982 2025-07-16 03:19:18,217 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750983_10159 src: /192.168.158.1:54902 dest: /192.168.158.4:9866 2025-07-16 03:19:18,247 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54902, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_246278114_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750983_10159, duration(ns): 20751921 2025-07-16 03:19:18,247 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750983_10159, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-16 03:19:22,027 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750983_10159 replica FinalizedReplica, blk_1073750983_10159, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750983 for deletion 2025-07-16 03:19:22,028 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750983_10159 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750983 2025-07-16 03:20:18,229 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750984_10160 src: /192.168.158.7:39620 dest: /192.168.158.4:9866 2025-07-16 03:20:18,246 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:39620, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-75740177_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750984_10160, duration(ns): 14843255 2025-07-16 03:20:18,246 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750984_10160, type=LAST_IN_PIPELINE terminating 2025-07-16 03:20:22,028 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750984_10160 replica FinalizedReplica, blk_1073750984_10160, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750984 for deletion 2025-07-16 03:20:22,031 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750984_10160 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750984 2025-07-16 03:22:18,227 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750986_10162 src: /192.168.158.6:52582 dest: /192.168.158.4:9866 2025-07-16 03:22:18,253 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:52582, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1020817172_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750986_10162, duration(ns): 20629741 2025-07-16 03:22:18,253 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750986_10162, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-16 03:22:22,031 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750986_10162 replica FinalizedReplica, blk_1073750986_10162, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750986 for deletion 2025-07-16 03:22:22,032 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750986_10162 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750986 2025-07-16 03:24:18,233 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750988_10164 src: /192.168.158.8:54160 dest: /192.168.158.4:9866 2025-07-16 03:24:18,249 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:54160, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1723665238_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750988_10164, duration(ns): 14177507 2025-07-16 03:24:18,250 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750988_10164, type=LAST_IN_PIPELINE terminating 2025-07-16 03:24:22,034 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750988_10164 replica FinalizedReplica, blk_1073750988_10164, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750988 for deletion 2025-07-16 03:24:22,036 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750988_10164 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750988 2025-07-16 03:25:18,236 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750989_10165 src: /192.168.158.1:54176 dest: /192.168.158.4:9866 2025-07-16 03:25:18,268 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54176, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679223869_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750989_10165, duration(ns): 22043032 2025-07-16 03:25:18,268 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750989_10165, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-16 03:25:25,038 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750989_10165 replica FinalizedReplica, blk_1073750989_10165, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750989 for deletion 2025-07-16 03:25:25,040 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750989_10165 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750989 2025-07-16 03:26:18,241 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750990_10166 src: /192.168.158.1:60626 dest: /192.168.158.4:9866 2025-07-16 03:26:18,276 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60626, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1111327583_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750990_10166, duration(ns): 25490259 2025-07-16 03:26:18,276 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750990_10166, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-16 03:26:25,041 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750990_10166 replica FinalizedReplica, blk_1073750990_10166, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750990 for deletion 2025-07-16 03:26:25,042 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750990_10166 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750990 2025-07-16 03:27:18,252 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750991_10167 src: /192.168.158.7:48544 dest: /192.168.158.4:9866 2025-07-16 03:27:18,275 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:48544, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1671449133_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750991_10167, duration(ns): 18080960 2025-07-16 03:27:18,276 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750991_10167, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-16 03:27:22,046 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750991_10167 replica FinalizedReplica, blk_1073750991_10167, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750991 for deletion 2025-07-16 03:27:22,047 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750991_10167 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750991 2025-07-16 03:28:18,253 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750992_10168 src: /192.168.158.5:36222 dest: /192.168.158.4:9866 2025-07-16 03:28:18,276 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:36222, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1310684249_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750992_10168, duration(ns): 17850110 2025-07-16 03:28:18,276 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750992_10168, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-16 03:28:22,050 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750992_10168 replica FinalizedReplica, blk_1073750992_10168, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750992 for deletion 2025-07-16 03:28:22,051 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750992_10168 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750992 2025-07-16 03:30:18,255 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750994_10170 src: /192.168.158.5:39658 dest: /192.168.158.4:9866 2025-07-16 03:30:18,274 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:39658, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1898958188_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750994_10170, duration(ns): 16493799 2025-07-16 03:30:18,274 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750994_10170, type=LAST_IN_PIPELINE terminating 2025-07-16 03:30:22,054 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750994_10170 replica FinalizedReplica, blk_1073750994_10170, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750994 for deletion 2025-07-16 03:30:22,055 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750994_10170 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750994 2025-07-16 03:31:18,262 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750995_10171 src: /192.168.158.7:35974 dest: /192.168.158.4:9866 2025-07-16 03:31:18,285 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:35974, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1070869478_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750995_10171, duration(ns): 18273212 2025-07-16 03:31:18,285 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750995_10171, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-16 03:31:25,056 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750995_10171 replica FinalizedReplica, blk_1073750995_10171, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750995 for deletion 2025-07-16 03:31:25,058 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750995_10171 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750995 2025-07-16 03:32:18,285 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750996_10172 src: /192.168.158.6:51136 dest: /192.168.158.4:9866 2025-07-16 03:32:18,303 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:51136, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-532051926_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750996_10172, duration(ns): 15933637 2025-07-16 03:32:18,303 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750996_10172, type=LAST_IN_PIPELINE terminating 2025-07-16 03:32:22,060 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750996_10172 replica FinalizedReplica, blk_1073750996_10172, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750996 for deletion 2025-07-16 03:32:22,061 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750996_10172 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750996 2025-07-16 03:33:18,262 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750997_10173 src: /192.168.158.7:33242 dest: /192.168.158.4:9866 2025-07-16 03:33:18,321 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:33242, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_204204681_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750997_10173, duration(ns): 53001802 2025-07-16 03:33:18,322 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750997_10173, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-16 03:33:22,062 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750997_10173 replica FinalizedReplica, blk_1073750997_10173, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750997 for deletion 2025-07-16 03:33:22,063 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750997_10173 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750997 2025-07-16 03:34:18,295 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073750998_10174 src: /192.168.158.6:58814 dest: /192.168.158.4:9866 2025-07-16 03:34:18,322 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:58814, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1277743856_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073750998_10174, duration(ns): 22115703 2025-07-16 03:34:18,322 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073750998_10174, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-16 03:34:22,067 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073750998_10174 replica FinalizedReplica, blk_1073750998_10174, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750998 for deletion 2025-07-16 03:34:22,068 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073750998_10174 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073750998 2025-07-16 03:40:23,284 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751004_10180 src: /192.168.158.1:47874 dest: /192.168.158.4:9866 2025-07-16 03:40:23,315 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47874, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2057058920_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751004_10180, duration(ns): 21391982 2025-07-16 03:40:23,315 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751004_10180, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-16 03:40:25,080 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751004_10180 replica FinalizedReplica, blk_1073751004_10180, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073751004 for deletion 2025-07-16 03:40:25,081 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751004_10180 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073751004 2025-07-16 03:41:23,296 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751005_10181 src: /192.168.158.1:41660 dest: /192.168.158.4:9866 2025-07-16 03:41:23,327 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41660, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_383518127_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751005_10181, duration(ns): 22094978 2025-07-16 03:41:23,327 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751005_10181, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-16 03:41:25,082 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751005_10181 replica FinalizedReplica, blk_1073751005_10181, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073751005 for deletion 2025-07-16 03:41:25,083 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751005_10181 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073751005 2025-07-16 03:42:23,307 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751006_10182 src: /192.168.158.9:34028 dest: /192.168.158.4:9866 2025-07-16 03:42:23,327 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:34028, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1046156253_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751006_10182, duration(ns): 17156519 2025-07-16 03:42:23,327 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751006_10182, type=LAST_IN_PIPELINE terminating 2025-07-16 03:42:25,084 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751006_10182 replica FinalizedReplica, blk_1073751006_10182, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073751006 for deletion 2025-07-16 03:42:25,085 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751006_10182 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073751006 2025-07-16 03:44:28,308 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751008_10184 src: /192.168.158.6:56654 dest: /192.168.158.4:9866 2025-07-16 03:44:28,326 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:56654, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-340904438_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751008_10184, duration(ns): 15810161 2025-07-16 03:44:28,326 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751008_10184, type=LAST_IN_PIPELINE terminating 2025-07-16 03:44:31,086 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751008_10184 replica FinalizedReplica, blk_1073751008_10184, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073751008 for deletion 2025-07-16 03:44:31,087 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751008_10184 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073751008 2025-07-16 03:45:28,320 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751009_10185 src: /192.168.158.9:33800 dest: /192.168.158.4:9866 2025-07-16 03:45:28,344 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:33800, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1451111655_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751009_10185, duration(ns): 17994437 2025-07-16 03:45:28,344 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751009_10185, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-16 03:45:31,089 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751009_10185 replica FinalizedReplica, blk_1073751009_10185, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073751009 for deletion 2025-07-16 03:45:31,090 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751009_10185 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073751009 2025-07-16 03:46:28,301 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751010_10186 src: /192.168.158.1:37408 dest: /192.168.158.4:9866 2025-07-16 03:46:28,333 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37408, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1147936905_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751010_10186, duration(ns): 20909518 2025-07-16 03:46:28,333 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751010_10186, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-16 03:46:31,089 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751010_10186 replica FinalizedReplica, blk_1073751010_10186, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073751010 for deletion 2025-07-16 03:46:31,090 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751010_10186 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073751010 2025-07-16 03:48:33,301 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751012_10188 src: /192.168.158.1:52752 dest: /192.168.158.4:9866 2025-07-16 03:48:33,332 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52752, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1696427085_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751012_10188, duration(ns): 22345409 2025-07-16 03:48:33,333 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751012_10188, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-16 03:48:40,093 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751012_10188 replica FinalizedReplica, blk_1073751012_10188, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073751012 for deletion 2025-07-16 03:48:40,094 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751012_10188 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073751012 2025-07-16 03:50:33,313 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751014_10190 src: /192.168.158.6:42976 dest: /192.168.158.4:9866 2025-07-16 03:50:33,338 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:42976, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-71805019_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751014_10190, duration(ns): 19021356 2025-07-16 03:50:33,338 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751014_10190, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-16 03:50:37,095 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751014_10190 replica FinalizedReplica, blk_1073751014_10190, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073751014 for deletion 2025-07-16 03:50:37,097 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751014_10190 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073751014 2025-07-16 03:52:33,322 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751016_10192 src: /192.168.158.5:42686 dest: /192.168.158.4:9866 2025-07-16 03:52:33,341 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:42686, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-671387471_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751016_10192, duration(ns): 16735165 2025-07-16 03:52:33,342 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751016_10192, type=LAST_IN_PIPELINE terminating 2025-07-16 03:52:37,103 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751016_10192 replica FinalizedReplica, blk_1073751016_10192, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073751016 for deletion 2025-07-16 03:52:37,105 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751016_10192 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073751016 2025-07-16 03:54:33,321 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751018_10194 src: /192.168.158.5:35326 dest: /192.168.158.4:9866 2025-07-16 03:54:33,344 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:35326, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1089861230_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751018_10194, duration(ns): 18684644 2025-07-16 03:54:33,345 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751018_10194, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-16 03:54:37,109 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751018_10194 replica FinalizedReplica, blk_1073751018_10194, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073751018 for deletion 2025-07-16 03:54:37,110 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751018_10194 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073751018 2025-07-16 03:55:33,328 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751019_10195 src: /192.168.158.5:41386 dest: /192.168.158.4:9866 2025-07-16 03:55:33,352 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:41386, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-782182800_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751019_10195, duration(ns): 18089610 2025-07-16 03:55:33,352 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751019_10195, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-16 03:55:40,111 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751019_10195 replica FinalizedReplica, blk_1073751019_10195, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073751019 for deletion 2025-07-16 03:55:40,112 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751019_10195 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073751019 2025-07-16 03:57:38,331 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751021_10197 src: /192.168.158.8:34580 dest: /192.168.158.4:9866 2025-07-16 03:57:38,348 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:34580, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-180299448_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751021_10197, duration(ns): 15504143 2025-07-16 03:57:38,349 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751021_10197, type=LAST_IN_PIPELINE terminating 2025-07-16 03:57:40,114 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751021_10197 replica FinalizedReplica, blk_1073751021_10197, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073751021 for deletion 2025-07-16 03:57:40,115 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751021_10197 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073751021 2025-07-16 03:58:38,330 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751022_10198 src: /192.168.158.7:40844 dest: /192.168.158.4:9866 2025-07-16 03:58:38,354 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:40844, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1163121183_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751022_10198, duration(ns): 18380258 2025-07-16 03:58:38,354 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751022_10198, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-16 03:58:40,114 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751022_10198 replica FinalizedReplica, blk_1073751022_10198, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073751022 for deletion 2025-07-16 03:58:40,115 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751022_10198 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073751022 2025-07-16 03:59:38,327 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751023_10199 src: /192.168.158.8:59666 dest: /192.168.158.4:9866 2025-07-16 03:59:38,343 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:59666, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1580046789_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751023_10199, duration(ns): 14492864 2025-07-16 03:59:38,344 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751023_10199, type=LAST_IN_PIPELINE terminating 2025-07-16 03:59:40,117 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751023_10199 replica FinalizedReplica, blk_1073751023_10199, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073751023 for deletion 2025-07-16 03:59:40,118 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751023_10199 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073751023 2025-07-16 04:01:43,341 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751025_10201 src: /192.168.158.5:48708 dest: /192.168.158.4:9866 2025-07-16 04:01:43,358 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:48708, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1342577745_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751025_10201, duration(ns): 15405964 2025-07-16 04:01:43,359 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751025_10201, type=LAST_IN_PIPELINE terminating 2025-07-16 04:01:46,119 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751025_10201 replica FinalizedReplica, blk_1073751025_10201, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073751025 for deletion 2025-07-16 04:01:46,120 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751025_10201 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073751025 2025-07-16 04:03:43,324 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751027_10203 src: /192.168.158.1:35308 dest: /192.168.158.4:9866 2025-07-16 04:03:43,356 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35308, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_685093399_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751027_10203, duration(ns): 23410566 2025-07-16 04:03:43,356 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751027_10203, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-16 04:03:46,124 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751027_10203 replica FinalizedReplica, blk_1073751027_10203, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073751027 for deletion 2025-07-16 04:03:46,125 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751027_10203 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073751027 2025-07-16 04:04:43,346 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751028_10204 src: /192.168.158.1:33032 dest: /192.168.158.4:9866 2025-07-16 04:04:43,378 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33032, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_333558431_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751028_10204, duration(ns): 22715166 2025-07-16 04:04:43,378 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751028_10204, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-16 04:04:46,124 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751028_10204 replica FinalizedReplica, blk_1073751028_10204, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073751028 for deletion 2025-07-16 04:04:46,126 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751028_10204 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073751028 2025-07-16 04:08:48,362 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751032_10208 src: /192.168.158.1:48686 dest: /192.168.158.4:9866 2025-07-16 04:08:48,391 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48686, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_796448537_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751032_10208, duration(ns): 20344913 2025-07-16 04:08:48,391 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751032_10208, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-16 04:08:52,132 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751032_10208 replica FinalizedReplica, blk_1073751032_10208, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073751032 for deletion 2025-07-16 04:08:52,133 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751032_10208 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073751032 2025-07-16 04:09:53,360 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751033_10209 src: /192.168.158.8:35690 dest: /192.168.158.4:9866 2025-07-16 04:09:53,388 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:35690, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_224991409_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751033_10209, duration(ns): 21118509 2025-07-16 04:09:53,388 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751033_10209, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-16 04:09:55,133 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751033_10209 replica FinalizedReplica, blk_1073751033_10209, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073751033 for deletion 2025-07-16 04:09:55,134 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751033_10209 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073751033 2025-07-16 04:10:53,357 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751034_10210 src: /192.168.158.5:56836 dest: /192.168.158.4:9866 2025-07-16 04:10:53,382 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:56836, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1714246912_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751034_10210, duration(ns): 20251552 2025-07-16 04:10:53,383 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751034_10210, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-16 04:10:58,136 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751034_10210 replica FinalizedReplica, blk_1073751034_10210, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073751034 for deletion 2025-07-16 04:10:58,137 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751034_10210 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073751034 2025-07-16 04:12:53,366 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751036_10212 src: /192.168.158.9:59566 dest: /192.168.158.4:9866 2025-07-16 04:12:53,389 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:59566, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_676131794_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751036_10212, duration(ns): 17477156 2025-07-16 04:12:53,389 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751036_10212, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-16 04:12:58,138 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751036_10212 replica FinalizedReplica, blk_1073751036_10212, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073751036 for deletion 2025-07-16 04:12:58,139 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751036_10212 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073751036 2025-07-16 04:13:53,354 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751037_10213 src: /192.168.158.1:51648 dest: /192.168.158.4:9866 2025-07-16 04:13:53,386 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51648, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1827240891_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751037_10213, duration(ns): 22569538 2025-07-16 04:13:53,386 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751037_10213, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-16 04:13:55,141 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751037_10213 replica FinalizedReplica, blk_1073751037_10213, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073751037 for deletion 2025-07-16 04:13:55,142 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751037_10213 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073751037 2025-07-16 04:17:58,367 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751041_10217 src: /192.168.158.1:47744 dest: /192.168.158.4:9866 2025-07-16 04:17:58,397 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47744, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1548936439_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751041_10217, duration(ns): 22281538 2025-07-16 04:17:58,397 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751041_10217, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-16 04:18:01,150 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751041_10217 replica FinalizedReplica, blk_1073751041_10217, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751041 for deletion 2025-07-16 04:18:01,151 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751041_10217 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751041 2025-07-16 04:18:58,365 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751042_10218 src: /192.168.158.9:53432 dest: /192.168.158.4:9866 2025-07-16 04:18:58,383 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:53432, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1428178012_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751042_10218, duration(ns): 15383989 2025-07-16 04:18:58,383 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751042_10218, type=LAST_IN_PIPELINE terminating 2025-07-16 04:19:01,152 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751042_10218 replica FinalizedReplica, blk_1073751042_10218, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751042 for deletion 2025-07-16 04:19:01,154 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751042_10218 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751042 2025-07-16 04:21:03,377 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751044_10220 src: /192.168.158.5:44036 dest: /192.168.158.4:9866 2025-07-16 04:21:03,400 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:44036, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1743773706_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751044_10220, duration(ns): 18002588 2025-07-16 04:21:03,401 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751044_10220, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-16 04:21:10,159 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751044_10220 replica FinalizedReplica, blk_1073751044_10220, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751044 for deletion 2025-07-16 04:21:10,160 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751044_10220 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751044 2025-07-16 04:22:03,370 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751045_10221 src: /192.168.158.1:38644 dest: /192.168.158.4:9866 2025-07-16 04:22:03,399 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38644, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1680588558_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751045_10221, duration(ns): 20229459 2025-07-16 04:22:03,399 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751045_10221, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-16 04:22:07,160 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751045_10221 replica FinalizedReplica, blk_1073751045_10221, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751045 for deletion 2025-07-16 04:22:07,161 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751045_10221 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751045 2025-07-16 04:25:03,378 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751048_10224 src: /192.168.158.1:44062 dest: /192.168.158.4:9866 2025-07-16 04:25:03,406 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44062, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1212962091_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751048_10224, duration(ns): 19433431 2025-07-16 04:25:03,406 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751048_10224, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-16 04:25:07,171 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751048_10224 replica FinalizedReplica, blk_1073751048_10224, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751048 for deletion 2025-07-16 04:25:07,172 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751048_10224 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751048 2025-07-16 04:26:03,377 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751049_10225 src: /192.168.158.8:39094 dest: /192.168.158.4:9866 2025-07-16 04:26:03,403 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:39094, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_32850425_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751049_10225, duration(ns): 20086945 2025-07-16 04:26:03,403 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751049_10225, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-16 04:26:07,173 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751049_10225 replica FinalizedReplica, blk_1073751049_10225, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751049 for deletion 2025-07-16 04:26:07,175 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751049_10225 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751049 2025-07-16 04:27:08,374 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751050_10226 src: /192.168.158.1:47744 dest: /192.168.158.4:9866 2025-07-16 04:27:08,405 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47744, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1846232023_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751050_10226, duration(ns): 22837531 2025-07-16 04:27:08,406 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751050_10226, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-16 04:27:10,175 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751050_10226 replica FinalizedReplica, blk_1073751050_10226, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751050 for deletion 2025-07-16 04:27:10,176 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751050_10226 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751050 2025-07-16 04:28:08,384 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751051_10227 src: /192.168.158.9:52618 dest: /192.168.158.4:9866 2025-07-16 04:28:08,410 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:52618, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_279023814_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751051_10227, duration(ns): 20790562 2025-07-16 04:28:08,411 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751051_10227, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-16 04:28:13,179 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751051_10227 replica FinalizedReplica, blk_1073751051_10227, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751051 for deletion 2025-07-16 04:28:13,180 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751051_10227 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751051 2025-07-16 04:29:08,383 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751052_10228 src: /192.168.158.1:34954 dest: /192.168.158.4:9866 2025-07-16 04:29:08,413 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34954, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1458319656_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751052_10228, duration(ns): 21672232 2025-07-16 04:29:08,413 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751052_10228, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-16 04:29:10,179 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751052_10228 replica FinalizedReplica, blk_1073751052_10228, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751052 for deletion 2025-07-16 04:29:10,180 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751052_10228 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751052 2025-07-16 04:30:08,404 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751053_10229 src: /192.168.158.1:50230 dest: /192.168.158.4:9866 2025-07-16 04:30:08,435 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50230, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1131499282_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751053_10229, duration(ns): 22881187 2025-07-16 04:30:08,435 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751053_10229, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-16 04:30:10,179 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751053_10229 replica FinalizedReplica, blk_1073751053_10229, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751053 for deletion 2025-07-16 04:30:10,180 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751053_10229 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751053 2025-07-16 04:31:13,393 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751054_10230 src: /192.168.158.8:37334 dest: /192.168.158.4:9866 2025-07-16 04:31:13,419 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:37334, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_526407619_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751054_10230, duration(ns): 21007378 2025-07-16 04:31:13,419 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751054_10230, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-16 04:31:19,183 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751054_10230 replica FinalizedReplica, blk_1073751054_10230, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751054 for deletion 2025-07-16 04:31:19,184 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751054_10230 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751054 2025-07-16 04:34:18,383 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751057_10233 src: /192.168.158.5:45482 dest: /192.168.158.4:9866 2025-07-16 04:34:18,401 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:45482, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1870616512_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751057_10233, duration(ns): 15477241 2025-07-16 04:34:18,401 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751057_10233, type=LAST_IN_PIPELINE terminating 2025-07-16 04:34:25,185 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751057_10233 replica FinalizedReplica, blk_1073751057_10233, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751057 for deletion 2025-07-16 04:34:25,186 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751057_10233 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751057 2025-07-16 04:37:18,399 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751060_10236 src: /192.168.158.1:39704 dest: /192.168.158.4:9866 2025-07-16 04:37:18,428 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39704, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-671764067_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751060_10236, duration(ns): 21049595 2025-07-16 04:37:18,429 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751060_10236, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-16 04:37:22,194 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751060_10236 replica FinalizedReplica, blk_1073751060_10236, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751060 for deletion 2025-07-16 04:37:22,195 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751060_10236 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751060 2025-07-16 04:39:23,409 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751062_10238 src: /192.168.158.7:51902 dest: /192.168.158.4:9866 2025-07-16 04:39:23,429 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:51902, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1101117733_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751062_10238, duration(ns): 18083283 2025-07-16 04:39:23,429 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751062_10238, type=LAST_IN_PIPELINE terminating 2025-07-16 04:39:28,199 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751062_10238 replica FinalizedReplica, blk_1073751062_10238, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751062 for deletion 2025-07-16 04:39:28,201 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751062_10238 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751062 2025-07-16 04:40:28,396 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751063_10239 src: /192.168.158.5:37036 dest: /192.168.158.4:9866 2025-07-16 04:40:28,421 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:37036, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-696611517_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751063_10239, duration(ns): 19570209 2025-07-16 04:40:28,421 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751063_10239, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-16 04:40:31,199 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751063_10239 replica FinalizedReplica, blk_1073751063_10239, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751063 for deletion 2025-07-16 04:40:31,200 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751063_10239 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751063 2025-07-16 04:45:28,395 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751068_10244 src: /192.168.158.6:47782 dest: /192.168.158.4:9866 2025-07-16 04:45:28,418 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:47782, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1216967546_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751068_10244, duration(ns): 17453633 2025-07-16 04:45:28,418 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751068_10244, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-16 04:45:31,208 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751068_10244 replica FinalizedReplica, blk_1073751068_10244, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751068 for deletion 2025-07-16 04:45:31,209 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751068_10244 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751068 2025-07-16 04:46:28,395 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751069_10245 src: /192.168.158.1:43320 dest: /192.168.158.4:9866 2025-07-16 04:46:28,426 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43320, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_900231041_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751069_10245, duration(ns): 21672525 2025-07-16 04:46:28,426 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751069_10245, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-16 04:46:31,215 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751069_10245 replica FinalizedReplica, blk_1073751069_10245, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751069 for deletion 2025-07-16 04:46:31,216 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751069_10245 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751069 2025-07-16 04:49:28,448 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751072_10248 src: /192.168.158.1:60860 dest: /192.168.158.4:9866 2025-07-16 04:49:28,480 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60860, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_573764003_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751072_10248, duration(ns): 22731335 2025-07-16 04:49:28,481 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751072_10248, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-16 04:49:31,221 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751072_10248 replica FinalizedReplica, blk_1073751072_10248, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751072 for deletion 2025-07-16 04:49:31,223 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751072_10248 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751072 2025-07-16 04:51:28,416 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751074_10250 src: /192.168.158.1:46294 dest: /192.168.158.4:9866 2025-07-16 04:51:28,448 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46294, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1868868585_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751074_10250, duration(ns): 23053682 2025-07-16 04:51:28,448 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751074_10250, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-16 04:51:31,223 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751074_10250 replica FinalizedReplica, blk_1073751074_10250, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751074 for deletion 2025-07-16 04:51:31,224 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751074_10250 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751074 2025-07-16 04:55:28,428 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751078_10254 src: /192.168.158.7:58282 dest: /192.168.158.4:9866 2025-07-16 04:55:28,449 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:58282, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1086675982_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751078_10254, duration(ns): 15714994 2025-07-16 04:55:28,449 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751078_10254, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-16 04:55:34,229 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751078_10254 replica FinalizedReplica, blk_1073751078_10254, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751078 for deletion 2025-07-16 04:55:34,230 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751078_10254 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751078 2025-07-16 05:00:28,432 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751083_10259 src: /192.168.158.7:37244 dest: /192.168.158.4:9866 2025-07-16 05:00:28,458 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:37244, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_942471220_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751083_10259, duration(ns): 21050768 2025-07-16 05:00:28,459 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751083_10259, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-16 05:00:31,246 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751083_10259 replica FinalizedReplica, blk_1073751083_10259, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751083 for deletion 2025-07-16 05:00:31,248 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751083_10259 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751083 2025-07-16 05:01:28,442 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751084_10260 src: /192.168.158.7:60820 dest: /192.168.158.4:9866 2025-07-16 05:01:28,459 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:60820, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_330136906_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751084_10260, duration(ns): 14877489 2025-07-16 05:01:28,460 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751084_10260, type=LAST_IN_PIPELINE terminating 2025-07-16 05:01:31,247 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751084_10260 replica FinalizedReplica, blk_1073751084_10260, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751084 for deletion 2025-07-16 05:01:31,248 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751084_10260 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751084 2025-07-16 05:03:33,439 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751086_10262 src: /192.168.158.7:42308 dest: /192.168.158.4:9866 2025-07-16 05:03:33,464 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:42308, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1890882584_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751086_10262, duration(ns): 19586480 2025-07-16 05:03:33,464 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751086_10262, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-16 05:03:37,250 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751086_10262 replica FinalizedReplica, blk_1073751086_10262, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751086 for deletion 2025-07-16 05:03:37,252 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751086_10262 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751086 2025-07-16 05:05:38,440 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751088_10264 src: /192.168.158.7:52802 dest: /192.168.158.4:9866 2025-07-16 05:05:38,458 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:52802, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-347492797_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751088_10264, duration(ns): 15348329 2025-07-16 05:05:38,458 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751088_10264, type=LAST_IN_PIPELINE terminating 2025-07-16 05:05:40,255 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751088_10264 replica FinalizedReplica, blk_1073751088_10264, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751088 for deletion 2025-07-16 05:05:40,256 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751088_10264 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751088 2025-07-16 05:06:38,467 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751089_10265 src: /192.168.158.1:54178 dest: /192.168.158.4:9866 2025-07-16 05:06:38,497 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54178, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1592051779_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751089_10265, duration(ns): 21584985 2025-07-16 05:06:38,498 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751089_10265, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-16 05:06:43,256 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751089_10265 replica FinalizedReplica, blk_1073751089_10265, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751089 for deletion 2025-07-16 05:06:43,257 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751089_10265 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751089 2025-07-16 05:07:38,437 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751090_10266 src: /192.168.158.1:34752 dest: /192.168.158.4:9866 2025-07-16 05:07:38,467 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34752, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_58979164_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751090_10266, duration(ns): 21393692 2025-07-16 05:07:38,468 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751090_10266, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-16 05:07:40,260 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751090_10266 replica FinalizedReplica, blk_1073751090_10266, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751090 for deletion 2025-07-16 05:07:40,262 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751090_10266 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751090 2025-07-16 05:09:38,451 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751092_10268 src: /192.168.158.1:58084 dest: /192.168.158.4:9866 2025-07-16 05:09:38,481 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58084, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_501966073_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751092_10268, duration(ns): 21986619 2025-07-16 05:09:38,482 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751092_10268, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-16 05:09:40,265 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751092_10268 replica FinalizedReplica, blk_1073751092_10268, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751092 for deletion 2025-07-16 05:09:40,266 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751092_10268 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751092 2025-07-16 05:10:38,456 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751093_10269 src: /192.168.158.6:52394 dest: /192.168.158.4:9866 2025-07-16 05:10:38,476 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:52394, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-318145125_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751093_10269, duration(ns): 17221029 2025-07-16 05:10:38,476 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751093_10269, type=LAST_IN_PIPELINE terminating 2025-07-16 05:10:40,267 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751093_10269 replica FinalizedReplica, blk_1073751093_10269, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751093 for deletion 2025-07-16 05:10:40,268 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751093_10269 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751093 2025-07-16 05:12:43,452 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751095_10271 src: /192.168.158.6:34136 dest: /192.168.158.4:9866 2025-07-16 05:12:43,477 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:34136, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-926417987_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751095_10271, duration(ns): 20060654 2025-07-16 05:12:43,477 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751095_10271, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-16 05:12:46,270 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751095_10271 replica FinalizedReplica, blk_1073751095_10271, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751095 for deletion 2025-07-16 05:12:46,271 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751095_10271 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751095 2025-07-16 05:14:53,455 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751097_10273 src: /192.168.158.1:47578 dest: /192.168.158.4:9866 2025-07-16 05:14:53,485 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47578, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1405927200_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751097_10273, duration(ns): 21157029 2025-07-16 05:14:53,485 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751097_10273, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-16 05:14:58,277 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751097_10273 replica FinalizedReplica, blk_1073751097_10273, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751097 for deletion 2025-07-16 05:14:58,278 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751097_10273 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751097 2025-07-16 05:17:58,455 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751100_10276 src: /192.168.158.6:59736 dest: /192.168.158.4:9866 2025-07-16 05:17:58,473 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:59736, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1264869688_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751100_10276, duration(ns): 15874087 2025-07-16 05:17:58,473 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751100_10276, type=LAST_IN_PIPELINE terminating 2025-07-16 05:18:04,283 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751100_10276 replica FinalizedReplica, blk_1073751100_10276, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751100 for deletion 2025-07-16 05:18:04,284 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751100_10276 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751100 2025-07-16 05:18:58,475 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751101_10277 src: /192.168.158.9:52032 dest: /192.168.158.4:9866 2025-07-16 05:18:58,494 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:52032, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_252340593_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751101_10277, duration(ns): 16889742 2025-07-16 05:18:58,494 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751101_10277, type=LAST_IN_PIPELINE terminating 2025-07-16 05:19:04,283 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751101_10277 replica FinalizedReplica, blk_1073751101_10277, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751101 for deletion 2025-07-16 05:19:04,284 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751101_10277 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751101 2025-07-16 05:20:58,457 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751103_10279 src: /192.168.158.1:39016 dest: /192.168.158.4:9866 2025-07-16 05:20:58,488 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39016, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-301691701_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751103_10279, duration(ns): 21951706 2025-07-16 05:20:58,488 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751103_10279, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-16 05:21:04,289 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751103_10279 replica FinalizedReplica, blk_1073751103_10279, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751103 for deletion 2025-07-16 05:21:04,290 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751103_10279 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751103 2025-07-16 05:21:58,459 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751104_10280 src: /192.168.158.5:47010 dest: /192.168.158.4:9866 2025-07-16 05:21:58,485 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:47010, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1031394716_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751104_10280, duration(ns): 18616853 2025-07-16 05:21:58,485 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751104_10280, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-16 05:22:01,290 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751104_10280 replica FinalizedReplica, blk_1073751104_10280, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751104 for deletion 2025-07-16 05:22:01,291 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751104_10280 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751104 2025-07-16 05:22:58,470 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751105_10281 src: /192.168.158.6:53844 dest: /192.168.158.4:9866 2025-07-16 05:22:58,487 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:53844, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1274850752_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751105_10281, duration(ns): 14714518 2025-07-16 05:22:58,487 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751105_10281, type=LAST_IN_PIPELINE terminating 2025-07-16 05:23:01,293 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751105_10281 replica FinalizedReplica, blk_1073751105_10281, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751105 for deletion 2025-07-16 05:23:01,294 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751105_10281 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751105 2025-07-16 05:23:58,467 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751106_10282 src: /192.168.158.1:52488 dest: /192.168.158.4:9866 2025-07-16 05:23:58,496 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52488, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-946230972_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751106_10282, duration(ns): 20731826 2025-07-16 05:23:58,497 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751106_10282, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-16 05:24:01,294 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751106_10282 replica FinalizedReplica, blk_1073751106_10282, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751106 for deletion 2025-07-16 05:24:01,296 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751106_10282 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751106 2025-07-16 05:33:08,492 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751115_10291 src: /192.168.158.1:46050 dest: /192.168.158.4:9866 2025-07-16 05:33:08,525 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46050, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1741691495_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751115_10291, duration(ns): 23379659 2025-07-16 05:33:08,526 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751115_10291, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-16 05:33:13,322 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751115_10291 replica FinalizedReplica, blk_1073751115_10291, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751115 for deletion 2025-07-16 05:33:13,324 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751115_10291 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751115 2025-07-16 05:36:13,269 INFO org.apache.hadoop.hdfs.server.datanode.DirectoryScanner: BlockPool BP-1059995147-192.168.158.1-1752101929360 Total blocks: 8, missing metadata files:0, missing block files:0, missing blocks in memory:0, mismatched blocks:0 2025-07-16 05:37:13,510 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751119_10295 src: /192.168.158.5:44622 dest: /192.168.158.4:9866 2025-07-16 05:37:13,527 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:44622, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_577492175_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751119_10295, duration(ns): 15064186 2025-07-16 05:37:13,528 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751119_10295, type=LAST_IN_PIPELINE terminating 2025-07-16 05:37:16,334 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751119_10295 replica FinalizedReplica, blk_1073751119_10295, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751119 for deletion 2025-07-16 05:37:16,335 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751119_10295 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751119 2025-07-16 05:39:13,501 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751121_10297 src: /192.168.158.1:54778 dest: /192.168.158.4:9866 2025-07-16 05:39:13,531 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54778, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-668121125_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751121_10297, duration(ns): 21435334 2025-07-16 05:39:13,532 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751121_10297, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-16 05:39:16,337 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751121_10297 replica FinalizedReplica, blk_1073751121_10297, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751121 for deletion 2025-07-16 05:39:16,338 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751121_10297 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751121 2025-07-16 05:40:13,502 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751122_10298 src: /192.168.158.8:58948 dest: /192.168.158.4:9866 2025-07-16 05:40:13,520 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:58948, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-784893542_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751122_10298, duration(ns): 16167350 2025-07-16 05:40:13,520 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751122_10298, type=LAST_IN_PIPELINE terminating 2025-07-16 05:40:16,338 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751122_10298 replica FinalizedReplica, blk_1073751122_10298, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751122 for deletion 2025-07-16 05:40:16,339 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751122_10298 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751122 2025-07-16 05:44:13,523 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751126_10302 src: /192.168.158.1:38862 dest: /192.168.158.4:9866 2025-07-16 05:44:13,553 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38862, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-324323701_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751126_10302, duration(ns): 21710481 2025-07-16 05:44:13,554 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751126_10302, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-16 05:44:16,350 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751126_10302 replica FinalizedReplica, blk_1073751126_10302, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751126 for deletion 2025-07-16 05:44:16,351 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751126_10302 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751126 2025-07-16 05:45:13,541 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751127_10303 src: /192.168.158.9:59440 dest: /192.168.158.4:9866 2025-07-16 05:45:13,565 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:59440, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-230424345_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751127_10303, duration(ns): 18160573 2025-07-16 05:45:13,565 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751127_10303, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-16 05:45:19,355 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751127_10303 replica FinalizedReplica, blk_1073751127_10303, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751127 for deletion 2025-07-16 05:45:19,356 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751127_10303 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751127 2025-07-16 05:46:13,535 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751128_10304 src: /192.168.158.1:43714 dest: /192.168.158.4:9866 2025-07-16 05:46:13,566 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43714, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1988195994_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751128_10304, duration(ns): 22777944 2025-07-16 05:46:13,566 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751128_10304, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-16 05:46:16,357 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751128_10304 replica FinalizedReplica, blk_1073751128_10304, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751128 for deletion 2025-07-16 05:46:16,358 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751128_10304 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751128 2025-07-16 05:48:13,541 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751130_10306 src: /192.168.158.1:34662 dest: /192.168.158.4:9866 2025-07-16 05:48:13,572 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34662, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1764933123_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751130_10306, duration(ns): 22930323 2025-07-16 05:48:13,573 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751130_10306, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-16 05:48:19,358 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751130_10306 replica FinalizedReplica, blk_1073751130_10306, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751130 for deletion 2025-07-16 05:48:19,359 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751130_10306 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751130 2025-07-16 05:49:13,555 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751131_10307 src: /192.168.158.7:47462 dest: /192.168.158.4:9866 2025-07-16 05:49:13,579 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:47462, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1305988435_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751131_10307, duration(ns): 19215777 2025-07-16 05:49:13,580 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751131_10307, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-16 05:49:16,357 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751131_10307 replica FinalizedReplica, blk_1073751131_10307, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751131 for deletion 2025-07-16 05:49:16,359 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751131_10307 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751131 2025-07-16 05:50:13,526 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751132_10308 src: /192.168.158.1:53926 dest: /192.168.158.4:9866 2025-07-16 05:50:13,556 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53926, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1972038477_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751132_10308, duration(ns): 21156469 2025-07-16 05:50:13,556 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751132_10308, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-16 05:50:16,360 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751132_10308 replica FinalizedReplica, blk_1073751132_10308, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751132 for deletion 2025-07-16 05:50:16,361 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751132_10308 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751132 2025-07-16 05:52:13,531 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751134_10310 src: /192.168.158.9:52666 dest: /192.168.158.4:9866 2025-07-16 05:52:13,550 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:52666, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_586761424_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751134_10310, duration(ns): 17472525 2025-07-16 05:52:13,551 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751134_10310, type=LAST_IN_PIPELINE terminating 2025-07-16 05:52:16,360 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751134_10310 replica FinalizedReplica, blk_1073751134_10310, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751134 for deletion 2025-07-16 05:52:16,362 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751134_10310 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751134 2025-07-16 05:57:18,559 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751139_10315 src: /192.168.158.6:39038 dest: /192.168.158.4:9866 2025-07-16 05:57:18,577 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:39038, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_317925781_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751139_10315, duration(ns): 15862198 2025-07-16 05:57:18,578 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751139_10315, type=LAST_IN_PIPELINE terminating 2025-07-16 05:57:22,365 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751139_10315 replica FinalizedReplica, blk_1073751139_10315, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751139 for deletion 2025-07-16 05:57:22,366 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751139_10315 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751139 2025-07-16 05:58:18,565 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751140_10316 src: /192.168.158.6:36680 dest: /192.168.158.4:9866 2025-07-16 05:58:18,588 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:36680, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1083288097_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751140_10316, duration(ns): 18424142 2025-07-16 05:58:18,589 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751140_10316, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-16 05:58:22,366 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751140_10316 replica FinalizedReplica, blk_1073751140_10316, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751140 for deletion 2025-07-16 05:58:22,367 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751140_10316 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751140 2025-07-16 06:01:18,565 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751143_10319 src: /192.168.158.9:45996 dest: /192.168.158.4:9866 2025-07-16 06:01:18,581 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:45996, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1904593633_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751143_10319, duration(ns): 14630941 2025-07-16 06:01:18,582 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751143_10319, type=LAST_IN_PIPELINE terminating 2025-07-16 06:01:22,371 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751143_10319 replica FinalizedReplica, blk_1073751143_10319, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751143 for deletion 2025-07-16 06:01:22,373 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751143_10319 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751143 2025-07-16 06:02:18,546 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751144_10320 src: /192.168.158.1:36276 dest: /192.168.158.4:9866 2025-07-16 06:02:18,577 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36276, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1547078484_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751144_10320, duration(ns): 22750492 2025-07-16 06:02:18,577 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751144_10320, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-16 06:02:22,374 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751144_10320 replica FinalizedReplica, blk_1073751144_10320, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751144 for deletion 2025-07-16 06:02:22,375 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751144_10320 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751144 2025-07-16 06:04:18,588 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751146_10322 src: /192.168.158.1:55036 dest: /192.168.158.4:9866 2025-07-16 06:04:18,618 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55036, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1791045468_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751146_10322, duration(ns): 20947440 2025-07-16 06:04:18,618 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751146_10322, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-16 06:04:22,383 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751146_10322 replica FinalizedReplica, blk_1073751146_10322, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751146 for deletion 2025-07-16 06:04:22,385 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751146_10322 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751146 2025-07-16 06:07:18,563 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751149_10325 src: /192.168.158.5:50584 dest: /192.168.158.4:9866 2025-07-16 06:07:18,580 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:50584, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1596526233_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751149_10325, duration(ns): 14347081 2025-07-16 06:07:18,580 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751149_10325, type=LAST_IN_PIPELINE terminating 2025-07-16 06:07:22,389 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751149_10325 replica FinalizedReplica, blk_1073751149_10325, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751149 for deletion 2025-07-16 06:07:22,391 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751149_10325 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751149 2025-07-16 06:11:23,570 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751153_10329 src: /192.168.158.9:45882 dest: /192.168.158.4:9866 2025-07-16 06:11:23,588 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:45882, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2078321302_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751153_10329, duration(ns): 16354294 2025-07-16 06:11:23,589 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751153_10329, type=LAST_IN_PIPELINE terminating 2025-07-16 06:11:28,398 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751153_10329 replica FinalizedReplica, blk_1073751153_10329, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751153 for deletion 2025-07-16 06:11:28,399 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751153_10329 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751153 2025-07-16 06:14:23,559 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751156_10332 src: /192.168.158.6:38722 dest: /192.168.158.4:9866 2025-07-16 06:14:23,577 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:38722, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1719441424_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751156_10332, duration(ns): 14992543 2025-07-16 06:14:23,577 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751156_10332, type=LAST_IN_PIPELINE terminating 2025-07-16 06:14:28,407 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751156_10332 replica FinalizedReplica, blk_1073751156_10332, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751156 for deletion 2025-07-16 06:14:28,408 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751156_10332 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751156 2025-07-16 06:15:23,559 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751157_10333 src: /192.168.158.8:44906 dest: /192.168.158.4:9866 2025-07-16 06:15:23,578 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:44906, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1069006158_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751157_10333, duration(ns): 17089646 2025-07-16 06:15:23,578 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751157_10333, type=LAST_IN_PIPELINE terminating 2025-07-16 06:15:25,408 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751157_10333 replica FinalizedReplica, blk_1073751157_10333, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751157 for deletion 2025-07-16 06:15:25,409 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751157_10333 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751157 2025-07-16 06:16:23,562 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751158_10334 src: /192.168.158.7:47160 dest: /192.168.158.4:9866 2025-07-16 06:16:23,582 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:47160, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1830323245_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751158_10334, duration(ns): 17034010 2025-07-16 06:16:23,582 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751158_10334, type=LAST_IN_PIPELINE terminating 2025-07-16 06:16:25,409 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751158_10334 replica FinalizedReplica, blk_1073751158_10334, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751158 for deletion 2025-07-16 06:16:25,411 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751158_10334 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751158 2025-07-16 06:17:28,574 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751159_10335 src: /192.168.158.7:49356 dest: /192.168.158.4:9866 2025-07-16 06:17:28,591 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:49356, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1016676426_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751159_10335, duration(ns): 14926400 2025-07-16 06:17:28,591 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751159_10335, type=LAST_IN_PIPELINE terminating 2025-07-16 06:17:34,413 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751159_10335 replica FinalizedReplica, blk_1073751159_10335, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751159 for deletion 2025-07-16 06:17:34,414 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751159_10335 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751159 2025-07-16 06:18:33,610 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751160_10336 src: /192.168.158.9:36440 dest: /192.168.158.4:9866 2025-07-16 06:18:33,633 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:36440, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_122189270_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751160_10336, duration(ns): 17796784 2025-07-16 06:18:33,633 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751160_10336, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-16 06:18:40,418 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751160_10336 replica FinalizedReplica, blk_1073751160_10336, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751160 for deletion 2025-07-16 06:18:40,419 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751160_10336 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751160 2025-07-16 06:19:38,580 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751161_10337 src: /192.168.158.6:58032 dest: /192.168.158.4:9866 2025-07-16 06:19:38,597 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:58032, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-53080056_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751161_10337, duration(ns): 14380308 2025-07-16 06:19:38,597 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751161_10337, type=LAST_IN_PIPELINE terminating 2025-07-16 06:19:40,421 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751161_10337 replica FinalizedReplica, blk_1073751161_10337, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751161 for deletion 2025-07-16 06:19:40,422 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751161_10337 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751161 2025-07-16 06:23:38,600 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751165_10341 src: /192.168.158.8:57156 dest: /192.168.158.4:9866 2025-07-16 06:23:38,619 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:57156, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-396915570_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751165_10341, duration(ns): 16765341 2025-07-16 06:23:38,619 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751165_10341, type=LAST_IN_PIPELINE terminating 2025-07-16 06:23:40,430 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751165_10341 replica FinalizedReplica, blk_1073751165_10341, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751165 for deletion 2025-07-16 06:23:40,431 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751165_10341 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751165 2025-07-16 06:26:38,561 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751168_10344 src: /192.168.158.7:42332 dest: /192.168.158.4:9866 2025-07-16 06:26:38,584 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:42332, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1037178442_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751168_10344, duration(ns): 18072157 2025-07-16 06:26:38,585 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751168_10344, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-16 06:26:43,440 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751168_10344 replica FinalizedReplica, blk_1073751168_10344, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751168 for deletion 2025-07-16 06:26:43,441 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751168_10344 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751168 2025-07-16 06:27:38,564 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751169_10345 src: /192.168.158.1:49504 dest: /192.168.158.4:9866 2025-07-16 06:27:38,594 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49504, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-506049023_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751169_10345, duration(ns): 22430035 2025-07-16 06:27:38,595 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751169_10345, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-16 06:27:40,443 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751169_10345 replica FinalizedReplica, blk_1073751169_10345, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751169 for deletion 2025-07-16 06:27:40,444 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751169_10345 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751169 2025-07-16 06:29:43,617 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751171_10347 src: /192.168.158.9:38910 dest: /192.168.158.4:9866 2025-07-16 06:29:43,639 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:38910, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_377827601_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751171_10347, duration(ns): 16869237 2025-07-16 06:29:43,639 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751171_10347, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-16 06:29:46,445 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751171_10347 replica FinalizedReplica, blk_1073751171_10347, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751171 for deletion 2025-07-16 06:29:46,446 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751171_10347 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751171 2025-07-16 06:30:43,610 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751172_10348 src: /192.168.158.6:46806 dest: /192.168.158.4:9866 2025-07-16 06:30:43,633 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:46806, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1712167149_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751172_10348, duration(ns): 17783628 2025-07-16 06:30:43,633 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751172_10348, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-16 06:30:46,448 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751172_10348 replica FinalizedReplica, blk_1073751172_10348, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751172 for deletion 2025-07-16 06:30:46,449 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751172_10348 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751172 2025-07-16 06:31:43,611 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751173_10349 src: /192.168.158.9:50540 dest: /192.168.158.4:9866 2025-07-16 06:31:43,630 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:50540, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-799632192_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751173_10349, duration(ns): 16339104 2025-07-16 06:31:43,630 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751173_10349, type=LAST_IN_PIPELINE terminating 2025-07-16 06:31:46,452 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751173_10349 replica FinalizedReplica, blk_1073751173_10349, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751173 for deletion 2025-07-16 06:31:46,453 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751173_10349 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751173 2025-07-16 06:32:48,604 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751174_10350 src: /192.168.158.9:33282 dest: /192.168.158.4:9866 2025-07-16 06:32:48,625 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:33282, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-81443710_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751174_10350, duration(ns): 18725566 2025-07-16 06:32:48,625 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751174_10350, type=LAST_IN_PIPELINE terminating 2025-07-16 06:32:55,453 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751174_10350 replica FinalizedReplica, blk_1073751174_10350, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751174 for deletion 2025-07-16 06:32:55,455 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751174_10350 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751174 2025-07-16 06:33:48,599 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751175_10351 src: /192.168.158.1:56114 dest: /192.168.158.4:9866 2025-07-16 06:33:48,632 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56114, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-954932552_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751175_10351, duration(ns): 24499548 2025-07-16 06:33:48,633 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751175_10351, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-16 06:33:55,454 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751175_10351 replica FinalizedReplica, blk_1073751175_10351, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751175 for deletion 2025-07-16 06:33:55,456 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751175_10351 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751175 2025-07-16 06:35:48,604 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751177_10353 src: /192.168.158.1:39344 dest: /192.168.158.4:9866 2025-07-16 06:35:48,639 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39344, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1499010383_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751177_10353, duration(ns): 27145943 2025-07-16 06:35:48,639 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751177_10353, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-16 06:35:52,460 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751177_10353 replica FinalizedReplica, blk_1073751177_10353, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751177 for deletion 2025-07-16 06:35:52,461 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751177_10353 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751177 2025-07-16 06:36:48,608 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751178_10354 src: /192.168.158.8:47490 dest: /192.168.158.4:9866 2025-07-16 06:36:48,626 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:47490, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-366154102_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751178_10354, duration(ns): 16627074 2025-07-16 06:36:48,627 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751178_10354, type=LAST_IN_PIPELINE terminating 2025-07-16 06:36:52,461 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751178_10354 replica FinalizedReplica, blk_1073751178_10354, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751178 for deletion 2025-07-16 06:36:52,463 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751178_10354 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751178 2025-07-16 06:39:48,621 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751181_10357 src: /192.168.158.8:41306 dest: /192.168.158.4:9866 2025-07-16 06:39:48,646 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:41306, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-664756086_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751181_10357, duration(ns): 20265372 2025-07-16 06:39:48,647 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751181_10357, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-16 06:39:52,468 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751181_10357 replica FinalizedReplica, blk_1073751181_10357, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751181 for deletion 2025-07-16 06:39:52,469 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751181_10357 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751181 2025-07-16 06:41:48,601 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751183_10359 src: /192.168.158.6:58158 dest: /192.168.158.4:9866 2025-07-16 06:41:48,618 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:58158, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2003231373_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751183_10359, duration(ns): 15126757 2025-07-16 06:41:48,618 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751183_10359, type=LAST_IN_PIPELINE terminating 2025-07-16 06:41:52,472 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751183_10359 replica FinalizedReplica, blk_1073751183_10359, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751183 for deletion 2025-07-16 06:41:52,473 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751183_10359 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751183 2025-07-16 06:42:48,602 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751184_10360 src: /192.168.158.1:45828 dest: /192.168.158.4:9866 2025-07-16 06:42:48,632 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45828, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1254599388_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751184_10360, duration(ns): 21751252 2025-07-16 06:42:48,632 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751184_10360, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-16 06:42:52,475 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751184_10360 replica FinalizedReplica, blk_1073751184_10360, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751184 for deletion 2025-07-16 06:42:52,476 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751184_10360 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751184 2025-07-16 06:44:48,607 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751186_10362 src: /192.168.158.6:38756 dest: /192.168.158.4:9866 2025-07-16 06:44:48,626 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:38756, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_244524943_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751186_10362, duration(ns): 16274608 2025-07-16 06:44:48,626 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751186_10362, type=LAST_IN_PIPELINE terminating 2025-07-16 06:44:49,483 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751186_10362 replica FinalizedReplica, blk_1073751186_10362, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751186 for deletion 2025-07-16 06:44:49,484 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751186_10362 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751186 2025-07-16 06:45:48,609 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751187_10363 src: /192.168.158.5:56448 dest: /192.168.158.4:9866 2025-07-16 06:45:48,631 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:56448, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1139289448_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751187_10363, duration(ns): 17792107 2025-07-16 06:45:48,632 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751187_10363, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-16 06:45:52,485 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751187_10363 replica FinalizedReplica, blk_1073751187_10363, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751187 for deletion 2025-07-16 06:45:52,486 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751187_10363 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751187 2025-07-16 06:51:58,650 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751193_10369 src: /192.168.158.9:40284 dest: /192.168.158.4:9866 2025-07-16 06:51:58,667 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:40284, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_323282560_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751193_10369, duration(ns): 15648760 2025-07-16 06:51:58,668 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751193_10369, type=LAST_IN_PIPELINE terminating 2025-07-16 06:52:01,504 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751193_10369 replica FinalizedReplica, blk_1073751193_10369, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751193 for deletion 2025-07-16 06:52:01,505 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751193_10369 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751193 2025-07-16 06:52:58,669 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751194_10370 src: /192.168.158.9:37498 dest: /192.168.158.4:9866 2025-07-16 06:52:58,699 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:37498, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-32643609_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751194_10370, duration(ns): 23343868 2025-07-16 06:52:58,699 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751194_10370, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-16 06:53:04,507 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751194_10370 replica FinalizedReplica, blk_1073751194_10370, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751194 for deletion 2025-07-16 06:53:04,508 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751194_10370 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751194 2025-07-16 06:53:58,647 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751195_10371 src: /192.168.158.8:36754 dest: /192.168.158.4:9866 2025-07-16 06:53:58,666 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:36754, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1814927425_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751195_10371, duration(ns): 16413046 2025-07-16 06:53:58,666 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751195_10371, type=LAST_IN_PIPELINE terminating 2025-07-16 06:54:01,509 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751195_10371 replica FinalizedReplica, blk_1073751195_10371, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751195 for deletion 2025-07-16 06:54:01,510 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751195_10371 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751195 2025-07-16 06:55:58,645 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751197_10373 src: /192.168.158.8:56666 dest: /192.168.158.4:9866 2025-07-16 06:55:58,670 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:56666, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-42014148_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751197_10373, duration(ns): 18973132 2025-07-16 06:55:58,670 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751197_10373, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-16 06:56:01,515 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751197_10373 replica FinalizedReplica, blk_1073751197_10373, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751197 for deletion 2025-07-16 06:56:01,516 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751197_10373 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751197 2025-07-16 06:57:58,663 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751199_10375 src: /192.168.158.9:34720 dest: /192.168.158.4:9866 2025-07-16 06:57:58,688 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:34720, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1163941182_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751199_10375, duration(ns): 19240269 2025-07-16 06:57:58,688 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751199_10375, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-16 06:58:04,519 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751199_10375 replica FinalizedReplica, blk_1073751199_10375, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751199 for deletion 2025-07-16 06:58:04,520 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751199_10375 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751199 2025-07-16 06:58:58,670 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751200_10376 src: /192.168.158.9:36654 dest: /192.168.158.4:9866 2025-07-16 06:58:58,688 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:36654, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1778295064_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751200_10376, duration(ns): 15893281 2025-07-16 06:58:58,689 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751200_10376, type=LAST_IN_PIPELINE terminating 2025-07-16 06:59:04,524 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751200_10376 replica FinalizedReplica, blk_1073751200_10376, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751200 for deletion 2025-07-16 06:59:04,525 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751200_10376 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751200 2025-07-16 07:01:58,669 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751203_10379 src: /192.168.158.1:41662 dest: /192.168.158.4:9866 2025-07-16 07:01:58,704 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41662, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2056516632_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751203_10379, duration(ns): 27095783 2025-07-16 07:01:58,704 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751203_10379, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-16 07:02:01,528 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751203_10379 replica FinalizedReplica, blk_1073751203_10379, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751203 for deletion 2025-07-16 07:02:01,529 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751203_10379 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751203 2025-07-16 07:03:58,643 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751205_10381 src: /192.168.158.9:58898 dest: /192.168.158.4:9866 2025-07-16 07:03:58,668 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:58898, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1997436797_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751205_10381, duration(ns): 19281128 2025-07-16 07:03:58,668 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751205_10381, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-16 07:04:04,534 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751205_10381 replica FinalizedReplica, blk_1073751205_10381, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751205 for deletion 2025-07-16 07:04:04,536 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751205_10381 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751205 2025-07-16 07:04:58,649 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751206_10382 src: /192.168.158.7:51540 dest: /192.168.158.4:9866 2025-07-16 07:04:58,667 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:51540, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1109418477_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751206_10382, duration(ns): 15838495 2025-07-16 07:04:58,667 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751206_10382, type=LAST_IN_PIPELINE terminating 2025-07-16 07:05:04,537 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751206_10382 replica FinalizedReplica, blk_1073751206_10382, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751206 for deletion 2025-07-16 07:05:04,538 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751206_10382 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751206 2025-07-16 07:05:58,658 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751207_10383 src: /192.168.158.8:40912 dest: /192.168.158.4:9866 2025-07-16 07:05:58,678 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:40912, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-680833277_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751207_10383, duration(ns): 17846920 2025-07-16 07:05:58,679 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751207_10383, type=LAST_IN_PIPELINE terminating 2025-07-16 07:06:04,541 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751207_10383 replica FinalizedReplica, blk_1073751207_10383, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751207 for deletion 2025-07-16 07:06:04,542 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751207_10383 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751207 2025-07-16 07:15:03,674 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751216_10392 src: /192.168.158.9:51050 dest: /192.168.158.4:9866 2025-07-16 07:15:03,693 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:51050, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1986402117_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751216_10392, duration(ns): 17090891 2025-07-16 07:15:03,694 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751216_10392, type=LAST_IN_PIPELINE terminating 2025-07-16 07:15:04,557 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751216_10392 replica FinalizedReplica, blk_1073751216_10392, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751216 for deletion 2025-07-16 07:15:04,558 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751216_10392 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751216 2025-07-16 07:18:03,674 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751219_10395 src: /192.168.158.1:32890 dest: /192.168.158.4:9866 2025-07-16 07:18:03,705 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:32890, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1094559838_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751219_10395, duration(ns): 20672720 2025-07-16 07:18:03,705 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751219_10395, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-16 07:18:04,557 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751219_10395 replica FinalizedReplica, blk_1073751219_10395, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751219 for deletion 2025-07-16 07:18:04,558 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751219_10395 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751219 2025-07-16 07:20:03,682 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751221_10397 src: /192.168.158.5:48296 dest: /192.168.158.4:9866 2025-07-16 07:20:03,700 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:48296, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2115071556_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751221_10397, duration(ns): 15598112 2025-07-16 07:20:03,700 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751221_10397, type=LAST_IN_PIPELINE terminating 2025-07-16 07:20:07,558 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751221_10397 replica FinalizedReplica, blk_1073751221_10397, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751221 for deletion 2025-07-16 07:20:07,559 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751221_10397 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751221 2025-07-16 07:23:03,691 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751224_10400 src: /192.168.158.8:41406 dest: /192.168.158.4:9866 2025-07-16 07:23:03,717 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:41406, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-386351515_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751224_10400, duration(ns): 20554385 2025-07-16 07:23:03,717 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751224_10400, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-16 07:23:07,560 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751224_10400 replica FinalizedReplica, blk_1073751224_10400, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751224 for deletion 2025-07-16 07:23:07,562 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751224_10400 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751224 2025-07-16 07:24:03,691 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751225_10401 src: /192.168.158.8:36788 dest: /192.168.158.4:9866 2025-07-16 07:24:03,708 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:36788, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2049333674_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751225_10401, duration(ns): 15364000 2025-07-16 07:24:03,709 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751225_10401, type=LAST_IN_PIPELINE terminating 2025-07-16 07:24:04,560 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751225_10401 replica FinalizedReplica, blk_1073751225_10401, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751225 for deletion 2025-07-16 07:24:04,562 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751225_10401 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751225 2025-07-16 07:25:03,695 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751226_10402 src: /192.168.158.5:59178 dest: /192.168.158.4:9866 2025-07-16 07:25:03,720 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:59178, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1518113629_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751226_10402, duration(ns): 19685449 2025-07-16 07:25:03,721 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751226_10402, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-16 07:25:07,561 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751226_10402 replica FinalizedReplica, blk_1073751226_10402, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751226 for deletion 2025-07-16 07:25:07,562 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751226_10402 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751226 2025-07-16 07:26:03,694 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751227_10403 src: /192.168.158.8:49570 dest: /192.168.158.4:9866 2025-07-16 07:26:03,716 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:49570, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1424342847_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751227_10403, duration(ns): 17308925 2025-07-16 07:26:03,717 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751227_10403, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-16 07:26:04,563 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751227_10403 replica FinalizedReplica, blk_1073751227_10403, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751227 for deletion 2025-07-16 07:26:04,564 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751227_10403 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751227 2025-07-16 07:30:03,706 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751231_10407 src: /192.168.158.5:50610 dest: /192.168.158.4:9866 2025-07-16 07:30:03,730 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:50610, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_921646366_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751231_10407, duration(ns): 18821846 2025-07-16 07:30:03,730 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751231_10407, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-16 07:30:04,572 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751231_10407 replica FinalizedReplica, blk_1073751231_10407, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751231 for deletion 2025-07-16 07:30:04,573 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751231_10407 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751231 2025-07-16 07:31:03,705 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751232_10408 src: /192.168.158.5:34636 dest: /192.168.158.4:9866 2025-07-16 07:31:03,731 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:34636, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_710762548_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751232_10408, duration(ns): 19949812 2025-07-16 07:31:03,731 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751232_10408, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-16 07:31:04,573 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751232_10408 replica FinalizedReplica, blk_1073751232_10408, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751232 for deletion 2025-07-16 07:31:04,575 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751232_10408 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751232 2025-07-16 07:32:03,706 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751233_10409 src: /192.168.158.8:55900 dest: /192.168.158.4:9866 2025-07-16 07:32:03,723 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:55900, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1439829684_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751233_10409, duration(ns): 14701830 2025-07-16 07:32:03,723 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751233_10409, type=LAST_IN_PIPELINE terminating 2025-07-16 07:32:07,576 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751233_10409 replica FinalizedReplica, blk_1073751233_10409, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751233 for deletion 2025-07-16 07:32:07,577 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751233_10409 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751233 2025-07-16 07:36:08,711 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751237_10413 src: /192.168.158.1:53812 dest: /192.168.158.4:9866 2025-07-16 07:36:08,743 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53812, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1101617422_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751237_10413, duration(ns): 23367133 2025-07-16 07:36:08,743 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751237_10413, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-16 07:36:13,582 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751237_10413 replica FinalizedReplica, blk_1073751237_10413, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751237 for deletion 2025-07-16 07:36:13,584 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751237_10413 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751237 2025-07-16 07:37:08,718 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751238_10414 src: /192.168.158.8:40976 dest: /192.168.158.4:9866 2025-07-16 07:37:08,744 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:40976, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-465139519_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751238_10414, duration(ns): 19728263 2025-07-16 07:37:08,744 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751238_10414, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-16 07:37:10,584 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751238_10414 replica FinalizedReplica, blk_1073751238_10414, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751238 for deletion 2025-07-16 07:37:10,585 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751238_10414 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751238 2025-07-16 07:42:13,717 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751243_10419 src: /192.168.158.1:34188 dest: /192.168.158.4:9866 2025-07-16 07:42:13,748 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34188, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-910569685_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751243_10419, duration(ns): 22623163 2025-07-16 07:42:13,749 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751243_10419, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-16 07:42:16,586 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751243_10419 replica FinalizedReplica, blk_1073751243_10419, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751243 for deletion 2025-07-16 07:42:16,588 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751243_10419 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751243 2025-07-16 07:43:13,725 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751244_10420 src: /192.168.158.9:49840 dest: /192.168.158.4:9866 2025-07-16 07:43:13,743 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:49840, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1980954385_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751244_10420, duration(ns): 15365275 2025-07-16 07:43:13,743 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751244_10420, type=LAST_IN_PIPELINE terminating 2025-07-16 07:43:19,588 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751244_10420 replica FinalizedReplica, blk_1073751244_10420, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751244 for deletion 2025-07-16 07:43:19,589 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751244_10420 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751244 2025-07-16 07:44:13,729 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751245_10421 src: /192.168.158.1:51294 dest: /192.168.158.4:9866 2025-07-16 07:44:13,761 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51294, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_333454973_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751245_10421, duration(ns): 22825734 2025-07-16 07:44:13,761 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751245_10421, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-16 07:44:16,589 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751245_10421 replica FinalizedReplica, blk_1073751245_10421, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751245 for deletion 2025-07-16 07:44:16,590 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751245_10421 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751245 2025-07-16 07:46:18,744 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751247_10423 src: /192.168.158.5:40856 dest: /192.168.158.4:9866 2025-07-16 07:46:18,770 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:40856, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-208301746_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751247_10423, duration(ns): 21281102 2025-07-16 07:46:18,770 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751247_10423, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-16 07:46:22,588 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751247_10423 replica FinalizedReplica, blk_1073751247_10423, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751247 for deletion 2025-07-16 07:46:22,590 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751247_10423 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751247 2025-07-16 07:48:18,747 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751249_10425 src: /192.168.158.1:48488 dest: /192.168.158.4:9866 2025-07-16 07:48:18,779 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48488, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_72273989_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751249_10425, duration(ns): 23453329 2025-07-16 07:48:18,779 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751249_10425, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-16 07:48:19,589 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751249_10425 replica FinalizedReplica, blk_1073751249_10425, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751249 for deletion 2025-07-16 07:48:19,591 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751249_10425 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751249 2025-07-16 07:49:18,766 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751250_10426 src: /192.168.158.6:34804 dest: /192.168.158.4:9866 2025-07-16 07:49:18,792 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:34804, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1311603283_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751250_10426, duration(ns): 19900501 2025-07-16 07:49:18,792 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751250_10426, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-16 07:49:19,592 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751250_10426 replica FinalizedReplica, blk_1073751250_10426, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751250 for deletion 2025-07-16 07:49:19,593 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751250_10426 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751250 2025-07-16 07:50:18,744 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751251_10427 src: /192.168.158.9:32962 dest: /192.168.158.4:9866 2025-07-16 07:50:18,769 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:32962, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-911583924_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751251_10427, duration(ns): 19810262 2025-07-16 07:50:18,770 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751251_10427, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-16 07:50:19,593 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751251_10427 replica FinalizedReplica, blk_1073751251_10427, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751251 for deletion 2025-07-16 07:50:19,594 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751251_10427 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751251 2025-07-16 07:52:18,766 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751253_10429 src: /192.168.158.1:38732 dest: /192.168.158.4:9866 2025-07-16 07:52:18,797 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38732, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1217038669_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751253_10429, duration(ns): 23147529 2025-07-16 07:52:18,798 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751253_10429, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-16 07:52:22,595 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751253_10429 replica FinalizedReplica, blk_1073751253_10429, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751253 for deletion 2025-07-16 07:52:22,596 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751253_10429 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751253 2025-07-16 07:53:18,745 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751254_10430 src: /192.168.158.1:52462 dest: /192.168.158.4:9866 2025-07-16 07:53:18,783 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52462, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1666322626_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751254_10430, duration(ns): 28934303 2025-07-16 07:53:18,783 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751254_10430, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-16 07:53:19,596 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751254_10430 replica FinalizedReplica, blk_1073751254_10430, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751254 for deletion 2025-07-16 07:53:19,597 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751254_10430 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751254 2025-07-16 07:56:23,770 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751257_10433 src: /192.168.158.7:40712 dest: /192.168.158.4:9866 2025-07-16 07:56:23,787 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:40712, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_243653454_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751257_10433, duration(ns): 15290085 2025-07-16 07:56:23,788 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751257_10433, type=LAST_IN_PIPELINE terminating 2025-07-16 07:56:25,599 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751257_10433 replica FinalizedReplica, blk_1073751257_10433, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751257 for deletion 2025-07-16 07:56:25,600 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751257_10433 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751257 2025-07-16 07:58:23,764 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751259_10435 src: /192.168.158.8:50932 dest: /192.168.158.4:9866 2025-07-16 07:58:23,790 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:50932, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_328231443_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751259_10435, duration(ns): 20835495 2025-07-16 07:58:23,790 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751259_10435, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-16 07:58:28,605 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751259_10435 replica FinalizedReplica, blk_1073751259_10435, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751259 for deletion 2025-07-16 07:58:28,606 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751259_10435 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751259 2025-07-16 07:59:28,756 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751260_10436 src: /192.168.158.1:54306 dest: /192.168.158.4:9866 2025-07-16 07:59:28,787 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54306, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1178757944_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751260_10436, duration(ns): 22853029 2025-07-16 07:59:28,787 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751260_10436, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-16 07:59:34,610 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751260_10436 replica FinalizedReplica, blk_1073751260_10436, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751260 for deletion 2025-07-16 07:59:34,611 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751260_10436 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751260 2025-07-16 08:00:55,618 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Successfully sent block report 0x12cfd9a757d31f40, containing 4 storage report(s), of which we sent 4. The reports had 8 total blocks and used 1 RPC(s). This took 1 msec to generate and 4 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5. 2025-07-16 08:00:55,618 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Got finalize command for block pool BP-1059995147-192.168.158.1-1752101929360 2025-07-16 08:01:28,767 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751262_10438 src: /192.168.158.1:45408 dest: /192.168.158.4:9866 2025-07-16 08:01:28,797 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45408, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2065737970_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751262_10438, duration(ns): 21066060 2025-07-16 08:01:28,797 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751262_10438, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-16 08:01:34,617 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751262_10438 replica FinalizedReplica, blk_1073751262_10438, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751262 for deletion 2025-07-16 08:01:34,618 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751262_10438 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751262 2025-07-16 08:04:33,757 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751265_10441 src: /192.168.158.1:55472 dest: /192.168.158.4:9866 2025-07-16 08:04:33,787 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55472, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_606077911_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751265_10441, duration(ns): 21338618 2025-07-16 08:04:33,788 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751265_10441, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-16 08:04:34,624 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751265_10441 replica FinalizedReplica, blk_1073751265_10441, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751265 for deletion 2025-07-16 08:04:34,625 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751265_10441 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751265 2025-07-16 08:05:33,767 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751266_10442 src: /192.168.158.9:56948 dest: /192.168.158.4:9866 2025-07-16 08:05:33,783 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:56948, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_402042933_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751266_10442, duration(ns): 13831990 2025-07-16 08:05:33,783 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751266_10442, type=LAST_IN_PIPELINE terminating 2025-07-16 08:05:34,626 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751266_10442 replica FinalizedReplica, blk_1073751266_10442, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751266 for deletion 2025-07-16 08:05:34,627 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751266_10442 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751266 2025-07-16 08:07:33,763 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751268_10444 src: /192.168.158.1:42072 dest: /192.168.158.4:9866 2025-07-16 08:07:33,792 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42072, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1262556809_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751268_10444, duration(ns): 20031218 2025-07-16 08:07:33,793 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751268_10444, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-16 08:07:37,634 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751268_10444 replica FinalizedReplica, blk_1073751268_10444, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751268 for deletion 2025-07-16 08:07:37,635 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751268_10444 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751268 2025-07-16 08:09:33,780 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751270_10446 src: /192.168.158.6:52096 dest: /192.168.158.4:9866 2025-07-16 08:09:33,800 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:52096, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_812155220_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751270_10446, duration(ns): 17304329 2025-07-16 08:09:33,800 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751270_10446, type=LAST_IN_PIPELINE terminating 2025-07-16 08:09:37,638 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751270_10446 replica FinalizedReplica, blk_1073751270_10446, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751270 for deletion 2025-07-16 08:09:37,639 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751270_10446 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751270 2025-07-16 08:13:33,770 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751274_10450 src: /192.168.158.1:47630 dest: /192.168.158.4:9866 2025-07-16 08:13:33,800 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47630, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1569040752_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751274_10450, duration(ns): 21963929 2025-07-16 08:13:33,800 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751274_10450, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-16 08:13:34,647 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751274_10450 replica FinalizedReplica, blk_1073751274_10450, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751274 for deletion 2025-07-16 08:13:34,648 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751274_10450 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751274 2025-07-16 08:18:38,791 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751279_10455 src: /192.168.158.6:37832 dest: /192.168.158.4:9866 2025-07-16 08:18:38,815 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:37832, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2135699321_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751279_10455, duration(ns): 18836491 2025-07-16 08:18:38,815 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751279_10455, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-16 08:18:40,657 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751279_10455 replica FinalizedReplica, blk_1073751279_10455, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751279 for deletion 2025-07-16 08:18:40,658 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751279_10455 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751279 2025-07-16 08:19:43,799 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751280_10456 src: /192.168.158.5:43742 dest: /192.168.158.4:9866 2025-07-16 08:19:43,826 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:43742, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-504749997_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751280_10456, duration(ns): 20975399 2025-07-16 08:19:43,826 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751280_10456, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-16 08:19:46,660 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751280_10456 replica FinalizedReplica, blk_1073751280_10456, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751280 for deletion 2025-07-16 08:19:46,662 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751280_10456 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751280 2025-07-16 08:20:48,782 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751281_10457 src: /192.168.158.1:54790 dest: /192.168.158.4:9866 2025-07-16 08:20:48,813 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54790, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1557249075_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751281_10457, duration(ns): 23359528 2025-07-16 08:20:48,814 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751281_10457, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-16 08:20:52,662 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751281_10457 replica FinalizedReplica, blk_1073751281_10457, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751281 for deletion 2025-07-16 08:20:52,663 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751281_10457 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751281 2025-07-16 08:23:48,787 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751284_10460 src: /192.168.158.7:58616 dest: /192.168.158.4:9866 2025-07-16 08:23:48,806 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:58616, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1351103166_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751284_10460, duration(ns): 16651093 2025-07-16 08:23:48,806 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751284_10460, type=LAST_IN_PIPELINE terminating 2025-07-16 08:23:49,669 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751284_10460 replica FinalizedReplica, blk_1073751284_10460, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751284 for deletion 2025-07-16 08:23:49,670 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751284_10460 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751284 2025-07-16 08:31:48,805 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751292_10468 src: /192.168.158.7:58904 dest: /192.168.158.4:9866 2025-07-16 08:31:48,821 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:58904, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-648338755_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751292_10468, duration(ns): 13719481 2025-07-16 08:31:48,821 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751292_10468, type=LAST_IN_PIPELINE terminating 2025-07-16 08:31:52,709 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751292_10468 replica FinalizedReplica, blk_1073751292_10468, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751292 for deletion 2025-07-16 08:31:52,711 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751292_10468 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751292 2025-07-16 08:32:48,807 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751293_10469 src: /192.168.158.5:37696 dest: /192.168.158.4:9866 2025-07-16 08:32:48,831 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:37696, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1372462745_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751293_10469, duration(ns): 18706973 2025-07-16 08:32:48,831 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751293_10469, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-16 08:32:49,696 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751293_10469 replica FinalizedReplica, blk_1073751293_10469, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751293 for deletion 2025-07-16 08:32:49,697 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751293_10469 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751293 2025-07-16 08:34:53,813 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751295_10471 src: /192.168.158.5:42774 dest: /192.168.158.4:9866 2025-07-16 08:34:53,831 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:42774, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_392033204_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751295_10471, duration(ns): 15831968 2025-07-16 08:34:53,831 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751295_10471, type=LAST_IN_PIPELINE terminating 2025-07-16 08:34:55,704 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751295_10471 replica FinalizedReplica, blk_1073751295_10471, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751295 for deletion 2025-07-16 08:34:55,706 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751295_10471 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073751295 2025-07-16 08:35:53,814 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751296_10472 src: /192.168.158.9:50830 dest: /192.168.158.4:9866 2025-07-16 08:35:53,832 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:50830, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1754868895_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751296_10472, duration(ns): 15651344 2025-07-16 08:35:53,832 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751296_10472, type=LAST_IN_PIPELINE terminating 2025-07-16 08:35:55,705 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751296_10472 replica FinalizedReplica, blk_1073751296_10472, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751296 for deletion 2025-07-16 08:35:55,706 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751296_10472 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751296 2025-07-16 08:36:53,814 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751297_10473 src: /192.168.158.9:42924 dest: /192.168.158.4:9866 2025-07-16 08:36:53,838 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:42924, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1360233545_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751297_10473, duration(ns): 18370918 2025-07-16 08:36:53,838 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751297_10473, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-16 08:36:58,707 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751297_10473 replica FinalizedReplica, blk_1073751297_10473, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751297 for deletion 2025-07-16 08:36:58,708 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751297_10473 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751297 2025-07-16 08:40:53,815 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751301_10477 src: /192.168.158.8:49604 dest: /192.168.158.4:9866 2025-07-16 08:40:53,839 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:49604, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1299249346_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751301_10477, duration(ns): 17977822 2025-07-16 08:40:53,839 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751301_10477, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-16 08:40:58,714 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751301_10477 replica FinalizedReplica, blk_1073751301_10477, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751301 for deletion 2025-07-16 08:40:58,715 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751301_10477 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751301 2025-07-16 08:42:53,850 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751303_10479 src: /192.168.158.9:45578 dest: /192.168.158.4:9866 2025-07-16 08:42:53,873 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:45578, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_117992068_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751303_10479, duration(ns): 17255330 2025-07-16 08:42:53,873 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751303_10479, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-16 08:42:58,720 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751303_10479 replica FinalizedReplica, blk_1073751303_10479, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751303 for deletion 2025-07-16 08:42:58,721 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751303_10479 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751303 2025-07-16 08:43:53,825 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751304_10480 src: /192.168.158.1:47938 dest: /192.168.158.4:9866 2025-07-16 08:43:53,855 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47938, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1110444346_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751304_10480, duration(ns): 21000220 2025-07-16 08:43:53,856 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751304_10480, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-16 08:43:55,721 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751304_10480 replica FinalizedReplica, blk_1073751304_10480, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751304 for deletion 2025-07-16 08:43:55,722 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751304_10480 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751304 2025-07-16 08:44:53,829 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751305_10481 src: /192.168.158.7:53988 dest: /192.168.158.4:9866 2025-07-16 08:44:53,854 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:53988, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_65882890_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751305_10481, duration(ns): 19692029 2025-07-16 08:44:53,854 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751305_10481, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-16 08:44:55,723 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751305_10481 replica FinalizedReplica, blk_1073751305_10481, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751305 for deletion 2025-07-16 08:44:55,724 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751305_10481 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751305 2025-07-16 08:45:53,832 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751306_10482 src: /192.168.158.7:34204 dest: /192.168.158.4:9866 2025-07-16 08:45:53,850 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:34204, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-711903719_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751306_10482, duration(ns): 16575856 2025-07-16 08:45:53,851 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751306_10482, type=LAST_IN_PIPELINE terminating 2025-07-16 08:45:58,726 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751306_10482 replica FinalizedReplica, blk_1073751306_10482, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751306 for deletion 2025-07-16 08:45:58,727 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751306_10482 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751306 2025-07-16 08:46:53,842 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751307_10483 src: /192.168.158.5:51790 dest: /192.168.158.4:9866 2025-07-16 08:46:53,865 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:51790, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1257677419_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751307_10483, duration(ns): 18174652 2025-07-16 08:46:53,865 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751307_10483, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-16 08:46:55,726 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751307_10483 replica FinalizedReplica, blk_1073751307_10483, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751307 for deletion 2025-07-16 08:46:55,727 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751307_10483 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751307 2025-07-16 08:47:53,830 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751308_10484 src: /192.168.158.1:54538 dest: /192.168.158.4:9866 2025-07-16 08:47:53,859 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54538, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1796843163_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751308_10484, duration(ns): 20942198 2025-07-16 08:47:53,860 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751308_10484, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-16 08:47:58,728 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751308_10484 replica FinalizedReplica, blk_1073751308_10484, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751308 for deletion 2025-07-16 08:47:58,729 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751308_10484 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751308 2025-07-16 08:48:53,832 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751309_10485 src: /192.168.158.7:41564 dest: /192.168.158.4:9866 2025-07-16 08:48:53,856 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:41564, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-627630802_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751309_10485, duration(ns): 18933626 2025-07-16 08:48:53,857 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751309_10485, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-16 08:48:58,732 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751309_10485 replica FinalizedReplica, blk_1073751309_10485, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751309 for deletion 2025-07-16 08:48:58,733 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751309_10485 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751309 2025-07-16 08:49:53,838 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751310_10486 src: /192.168.158.6:40636 dest: /192.168.158.4:9866 2025-07-16 08:49:53,862 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:40636, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_513640856_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751310_10486, duration(ns): 18122347 2025-07-16 08:49:53,862 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751310_10486, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-16 08:49:55,734 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751310_10486 replica FinalizedReplica, blk_1073751310_10486, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751310 for deletion 2025-07-16 08:49:55,735 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751310_10486 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751310 2025-07-16 08:51:58,838 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751312_10488 src: /192.168.158.1:58906 dest: /192.168.158.4:9866 2025-07-16 08:51:58,874 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58906, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1964051996_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751312_10488, duration(ns): 27293911 2025-07-16 08:51:58,874 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751312_10488, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-16 08:52:01,737 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751312_10488 replica FinalizedReplica, blk_1073751312_10488, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751312 for deletion 2025-07-16 08:52:01,738 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751312_10488 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751312 2025-07-16 08:52:58,839 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751313_10489 src: /192.168.158.5:36750 dest: /192.168.158.4:9866 2025-07-16 08:52:58,863 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:36750, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2092697397_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751313_10489, duration(ns): 18471575 2025-07-16 08:52:58,863 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751313_10489, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-16 08:53:01,739 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751313_10489 replica FinalizedReplica, blk_1073751313_10489, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751313 for deletion 2025-07-16 08:53:01,741 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751313_10489 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751313 2025-07-16 08:53:58,838 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751314_10490 src: /192.168.158.1:36126 dest: /192.168.158.4:9866 2025-07-16 08:53:58,869 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36126, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_912362986_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751314_10490, duration(ns): 22331394 2025-07-16 08:53:58,869 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751314_10490, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-16 08:54:01,740 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751314_10490 replica FinalizedReplica, blk_1073751314_10490, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751314 for deletion 2025-07-16 08:54:01,741 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751314_10490 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751314 2025-07-16 08:54:58,844 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751315_10491 src: /192.168.158.9:42922 dest: /192.168.158.4:9866 2025-07-16 08:54:58,871 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:42922, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1736026874_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751315_10491, duration(ns): 20203664 2025-07-16 08:54:58,871 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751315_10491, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-16 08:55:01,743 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751315_10491 replica FinalizedReplica, blk_1073751315_10491, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751315 for deletion 2025-07-16 08:55:01,744 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751315_10491 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751315 2025-07-16 08:55:58,848 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751316_10492 src: /192.168.158.7:51528 dest: /192.168.158.4:9866 2025-07-16 08:55:58,872 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:51528, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1542679053_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751316_10492, duration(ns): 19098814 2025-07-16 08:55:58,873 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751316_10492, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-16 08:56:04,744 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751316_10492 replica FinalizedReplica, blk_1073751316_10492, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751316 for deletion 2025-07-16 08:56:04,745 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751316_10492 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751316 2025-07-16 09:00:58,866 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751321_10497 src: /192.168.158.8:37732 dest: /192.168.158.4:9866 2025-07-16 09:00:58,885 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:37732, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_747897982_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751321_10497, duration(ns): 17040931 2025-07-16 09:00:58,885 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751321_10497, type=LAST_IN_PIPELINE terminating 2025-07-16 09:01:04,752 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751321_10497 replica FinalizedReplica, blk_1073751321_10497, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751321 for deletion 2025-07-16 09:01:04,753 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751321_10497 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751321 2025-07-16 09:07:03,877 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751327_10503 src: /192.168.158.7:34626 dest: /192.168.158.4:9866 2025-07-16 09:07:03,901 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:34626, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1006861373_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751327_10503, duration(ns): 18179814 2025-07-16 09:07:03,901 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751327_10503, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-16 09:07:04,769 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751327_10503 replica FinalizedReplica, blk_1073751327_10503, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751327 for deletion 2025-07-16 09:07:04,770 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751327_10503 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751327 2025-07-16 09:08:08,870 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751328_10504 src: /192.168.158.5:38394 dest: /192.168.158.4:9866 2025-07-16 09:08:08,895 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:38394, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1564464679_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751328_10504, duration(ns): 19748066 2025-07-16 09:08:08,895 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751328_10504, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-16 09:08:13,772 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751328_10504 replica FinalizedReplica, blk_1073751328_10504, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751328 for deletion 2025-07-16 09:08:13,773 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751328_10504 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751328 2025-07-16 09:11:13,873 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751331_10507 src: /192.168.158.6:36514 dest: /192.168.158.4:9866 2025-07-16 09:11:13,897 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:36514, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-688548452_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751331_10507, duration(ns): 18007078 2025-07-16 09:11:13,897 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751331_10507, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-16 09:11:16,780 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751331_10507 replica FinalizedReplica, blk_1073751331_10507, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751331 for deletion 2025-07-16 09:11:16,781 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751331_10507 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751331 2025-07-16 09:12:13,875 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751332_10508 src: /192.168.158.1:39750 dest: /192.168.158.4:9866 2025-07-16 09:12:13,911 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39750, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_25525400_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751332_10508, duration(ns): 26523461 2025-07-16 09:12:13,911 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751332_10508, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-16 09:12:19,782 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751332_10508 replica FinalizedReplica, blk_1073751332_10508, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751332 for deletion 2025-07-16 09:12:19,783 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751332_10508 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751332 2025-07-16 09:13:13,884 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751333_10509 src: /192.168.158.9:59516 dest: /192.168.158.4:9866 2025-07-16 09:13:13,910 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:59516, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_812734555_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751333_10509, duration(ns): 20747465 2025-07-16 09:13:13,910 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751333_10509, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-16 09:13:16,785 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751333_10509 replica FinalizedReplica, blk_1073751333_10509, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751333 for deletion 2025-07-16 09:13:16,786 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751333_10509 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751333 2025-07-16 09:14:13,884 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751334_10510 src: /192.168.158.1:51696 dest: /192.168.158.4:9866 2025-07-16 09:14:13,915 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51696, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1032397603_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751334_10510, duration(ns): 22827849 2025-07-16 09:14:13,915 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751334_10510, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-16 09:14:16,788 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751334_10510 replica FinalizedReplica, blk_1073751334_10510, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751334 for deletion 2025-07-16 09:14:16,789 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751334_10510 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751334 2025-07-16 09:15:13,892 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751335_10511 src: /192.168.158.9:55946 dest: /192.168.158.4:9866 2025-07-16 09:15:13,910 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:55946, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_945925085_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751335_10511, duration(ns): 15389531 2025-07-16 09:15:13,910 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751335_10511, type=LAST_IN_PIPELINE terminating 2025-07-16 09:15:16,789 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751335_10511 replica FinalizedReplica, blk_1073751335_10511, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751335 for deletion 2025-07-16 09:15:16,792 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751335_10511 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751335 2025-07-16 09:16:13,913 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751336_10512 src: /192.168.158.7:49358 dest: /192.168.158.4:9866 2025-07-16 09:16:13,935 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:49358, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_832753625_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751336_10512, duration(ns): 20022238 2025-07-16 09:16:13,935 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751336_10512, type=LAST_IN_PIPELINE terminating 2025-07-16 09:16:19,789 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751336_10512 replica FinalizedReplica, blk_1073751336_10512, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751336 for deletion 2025-07-16 09:16:19,790 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751336_10512 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751336 2025-07-16 09:22:13,900 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751342_10518 src: /192.168.158.1:43768 dest: /192.168.158.4:9866 2025-07-16 09:22:13,930 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43768, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1022677993_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751342_10518, duration(ns): 22225154 2025-07-16 09:22:13,930 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751342_10518, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-16 09:22:16,801 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751342_10518 replica FinalizedReplica, blk_1073751342_10518, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751342 for deletion 2025-07-16 09:22:16,802 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751342_10518 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751342 2025-07-16 09:24:13,906 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751344_10520 src: /192.168.158.6:58752 dest: /192.168.158.4:9866 2025-07-16 09:24:13,924 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:58752, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1031882975_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751344_10520, duration(ns): 16367625 2025-07-16 09:24:13,924 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751344_10520, type=LAST_IN_PIPELINE terminating 2025-07-16 09:24:19,803 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751344_10520 replica FinalizedReplica, blk_1073751344_10520, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751344 for deletion 2025-07-16 09:24:19,804 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751344_10520 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751344 2025-07-16 09:25:13,910 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751345_10521 src: /192.168.158.9:55968 dest: /192.168.158.4:9866 2025-07-16 09:25:13,928 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:55968, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-385969314_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751345_10521, duration(ns): 16641273 2025-07-16 09:25:13,929 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751345_10521, type=LAST_IN_PIPELINE terminating 2025-07-16 09:25:19,803 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751345_10521 replica FinalizedReplica, blk_1073751345_10521, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751345 for deletion 2025-07-16 09:25:19,804 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751345_10521 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751345 2025-07-16 09:26:13,908 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751346_10522 src: /192.168.158.7:57070 dest: /192.168.158.4:9866 2025-07-16 09:26:13,932 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:57070, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-596188930_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751346_10522, duration(ns): 18778169 2025-07-16 09:26:13,932 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751346_10522, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-16 09:26:16,805 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751346_10522 replica FinalizedReplica, blk_1073751346_10522, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751346 for deletion 2025-07-16 09:26:16,807 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751346_10522 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751346 2025-07-16 09:27:13,909 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751347_10523 src: /192.168.158.1:34292 dest: /192.168.158.4:9866 2025-07-16 09:27:13,941 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34292, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1957474943_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751347_10523, duration(ns): 21682845 2025-07-16 09:27:13,941 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751347_10523, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-16 09:27:19,811 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751347_10523 replica FinalizedReplica, blk_1073751347_10523, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751347 for deletion 2025-07-16 09:27:19,812 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751347_10523 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751347 2025-07-16 09:29:13,914 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751349_10525 src: /192.168.158.1:52932 dest: /192.168.158.4:9866 2025-07-16 09:29:13,944 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52932, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-349289625_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751349_10525, duration(ns): 21534432 2025-07-16 09:29:13,944 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751349_10525, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-16 09:29:16,814 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751349_10525 replica FinalizedReplica, blk_1073751349_10525, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751349 for deletion 2025-07-16 09:29:16,815 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751349_10525 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751349 2025-07-16 09:30:18,919 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751350_10526 src: /192.168.158.1:48940 dest: /192.168.158.4:9866 2025-07-16 09:30:18,954 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48940, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_673892982_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751350_10526, duration(ns): 26201017 2025-07-16 09:30:18,954 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751350_10526, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-16 09:30:22,813 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751350_10526 replica FinalizedReplica, blk_1073751350_10526, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751350 for deletion 2025-07-16 09:30:22,815 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751350_10526 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751350 2025-07-16 09:31:23,916 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751351_10527 src: /192.168.158.1:36622 dest: /192.168.158.4:9866 2025-07-16 09:31:23,946 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36622, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_631510394_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751351_10527, duration(ns): 21123282 2025-07-16 09:31:23,947 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751351_10527, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-16 09:31:28,816 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751351_10527 replica FinalizedReplica, blk_1073751351_10527, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751351 for deletion 2025-07-16 09:31:28,817 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751351_10527 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751351 2025-07-16 09:32:23,923 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751352_10528 src: /192.168.158.9:40418 dest: /192.168.158.4:9866 2025-07-16 09:32:23,947 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:40418, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-890499486_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751352_10528, duration(ns): 18846799 2025-07-16 09:32:23,947 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751352_10528, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-16 09:32:28,817 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751352_10528 replica FinalizedReplica, blk_1073751352_10528, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751352 for deletion 2025-07-16 09:32:28,819 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751352_10528 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751352 2025-07-16 09:33:23,921 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751353_10529 src: /192.168.158.9:43550 dest: /192.168.158.4:9866 2025-07-16 09:33:23,947 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:43550, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1374831775_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751353_10529, duration(ns): 20228092 2025-07-16 09:33:23,948 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751353_10529, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-16 09:33:25,818 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751353_10529 replica FinalizedReplica, blk_1073751353_10529, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751353 for deletion 2025-07-16 09:33:25,819 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751353_10529 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751353 2025-07-16 09:34:23,926 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751354_10530 src: /192.168.158.8:48290 dest: /192.168.158.4:9866 2025-07-16 09:34:23,954 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:48290, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1019687125_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751354_10530, duration(ns): 22947345 2025-07-16 09:34:23,954 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751354_10530, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-16 09:34:28,823 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751354_10530 replica FinalizedReplica, blk_1073751354_10530, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751354 for deletion 2025-07-16 09:34:28,824 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751354_10530 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751354 2025-07-16 09:36:23,928 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751356_10532 src: /192.168.158.8:42178 dest: /192.168.158.4:9866 2025-07-16 09:36:23,954 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:42178, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1760715148_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751356_10532, duration(ns): 20116862 2025-07-16 09:36:23,954 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751356_10532, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-16 09:36:25,824 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751356_10532 replica FinalizedReplica, blk_1073751356_10532, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751356 for deletion 2025-07-16 09:36:25,825 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751356_10532 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751356 2025-07-16 09:38:23,932 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751358_10534 src: /192.168.158.1:54410 dest: /192.168.158.4:9866 2025-07-16 09:38:23,962 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54410, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_170940198_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751358_10534, duration(ns): 21407604 2025-07-16 09:38:23,962 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751358_10534, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-16 09:38:25,827 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751358_10534 replica FinalizedReplica, blk_1073751358_10534, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751358 for deletion 2025-07-16 09:38:25,828 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751358_10534 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751358 2025-07-16 09:39:23,931 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751359_10535 src: /192.168.158.1:51982 dest: /192.168.158.4:9866 2025-07-16 09:39:23,967 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51982, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_649428524_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751359_10535, duration(ns): 27282398 2025-07-16 09:39:23,967 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751359_10535, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-16 09:39:28,828 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751359_10535 replica FinalizedReplica, blk_1073751359_10535, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751359 for deletion 2025-07-16 09:39:28,829 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751359_10535 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751359 2025-07-16 09:43:23,954 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751363_10539 src: /192.168.158.8:40572 dest: /192.168.158.4:9866 2025-07-16 09:43:23,977 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:40572, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1310764109_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751363_10539, duration(ns): 17693426 2025-07-16 09:43:23,978 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751363_10539, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-16 09:43:28,840 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751363_10539 replica FinalizedReplica, blk_1073751363_10539, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751363 for deletion 2025-07-16 09:43:28,841 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751363_10539 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751363 2025-07-16 09:44:28,948 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751364_10540 src: /192.168.158.1:35484 dest: /192.168.158.4:9866 2025-07-16 09:44:28,978 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35484, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1205022986_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751364_10540, duration(ns): 21281273 2025-07-16 09:44:28,978 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751364_10540, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-16 09:44:31,841 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751364_10540 replica FinalizedReplica, blk_1073751364_10540, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751364 for deletion 2025-07-16 09:44:31,842 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751364_10540 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751364 2025-07-16 09:47:28,968 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751367_10543 src: /192.168.158.7:48574 dest: /192.168.158.4:9866 2025-07-16 09:47:28,992 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:48574, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1774299810_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751367_10543, duration(ns): 18760413 2025-07-16 09:47:28,992 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751367_10543, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-16 09:47:31,844 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751367_10543 replica FinalizedReplica, blk_1073751367_10543, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751367 for deletion 2025-07-16 09:47:31,845 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751367_10543 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751367 2025-07-16 09:49:33,963 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751369_10545 src: /192.168.158.5:51340 dest: /192.168.158.4:9866 2025-07-16 09:49:33,980 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:51340, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1778717589_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751369_10545, duration(ns): 15254482 2025-07-16 09:49:33,981 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751369_10545, type=LAST_IN_PIPELINE terminating 2025-07-16 09:49:34,846 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751369_10545 replica FinalizedReplica, blk_1073751369_10545, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751369 for deletion 2025-07-16 09:49:34,847 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751369_10545 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751369 2025-07-16 09:51:33,971 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751371_10547 src: /192.168.158.6:34890 dest: /192.168.158.4:9866 2025-07-16 09:51:33,995 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:34890, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1648582245_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751371_10547, duration(ns): 19399567 2025-07-16 09:51:33,996 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751371_10547, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-16 09:51:37,851 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751371_10547 replica FinalizedReplica, blk_1073751371_10547, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751371 for deletion 2025-07-16 09:51:37,853 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751371_10547 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751371 2025-07-16 09:54:33,982 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751374_10550 src: /192.168.158.1:51126 dest: /192.168.158.4:9866 2025-07-16 09:54:34,017 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51126, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_566626474_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751374_10550, duration(ns): 24647644 2025-07-16 09:54:34,017 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751374_10550, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-16 09:54:34,862 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751374_10550 replica FinalizedReplica, blk_1073751374_10550, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751374 for deletion 2025-07-16 09:54:34,863 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751374_10550 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751374 2025-07-16 09:56:38,996 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751376_10552 src: /192.168.158.8:57438 dest: /192.168.158.4:9866 2025-07-16 09:56:39,025 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:57438, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-139805229_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751376_10552, duration(ns): 23579371 2025-07-16 09:56:39,026 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751376_10552, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-16 09:56:43,865 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751376_10552 replica FinalizedReplica, blk_1073751376_10552, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751376 for deletion 2025-07-16 09:56:43,866 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751376_10552 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751376 2025-07-16 09:58:38,998 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751378_10554 src: /192.168.158.6:43680 dest: /192.168.158.4:9866 2025-07-16 09:58:39,023 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:43680, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1098623734_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751378_10554, duration(ns): 19978820 2025-07-16 09:58:39,023 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751378_10554, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-16 09:58:40,871 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751378_10554 replica FinalizedReplica, blk_1073751378_10554, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751378 for deletion 2025-07-16 09:58:40,873 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751378_10554 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751378 2025-07-16 10:03:39,009 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751383_10559 src: /192.168.158.9:43532 dest: /192.168.158.4:9866 2025-07-16 10:03:39,036 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:43532, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1461941809_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751383_10559, duration(ns): 21525229 2025-07-16 10:03:39,036 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751383_10559, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-16 10:03:43,886 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751383_10559 replica FinalizedReplica, blk_1073751383_10559, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751383 for deletion 2025-07-16 10:03:43,887 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751383_10559 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751383 2025-07-16 10:05:44,009 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751385_10561 src: /192.168.158.9:38608 dest: /192.168.158.4:9866 2025-07-16 10:05:44,027 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:38608, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679293793_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751385_10561, duration(ns): 15175911 2025-07-16 10:05:44,027 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751385_10561, type=LAST_IN_PIPELINE terminating 2025-07-16 10:05:49,886 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751385_10561 replica FinalizedReplica, blk_1073751385_10561, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751385 for deletion 2025-07-16 10:05:49,887 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751385_10561 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751385 2025-07-16 10:08:49,005 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751388_10564 src: /192.168.158.1:42976 dest: /192.168.158.4:9866 2025-07-16 10:08:49,039 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42976, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_860121005_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751388_10564, duration(ns): 26218203 2025-07-16 10:08:49,039 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751388_10564, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-16 10:08:52,893 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751388_10564 replica FinalizedReplica, blk_1073751388_10564, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751388 for deletion 2025-07-16 10:08:52,894 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751388_10564 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751388 2025-07-16 10:11:54,013 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751391_10567 src: /192.168.158.1:41958 dest: /192.168.158.4:9866 2025-07-16 10:11:54,046 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41958, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1949561585_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751391_10567, duration(ns): 24718683 2025-07-16 10:11:54,047 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751391_10567, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-16 10:11:58,898 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751391_10567 replica FinalizedReplica, blk_1073751391_10567, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751391 for deletion 2025-07-16 10:11:58,899 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751391_10567 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751391 2025-07-16 10:12:54,017 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751392_10568 src: /192.168.158.1:59038 dest: /192.168.158.4:9866 2025-07-16 10:12:54,050 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59038, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_988618519_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751392_10568, duration(ns): 24049095 2025-07-16 10:12:54,050 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751392_10568, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-16 10:12:55,900 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751392_10568 replica FinalizedReplica, blk_1073751392_10568, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751392 for deletion 2025-07-16 10:12:55,901 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751392_10568 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751392 2025-07-16 10:15:54,020 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751395_10571 src: /192.168.158.1:48066 dest: /192.168.158.4:9866 2025-07-16 10:15:54,052 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48066, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_795142714_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751395_10571, duration(ns): 22871917 2025-07-16 10:15:54,052 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751395_10571, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-16 10:15:55,902 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751395_10571 replica FinalizedReplica, blk_1073751395_10571, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751395 for deletion 2025-07-16 10:15:55,903 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751395_10571 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751395 2025-07-16 10:16:59,027 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751396_10572 src: /192.168.158.1:48004 dest: /192.168.158.4:9866 2025-07-16 10:16:59,059 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48004, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_16003747_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751396_10572, duration(ns): 22666058 2025-07-16 10:16:59,059 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751396_10572, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-16 10:17:01,903 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751396_10572 replica FinalizedReplica, blk_1073751396_10572, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751396 for deletion 2025-07-16 10:17:01,904 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751396_10572 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751396 2025-07-16 10:17:59,039 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751397_10573 src: /192.168.158.5:36216 dest: /192.168.158.4:9866 2025-07-16 10:17:59,066 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:36216, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2049388049_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751397_10573, duration(ns): 21328249 2025-07-16 10:17:59,066 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751397_10573, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-16 10:18:01,906 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751397_10573 replica FinalizedReplica, blk_1073751397_10573, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751397 for deletion 2025-07-16 10:18:01,907 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751397_10573 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751397 2025-07-16 10:18:59,035 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751398_10574 src: /192.168.158.9:50556 dest: /192.168.158.4:9866 2025-07-16 10:18:59,052 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:50556, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_702548519_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751398_10574, duration(ns): 14718090 2025-07-16 10:18:59,053 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751398_10574, type=LAST_IN_PIPELINE terminating 2025-07-16 10:19:01,908 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751398_10574 replica FinalizedReplica, blk_1073751398_10574, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751398 for deletion 2025-07-16 10:19:01,909 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751398_10574 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751398 2025-07-16 10:19:59,011 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751399_10575 src: /192.168.158.8:54052 dest: /192.168.158.4:9866 2025-07-16 10:19:59,036 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:54052, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1243025975_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751399_10575, duration(ns): 19587324 2025-07-16 10:19:59,036 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751399_10575, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-16 10:20:01,911 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751399_10575 replica FinalizedReplica, blk_1073751399_10575, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751399 for deletion 2025-07-16 10:20:01,912 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751399_10575 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751399 2025-07-16 10:22:59,007 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751402_10578 src: /192.168.158.9:55804 dest: /192.168.158.4:9866 2025-07-16 10:22:59,030 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:55804, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1488961417_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751402_10578, duration(ns): 18668367 2025-07-16 10:22:59,031 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751402_10578, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-16 10:23:01,917 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751402_10578 replica FinalizedReplica, blk_1073751402_10578, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751402 for deletion 2025-07-16 10:23:01,918 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751402_10578 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751402 2025-07-16 10:23:59,009 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751403_10579 src: /192.168.158.1:56378 dest: /192.168.158.4:9866 2025-07-16 10:23:59,040 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56378, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_336397402_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751403_10579, duration(ns): 22119242 2025-07-16 10:23:59,040 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751403_10579, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-16 10:24:04,924 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751403_10579 replica FinalizedReplica, blk_1073751403_10579, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751403 for deletion 2025-07-16 10:24:04,925 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751403_10579 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751403 2025-07-16 10:27:59,017 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751407_10583 src: /192.168.158.1:49428 dest: /192.168.158.4:9866 2025-07-16 10:27:59,049 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49428, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1976380689_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751407_10583, duration(ns): 22582254 2025-07-16 10:27:59,049 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751407_10583, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-16 10:28:01,930 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751407_10583 replica FinalizedReplica, blk_1073751407_10583, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751407 for deletion 2025-07-16 10:28:01,931 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751407_10583 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751407 2025-07-16 10:28:59,023 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751408_10584 src: /192.168.158.1:34582 dest: /192.168.158.4:9866 2025-07-16 10:28:59,053 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34582, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_390188946_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751408_10584, duration(ns): 21254948 2025-07-16 10:28:59,053 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751408_10584, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-16 10:29:01,933 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751408_10584 replica FinalizedReplica, blk_1073751408_10584, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751408 for deletion 2025-07-16 10:29:01,934 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751408_10584 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751408 2025-07-16 10:29:59,024 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751409_10585 src: /192.168.158.1:50450 dest: /192.168.158.4:9866 2025-07-16 10:29:59,053 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50450, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_999846228_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751409_10585, duration(ns): 20777100 2025-07-16 10:29:59,053 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751409_10585, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-16 10:30:01,933 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751409_10585 replica FinalizedReplica, blk_1073751409_10585, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751409 for deletion 2025-07-16 10:30:01,934 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751409_10585 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751409 2025-07-16 10:30:59,028 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751410_10586 src: /192.168.158.5:43596 dest: /192.168.158.4:9866 2025-07-16 10:30:59,052 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:43596, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1880817059_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751410_10586, duration(ns): 18241571 2025-07-16 10:30:59,052 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751410_10586, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-16 10:31:04,935 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751410_10586 replica FinalizedReplica, blk_1073751410_10586, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751410 for deletion 2025-07-16 10:31:04,936 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751410_10586 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751410 2025-07-16 10:35:04,031 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751414_10590 src: /192.168.158.5:57258 dest: /192.168.158.4:9866 2025-07-16 10:35:04,049 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:57258, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1696192644_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751414_10590, duration(ns): 16737774 2025-07-16 10:35:04,050 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751414_10590, type=LAST_IN_PIPELINE terminating 2025-07-16 10:35:07,942 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751414_10590 replica FinalizedReplica, blk_1073751414_10590, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751414 for deletion 2025-07-16 10:35:07,943 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751414_10590 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751414 2025-07-16 10:36:04,043 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751415_10591 src: /192.168.158.6:60442 dest: /192.168.158.4:9866 2025-07-16 10:36:04,060 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:60442, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1904770135_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751415_10591, duration(ns): 15679701 2025-07-16 10:36:04,061 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751415_10591, type=LAST_IN_PIPELINE terminating 2025-07-16 10:36:07,945 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751415_10591 replica FinalizedReplica, blk_1073751415_10591, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751415 for deletion 2025-07-16 10:36:07,946 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751415_10591 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751415 2025-07-16 10:37:04,038 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751416_10592 src: /192.168.158.1:52956 dest: /192.168.158.4:9866 2025-07-16 10:37:04,069 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52956, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1950423054_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751416_10592, duration(ns): 23124468 2025-07-16 10:37:04,070 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751416_10592, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-16 10:37:07,948 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751416_10592 replica FinalizedReplica, blk_1073751416_10592, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751416 for deletion 2025-07-16 10:37:07,949 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751416_10592 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751416 2025-07-16 10:38:04,040 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751417_10593 src: /192.168.158.5:42524 dest: /192.168.158.4:9866 2025-07-16 10:38:04,066 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:42524, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-785416180_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751417_10593, duration(ns): 20502939 2025-07-16 10:38:04,066 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751417_10593, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-16 10:38:10,950 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751417_10593 replica FinalizedReplica, blk_1073751417_10593, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751417 for deletion 2025-07-16 10:38:10,951 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751417_10593 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751417 2025-07-16 10:39:09,043 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751418_10594 src: /192.168.158.9:55220 dest: /192.168.158.4:9866 2025-07-16 10:39:09,067 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:55220, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-727699634_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751418_10594, duration(ns): 18735585 2025-07-16 10:39:09,067 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751418_10594, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-16 10:39:13,953 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751418_10594 replica FinalizedReplica, blk_1073751418_10594, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751418 for deletion 2025-07-16 10:39:13,954 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751418_10594 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751418 2025-07-16 10:43:09,047 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751422_10598 src: /192.168.158.7:35134 dest: /192.168.158.4:9866 2025-07-16 10:43:09,071 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:35134, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_692568505_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751422_10598, duration(ns): 17829641 2025-07-16 10:43:09,071 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751422_10598, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-16 10:43:16,958 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751422_10598 replica FinalizedReplica, blk_1073751422_10598, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751422 for deletion 2025-07-16 10:43:16,960 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751422_10598 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751422 2025-07-16 10:45:09,056 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751424_10600 src: /192.168.158.1:33386 dest: /192.168.158.4:9866 2025-07-16 10:45:09,089 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33386, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1852754919_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751424_10600, duration(ns): 23513606 2025-07-16 10:45:09,089 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751424_10600, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-16 10:45:13,962 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751424_10600 replica FinalizedReplica, blk_1073751424_10600, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751424 for deletion 2025-07-16 10:45:13,963 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751424_10600 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751424 2025-07-16 10:46:09,064 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751425_10601 src: /192.168.158.7:37004 dest: /192.168.158.4:9866 2025-07-16 10:46:09,082 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:37004, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1909026048_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751425_10601, duration(ns): 15106562 2025-07-16 10:46:09,082 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751425_10601, type=LAST_IN_PIPELINE terminating 2025-07-16 10:46:13,965 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751425_10601 replica FinalizedReplica, blk_1073751425_10601, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751425 for deletion 2025-07-16 10:46:13,966 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751425_10601 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751425 2025-07-16 10:47:14,063 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751426_10602 src: /192.168.158.6:40246 dest: /192.168.158.4:9866 2025-07-16 10:47:14,089 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:40246, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1646837165_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751426_10602, duration(ns): 21037962 2025-07-16 10:47:14,089 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751426_10602, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-16 10:47:19,967 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751426_10602 replica FinalizedReplica, blk_1073751426_10602, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751426 for deletion 2025-07-16 10:47:19,968 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751426_10602 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751426 2025-07-16 10:49:14,060 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751428_10604 src: /192.168.158.9:57440 dest: /192.168.158.4:9866 2025-07-16 10:49:14,083 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:57440, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-267185845_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751428_10604, duration(ns): 17875090 2025-07-16 10:49:14,084 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751428_10604, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-16 10:49:22,970 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751428_10604 replica FinalizedReplica, blk_1073751428_10604, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751428 for deletion 2025-07-16 10:49:22,971 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751428_10604 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751428 2025-07-16 10:50:14,066 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751429_10605 src: /192.168.158.8:53574 dest: /192.168.158.4:9866 2025-07-16 10:50:14,091 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:53574, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1856429382_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751429_10605, duration(ns): 19797985 2025-07-16 10:50:14,091 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751429_10605, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-16 10:50:19,971 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751429_10605 replica FinalizedReplica, blk_1073751429_10605, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751429 for deletion 2025-07-16 10:50:19,972 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751429_10605 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751429 2025-07-16 10:55:24,086 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751434_10610 src: /192.168.158.1:50704 dest: /192.168.158.4:9866 2025-07-16 10:55:24,122 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50704, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1721320472_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751434_10610, duration(ns): 27242873 2025-07-16 10:55:24,123 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751434_10610, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-16 10:55:28,978 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751434_10610 replica FinalizedReplica, blk_1073751434_10610, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751434 for deletion 2025-07-16 10:55:28,980 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751434_10610 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751434 2025-07-16 10:58:24,099 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751437_10613 src: /192.168.158.1:36906 dest: /192.168.158.4:9866 2025-07-16 10:58:24,130 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36906, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2076361272_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751437_10613, duration(ns): 21992193 2025-07-16 10:58:24,131 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751437_10613, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-16 10:58:28,990 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751437_10613 replica FinalizedReplica, blk_1073751437_10613, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751437 for deletion 2025-07-16 10:58:28,991 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751437_10613 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751437 2025-07-16 10:59:24,102 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751438_10614 src: /192.168.158.7:53872 dest: /192.168.158.4:9866 2025-07-16 10:59:24,119 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:53872, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-52139950_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751438_10614, duration(ns): 15527450 2025-07-16 10:59:24,120 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751438_10614, type=LAST_IN_PIPELINE terminating 2025-07-16 10:59:31,991 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751438_10614 replica FinalizedReplica, blk_1073751438_10614, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751438 for deletion 2025-07-16 10:59:31,992 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751438_10614 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751438 2025-07-16 11:01:24,106 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751440_10616 src: /192.168.158.8:46494 dest: /192.168.158.4:9866 2025-07-16 11:01:24,123 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:46494, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-295334371_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751440_10616, duration(ns): 15769515 2025-07-16 11:01:24,124 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751440_10616, type=LAST_IN_PIPELINE terminating 2025-07-16 11:01:28,997 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751440_10616 replica FinalizedReplica, blk_1073751440_10616, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751440 for deletion 2025-07-16 11:01:28,998 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751440_10616 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751440 2025-07-16 11:05:24,116 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751444_10620 src: /192.168.158.6:33116 dest: /192.168.158.4:9866 2025-07-16 11:05:24,134 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:33116, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_19675624_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751444_10620, duration(ns): 16276539 2025-07-16 11:05:24,135 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751444_10620, type=LAST_IN_PIPELINE terminating 2025-07-16 11:05:29,009 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751444_10620 replica FinalizedReplica, blk_1073751444_10620, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751444 for deletion 2025-07-16 11:05:29,010 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751444_10620 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751444 2025-07-16 11:08:29,112 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751447_10623 src: /192.168.158.1:39916 dest: /192.168.158.4:9866 2025-07-16 11:08:29,146 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39916, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_123627071_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751447_10623, duration(ns): 25044848 2025-07-16 11:08:29,146 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751447_10623, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-16 11:08:32,014 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751447_10623 replica FinalizedReplica, blk_1073751447_10623, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751447 for deletion 2025-07-16 11:08:32,015 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751447_10623 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751447 2025-07-16 11:09:29,116 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751448_10624 src: /192.168.158.1:58480 dest: /192.168.158.4:9866 2025-07-16 11:09:29,150 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58480, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1764945486_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751448_10624, duration(ns): 25127680 2025-07-16 11:09:29,150 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751448_10624, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-16 11:09:32,016 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751448_10624 replica FinalizedReplica, blk_1073751448_10624, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751448 for deletion 2025-07-16 11:09:32,017 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751448_10624 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751448 2025-07-16 11:10:34,102 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751449_10625 src: /192.168.158.1:52660 dest: /192.168.158.4:9866 2025-07-16 11:10:34,131 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52660, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1059380369_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751449_10625, duration(ns): 21004173 2025-07-16 11:10:34,131 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751449_10625, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-16 11:10:41,020 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751449_10625 replica FinalizedReplica, blk_1073751449_10625, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751449 for deletion 2025-07-16 11:10:41,021 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751449_10625 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751449 2025-07-16 11:12:39,102 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751451_10627 src: /192.168.158.1:41498 dest: /192.168.158.4:9866 2025-07-16 11:12:39,132 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41498, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2067058806_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751451_10627, duration(ns): 22183547 2025-07-16 11:12:39,133 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751451_10627, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-16 11:12:44,027 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751451_10627 replica FinalizedReplica, blk_1073751451_10627, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751451 for deletion 2025-07-16 11:12:44,028 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751451_10627 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751451 2025-07-16 11:14:39,105 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751453_10629 src: /192.168.158.1:37872 dest: /192.168.158.4:9866 2025-07-16 11:14:39,138 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37872, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1012729135_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751453_10629, duration(ns): 24014084 2025-07-16 11:14:39,138 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751453_10629, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-16 11:14:47,031 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751453_10629 replica FinalizedReplica, blk_1073751453_10629, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751453 for deletion 2025-07-16 11:14:47,032 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751453_10629 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751453 2025-07-16 11:17:39,114 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751456_10632 src: /192.168.158.6:43594 dest: /192.168.158.4:9866 2025-07-16 11:17:39,138 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:43594, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1613726865_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751456_10632, duration(ns): 18762038 2025-07-16 11:17:39,138 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751456_10632, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-16 11:17:44,039 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751456_10632 replica FinalizedReplica, blk_1073751456_10632, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751456 for deletion 2025-07-16 11:17:44,040 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751456_10632 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751456 2025-07-16 11:21:39,135 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751460_10636 src: /192.168.158.6:38214 dest: /192.168.158.4:9866 2025-07-16 11:21:39,161 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:38214, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_696280485_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751460_10636, duration(ns): 19312265 2025-07-16 11:21:39,161 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751460_10636, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-16 11:21:44,053 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751460_10636 replica FinalizedReplica, blk_1073751460_10636, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751460 for deletion 2025-07-16 11:21:44,054 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751460_10636 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751460 2025-07-16 11:22:39,136 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751461_10637 src: /192.168.158.5:44492 dest: /192.168.158.4:9866 2025-07-16 11:22:39,156 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:44492, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1043094023_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751461_10637, duration(ns): 18084754 2025-07-16 11:22:39,156 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751461_10637, type=LAST_IN_PIPELINE terminating 2025-07-16 11:22:44,054 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751461_10637 replica FinalizedReplica, blk_1073751461_10637, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751461 for deletion 2025-07-16 11:22:44,056 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751461_10637 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751461 2025-07-16 11:28:44,143 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751467_10643 src: /192.168.158.1:49532 dest: /192.168.158.4:9866 2025-07-16 11:28:44,173 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49532, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-755935182_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751467_10643, duration(ns): 21319723 2025-07-16 11:28:44,174 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751467_10643, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-16 11:28:47,069 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751467_10643 replica FinalizedReplica, blk_1073751467_10643, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751467 for deletion 2025-07-16 11:28:47,070 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751467_10643 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751467 2025-07-16 11:30:44,150 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751469_10645 src: /192.168.158.9:47212 dest: /192.168.158.4:9866 2025-07-16 11:30:44,168 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:47212, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1828113943_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751469_10645, duration(ns): 15642862 2025-07-16 11:30:44,168 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751469_10645, type=LAST_IN_PIPELINE terminating 2025-07-16 11:30:47,074 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751469_10645 replica FinalizedReplica, blk_1073751469_10645, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751469 for deletion 2025-07-16 11:30:47,075 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751469_10645 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751469 2025-07-16 11:31:44,145 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751470_10646 src: /192.168.158.1:47306 dest: /192.168.158.4:9866 2025-07-16 11:31:44,179 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47306, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1237966154_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751470_10646, duration(ns): 25455476 2025-07-16 11:31:44,180 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751470_10646, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-16 11:31:47,077 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751470_10646 replica FinalizedReplica, blk_1073751470_10646, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751470 for deletion 2025-07-16 11:31:47,078 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751470_10646 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751470 2025-07-16 11:32:44,150 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751471_10647 src: /192.168.158.8:43044 dest: /192.168.158.4:9866 2025-07-16 11:32:44,168 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:43044, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-364702524_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751471_10647, duration(ns): 14908324 2025-07-16 11:32:44,168 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751471_10647, type=LAST_IN_PIPELINE terminating 2025-07-16 11:32:50,078 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751471_10647 replica FinalizedReplica, blk_1073751471_10647, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751471 for deletion 2025-07-16 11:32:50,079 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751471_10647 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751471 2025-07-16 11:36:13,269 INFO org.apache.hadoop.hdfs.server.datanode.DirectoryScanner: BlockPool BP-1059995147-192.168.158.1-1752101929360 Total blocks: 8, missing metadata files:0, missing block files:0, missing blocks in memory:0, mismatched blocks:0 2025-07-16 11:38:49,163 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751477_10653 src: /192.168.158.9:43086 dest: /192.168.158.4:9866 2025-07-16 11:38:49,182 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:43086, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_376284283_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751477_10653, duration(ns): 17822110 2025-07-16 11:38:49,183 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751477_10653, type=LAST_IN_PIPELINE terminating 2025-07-16 11:38:56,092 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751477_10653 replica FinalizedReplica, blk_1073751477_10653, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751477 for deletion 2025-07-16 11:38:56,093 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751477_10653 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751477 2025-07-16 11:40:49,161 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751479_10655 src: /192.168.158.8:34984 dest: /192.168.158.4:9866 2025-07-16 11:40:49,185 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:34984, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1929165071_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751479_10655, duration(ns): 19134047 2025-07-16 11:40:49,185 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751479_10655, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-16 11:40:53,093 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751479_10655 replica FinalizedReplica, blk_1073751479_10655, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751479 for deletion 2025-07-16 11:40:53,094 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751479_10655 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751479 2025-07-16 11:43:49,186 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751482_10658 src: /192.168.158.1:59154 dest: /192.168.158.4:9866 2025-07-16 11:43:49,218 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59154, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_517452036_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751482_10658, duration(ns): 23024057 2025-07-16 11:43:49,218 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751482_10658, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-16 11:43:56,099 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751482_10658 replica FinalizedReplica, blk_1073751482_10658, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751482 for deletion 2025-07-16 11:43:56,100 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751482_10658 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751482 2025-07-16 11:46:54,190 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751485_10661 src: /192.168.158.5:46208 dest: /192.168.158.4:9866 2025-07-16 11:46:54,218 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:46208, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-491717693_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751485_10661, duration(ns): 21807132 2025-07-16 11:46:54,218 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751485_10661, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-16 11:46:59,110 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751485_10661 replica FinalizedReplica, blk_1073751485_10661, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751485 for deletion 2025-07-16 11:46:59,111 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751485_10661 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751485 2025-07-16 11:48:54,203 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751487_10663 src: /192.168.158.9:42478 dest: /192.168.158.4:9866 2025-07-16 11:48:54,223 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:42478, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-658983861_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751487_10663, duration(ns): 17277746 2025-07-16 11:48:54,223 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751487_10663, type=LAST_IN_PIPELINE terminating 2025-07-16 11:48:59,115 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751487_10663 replica FinalizedReplica, blk_1073751487_10663, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751487 for deletion 2025-07-16 11:48:59,116 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751487_10663 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751487 2025-07-16 11:52:59,208 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751491_10667 src: /192.168.158.8:37182 dest: /192.168.158.4:9866 2025-07-16 11:52:59,229 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:37182, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1341804492_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751491_10667, duration(ns): 18392794 2025-07-16 11:52:59,229 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751491_10667, type=LAST_IN_PIPELINE terminating 2025-07-16 11:53:02,125 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751491_10667 replica FinalizedReplica, blk_1073751491_10667, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751491 for deletion 2025-07-16 11:53:02,126 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751491_10667 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751491 2025-07-16 11:54:04,220 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751492_10668 src: /192.168.158.6:60570 dest: /192.168.158.4:9866 2025-07-16 11:54:04,244 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:60570, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-839452068_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751492_10668, duration(ns): 18414736 2025-07-16 11:54:04,244 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751492_10668, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-16 11:54:08,128 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751492_10668 replica FinalizedReplica, blk_1073751492_10668, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751492 for deletion 2025-07-16 11:54:08,129 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751492_10668 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751492 2025-07-16 11:55:04,212 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751493_10669 src: /192.168.158.7:41306 dest: /192.168.158.4:9866 2025-07-16 11:55:04,264 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:41306, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_461196255_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751493_10669, duration(ns): 46447503 2025-07-16 11:55:04,264 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751493_10669, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-16 11:55:08,131 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751493_10669 replica FinalizedReplica, blk_1073751493_10669, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751493 for deletion 2025-07-16 11:55:08,132 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751493_10669 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751493 2025-07-16 11:57:04,194 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751495_10671 src: /192.168.158.9:42992 dest: /192.168.158.4:9866 2025-07-16 11:57:04,217 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:42992, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2009530359_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751495_10671, duration(ns): 18150466 2025-07-16 11:57:04,218 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751495_10671, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-16 11:57:11,132 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751495_10671 replica FinalizedReplica, blk_1073751495_10671, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751495 for deletion 2025-07-16 11:57:11,133 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751495_10671 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751495 2025-07-16 11:59:04,200 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751497_10673 src: /192.168.158.9:45686 dest: /192.168.158.4:9866 2025-07-16 11:59:04,224 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:45686, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-531935085_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751497_10673, duration(ns): 18475018 2025-07-16 11:59:04,224 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751497_10673, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-16 11:59:08,138 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751497_10673 replica FinalizedReplica, blk_1073751497_10673, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751497 for deletion 2025-07-16 11:59:08,139 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751497_10673 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751497 2025-07-16 12:00:04,210 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751498_10674 src: /192.168.158.8:38726 dest: /192.168.158.4:9866 2025-07-16 12:00:04,227 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:38726, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1743314837_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751498_10674, duration(ns): 15661153 2025-07-16 12:00:04,228 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751498_10674, type=LAST_IN_PIPELINE terminating 2025-07-16 12:00:11,140 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751498_10674 replica FinalizedReplica, blk_1073751498_10674, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751498 for deletion 2025-07-16 12:00:11,141 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751498_10674 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751498 2025-07-16 12:06:14,210 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751504_10680 src: /192.168.158.7:40368 dest: /192.168.158.4:9866 2025-07-16 12:06:14,236 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:40368, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_97104623_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751504_10680, duration(ns): 20653262 2025-07-16 12:06:14,236 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751504_10680, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-16 12:06:17,157 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751504_10680 replica FinalizedReplica, blk_1073751504_10680, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751504 for deletion 2025-07-16 12:06:17,158 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751504_10680 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751504 2025-07-16 12:09:14,225 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751507_10683 src: /192.168.158.5:39404 dest: /192.168.158.4:9866 2025-07-16 12:09:14,249 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:39404, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1453285830_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751507_10683, duration(ns): 19277789 2025-07-16 12:09:14,249 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751507_10683, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-16 12:09:17,159 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751507_10683 replica FinalizedReplica, blk_1073751507_10683, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751507 for deletion 2025-07-16 12:09:17,160 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751507_10683 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751507 2025-07-16 12:10:14,225 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751508_10684 src: /192.168.158.1:36262 dest: /192.168.158.4:9866 2025-07-16 12:10:14,257 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36262, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1523318038_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751508_10684, duration(ns): 23538144 2025-07-16 12:10:14,257 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751508_10684, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-16 12:10:17,162 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751508_10684 replica FinalizedReplica, blk_1073751508_10684, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751508 for deletion 2025-07-16 12:10:17,163 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751508_10684 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751508 2025-07-16 12:11:19,232 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751509_10685 src: /192.168.158.9:60842 dest: /192.168.158.4:9866 2025-07-16 12:11:19,251 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:60842, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1928192834_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751509_10685, duration(ns): 16372305 2025-07-16 12:11:19,251 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751509_10685, type=LAST_IN_PIPELINE terminating 2025-07-16 12:11:26,161 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751509_10685 replica FinalizedReplica, blk_1073751509_10685, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751509 for deletion 2025-07-16 12:11:26,162 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751509_10685 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751509 2025-07-16 12:12:24,235 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751510_10686 src: /192.168.158.5:34088 dest: /192.168.158.4:9866 2025-07-16 12:12:24,258 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:34088, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_290869252_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751510_10686, duration(ns): 18022724 2025-07-16 12:12:24,258 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751510_10686, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-16 12:12:29,166 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751510_10686 replica FinalizedReplica, blk_1073751510_10686, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751510 for deletion 2025-07-16 12:12:29,167 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751510_10686 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751510 2025-07-16 12:13:29,233 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751511_10687 src: /192.168.158.9:39516 dest: /192.168.158.4:9866 2025-07-16 12:13:29,250 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:39516, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-238919883_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751511_10687, duration(ns): 15603745 2025-07-16 12:13:29,251 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751511_10687, type=LAST_IN_PIPELINE terminating 2025-07-16 12:13:32,167 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751511_10687 replica FinalizedReplica, blk_1073751511_10687, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751511 for deletion 2025-07-16 12:13:32,168 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751511_10687 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751511 2025-07-16 12:15:29,245 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751513_10689 src: /192.168.158.6:55672 dest: /192.168.158.4:9866 2025-07-16 12:15:29,269 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:55672, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-730312748_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751513_10689, duration(ns): 18715074 2025-07-16 12:15:29,269 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751513_10689, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-16 12:15:35,171 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751513_10689 replica FinalizedReplica, blk_1073751513_10689, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751513 for deletion 2025-07-16 12:15:35,172 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751513_10689 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751513 2025-07-16 12:16:29,240 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751514_10690 src: /192.168.158.5:51568 dest: /192.168.158.4:9866 2025-07-16 12:16:29,262 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:51568, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_358458191_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751514_10690, duration(ns): 19352787 2025-07-16 12:16:29,262 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751514_10690, type=LAST_IN_PIPELINE terminating 2025-07-16 12:16:35,171 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751514_10690 replica FinalizedReplica, blk_1073751514_10690, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751514 for deletion 2025-07-16 12:16:35,172 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751514_10690 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751514 2025-07-16 12:18:29,240 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751516_10692 src: /192.168.158.7:49710 dest: /192.168.158.4:9866 2025-07-16 12:18:29,258 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:49710, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1409651132_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751516_10692, duration(ns): 15299543 2025-07-16 12:18:29,258 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751516_10692, type=LAST_IN_PIPELINE terminating 2025-07-16 12:18:32,173 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751516_10692 replica FinalizedReplica, blk_1073751516_10692, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751516 for deletion 2025-07-16 12:18:32,174 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751516_10692 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751516 2025-07-16 12:19:29,230 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751517_10693 src: /192.168.158.6:60462 dest: /192.168.158.4:9866 2025-07-16 12:19:29,248 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:60462, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-210500144_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751517_10693, duration(ns): 16545624 2025-07-16 12:19:29,249 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751517_10693, type=LAST_IN_PIPELINE terminating 2025-07-16 12:19:32,177 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751517_10693 replica FinalizedReplica, blk_1073751517_10693, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751517 for deletion 2025-07-16 12:19:32,178 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751517_10693 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751517 2025-07-16 12:20:34,228 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751518_10694 src: /192.168.158.1:38514 dest: /192.168.158.4:9866 2025-07-16 12:20:34,257 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38514, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2140150373_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751518_10694, duration(ns): 20163736 2025-07-16 12:20:34,257 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751518_10694, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-16 12:20:41,178 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751518_10694 replica FinalizedReplica, blk_1073751518_10694, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751518 for deletion 2025-07-16 12:20:41,179 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751518_10694 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751518 2025-07-16 12:21:34,231 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751519_10695 src: /192.168.158.7:42942 dest: /192.168.158.4:9866 2025-07-16 12:21:34,256 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:42942, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2055000945_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751519_10695, duration(ns): 19442925 2025-07-16 12:21:34,256 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751519_10695, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-16 12:21:41,182 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751519_10695 replica FinalizedReplica, blk_1073751519_10695, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751519 for deletion 2025-07-16 12:21:41,183 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751519_10695 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751519 2025-07-16 12:22:39,254 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751520_10696 src: /192.168.158.1:40784 dest: /192.168.158.4:9866 2025-07-16 12:22:39,287 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40784, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2024578162_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751520_10696, duration(ns): 24084708 2025-07-16 12:22:39,288 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751520_10696, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-16 12:22:44,183 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751520_10696 replica FinalizedReplica, blk_1073751520_10696, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751520 for deletion 2025-07-16 12:22:44,185 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751520_10696 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751520 2025-07-16 12:23:39,232 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751521_10697 src: /192.168.158.1:50586 dest: /192.168.158.4:9866 2025-07-16 12:23:39,265 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50586, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1985072785_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751521_10697, duration(ns): 24315402 2025-07-16 12:23:39,265 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751521_10697, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-16 12:23:44,188 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751521_10697 replica FinalizedReplica, blk_1073751521_10697, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751521 for deletion 2025-07-16 12:23:44,189 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751521_10697 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751521 2025-07-16 12:24:44,237 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751522_10698 src: /192.168.158.6:48620 dest: /192.168.158.4:9866 2025-07-16 12:24:44,262 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:48620, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_355317876_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751522_10698, duration(ns): 19915320 2025-07-16 12:24:44,263 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751522_10698, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-16 12:24:50,190 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751522_10698 replica FinalizedReplica, blk_1073751522_10698, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751522 for deletion 2025-07-16 12:24:50,192 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751522_10698 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751522 2025-07-16 12:26:44,246 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751524_10700 src: /192.168.158.7:44934 dest: /192.168.158.4:9866 2025-07-16 12:26:44,263 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:44934, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2094216408_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751524_10700, duration(ns): 15150494 2025-07-16 12:26:44,263 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751524_10700, type=LAST_IN_PIPELINE terminating 2025-07-16 12:26:47,199 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751524_10700 replica FinalizedReplica, blk_1073751524_10700, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751524 for deletion 2025-07-16 12:26:47,200 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751524_10700 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751524 2025-07-16 12:28:49,245 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751526_10702 src: /192.168.158.8:43942 dest: /192.168.158.4:9866 2025-07-16 12:28:49,269 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:43942, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1377240539_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751526_10702, duration(ns): 19112459 2025-07-16 12:28:49,269 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751526_10702, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-16 12:28:53,204 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751526_10702 replica FinalizedReplica, blk_1073751526_10702, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751526 for deletion 2025-07-16 12:28:53,205 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751526_10702 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751526 2025-07-16 12:30:54,245 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751528_10704 src: /192.168.158.5:34072 dest: /192.168.158.4:9866 2025-07-16 12:30:54,268 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:34072, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1379325227_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751528_10704, duration(ns): 18243282 2025-07-16 12:30:54,268 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751528_10704, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-16 12:30:59,207 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751528_10704 replica FinalizedReplica, blk_1073751528_10704, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751528 for deletion 2025-07-16 12:30:59,208 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751528_10704 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751528 2025-07-16 12:33:54,240 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751531_10707 src: /192.168.158.8:42158 dest: /192.168.158.4:9866 2025-07-16 12:33:54,262 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:42158, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_13846092_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751531_10707, duration(ns): 16713250 2025-07-16 12:33:54,262 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751531_10707, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-16 12:33:59,212 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751531_10707 replica FinalizedReplica, blk_1073751531_10707, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751531 for deletion 2025-07-16 12:33:59,212 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751531_10707 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751531 2025-07-16 12:34:59,264 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751532_10708 src: /192.168.158.9:48216 dest: /192.168.158.4:9866 2025-07-16 12:34:59,288 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:48216, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1644469833_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751532_10708, duration(ns): 18231215 2025-07-16 12:34:59,288 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751532_10708, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-16 12:35:05,215 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751532_10708 replica FinalizedReplica, blk_1073751532_10708, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751532 for deletion 2025-07-16 12:35:05,216 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751532_10708 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751532 2025-07-16 12:35:59,239 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751533_10709 src: /192.168.158.1:56976 dest: /192.168.158.4:9866 2025-07-16 12:35:59,270 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56976, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1661442563_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751533_10709, duration(ns): 22652111 2025-07-16 12:35:59,271 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751533_10709, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-16 12:36:02,219 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751533_10709 replica FinalizedReplica, blk_1073751533_10709, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751533 for deletion 2025-07-16 12:36:02,220 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751533_10709 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751533 2025-07-16 12:39:04,260 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751536_10712 src: /192.168.158.8:36942 dest: /192.168.158.4:9866 2025-07-16 12:39:04,285 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:36942, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_647789754_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751536_10712, duration(ns): 19948100 2025-07-16 12:39:04,286 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751536_10712, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-16 12:39:08,226 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751536_10712 replica FinalizedReplica, blk_1073751536_10712, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751536 for deletion 2025-07-16 12:39:08,227 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751536_10712 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751536 2025-07-16 12:43:04,260 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751540_10716 src: /192.168.158.6:33088 dest: /192.168.158.4:9866 2025-07-16 12:43:04,280 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:33088, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1194199350_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751540_10716, duration(ns): 17835972 2025-07-16 12:43:04,281 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751540_10716, type=LAST_IN_PIPELINE terminating 2025-07-16 12:43:08,236 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751540_10716 replica FinalizedReplica, blk_1073751540_10716, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751540 for deletion 2025-07-16 12:43:08,238 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751540_10716 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751540 2025-07-16 12:48:14,259 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751545_10721 src: /192.168.158.1:49168 dest: /192.168.158.4:9866 2025-07-16 12:48:14,292 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49168, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1665503505_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751545_10721, duration(ns): 24510720 2025-07-16 12:48:14,292 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751545_10721, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-16 12:48:20,245 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751545_10721 replica FinalizedReplica, blk_1073751545_10721, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751545 for deletion 2025-07-16 12:48:20,246 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751545_10721 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751545 2025-07-16 12:49:19,276 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751546_10722 src: /192.168.158.7:51852 dest: /192.168.158.4:9866 2025-07-16 12:49:19,304 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:51852, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1724499581_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751546_10722, duration(ns): 22489970 2025-07-16 12:49:19,304 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751546_10722, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-16 12:49:23,248 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751546_10722 replica FinalizedReplica, blk_1073751546_10722, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751546 for deletion 2025-07-16 12:49:23,249 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751546_10722 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751546 2025-07-16 12:50:19,264 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751547_10723 src: /192.168.158.1:37266 dest: /192.168.158.4:9866 2025-07-16 12:50:19,295 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37266, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2002852033_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751547_10723, duration(ns): 21936498 2025-07-16 12:50:19,295 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751547_10723, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-16 12:50:23,251 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751547_10723 replica FinalizedReplica, blk_1073751547_10723, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751547 for deletion 2025-07-16 12:50:23,252 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751547_10723 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751547 2025-07-16 12:51:19,281 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751548_10724 src: /192.168.158.1:37506 dest: /192.168.158.4:9866 2025-07-16 12:51:19,313 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37506, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1131596948_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751548_10724, duration(ns): 23365889 2025-07-16 12:51:19,313 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751548_10724, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-16 12:51:23,251 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751548_10724 replica FinalizedReplica, blk_1073751548_10724, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751548 for deletion 2025-07-16 12:51:23,253 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751548_10724 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751548 2025-07-16 12:52:19,260 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751549_10725 src: /192.168.158.1:49292 dest: /192.168.158.4:9866 2025-07-16 12:52:19,290 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49292, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_700182563_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751549_10725, duration(ns): 21928229 2025-07-16 12:52:19,290 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751549_10725, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-16 12:52:26,257 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751549_10725 replica FinalizedReplica, blk_1073751549_10725, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751549 for deletion 2025-07-16 12:52:26,258 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751549_10725 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073751549 2025-07-16 12:55:24,284 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751552_10728 src: /192.168.158.5:44152 dest: /192.168.158.4:9866 2025-07-16 12:55:24,309 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:44152, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2035949531_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751552_10728, duration(ns): 19640595 2025-07-16 12:55:24,309 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751552_10728, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-16 12:55:29,262 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751552_10728 replica FinalizedReplica, blk_1073751552_10728, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751552 for deletion 2025-07-16 12:55:29,263 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751552_10728 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751552 2025-07-16 12:58:29,279 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751555_10731 src: /192.168.158.1:45684 dest: /192.168.158.4:9866 2025-07-16 12:58:29,310 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45684, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1870530155_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751555_10731, duration(ns): 22254031 2025-07-16 12:58:29,310 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751555_10731, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-16 12:58:35,269 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751555_10731 replica FinalizedReplica, blk_1073751555_10731, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751555 for deletion 2025-07-16 12:58:35,270 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751555_10731 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751555 2025-07-16 12:59:29,280 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751556_10732 src: /192.168.158.1:53664 dest: /192.168.158.4:9866 2025-07-16 12:59:29,321 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53664, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1037095784_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751556_10732, duration(ns): 30372479 2025-07-16 12:59:29,321 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751556_10732, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-16 12:59:32,275 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751556_10732 replica FinalizedReplica, blk_1073751556_10732, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751556 for deletion 2025-07-16 12:59:32,276 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751556_10732 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751556 2025-07-16 13:00:29,282 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751557_10733 src: /192.168.158.6:55508 dest: /192.168.158.4:9866 2025-07-16 13:00:29,300 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:55508, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-693133408_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751557_10733, duration(ns): 15478124 2025-07-16 13:00:29,300 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751557_10733, type=LAST_IN_PIPELINE terminating 2025-07-16 13:00:35,278 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751557_10733 replica FinalizedReplica, blk_1073751557_10733, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751557 for deletion 2025-07-16 13:00:35,279 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751557_10733 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751557 2025-07-16 13:02:29,309 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751559_10735 src: /192.168.158.7:55502 dest: /192.168.158.4:9866 2025-07-16 13:02:29,330 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:55502, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1354038898_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751559_10735, duration(ns): 17470784 2025-07-16 13:02:29,330 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751559_10735, type=LAST_IN_PIPELINE terminating 2025-07-16 13:02:32,283 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751559_10735 replica FinalizedReplica, blk_1073751559_10735, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751559 for deletion 2025-07-16 13:02:32,284 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751559_10735 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751559 2025-07-16 13:04:29,300 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751561_10737 src: /192.168.158.1:52040 dest: /192.168.158.4:9866 2025-07-16 13:04:29,334 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52040, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_613123194_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751561_10737, duration(ns): 25209384 2025-07-16 13:04:29,334 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751561_10737, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-16 13:04:32,286 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751561_10737 replica FinalizedReplica, blk_1073751561_10737, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751561 for deletion 2025-07-16 13:04:32,287 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751561_10737 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751561 2025-07-16 13:06:34,298 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751563_10739 src: /192.168.158.8:53544 dest: /192.168.158.4:9866 2025-07-16 13:06:34,315 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:53544, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1300841798_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751563_10739, duration(ns): 14767598 2025-07-16 13:06:34,316 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751563_10739, type=LAST_IN_PIPELINE terminating 2025-07-16 13:06:41,290 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751563_10739 replica FinalizedReplica, blk_1073751563_10739, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751563 for deletion 2025-07-16 13:06:41,291 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751563_10739 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751563 2025-07-16 13:07:34,305 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751564_10740 src: /192.168.158.7:41522 dest: /192.168.158.4:9866 2025-07-16 13:07:34,329 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:41522, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-221142444_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751564_10740, duration(ns): 19112004 2025-07-16 13:07:34,330 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751564_10740, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-16 13:07:41,291 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751564_10740 replica FinalizedReplica, blk_1073751564_10740, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751564 for deletion 2025-07-16 13:07:41,292 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751564_10740 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751564 2025-07-16 13:10:39,342 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751567_10743 src: /192.168.158.8:59402 dest: /192.168.158.4:9866 2025-07-16 13:10:39,359 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:59402, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-589040922_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751567_10743, duration(ns): 15271870 2025-07-16 13:10:39,360 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751567_10743, type=LAST_IN_PIPELINE terminating 2025-07-16 13:10:47,302 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751567_10743 replica FinalizedReplica, blk_1073751567_10743, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751567 for deletion 2025-07-16 13:10:47,303 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751567_10743 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751567 2025-07-16 13:14:44,312 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751571_10747 src: /192.168.158.9:46394 dest: /192.168.158.4:9866 2025-07-16 13:14:44,337 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:46394, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1582380231_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751571_10747, duration(ns): 19070813 2025-07-16 13:14:44,337 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751571_10747, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-16 13:14:47,310 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751571_10747 replica FinalizedReplica, blk_1073751571_10747, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751571 for deletion 2025-07-16 13:14:47,311 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751571_10747 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751571 2025-07-16 13:18:44,317 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751575_10751 src: /192.168.158.1:50732 dest: /192.168.158.4:9866 2025-07-16 13:18:44,348 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50732, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_698850544_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751575_10751, duration(ns): 21792820 2025-07-16 13:18:44,348 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751575_10751, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-16 13:18:47,314 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751575_10751 replica FinalizedReplica, blk_1073751575_10751, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751575 for deletion 2025-07-16 13:18:47,315 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751575_10751 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751575 2025-07-16 13:20:44,326 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751577_10753 src: /192.168.158.6:41430 dest: /192.168.158.4:9866 2025-07-16 13:20:44,344 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:41430, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1690184382_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751577_10753, duration(ns): 15134894 2025-07-16 13:20:44,344 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751577_10753, type=LAST_IN_PIPELINE terminating 2025-07-16 13:20:47,316 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751577_10753 replica FinalizedReplica, blk_1073751577_10753, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751577 for deletion 2025-07-16 13:20:47,317 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751577_10753 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751577 2025-07-16 13:21:44,329 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751578_10754 src: /192.168.158.5:52932 dest: /192.168.158.4:9866 2025-07-16 13:21:44,346 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:52932, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_925186190_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751578_10754, duration(ns): 14777075 2025-07-16 13:21:44,346 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751578_10754, type=LAST_IN_PIPELINE terminating 2025-07-16 13:21:47,319 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751578_10754 replica FinalizedReplica, blk_1073751578_10754, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751578 for deletion 2025-07-16 13:21:47,320 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751578_10754 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751578 2025-07-16 13:22:44,320 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751579_10755 src: /192.168.158.8:52866 dest: /192.168.158.4:9866 2025-07-16 13:22:44,344 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:52866, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-579131405_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751579_10755, duration(ns): 18914731 2025-07-16 13:22:44,345 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751579_10755, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-16 13:22:47,320 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751579_10755 replica FinalizedReplica, blk_1073751579_10755, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751579 for deletion 2025-07-16 13:22:47,322 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751579_10755 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751579 2025-07-16 13:23:49,322 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751580_10756 src: /192.168.158.5:33394 dest: /192.168.158.4:9866 2025-07-16 13:23:49,338 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:33394, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1069649614_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751580_10756, duration(ns): 13807525 2025-07-16 13:23:49,339 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751580_10756, type=LAST_IN_PIPELINE terminating 2025-07-16 13:23:53,321 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751580_10756 replica FinalizedReplica, blk_1073751580_10756, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751580 for deletion 2025-07-16 13:23:53,323 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751580_10756 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751580 2025-07-16 13:24:49,324 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751581_10757 src: /192.168.158.5:58362 dest: /192.168.158.4:9866 2025-07-16 13:24:49,350 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:58362, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-827507073_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751581_10757, duration(ns): 20701606 2025-07-16 13:24:49,350 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751581_10757, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-16 13:24:53,323 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751581_10757 replica FinalizedReplica, blk_1073751581_10757, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751581 for deletion 2025-07-16 13:24:53,325 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751581_10757 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751581 2025-07-16 13:26:49,332 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751583_10759 src: /192.168.158.7:58076 dest: /192.168.158.4:9866 2025-07-16 13:26:49,349 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:58076, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_632045444_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751583_10759, duration(ns): 14529360 2025-07-16 13:26:49,349 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751583_10759, type=LAST_IN_PIPELINE terminating 2025-07-16 13:26:53,332 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751583_10759 replica FinalizedReplica, blk_1073751583_10759, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751583 for deletion 2025-07-16 13:26:53,333 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751583_10759 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751583 2025-07-16 13:27:49,332 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751584_10760 src: /192.168.158.7:41232 dest: /192.168.158.4:9866 2025-07-16 13:27:49,349 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:41232, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1432357940_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751584_10760, duration(ns): 14984295 2025-07-16 13:27:49,350 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751584_10760, type=LAST_IN_PIPELINE terminating 2025-07-16 13:27:53,333 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751584_10760 replica FinalizedReplica, blk_1073751584_10760, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751584 for deletion 2025-07-16 13:27:53,335 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751584_10760 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751584 2025-07-16 13:28:49,326 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751585_10761 src: /192.168.158.1:58202 dest: /192.168.158.4:9866 2025-07-16 13:28:49,359 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58202, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2113937749_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751585_10761, duration(ns): 23869759 2025-07-16 13:28:49,359 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751585_10761, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-16 13:28:53,335 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751585_10761 replica FinalizedReplica, blk_1073751585_10761, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751585 for deletion 2025-07-16 13:28:53,337 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751585_10761 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751585 2025-07-16 13:32:54,334 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751589_10765 src: /192.168.158.1:54448 dest: /192.168.158.4:9866 2025-07-16 13:32:54,365 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54448, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_721363688_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751589_10765, duration(ns): 21940290 2025-07-16 13:32:54,365 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751589_10765, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-16 13:32:59,346 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751589_10765 replica FinalizedReplica, blk_1073751589_10765, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751589 for deletion 2025-07-16 13:32:59,347 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751589_10765 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751589 2025-07-16 13:33:54,341 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751590_10766 src: /192.168.158.1:33132 dest: /192.168.158.4:9866 2025-07-16 13:33:54,372 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33132, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1013971574_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751590_10766, duration(ns): 22827812 2025-07-16 13:33:54,373 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751590_10766, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-16 13:33:59,350 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751590_10766 replica FinalizedReplica, blk_1073751590_10766, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751590 for deletion 2025-07-16 13:33:59,351 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751590_10766 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751590 2025-07-16 13:34:54,344 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751591_10767 src: /192.168.158.5:51682 dest: /192.168.158.4:9866 2025-07-16 13:34:54,369 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:51682, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1985286487_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751591_10767, duration(ns): 19862455 2025-07-16 13:34:54,369 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751591_10767, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-16 13:34:59,353 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751591_10767 replica FinalizedReplica, blk_1073751591_10767, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751591 for deletion 2025-07-16 13:34:59,354 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751591_10767 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751591 2025-07-16 13:35:54,341 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751592_10768 src: /192.168.158.6:43884 dest: /192.168.158.4:9866 2025-07-16 13:35:54,363 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:43884, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1272544654_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751592_10768, duration(ns): 16277693 2025-07-16 13:35:54,363 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751592_10768, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-16 13:35:59,354 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751592_10768 replica FinalizedReplica, blk_1073751592_10768, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751592 for deletion 2025-07-16 13:35:59,356 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751592_10768 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751592 2025-07-16 13:36:54,348 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751593_10769 src: /192.168.158.9:46888 dest: /192.168.158.4:9866 2025-07-16 13:36:54,372 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:46888, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1043479006_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751593_10769, duration(ns): 18041864 2025-07-16 13:36:54,372 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751593_10769, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-16 13:36:59,354 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751593_10769 replica FinalizedReplica, blk_1073751593_10769, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751593 for deletion 2025-07-16 13:36:59,356 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751593_10769 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751593 2025-07-16 13:39:54,352 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751596_10772 src: /192.168.158.1:34056 dest: /192.168.158.4:9866 2025-07-16 13:39:54,385 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34056, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1286382860_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751596_10772, duration(ns): 24293942 2025-07-16 13:39:54,386 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751596_10772, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-16 13:40:02,358 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751596_10772 replica FinalizedReplica, blk_1073751596_10772, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751596 for deletion 2025-07-16 13:40:02,359 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751596_10772 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751596 2025-07-16 13:40:54,360 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751597_10773 src: /192.168.158.1:55728 dest: /192.168.158.4:9866 2025-07-16 13:40:54,395 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55728, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1189542237_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751597_10773, duration(ns): 26637623 2025-07-16 13:40:54,396 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751597_10773, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-16 13:40:59,359 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751597_10773 replica FinalizedReplica, blk_1073751597_10773, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751597 for deletion 2025-07-16 13:40:59,361 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751597_10773 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751597 2025-07-16 13:41:54,367 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751598_10774 src: /192.168.158.6:54234 dest: /192.168.158.4:9866 2025-07-16 13:41:54,385 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:54234, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-977203086_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751598_10774, duration(ns): 15965661 2025-07-16 13:41:54,385 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751598_10774, type=LAST_IN_PIPELINE terminating 2025-07-16 13:42:02,362 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751598_10774 replica FinalizedReplica, blk_1073751598_10774, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751598 for deletion 2025-07-16 13:42:02,363 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751598_10774 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751598 2025-07-16 13:42:59,363 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751599_10775 src: /192.168.158.8:48584 dest: /192.168.158.4:9866 2025-07-16 13:42:59,381 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:48584, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_592690233_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751599_10775, duration(ns): 15645585 2025-07-16 13:42:59,381 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751599_10775, type=LAST_IN_PIPELINE terminating 2025-07-16 13:43:02,363 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751599_10775 replica FinalizedReplica, blk_1073751599_10775, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751599 for deletion 2025-07-16 13:43:02,366 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751599_10775 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751599 2025-07-16 13:43:59,362 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751600_10776 src: /192.168.158.6:59354 dest: /192.168.158.4:9866 2025-07-16 13:43:59,379 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:59354, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1962456800_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751600_10776, duration(ns): 14805907 2025-07-16 13:43:59,379 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751600_10776, type=LAST_IN_PIPELINE terminating 2025-07-16 13:44:02,368 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751600_10776 replica FinalizedReplica, blk_1073751600_10776, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751600 for deletion 2025-07-16 13:44:02,369 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751600_10776 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751600 2025-07-16 13:44:59,388 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751601_10777 src: /192.168.158.1:36986 dest: /192.168.158.4:9866 2025-07-16 13:44:59,418 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36986, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_879684458_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751601_10777, duration(ns): 22609863 2025-07-16 13:44:59,419 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751601_10777, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-16 13:45:02,370 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751601_10777 replica FinalizedReplica, blk_1073751601_10777, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751601 for deletion 2025-07-16 13:45:02,372 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751601_10777 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751601 2025-07-16 13:47:04,380 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751603_10779 src: /192.168.158.6:57956 dest: /192.168.158.4:9866 2025-07-16 13:47:04,403 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:57956, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1751195128_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751603_10779, duration(ns): 18114873 2025-07-16 13:47:04,403 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751603_10779, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-16 13:47:08,374 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751603_10779 replica FinalizedReplica, blk_1073751603_10779, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751603 for deletion 2025-07-16 13:47:08,375 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751603_10779 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751603 2025-07-16 13:48:04,376 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751604_10780 src: /192.168.158.8:35448 dest: /192.168.158.4:9866 2025-07-16 13:48:04,401 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:35448, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1251430853_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751604_10780, duration(ns): 19774558 2025-07-16 13:48:04,401 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751604_10780, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-16 13:48:08,374 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751604_10780 replica FinalizedReplica, blk_1073751604_10780, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751604 for deletion 2025-07-16 13:48:08,376 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751604_10780 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751604 2025-07-16 13:53:09,384 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751609_10785 src: /192.168.158.1:52216 dest: /192.168.158.4:9866 2025-07-16 13:53:09,415 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52216, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-161833655_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751609_10785, duration(ns): 21605573 2025-07-16 13:53:09,418 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751609_10785, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-16 13:53:14,386 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751609_10785 replica FinalizedReplica, blk_1073751609_10785, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751609 for deletion 2025-07-16 13:53:14,387 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751609_10785 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751609 2025-07-16 13:55:09,404 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751611_10787 src: /192.168.158.9:58594 dest: /192.168.158.4:9866 2025-07-16 13:55:09,428 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:58594, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1232957843_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751611_10787, duration(ns): 18169980 2025-07-16 13:55:09,428 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751611_10787, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-16 13:55:17,391 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751611_10787 replica FinalizedReplica, blk_1073751611_10787, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751611 for deletion 2025-07-16 13:55:17,392 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751611_10787 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751611 2025-07-16 13:58:09,394 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751614_10790 src: /192.168.158.1:53258 dest: /192.168.158.4:9866 2025-07-16 13:58:09,427 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53258, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-593895085_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751614_10790, duration(ns): 24194270 2025-07-16 13:58:09,427 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751614_10790, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-16 13:58:17,397 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751614_10790 replica FinalizedReplica, blk_1073751614_10790, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751614 for deletion 2025-07-16 13:58:17,399 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751614_10790 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751614 2025-07-16 14:00:09,415 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751616_10792 src: /192.168.158.1:53430 dest: /192.168.158.4:9866 2025-07-16 14:00:09,449 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53430, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-64692606_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751616_10792, duration(ns): 24983273 2025-07-16 14:00:09,449 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751616_10792, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-16 14:00:17,402 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751616_10792 replica FinalizedReplica, blk_1073751616_10792, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751616 for deletion 2025-07-16 14:00:17,403 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751616_10792 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751616 2025-07-16 14:00:56,409 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Successfully sent block report 0x12cfd9a757d31f41, containing 4 storage report(s), of which we sent 4. The reports had 8 total blocks and used 1 RPC(s). This took 0 msec to generate and 5 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5. 2025-07-16 14:00:56,409 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Got finalize command for block pool BP-1059995147-192.168.158.1-1752101929360 2025-07-16 14:01:14,409 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751617_10793 src: /192.168.158.7:49680 dest: /192.168.158.4:9866 2025-07-16 14:01:14,428 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:49680, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_909655922_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751617_10793, duration(ns): 16751154 2025-07-16 14:01:14,428 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751617_10793, type=LAST_IN_PIPELINE terminating 2025-07-16 14:01:17,405 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751617_10793 replica FinalizedReplica, blk_1073751617_10793, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751617 for deletion 2025-07-16 14:01:17,406 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751617_10793 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751617 2025-07-16 14:04:14,433 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751620_10796 src: /192.168.158.7:33770 dest: /192.168.158.4:9866 2025-07-16 14:04:14,456 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:33770, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_457737320_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751620_10796, duration(ns): 17993733 2025-07-16 14:04:14,457 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751620_10796, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-16 14:04:17,410 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751620_10796 replica FinalizedReplica, blk_1073751620_10796, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751620 for deletion 2025-07-16 14:04:17,411 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751620_10796 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751620 2025-07-16 14:05:19,433 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751621_10797 src: /192.168.158.8:57282 dest: /192.168.158.4:9866 2025-07-16 14:05:19,453 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:57282, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1973260417_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751621_10797, duration(ns): 17304312 2025-07-16 14:05:19,453 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751621_10797, type=LAST_IN_PIPELINE terminating 2025-07-16 14:05:26,414 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751621_10797 replica FinalizedReplica, blk_1073751621_10797, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751621 for deletion 2025-07-16 14:05:26,416 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751621_10797 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751621 2025-07-16 14:07:19,415 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751623_10799 src: /192.168.158.1:42168 dest: /192.168.158.4:9866 2025-07-16 14:07:19,448 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42168, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-21681668_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751623_10799, duration(ns): 25064785 2025-07-16 14:07:19,451 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751623_10799, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-16 14:07:26,420 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751623_10799 replica FinalizedReplica, blk_1073751623_10799, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751623 for deletion 2025-07-16 14:07:26,422 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751623_10799 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751623 2025-07-16 14:08:19,418 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751624_10800 src: /192.168.158.1:34976 dest: /192.168.158.4:9866 2025-07-16 14:08:19,450 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34976, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2052389978_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751624_10800, duration(ns): 23244304 2025-07-16 14:08:19,452 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751624_10800, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-16 14:08:23,422 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751624_10800 replica FinalizedReplica, blk_1073751624_10800, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751624 for deletion 2025-07-16 14:08:23,423 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751624_10800 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751624 2025-07-16 14:09:24,437 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751625_10801 src: /192.168.158.5:40076 dest: /192.168.158.4:9866 2025-07-16 14:09:24,464 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:40076, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-41884954_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751625_10801, duration(ns): 21369306 2025-07-16 14:09:24,464 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751625_10801, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-16 14:09:32,422 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751625_10801 replica FinalizedReplica, blk_1073751625_10801, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751625 for deletion 2025-07-16 14:09:32,423 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751625_10801 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751625 2025-07-16 14:15:24,437 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751631_10807 src: /192.168.158.6:49708 dest: /192.168.158.4:9866 2025-07-16 14:15:24,454 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:49708, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-787041487_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751631_10807, duration(ns): 14398508 2025-07-16 14:15:24,454 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751631_10807, type=LAST_IN_PIPELINE terminating 2025-07-16 14:15:29,439 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751631_10807 replica FinalizedReplica, blk_1073751631_10807, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751631 for deletion 2025-07-16 14:15:29,440 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751631_10807 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751631 2025-07-16 14:20:24,442 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751636_10812 src: /192.168.158.7:47488 dest: /192.168.158.4:9866 2025-07-16 14:20:24,465 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:47488, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_325919381_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751636_10812, duration(ns): 18071588 2025-07-16 14:20:24,466 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751636_10812, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-16 14:20:29,449 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751636_10812 replica FinalizedReplica, blk_1073751636_10812, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751636 for deletion 2025-07-16 14:20:29,450 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751636_10812 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751636 2025-07-16 14:22:24,438 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751638_10814 src: /192.168.158.1:34576 dest: /192.168.158.4:9866 2025-07-16 14:22:24,471 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34576, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1139177786_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751638_10814, duration(ns): 24142975 2025-07-16 14:22:24,471 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751638_10814, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-16 14:22:29,450 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751638_10814 replica FinalizedReplica, blk_1073751638_10814, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751638 for deletion 2025-07-16 14:22:29,451 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751638_10814 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751638 2025-07-16 14:25:24,438 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751641_10817 src: /192.168.158.8:36170 dest: /192.168.158.4:9866 2025-07-16 14:25:24,465 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:36170, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_950399992_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751641_10817, duration(ns): 21831634 2025-07-16 14:25:24,465 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751641_10817, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-16 14:25:29,454 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751641_10817 replica FinalizedReplica, blk_1073751641_10817, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751641 for deletion 2025-07-16 14:25:29,455 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751641_10817 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751641 2025-07-16 14:28:29,429 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751644_10820 src: /192.168.158.1:33302 dest: /192.168.158.4:9866 2025-07-16 14:28:29,462 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33302, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_617553491_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751644_10820, duration(ns): 23885971 2025-07-16 14:28:29,463 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751644_10820, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-16 14:28:32,462 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751644_10820 replica FinalizedReplica, blk_1073751644_10820, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751644 for deletion 2025-07-16 14:28:32,463 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751644_10820 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751644 2025-07-16 14:29:29,438 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751645_10821 src: /192.168.158.1:39336 dest: /192.168.158.4:9866 2025-07-16 14:29:29,469 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39336, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-191383320_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751645_10821, duration(ns): 22315933 2025-07-16 14:29:29,471 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751645_10821, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-16 14:29:32,462 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751645_10821 replica FinalizedReplica, blk_1073751645_10821, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751645 for deletion 2025-07-16 14:29:32,463 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751645_10821 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751645 2025-07-16 14:32:34,454 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751648_10824 src: /192.168.158.8:50290 dest: /192.168.158.4:9866 2025-07-16 14:32:34,478 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:50290, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1280770058_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751648_10824, duration(ns): 18578983 2025-07-16 14:32:34,480 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751648_10824, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-16 14:32:38,465 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751648_10824 replica FinalizedReplica, blk_1073751648_10824, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751648 for deletion 2025-07-16 14:32:38,466 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751648_10824 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751648 2025-07-16 14:35:34,452 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751651_10827 src: /192.168.158.1:52160 dest: /192.168.158.4:9866 2025-07-16 14:35:34,497 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52160, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-253105017_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751651_10827, duration(ns): 34678614 2025-07-16 14:35:34,497 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751651_10827, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-16 14:35:38,469 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751651_10827 replica FinalizedReplica, blk_1073751651_10827, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751651 for deletion 2025-07-16 14:35:38,470 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751651_10827 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751651 2025-07-16 14:36:34,470 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751652_10828 src: /192.168.158.7:51952 dest: /192.168.158.4:9866 2025-07-16 14:36:34,486 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:51952, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-22150920_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751652_10828, duration(ns): 13711479 2025-07-16 14:36:34,486 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751652_10828, type=LAST_IN_PIPELINE terminating 2025-07-16 14:36:38,470 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751652_10828 replica FinalizedReplica, blk_1073751652_10828, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751652 for deletion 2025-07-16 14:36:38,472 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751652_10828 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751652 2025-07-16 14:37:34,461 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751653_10829 src: /192.168.158.8:40110 dest: /192.168.158.4:9866 2025-07-16 14:37:34,479 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:40110, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1739351255_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751653_10829, duration(ns): 16170488 2025-07-16 14:37:34,479 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751653_10829, type=LAST_IN_PIPELINE terminating 2025-07-16 14:37:41,474 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751653_10829 replica FinalizedReplica, blk_1073751653_10829, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751653 for deletion 2025-07-16 14:37:41,475 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751653_10829 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751653 2025-07-16 14:38:34,449 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751654_10830 src: /192.168.158.7:56660 dest: /192.168.158.4:9866 2025-07-16 14:38:34,469 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:56660, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2107545062_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751654_10830, duration(ns): 17645490 2025-07-16 14:38:34,469 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751654_10830, type=LAST_IN_PIPELINE terminating 2025-07-16 14:38:38,478 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751654_10830 replica FinalizedReplica, blk_1073751654_10830, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751654 for deletion 2025-07-16 14:38:38,479 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751654_10830 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751654 2025-07-16 14:41:34,453 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751657_10833 src: /192.168.158.8:40132 dest: /192.168.158.4:9866 2025-07-16 14:41:34,472 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:40132, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1476009670_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751657_10833, duration(ns): 16511493 2025-07-16 14:41:34,472 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751657_10833, type=LAST_IN_PIPELINE terminating 2025-07-16 14:41:38,487 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751657_10833 replica FinalizedReplica, blk_1073751657_10833, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751657 for deletion 2025-07-16 14:41:38,488 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751657_10833 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751657 2025-07-16 14:42:34,469 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751658_10834 src: /192.168.158.9:46670 dest: /192.168.158.4:9866 2025-07-16 14:42:34,493 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:46670, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2100881562_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751658_10834, duration(ns): 18168423 2025-07-16 14:42:34,493 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751658_10834, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-16 14:42:41,492 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751658_10834 replica FinalizedReplica, blk_1073751658_10834, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751658 for deletion 2025-07-16 14:42:41,493 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751658_10834 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751658 2025-07-16 14:45:34,457 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751661_10837 src: /192.168.158.1:46168 dest: /192.168.158.4:9866 2025-07-16 14:45:34,490 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46168, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_628176354_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751661_10837, duration(ns): 22947257 2025-07-16 14:45:34,490 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751661_10837, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-16 14:45:41,498 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751661_10837 replica FinalizedReplica, blk_1073751661_10837, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751661 for deletion 2025-07-16 14:45:41,499 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751661_10837 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751661 2025-07-16 14:48:34,496 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751664_10840 src: /192.168.158.8:32864 dest: /192.168.158.4:9866 2025-07-16 14:48:34,514 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:32864, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_662172326_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751664_10840, duration(ns): 15765390 2025-07-16 14:48:34,514 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751664_10840, type=LAST_IN_PIPELINE terminating 2025-07-16 14:48:38,505 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751664_10840 replica FinalizedReplica, blk_1073751664_10840, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751664 for deletion 2025-07-16 14:48:38,506 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751664_10840 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751664 2025-07-16 14:49:34,468 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751665_10841 src: /192.168.158.1:39464 dest: /192.168.158.4:9866 2025-07-16 14:49:34,498 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39464, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-182049607_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751665_10841, duration(ns): 22162406 2025-07-16 14:49:34,498 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751665_10841, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-16 14:49:38,506 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751665_10841 replica FinalizedReplica, blk_1073751665_10841, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751665 for deletion 2025-07-16 14:49:38,508 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751665_10841 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751665 2025-07-16 14:50:34,474 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751666_10842 src: /192.168.158.5:60370 dest: /192.168.158.4:9866 2025-07-16 14:50:34,492 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:60370, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1409900989_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751666_10842, duration(ns): 15544459 2025-07-16 14:50:34,492 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751666_10842, type=LAST_IN_PIPELINE terminating 2025-07-16 14:50:38,507 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751666_10842 replica FinalizedReplica, blk_1073751666_10842, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751666 for deletion 2025-07-16 14:50:38,510 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751666_10842 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751666 2025-07-16 14:53:39,468 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751669_10845 src: /192.168.158.1:45018 dest: /192.168.158.4:9866 2025-07-16 14:53:39,498 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45018, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-273662437_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751669_10845, duration(ns): 20715077 2025-07-16 14:53:39,498 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751669_10845, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-16 14:53:41,516 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751669_10845 replica FinalizedReplica, blk_1073751669_10845, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751669 for deletion 2025-07-16 14:53:41,517 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751669_10845 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751669 2025-07-16 14:54:39,474 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751670_10846 src: /192.168.158.9:50632 dest: /192.168.158.4:9866 2025-07-16 14:54:39,496 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:50632, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1602368267_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751670_10846, duration(ns): 17277233 2025-07-16 14:54:39,497 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751670_10846, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-16 14:54:41,519 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751670_10846 replica FinalizedReplica, blk_1073751670_10846, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751670 for deletion 2025-07-16 14:54:41,520 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751670_10846 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751670 2025-07-16 14:56:39,482 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751672_10848 src: /192.168.158.6:53368 dest: /192.168.158.4:9866 2025-07-16 14:56:39,500 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:53368, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_213979444_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751672_10848, duration(ns): 15870951 2025-07-16 14:56:39,500 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751672_10848, type=LAST_IN_PIPELINE terminating 2025-07-16 14:56:41,526 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751672_10848 replica FinalizedReplica, blk_1073751672_10848, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751672 for deletion 2025-07-16 14:56:41,527 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751672_10848 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751672 2025-07-16 14:59:49,482 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751675_10851 src: /192.168.158.6:36076 dest: /192.168.158.4:9866 2025-07-16 14:59:49,505 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:36076, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1503875508_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751675_10851, duration(ns): 18568854 2025-07-16 14:59:49,506 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751675_10851, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-16 14:59:53,539 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751675_10851 replica FinalizedReplica, blk_1073751675_10851, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751675 for deletion 2025-07-16 14:59:53,540 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751675_10851 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751675 2025-07-16 15:00:49,484 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751676_10852 src: /192.168.158.7:49092 dest: /192.168.158.4:9866 2025-07-16 15:00:49,508 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:49092, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-23771918_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751676_10852, duration(ns): 19096698 2025-07-16 15:00:49,509 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751676_10852, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-16 15:00:56,540 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751676_10852 replica FinalizedReplica, blk_1073751676_10852, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751676 for deletion 2025-07-16 15:00:56,541 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751676_10852 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751676 2025-07-16 15:01:54,497 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751677_10853 src: /192.168.158.6:34276 dest: /192.168.158.4:9866 2025-07-16 15:01:54,517 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:34276, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_407870893_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751677_10853, duration(ns): 17135783 2025-07-16 15:01:54,517 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751677_10853, type=LAST_IN_PIPELINE terminating 2025-07-16 15:01:56,540 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751677_10853 replica FinalizedReplica, blk_1073751677_10853, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751677 for deletion 2025-07-16 15:01:56,541 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751677_10853 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751677 2025-07-16 15:02:59,491 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751678_10854 src: /192.168.158.5:37534 dest: /192.168.158.4:9866 2025-07-16 15:02:59,509 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:37534, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1157863107_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751678_10854, duration(ns): 16433325 2025-07-16 15:02:59,510 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751678_10854, type=LAST_IN_PIPELINE terminating 2025-07-16 15:03:02,542 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751678_10854 replica FinalizedReplica, blk_1073751678_10854, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751678 for deletion 2025-07-16 15:03:02,543 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751678_10854 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751678 2025-07-16 15:03:59,492 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751679_10855 src: /192.168.158.1:37262 dest: /192.168.158.4:9866 2025-07-16 15:03:59,524 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37262, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_47053486_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751679_10855, duration(ns): 23338005 2025-07-16 15:03:59,525 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751679_10855, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-16 15:04:02,543 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751679_10855 replica FinalizedReplica, blk_1073751679_10855, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751679 for deletion 2025-07-16 15:04:02,545 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751679_10855 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751679 2025-07-16 15:04:59,486 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751680_10856 src: /192.168.158.1:52744 dest: /192.168.158.4:9866 2025-07-16 15:04:59,518 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52744, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-557015127_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751680_10856, duration(ns): 23450766 2025-07-16 15:04:59,518 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751680_10856, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-16 15:05:02,542 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751680_10856 replica FinalizedReplica, blk_1073751680_10856, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751680 for deletion 2025-07-16 15:05:02,543 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751680_10856 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751680 2025-07-16 15:07:04,504 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751682_10858 src: /192.168.158.7:41268 dest: /192.168.158.4:9866 2025-07-16 15:07:04,522 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:41268, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-597823854_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751682_10858, duration(ns): 15370011 2025-07-16 15:07:04,522 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751682_10858, type=LAST_IN_PIPELINE terminating 2025-07-16 15:07:08,546 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751682_10858 replica FinalizedReplica, blk_1073751682_10858, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751682 for deletion 2025-07-16 15:07:08,547 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751682_10858 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751682 2025-07-16 15:08:04,497 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751683_10859 src: /192.168.158.7:37906 dest: /192.168.158.4:9866 2025-07-16 15:08:04,521 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:37906, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1121098068_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751683_10859, duration(ns): 18819660 2025-07-16 15:08:04,522 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751683_10859, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-16 15:08:08,548 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751683_10859 replica FinalizedReplica, blk_1073751683_10859, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751683 for deletion 2025-07-16 15:08:08,550 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751683_10859 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751683 2025-07-16 15:10:09,506 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751685_10861 src: /192.168.158.6:43778 dest: /192.168.158.4:9866 2025-07-16 15:10:09,530 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:43778, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1345201779_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751685_10861, duration(ns): 18600029 2025-07-16 15:10:09,530 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751685_10861, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-16 15:10:11,553 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751685_10861 replica FinalizedReplica, blk_1073751685_10861, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751685 for deletion 2025-07-16 15:10:11,554 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751685_10861 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751685 2025-07-16 15:11:14,504 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751686_10862 src: /192.168.158.1:46768 dest: /192.168.158.4:9866 2025-07-16 15:11:14,536 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46768, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1451945604_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751686_10862, duration(ns): 23107293 2025-07-16 15:11:14,536 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751686_10862, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-16 15:11:20,554 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751686_10862 replica FinalizedReplica, blk_1073751686_10862, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751686 for deletion 2025-07-16 15:11:20,556 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751686_10862 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751686 2025-07-16 15:14:14,514 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751689_10865 src: /192.168.158.1:39566 dest: /192.168.158.4:9866 2025-07-16 15:14:14,545 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39566, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1216709613_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751689_10865, duration(ns): 23065800 2025-07-16 15:14:14,545 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751689_10865, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-16 15:14:17,561 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751689_10865 replica FinalizedReplica, blk_1073751689_10865, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751689 for deletion 2025-07-16 15:14:17,562 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751689_10865 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751689 2025-07-16 15:16:24,510 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751691_10867 src: /192.168.158.6:34906 dest: /192.168.158.4:9866 2025-07-16 15:16:24,528 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:34906, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1165143066_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751691_10867, duration(ns): 14339716 2025-07-16 15:16:24,528 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751691_10867, type=LAST_IN_PIPELINE terminating 2025-07-16 15:16:26,562 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751691_10867 replica FinalizedReplica, blk_1073751691_10867, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751691 for deletion 2025-07-16 15:16:26,563 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751691_10867 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751691 2025-07-16 15:17:24,508 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751692_10868 src: /192.168.158.9:57964 dest: /192.168.158.4:9866 2025-07-16 15:17:24,535 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:57964, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1471539774_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751692_10868, duration(ns): 21907829 2025-07-16 15:17:24,536 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751692_10868, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-16 15:17:26,562 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751692_10868 replica FinalizedReplica, blk_1073751692_10868, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751692 for deletion 2025-07-16 15:17:26,563 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751692_10868 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751692 2025-07-16 15:21:24,513 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751696_10872 src: /192.168.158.5:44592 dest: /192.168.158.4:9866 2025-07-16 15:21:24,538 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:44592, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2060605416_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751696_10872, duration(ns): 19746420 2025-07-16 15:21:24,538 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751696_10872, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-16 15:21:26,567 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751696_10872 replica FinalizedReplica, blk_1073751696_10872, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751696 for deletion 2025-07-16 15:21:26,568 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751696_10872 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751696 2025-07-16 15:22:29,518 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751697_10873 src: /192.168.158.1:46256 dest: /192.168.158.4:9866 2025-07-16 15:22:29,549 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46256, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2141450992_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751697_10873, duration(ns): 22681899 2025-07-16 15:22:29,549 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751697_10873, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-16 15:22:32,570 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751697_10873 replica FinalizedReplica, blk_1073751697_10873, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751697 for deletion 2025-07-16 15:22:32,571 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751697_10873 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751697 2025-07-16 15:25:29,517 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751700_10876 src: /192.168.158.7:41812 dest: /192.168.158.4:9866 2025-07-16 15:25:29,540 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:41812, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1599784288_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751700_10876, duration(ns): 17489588 2025-07-16 15:25:29,540 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751700_10876, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-16 15:25:35,573 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751700_10876 replica FinalizedReplica, blk_1073751700_10876, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751700 for deletion 2025-07-16 15:25:35,574 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751700_10876 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751700 2025-07-16 15:26:29,523 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751701_10877 src: /192.168.158.6:39780 dest: /192.168.158.4:9866 2025-07-16 15:26:29,542 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:39780, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1812063972_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751701_10877, duration(ns): 17177021 2025-07-16 15:26:29,543 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751701_10877, type=LAST_IN_PIPELINE terminating 2025-07-16 15:26:32,576 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751701_10877 replica FinalizedReplica, blk_1073751701_10877, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751701 for deletion 2025-07-16 15:26:32,577 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751701_10877 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751701 2025-07-16 15:27:29,520 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751702_10878 src: /192.168.158.1:51582 dest: /192.168.158.4:9866 2025-07-16 15:27:29,552 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51582, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2029847981_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751702_10878, duration(ns): 23081380 2025-07-16 15:27:29,552 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751702_10878, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-16 15:27:32,581 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751702_10878 replica FinalizedReplica, blk_1073751702_10878, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751702 for deletion 2025-07-16 15:27:32,582 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751702_10878 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751702 2025-07-16 15:28:29,534 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751703_10879 src: /192.168.158.8:58264 dest: /192.168.158.4:9866 2025-07-16 15:28:29,552 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:58264, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1962038399_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751703_10879, duration(ns): 15183418 2025-07-16 15:28:29,552 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751703_10879, type=LAST_IN_PIPELINE terminating 2025-07-16 15:28:32,584 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751703_10879 replica FinalizedReplica, blk_1073751703_10879, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751703 for deletion 2025-07-16 15:28:32,585 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751703_10879 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751703 2025-07-16 15:29:29,521 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751704_10880 src: /192.168.158.9:53708 dest: /192.168.158.4:9866 2025-07-16 15:29:29,546 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:53708, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1436136146_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751704_10880, duration(ns): 19278310 2025-07-16 15:29:29,546 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751704_10880, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-16 15:29:32,585 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751704_10880 replica FinalizedReplica, blk_1073751704_10880, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751704 for deletion 2025-07-16 15:29:32,588 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751704_10880 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751704 2025-07-16 15:36:44,539 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751711_10887 src: /192.168.158.7:37604 dest: /192.168.158.4:9866 2025-07-16 15:36:44,563 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:37604, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1314078747_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751711_10887, duration(ns): 18726118 2025-07-16 15:36:44,564 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751711_10887, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-16 15:36:47,595 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751711_10887 replica FinalizedReplica, blk_1073751711_10887, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751711 for deletion 2025-07-16 15:36:47,597 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751711_10887 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751711 2025-07-16 15:40:49,537 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751715_10891 src: /192.168.158.1:52722 dest: /192.168.158.4:9866 2025-07-16 15:40:49,567 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52722, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1521175259_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751715_10891, duration(ns): 21535121 2025-07-16 15:40:49,567 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751715_10891, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-16 15:40:53,607 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751715_10891 replica FinalizedReplica, blk_1073751715_10891, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751715 for deletion 2025-07-16 15:40:53,608 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751715_10891 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751715 2025-07-16 15:41:49,573 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751716_10892 src: /192.168.158.1:55106 dest: /192.168.158.4:9866 2025-07-16 15:41:49,607 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55106, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1520143760_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751716_10892, duration(ns): 24761662 2025-07-16 15:41:49,607 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751716_10892, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-16 15:41:53,607 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751716_10892 replica FinalizedReplica, blk_1073751716_10892, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751716 for deletion 2025-07-16 15:41:53,608 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751716_10892 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751716 2025-07-16 15:44:54,557 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751719_10895 src: /192.168.158.7:51366 dest: /192.168.158.4:9866 2025-07-16 15:44:54,575 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:51366, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1581959487_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751719_10895, duration(ns): 15932074 2025-07-16 15:44:54,575 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751719_10895, type=LAST_IN_PIPELINE terminating 2025-07-16 15:44:59,613 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751719_10895 replica FinalizedReplica, blk_1073751719_10895, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751719 for deletion 2025-07-16 15:44:59,614 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751719_10895 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751719 2025-07-16 15:46:59,546 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751721_10897 src: /192.168.158.1:34264 dest: /192.168.158.4:9866 2025-07-16 15:46:59,578 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34264, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1947709974_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751721_10897, duration(ns): 23304703 2025-07-16 15:46:59,578 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751721_10897, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-16 15:47:02,614 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751721_10897 replica FinalizedReplica, blk_1073751721_10897, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751721 for deletion 2025-07-16 15:47:02,615 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751721_10897 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751721 2025-07-16 15:50:59,561 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751725_10901 src: /192.168.158.1:40164 dest: /192.168.158.4:9866 2025-07-16 15:50:59,593 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40164, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-916608675_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751725_10901, duration(ns): 21988712 2025-07-16 15:50:59,593 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751725_10901, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-16 15:51:02,620 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751725_10901 replica FinalizedReplica, blk_1073751725_10901, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751725 for deletion 2025-07-16 15:51:02,621 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751725_10901 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751725 2025-07-16 15:52:59,566 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751727_10903 src: /192.168.158.5:33062 dest: /192.168.158.4:9866 2025-07-16 15:52:59,590 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:33062, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-424634929_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751727_10903, duration(ns): 18766447 2025-07-16 15:52:59,590 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751727_10903, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-16 15:53:02,625 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751727_10903 replica FinalizedReplica, blk_1073751727_10903, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751727 for deletion 2025-07-16 15:53:02,626 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751727_10903 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751727 2025-07-16 15:54:04,570 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751728_10904 src: /192.168.158.8:42868 dest: /192.168.158.4:9866 2025-07-16 15:54:04,596 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:42868, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-749175177_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751728_10904, duration(ns): 20580414 2025-07-16 15:54:04,596 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751728_10904, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-16 15:54:11,625 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751728_10904 replica FinalizedReplica, blk_1073751728_10904, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751728 for deletion 2025-07-16 15:54:11,626 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751728_10904 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751728 2025-07-16 15:56:09,563 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751730_10906 src: /192.168.158.1:58862 dest: /192.168.158.4:9866 2025-07-16 15:56:09,592 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58862, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1220106956_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751730_10906, duration(ns): 20680316 2025-07-16 15:56:09,592 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751730_10906, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-16 15:56:14,632 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751730_10906 replica FinalizedReplica, blk_1073751730_10906, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751730 for deletion 2025-07-16 15:56:14,633 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751730_10906 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751730 2025-07-16 16:00:24,580 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751734_10910 src: /192.168.158.5:43124 dest: /192.168.158.4:9866 2025-07-16 16:00:24,604 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:43124, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2146975739_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751734_10910, duration(ns): 18695924 2025-07-16 16:00:24,604 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751734_10910, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-16 16:00:26,641 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751734_10910 replica FinalizedReplica, blk_1073751734_10910, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751734 for deletion 2025-07-16 16:00:26,642 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751734_10910 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751734 2025-07-16 16:01:24,563 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751735_10911 src: /192.168.158.8:56022 dest: /192.168.158.4:9866 2025-07-16 16:01:24,587 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:56022, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_275498845_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751735_10911, duration(ns): 19494874 2025-07-16 16:01:24,587 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751735_10911, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-16 16:01:26,644 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751735_10911 replica FinalizedReplica, blk_1073751735_10911, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751735 for deletion 2025-07-16 16:01:26,645 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751735_10911 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751735 2025-07-16 16:02:24,590 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751736_10912 src: /192.168.158.5:53362 dest: /192.168.158.4:9866 2025-07-16 16:02:24,616 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:53362, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-627799157_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751736_10912, duration(ns): 20966724 2025-07-16 16:02:24,616 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751736_10912, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-16 16:02:26,646 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751736_10912 replica FinalizedReplica, blk_1073751736_10912, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751736 for deletion 2025-07-16 16:02:26,648 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751736_10912 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751736 2025-07-16 16:05:29,579 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751739_10915 src: /192.168.158.8:37388 dest: /192.168.158.4:9866 2025-07-16 16:05:29,597 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:37388, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_671415072_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751739_10915, duration(ns): 16247542 2025-07-16 16:05:29,598 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751739_10915, type=LAST_IN_PIPELINE terminating 2025-07-16 16:05:35,652 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751739_10915 replica FinalizedReplica, blk_1073751739_10915, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751739 for deletion 2025-07-16 16:05:35,653 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751739_10915 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751739 2025-07-16 16:06:29,573 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751740_10916 src: /192.168.158.1:47134 dest: /192.168.158.4:9866 2025-07-16 16:06:29,605 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47134, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-932590457_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751740_10916, duration(ns): 23872636 2025-07-16 16:06:29,606 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751740_10916, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-16 16:06:32,653 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751740_10916 replica FinalizedReplica, blk_1073751740_10916, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751740 for deletion 2025-07-16 16:06:32,654 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751740_10916 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751740 2025-07-16 16:09:29,582 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751743_10919 src: /192.168.158.5:36084 dest: /192.168.158.4:9866 2025-07-16 16:09:29,601 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:36084, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1375058739_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751743_10919, duration(ns): 17294875 2025-07-16 16:09:29,602 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751743_10919, type=LAST_IN_PIPELINE terminating 2025-07-16 16:09:32,654 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751743_10919 replica FinalizedReplica, blk_1073751743_10919, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751743 for deletion 2025-07-16 16:09:32,656 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751743_10919 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751743 2025-07-16 16:11:34,588 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751745_10921 src: /192.168.158.8:43668 dest: /192.168.158.4:9866 2025-07-16 16:11:34,613 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:43668, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1836848194_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751745_10921, duration(ns): 20281880 2025-07-16 16:11:34,613 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751745_10921, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-16 16:11:38,656 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751745_10921 replica FinalizedReplica, blk_1073751745_10921, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751745 for deletion 2025-07-16 16:11:38,658 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751745_10921 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751745 2025-07-16 16:12:34,608 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751746_10922 src: /192.168.158.1:60678 dest: /192.168.158.4:9866 2025-07-16 16:12:34,640 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60678, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1144594113_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751746_10922, duration(ns): 23495132 2025-07-16 16:12:34,640 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751746_10922, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-16 16:12:38,658 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751746_10922 replica FinalizedReplica, blk_1073751746_10922, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751746 for deletion 2025-07-16 16:12:38,659 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751746_10922 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751746 2025-07-16 16:15:34,604 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751749_10925 src: /192.168.158.9:53690 dest: /192.168.158.4:9866 2025-07-16 16:15:34,627 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:53690, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_690115842_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751749_10925, duration(ns): 18176608 2025-07-16 16:15:34,627 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751749_10925, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-16 16:15:38,661 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751749_10925 replica FinalizedReplica, blk_1073751749_10925, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751749 for deletion 2025-07-16 16:15:38,662 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751749_10925 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751749 2025-07-16 16:16:34,599 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751750_10926 src: /192.168.158.1:33590 dest: /192.168.158.4:9866 2025-07-16 16:16:34,629 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33590, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2060273345_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751750_10926, duration(ns): 21203242 2025-07-16 16:16:34,629 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751750_10926, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-16 16:16:38,661 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751750_10926 replica FinalizedReplica, blk_1073751750_10926, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751750 for deletion 2025-07-16 16:16:38,662 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751750_10926 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751750 2025-07-16 16:17:39,605 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751751_10927 src: /192.168.158.6:44042 dest: /192.168.158.4:9866 2025-07-16 16:17:39,626 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:44042, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-561304238_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751751_10927, duration(ns): 18386745 2025-07-16 16:17:39,626 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751751_10927, type=LAST_IN_PIPELINE terminating 2025-07-16 16:17:44,661 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751751_10927 replica FinalizedReplica, blk_1073751751_10927, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751751 for deletion 2025-07-16 16:17:44,663 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751751_10927 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751751 2025-07-16 16:19:44,610 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751753_10929 src: /192.168.158.8:51014 dest: /192.168.158.4:9866 2025-07-16 16:19:44,627 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:51014, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-171364053_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751753_10929, duration(ns): 15457741 2025-07-16 16:19:44,628 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751753_10929, type=LAST_IN_PIPELINE terminating 2025-07-16 16:19:50,668 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751753_10929 replica FinalizedReplica, blk_1073751753_10929, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751753 for deletion 2025-07-16 16:19:50,670 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751753_10929 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751753 2025-07-16 16:22:44,610 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751756_10932 src: /192.168.158.8:42846 dest: /192.168.158.4:9866 2025-07-16 16:22:44,628 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:42846, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1428726840_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751756_10932, duration(ns): 16308658 2025-07-16 16:22:44,629 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751756_10932, type=LAST_IN_PIPELINE terminating 2025-07-16 16:22:47,675 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751756_10932 replica FinalizedReplica, blk_1073751756_10932, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751756 for deletion 2025-07-16 16:22:47,676 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751756_10932 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751756 2025-07-16 16:23:44,600 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751757_10933 src: /192.168.158.1:47262 dest: /192.168.158.4:9866 2025-07-16 16:23:44,635 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47262, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1252119915_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751757_10933, duration(ns): 24987809 2025-07-16 16:23:44,635 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751757_10933, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-16 16:23:47,679 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751757_10933 replica FinalizedReplica, blk_1073751757_10933, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751757 for deletion 2025-07-16 16:23:47,680 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751757_10933 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751757 2025-07-16 16:24:44,613 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751758_10934 src: /192.168.158.1:49102 dest: /192.168.158.4:9866 2025-07-16 16:24:44,645 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49102, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-610875264_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751758_10934, duration(ns): 23842987 2025-07-16 16:24:44,645 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751758_10934, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-16 16:24:47,680 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751758_10934 replica FinalizedReplica, blk_1073751758_10934, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751758 for deletion 2025-07-16 16:24:47,681 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751758_10934 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751758 2025-07-16 16:25:44,616 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751759_10935 src: /192.168.158.1:51594 dest: /192.168.158.4:9866 2025-07-16 16:25:44,650 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51594, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1586007706_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751759_10935, duration(ns): 24383349 2025-07-16 16:25:44,650 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751759_10935, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-16 16:25:47,684 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751759_10935 replica FinalizedReplica, blk_1073751759_10935, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751759 for deletion 2025-07-16 16:25:47,685 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751759_10935 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751759 2025-07-16 16:28:49,610 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751762_10938 src: /192.168.158.1:54064 dest: /192.168.158.4:9866 2025-07-16 16:28:49,661 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54064, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1922024824_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751762_10938, duration(ns): 43007075 2025-07-16 16:28:49,661 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751762_10938, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-16 16:28:56,689 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751762_10938 replica FinalizedReplica, blk_1073751762_10938, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751762 for deletion 2025-07-16 16:28:56,691 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751762_10938 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751762 2025-07-16 16:30:49,625 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751764_10940 src: /192.168.158.6:43332 dest: /192.168.158.4:9866 2025-07-16 16:30:49,650 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:43332, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2087092103_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751764_10940, duration(ns): 19429579 2025-07-16 16:30:49,650 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751764_10940, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-16 16:30:53,692 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751764_10940 replica FinalizedReplica, blk_1073751764_10940, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751764 for deletion 2025-07-16 16:30:53,693 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751764_10940 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751764 2025-07-16 16:32:49,627 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751766_10942 src: /192.168.158.8:36796 dest: /192.168.158.4:9866 2025-07-16 16:32:49,654 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:36796, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1389155042_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751766_10942, duration(ns): 22015004 2025-07-16 16:32:49,654 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751766_10942, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-16 16:32:53,693 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751766_10942 replica FinalizedReplica, blk_1073751766_10942, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751766 for deletion 2025-07-16 16:32:53,695 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751766_10942 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751766 2025-07-16 16:33:49,623 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751767_10943 src: /192.168.158.1:38294 dest: /192.168.158.4:9866 2025-07-16 16:33:49,653 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38294, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1732887920_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751767_10943, duration(ns): 21053975 2025-07-16 16:33:49,653 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751767_10943, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-16 16:33:56,697 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751767_10943 replica FinalizedReplica, blk_1073751767_10943, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751767 for deletion 2025-07-16 16:33:56,698 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751767_10943 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751767 2025-07-16 16:34:54,634 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751768_10944 src: /192.168.158.5:57538 dest: /192.168.158.4:9866 2025-07-16 16:34:54,652 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:57538, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-458005664_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751768_10944, duration(ns): 15219072 2025-07-16 16:34:54,652 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751768_10944, type=LAST_IN_PIPELINE terminating 2025-07-16 16:34:59,698 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751768_10944 replica FinalizedReplica, blk_1073751768_10944, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751768 for deletion 2025-07-16 16:34:59,699 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751768_10944 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751768 2025-07-16 16:35:54,628 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751769_10945 src: /192.168.158.8:43198 dest: /192.168.158.4:9866 2025-07-16 16:35:54,654 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:43198, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1227766517_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751769_10945, duration(ns): 20737746 2025-07-16 16:35:54,654 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751769_10945, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-16 16:35:59,701 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751769_10945 replica FinalizedReplica, blk_1073751769_10945, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751769 for deletion 2025-07-16 16:35:59,702 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751769_10945 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751769 2025-07-16 16:36:54,632 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751770_10946 src: /192.168.158.8:56722 dest: /192.168.158.4:9866 2025-07-16 16:36:54,649 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:56722, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-552358812_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751770_10946, duration(ns): 15030732 2025-07-16 16:36:54,649 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751770_10946, type=LAST_IN_PIPELINE terminating 2025-07-16 16:36:56,702 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751770_10946 replica FinalizedReplica, blk_1073751770_10946, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751770 for deletion 2025-07-16 16:36:56,703 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751770_10946 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751770 2025-07-16 16:37:54,658 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751771_10947 src: /192.168.158.6:50418 dest: /192.168.158.4:9866 2025-07-16 16:37:54,684 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:50418, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2104657464_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751771_10947, duration(ns): 20248688 2025-07-16 16:37:54,684 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751771_10947, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-16 16:37:56,705 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751771_10947 replica FinalizedReplica, blk_1073751771_10947, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751771 for deletion 2025-07-16 16:37:56,706 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751771_10947 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751771 2025-07-16 16:45:59,634 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751779_10955 src: /192.168.158.1:43084 dest: /192.168.158.4:9866 2025-07-16 16:45:59,665 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43084, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1635907179_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751779_10955, duration(ns): 22846011 2025-07-16 16:45:59,666 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751779_10955, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-16 16:46:02,721 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751779_10955 replica FinalizedReplica, blk_1073751779_10955, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751779 for deletion 2025-07-16 16:46:02,722 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751779_10955 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751779 2025-07-16 16:49:04,636 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751782_10958 src: /192.168.158.6:46140 dest: /192.168.158.4:9866 2025-07-16 16:49:04,660 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:46140, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2012039168_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751782_10958, duration(ns): 18830990 2025-07-16 16:49:04,661 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751782_10958, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-16 16:49:08,728 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751782_10958 replica FinalizedReplica, blk_1073751782_10958, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751782 for deletion 2025-07-16 16:49:08,729 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751782_10958 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751782 2025-07-16 16:53:09,651 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751786_10962 src: /192.168.158.8:51144 dest: /192.168.158.4:9866 2025-07-16 16:53:09,676 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:51144, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-80589579_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751786_10962, duration(ns): 19684434 2025-07-16 16:53:09,676 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751786_10962, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-16 16:53:11,741 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751786_10962 replica FinalizedReplica, blk_1073751786_10962, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751786 for deletion 2025-07-16 16:53:11,742 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751786_10962 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751786 2025-07-16 16:54:09,648 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751787_10963 src: /192.168.158.7:51016 dest: /192.168.158.4:9866 2025-07-16 16:54:09,672 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:51016, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1550817451_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751787_10963, duration(ns): 18616401 2025-07-16 16:54:09,673 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751787_10963, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-16 16:54:11,745 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751787_10963 replica FinalizedReplica, blk_1073751787_10963, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751787 for deletion 2025-07-16 16:54:11,747 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751787_10963 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751787 2025-07-16 16:56:14,664 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751789_10965 src: /192.168.158.8:49824 dest: /192.168.158.4:9866 2025-07-16 16:56:14,689 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:49824, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-613726612_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751789_10965, duration(ns): 19477443 2025-07-16 16:56:14,689 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751789_10965, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-16 16:56:20,750 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751789_10965 replica FinalizedReplica, blk_1073751789_10965, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751789 for deletion 2025-07-16 16:56:20,751 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751789_10965 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751789 2025-07-16 16:57:14,669 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751790_10966 src: /192.168.158.7:46172 dest: /192.168.158.4:9866 2025-07-16 16:57:14,687 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:46172, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1845187781_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751790_10966, duration(ns): 16223752 2025-07-16 16:57:14,688 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751790_10966, type=LAST_IN_PIPELINE terminating 2025-07-16 16:57:17,752 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751790_10966 replica FinalizedReplica, blk_1073751790_10966, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751790 for deletion 2025-07-16 16:57:17,753 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751790_10966 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751790 2025-07-16 16:58:14,671 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751791_10967 src: /192.168.158.5:38552 dest: /192.168.158.4:9866 2025-07-16 16:58:14,689 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:38552, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1012388416_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751791_10967, duration(ns): 15671709 2025-07-16 16:58:14,689 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751791_10967, type=LAST_IN_PIPELINE terminating 2025-07-16 16:58:17,752 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751791_10967 replica FinalizedReplica, blk_1073751791_10967, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751791 for deletion 2025-07-16 16:58:17,753 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751791_10967 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751791 2025-07-16 17:00:14,656 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751793_10969 src: /192.168.158.5:60126 dest: /192.168.158.4:9866 2025-07-16 17:00:14,677 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:60126, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_956123565_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751793_10969, duration(ns): 17104517 2025-07-16 17:00:14,678 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751793_10969, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-16 17:00:17,758 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751793_10969 replica FinalizedReplica, blk_1073751793_10969, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751793 for deletion 2025-07-16 17:00:17,759 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751793_10969 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751793 2025-07-16 17:01:19,667 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751794_10970 src: /192.168.158.1:40924 dest: /192.168.158.4:9866 2025-07-16 17:01:19,698 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40924, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1500113696_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751794_10970, duration(ns): 22876350 2025-07-16 17:01:19,700 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751794_10970, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-16 17:01:23,761 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751794_10970 replica FinalizedReplica, blk_1073751794_10970, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751794 for deletion 2025-07-16 17:01:23,762 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751794_10970 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751794 2025-07-16 17:03:19,672 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751796_10972 src: /192.168.158.8:40858 dest: /192.168.158.4:9866 2025-07-16 17:03:19,694 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:40858, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-45391719_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751796_10972, duration(ns): 18372125 2025-07-16 17:03:19,694 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751796_10972, type=LAST_IN_PIPELINE terminating 2025-07-16 17:03:26,765 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751796_10972 replica FinalizedReplica, blk_1073751796_10972, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751796 for deletion 2025-07-16 17:03:26,766 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751796_10972 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751796 2025-07-16 17:04:24,668 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751797_10973 src: /192.168.158.8:39120 dest: /192.168.158.4:9866 2025-07-16 17:04:24,695 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:39120, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_239316271_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751797_10973, duration(ns): 21318388 2025-07-16 17:04:24,695 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751797_10973, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-16 17:04:26,765 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751797_10973 replica FinalizedReplica, blk_1073751797_10973, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751797 for deletion 2025-07-16 17:04:26,766 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751797_10973 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751797 2025-07-16 17:05:29,677 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751798_10974 src: /192.168.158.7:55392 dest: /192.168.158.4:9866 2025-07-16 17:05:29,693 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:55392, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-518531955_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751798_10974, duration(ns): 14127640 2025-07-16 17:05:29,693 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751798_10974, type=LAST_IN_PIPELINE terminating 2025-07-16 17:05:32,767 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751798_10974 replica FinalizedReplica, blk_1073751798_10974, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751798 for deletion 2025-07-16 17:05:32,768 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751798_10974 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751798 2025-07-16 17:08:29,683 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751801_10977 src: /192.168.158.8:33834 dest: /192.168.158.4:9866 2025-07-16 17:08:29,702 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:33834, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-695837764_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751801_10977, duration(ns): 16553291 2025-07-16 17:08:29,702 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751801_10977, type=LAST_IN_PIPELINE terminating 2025-07-16 17:08:32,771 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751801_10977 replica FinalizedReplica, blk_1073751801_10977, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751801 for deletion 2025-07-16 17:08:32,772 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751801_10977 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751801 2025-07-16 17:09:29,682 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751802_10978 src: /192.168.158.7:33832 dest: /192.168.158.4:9866 2025-07-16 17:09:29,706 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:33832, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_190042458_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751802_10978, duration(ns): 19130510 2025-07-16 17:09:29,706 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751802_10978, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-16 17:09:32,774 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751802_10978 replica FinalizedReplica, blk_1073751802_10978, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751802 for deletion 2025-07-16 17:09:32,776 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751802_10978 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751802 2025-07-16 17:11:39,691 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751804_10980 src: /192.168.158.8:35614 dest: /192.168.158.4:9866 2025-07-16 17:11:39,709 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:35614, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1562020783_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751804_10980, duration(ns): 16015328 2025-07-16 17:11:39,709 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751804_10980, type=LAST_IN_PIPELINE terminating 2025-07-16 17:11:44,776 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751804_10980 replica FinalizedReplica, blk_1073751804_10980, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751804 for deletion 2025-07-16 17:11:44,777 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751804_10980 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751804 2025-07-16 17:13:44,689 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751806_10982 src: /192.168.158.1:51192 dest: /192.168.158.4:9866 2025-07-16 17:13:44,720 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51192, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_235029835_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751806_10982, duration(ns): 22776892 2025-07-16 17:13:44,720 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751806_10982, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-16 17:13:50,775 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751806_10982 replica FinalizedReplica, blk_1073751806_10982, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751806 for deletion 2025-07-16 17:13:50,777 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751806_10982 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751806 2025-07-16 17:14:44,691 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751807_10983 src: /192.168.158.6:42610 dest: /192.168.158.4:9866 2025-07-16 17:14:44,715 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:42610, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2003858994_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751807_10983, duration(ns): 18655823 2025-07-16 17:14:44,715 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751807_10983, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-16 17:14:47,776 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751807_10983 replica FinalizedReplica, blk_1073751807_10983, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751807 for deletion 2025-07-16 17:14:47,777 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751807_10983 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073751807 2025-07-16 17:15:44,700 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751808_10984 src: /192.168.158.7:60866 dest: /192.168.158.4:9866 2025-07-16 17:15:44,717 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:60866, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-960011296_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751808_10984, duration(ns): 15368162 2025-07-16 17:15:44,718 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751808_10984, type=LAST_IN_PIPELINE terminating 2025-07-16 17:15:47,777 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751808_10984 replica FinalizedReplica, blk_1073751808_10984, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751808 for deletion 2025-07-16 17:15:47,779 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751808_10984 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751808 2025-07-16 17:18:49,706 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751811_10987 src: /192.168.158.6:49006 dest: /192.168.158.4:9866 2025-07-16 17:18:49,724 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:49006, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2112393571_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751811_10987, duration(ns): 16341705 2025-07-16 17:18:49,725 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751811_10987, type=LAST_IN_PIPELINE terminating 2025-07-16 17:18:56,781 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751811_10987 replica FinalizedReplica, blk_1073751811_10987, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751811 for deletion 2025-07-16 17:18:56,783 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751811_10987 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751811 2025-07-16 17:20:49,704 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751813_10989 src: /192.168.158.1:54008 dest: /192.168.158.4:9866 2025-07-16 17:20:49,738 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54008, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2141514087_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751813_10989, duration(ns): 25391992 2025-07-16 17:20:49,738 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751813_10989, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-16 17:20:53,784 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751813_10989 replica FinalizedReplica, blk_1073751813_10989, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751813 for deletion 2025-07-16 17:20:53,785 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751813_10989 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751813 2025-07-16 17:21:49,682 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751814_10990 src: /192.168.158.1:48764 dest: /192.168.158.4:9866 2025-07-16 17:21:49,714 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48764, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1711356424_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751814_10990, duration(ns): 23716299 2025-07-16 17:21:49,714 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751814_10990, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-16 17:21:53,784 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751814_10990 replica FinalizedReplica, blk_1073751814_10990, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751814 for deletion 2025-07-16 17:21:53,785 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751814_10990 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751814 2025-07-16 17:23:49,699 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751816_10992 src: /192.168.158.7:36166 dest: /192.168.158.4:9866 2025-07-16 17:23:49,724 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:36166, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_254706876_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751816_10992, duration(ns): 18964558 2025-07-16 17:23:49,724 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751816_10992, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-16 17:23:56,790 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751816_10992 replica FinalizedReplica, blk_1073751816_10992, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751816 for deletion 2025-07-16 17:23:56,791 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751816_10992 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751816 2025-07-16 17:26:49,700 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751819_10995 src: /192.168.158.1:35118 dest: /192.168.158.4:9866 2025-07-16 17:26:49,730 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35118, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_767822790_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751819_10995, duration(ns): 21309397 2025-07-16 17:26:49,730 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751819_10995, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-16 17:26:53,797 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751819_10995 replica FinalizedReplica, blk_1073751819_10995, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751819 for deletion 2025-07-16 17:26:53,798 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751819_10995 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751819 2025-07-16 17:31:54,721 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751824_11000 src: /192.168.158.9:39078 dest: /192.168.158.4:9866 2025-07-16 17:31:54,744 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:39078, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_520915450_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751824_11000, duration(ns): 18095265 2025-07-16 17:31:54,744 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751824_11000, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-16 17:31:59,808 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751824_11000 replica FinalizedReplica, blk_1073751824_11000, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751824 for deletion 2025-07-16 17:31:59,809 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751824_11000 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751824 2025-07-16 17:33:59,712 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751826_11002 src: /192.168.158.9:53938 dest: /192.168.158.4:9866 2025-07-16 17:33:59,729 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:53938, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_331647451_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751826_11002, duration(ns): 14151672 2025-07-16 17:33:59,729 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751826_11002, type=LAST_IN_PIPELINE terminating 2025-07-16 17:34:05,812 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751826_11002 replica FinalizedReplica, blk_1073751826_11002, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751826 for deletion 2025-07-16 17:34:05,813 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751826_11002 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751826 2025-07-16 17:34:59,717 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751827_11003 src: /192.168.158.5:35996 dest: /192.168.158.4:9866 2025-07-16 17:34:59,734 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:35996, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_573806214_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751827_11003, duration(ns): 15276055 2025-07-16 17:34:59,735 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751827_11003, type=LAST_IN_PIPELINE terminating 2025-07-16 17:35:05,815 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751827_11003 replica FinalizedReplica, blk_1073751827_11003, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751827 for deletion 2025-07-16 17:35:05,816 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751827_11003 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751827 2025-07-16 17:36:13,270 INFO org.apache.hadoop.hdfs.server.datanode.DirectoryScanner: BlockPool BP-1059995147-192.168.158.1-1752101929360 Total blocks: 8, missing metadata files:0, missing block files:0, missing blocks in memory:0, mismatched blocks:0 2025-07-16 17:36:59,713 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751829_11005 src: /192.168.158.5:45498 dest: /192.168.158.4:9866 2025-07-16 17:36:59,738 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:45498, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1573826216_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751829_11005, duration(ns): 20073821 2025-07-16 17:36:59,739 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751829_11005, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-16 17:37:02,818 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751829_11005 replica FinalizedReplica, blk_1073751829_11005, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751829 for deletion 2025-07-16 17:37:02,819 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751829_11005 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751829 2025-07-16 17:37:59,722 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751830_11006 src: /192.168.158.9:60614 dest: /192.168.158.4:9866 2025-07-16 17:37:59,739 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:60614, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1988410292_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751830_11006, duration(ns): 14270033 2025-07-16 17:37:59,739 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751830_11006, type=LAST_IN_PIPELINE terminating 2025-07-16 17:38:02,820 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751830_11006 replica FinalizedReplica, blk_1073751830_11006, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751830 for deletion 2025-07-16 17:38:02,822 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751830_11006 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751830 2025-07-16 17:39:59,721 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751832_11008 src: /192.168.158.5:40998 dest: /192.168.158.4:9866 2025-07-16 17:39:59,739 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:40998, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_287251269_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751832_11008, duration(ns): 15326495 2025-07-16 17:39:59,739 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751832_11008, type=LAST_IN_PIPELINE terminating 2025-07-16 17:40:02,827 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751832_11008 replica FinalizedReplica, blk_1073751832_11008, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751832 for deletion 2025-07-16 17:40:02,828 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751832_11008 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751832 2025-07-16 17:43:04,729 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751835_11011 src: /192.168.158.9:46318 dest: /192.168.158.4:9866 2025-07-16 17:43:04,750 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:46318, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1518239800_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751835_11011, duration(ns): 17316265 2025-07-16 17:43:04,750 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751835_11011, type=LAST_IN_PIPELINE terminating 2025-07-16 17:43:08,832 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751835_11011 replica FinalizedReplica, blk_1073751835_11011, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751835 for deletion 2025-07-16 17:43:08,833 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751835_11011 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751835 2025-07-16 17:45:04,727 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751837_11013 src: /192.168.158.7:47712 dest: /192.168.158.4:9866 2025-07-16 17:45:04,752 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:47712, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1102832449_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751837_11013, duration(ns): 19316035 2025-07-16 17:45:04,752 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751837_11013, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-16 17:45:08,838 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751837_11013 replica FinalizedReplica, blk_1073751837_11013, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751837 for deletion 2025-07-16 17:45:08,839 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751837_11013 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751837 2025-07-16 17:48:04,734 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751840_11016 src: /192.168.158.1:59126 dest: /192.168.158.4:9866 2025-07-16 17:48:04,764 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59126, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-621056127_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751840_11016, duration(ns): 20854684 2025-07-16 17:48:04,764 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751840_11016, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-16 17:48:08,843 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751840_11016 replica FinalizedReplica, blk_1073751840_11016, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751840 for deletion 2025-07-16 17:48:08,844 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751840_11016 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751840 2025-07-16 17:50:04,735 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751842_11018 src: /192.168.158.5:40952 dest: /192.168.158.4:9866 2025-07-16 17:50:04,760 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:40952, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1252049148_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751842_11018, duration(ns): 19922010 2025-07-16 17:50:04,761 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751842_11018, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-16 17:50:11,847 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751842_11018 replica FinalizedReplica, blk_1073751842_11018, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751842 for deletion 2025-07-16 17:50:11,848 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751842_11018 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751842 2025-07-16 17:53:04,738 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751845_11021 src: /192.168.158.5:44780 dest: /192.168.158.4:9866 2025-07-16 17:53:04,763 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:44780, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-134784464_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751845_11021, duration(ns): 19188336 2025-07-16 17:53:04,763 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751845_11021, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-16 17:53:11,853 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751845_11021 replica FinalizedReplica, blk_1073751845_11021, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751845 for deletion 2025-07-16 17:53:11,855 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751845_11021 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751845 2025-07-16 17:56:19,745 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751848_11024 src: /192.168.158.9:55234 dest: /192.168.158.4:9866 2025-07-16 17:56:19,771 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:55234, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1432700087_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751848_11024, duration(ns): 19830144 2025-07-16 17:56:19,771 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751848_11024, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-16 17:56:23,863 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751848_11024 replica FinalizedReplica, blk_1073751848_11024, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751848 for deletion 2025-07-16 17:56:23,864 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751848_11024 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751848 2025-07-16 17:58:24,749 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751850_11026 src: /192.168.158.9:56882 dest: /192.168.158.4:9866 2025-07-16 17:58:24,774 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:56882, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_835076195_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751850_11026, duration(ns): 19021060 2025-07-16 17:58:24,774 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751850_11026, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-16 17:58:26,867 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751850_11026 replica FinalizedReplica, blk_1073751850_11026, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751850 for deletion 2025-07-16 17:58:26,868 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751850_11026 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751850 2025-07-16 17:59:24,779 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751851_11027 src: /192.168.158.1:33286 dest: /192.168.158.4:9866 2025-07-16 17:59:24,810 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33286, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1264855266_107, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751851_11027, duration(ns): 22474883 2025-07-16 17:59:24,810 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751851_11027, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-16 17:59:29,869 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751851_11027 replica FinalizedReplica, blk_1073751851_11027, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751851 for deletion 2025-07-16 17:59:29,871 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751851_11027 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751851 2025-07-16 18:00:26,870 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: IOException in offerService java.io.EOFException: End of File Exception between local host is: "dmidlkprdls04.svr.luc.edu/192.168.158.4"; destination host is: "dmidlkprdls01.svr.luc.edu":8022; : java.io.EOFException; For more details see: http://wiki.apache.org/hadoop/EOFException at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:892) at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:846) at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1620) at org.apache.hadoop.ipc.Client.call(Client.java:1562) at org.apache.hadoop.ipc.Client.call(Client.java:1459) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:233) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:118) at com.sun.proxy.$Proxy24.sendHeartbeat(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolClientSideTranslatorPB.sendHeartbeat(DatanodeProtocolClientSideTranslatorPB.java:166) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.sendHeartBeat(BPServiceActor.java:553) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.offerService(BPServiceActor.java:694) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:894) at java.lang.Thread.run(Thread.java:750) Caused by: java.io.EOFException at java.io.DataInputStream.readInt(DataInputStream.java:392) at org.apache.hadoop.ipc.Client$IpcStreams.readResponse(Client.java:1950) at org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1245) at org.apache.hadoop.ipc.Client$Connection.run(Client.java:1141) 2025-07-16 18:00:28,873 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:00:29,874 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:00:30,876 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:00:31,877 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:00:32,878 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:00:33,879 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:00:34,880 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:00:35,881 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:00:36,882 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:00:37,884 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:00:38,885 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:00:39,886 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:00:40,887 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:00:41,888 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:00:42,889 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:00:43,891 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:00:44,892 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:00:45,893 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:00:46,894 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:00:47,895 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:01:06,970 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:01:10,041 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:01:16,186 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:01:19,257 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:01:22,330 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:01:25,402 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:01:28,473 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:01:31,545 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:01:34,617 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:01:37,691 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:01:40,761 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:01:43,833 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:01:46,905 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:01:49,977 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:01:53,049 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:01:56,121 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:01:59,193 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:02:02,265 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:02:05,337 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:02:08,409 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:02:11,481 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:02:14,553 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:02:17,625 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:02:20,697 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:02:23,770 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:02:26,118 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:02:42,714 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:02:43,715 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:02:44,716 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:02:45,717 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:02:45,719 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: IOException in offerService java.net.NoRouteToHostException: No Route to Host from dmidlkprdls04.svr.luc.edu/192.168.158.4 to dmidlkprdls01.svr.luc.edu:8022 failed on socket timeout exception: java.net.NoRouteToHostException: No route to host; For more details see: http://wiki.apache.org/hadoop/NoRouteToHost at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:892) at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:839) at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1620) at org.apache.hadoop.ipc.Client.call(Client.java:1562) at org.apache.hadoop.ipc.Client.call(Client.java:1459) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:233) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:118) at com.sun.proxy.$Proxy24.cacheReport(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolClientSideTranslatorPB.cacheReport(DatanodeProtocolClientSideTranslatorPB.java:236) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.cacheReport(BPServiceActor.java:499) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.offerService(BPServiceActor.java:740) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:894) at java.lang.Thread.run(Thread.java:750) Caused by: java.net.NoRouteToHostException: No route to host at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:716) at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:205) at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:586) at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:730) at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:843) at org.apache.hadoop.ipc.Client$Connection.access$3800(Client.java:430) at org.apache.hadoop.ipc.Client.getConnection(Client.java:1681) at org.apache.hadoop.ipc.Client.call(Client.java:1506) ... 9 more 2025-07-16 18:02:47,721 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:02:48,722 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:02:49,723 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:02:50,725 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:02:51,726 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:02:52,727 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:02:53,728 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:02:54,729 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:02:55,730 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:02:56,731 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:02:57,732 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:02:58,733 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:02:59,734 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:03:00,735 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:03:01,736 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:03:02,737 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:03:03,738 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:03:04,739 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:03:05,741 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:03:06,742 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:03:07,743 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:03:08,744 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:03:09,745 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:03:10,746 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:03:11,747 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:03:12,748 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:03:13,750 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:03:14,751 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:03:15,752 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:03:16,753 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:03:17,754 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:03:18,755 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:03:19,756 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:03:20,757 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:03:21,758 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:03:22,760 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:03:23,761 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:03:24,763 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:03:25,764 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:03:26,765 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:03:27,766 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:03:28,767 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:03:29,768 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:03:30,769 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:03:31,770 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:03:32,771 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:03:33,772 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:03:34,774 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:03:35,775 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:03:36,776 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:03:36,777 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: IOException in offerService java.net.NoRouteToHostException: No Route to Host from dmidlkprdls04.svr.luc.edu/192.168.158.4 to dmidlkprdls01.svr.luc.edu:8022 failed on socket timeout exception: java.net.NoRouteToHostException: No route to host; For more details see: http://wiki.apache.org/hadoop/NoRouteToHost at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:892) at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:839) at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1620) at org.apache.hadoop.ipc.Client.call(Client.java:1562) at org.apache.hadoop.ipc.Client.call(Client.java:1459) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:233) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:118) at com.sun.proxy.$Proxy24.sendHeartbeat(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolClientSideTranslatorPB.sendHeartbeat(DatanodeProtocolClientSideTranslatorPB.java:166) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.sendHeartBeat(BPServiceActor.java:553) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.offerService(BPServiceActor.java:694) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:894) at java.lang.Thread.run(Thread.java:750) Caused by: java.net.NoRouteToHostException: No route to host at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:716) at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:205) at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:586) at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:730) at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:843) at org.apache.hadoop.ipc.Client$Connection.access$3800(Client.java:430) at org.apache.hadoop.ipc.Client.getConnection(Client.java:1681) at org.apache.hadoop.ipc.Client.call(Client.java:1506) ... 9 more 2025-07-16 18:03:38,780 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:03:39,781 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:03:40,782 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:03:41,783 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:03:42,784 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:03:43,785 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:03:44,786 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:03:45,732 WARN com.cloudera.cmf.event.publish.EventStorePublisherWithRetry: Failed to publish event: SimpleEvent{attributes={ROLE_TYPE=[DATANODE], EXCEPTION_TYPES=[java.net.NoRouteToHostException, java.net.NoRouteToHostException], HOST_IDS=[5c33df90-d247-4c6d-b9e0-5908a423580a], STACKTRACE=[java.net.NoRouteToHostException: No Route to Host from dmidlkprdls04.svr.luc.edu/192.168.158.4 to dmidlkprdls01.svr.luc.edu:8022 failed on socket timeout exception: java.net.NoRouteToHostException: No route to host; For more details see: http://wiki.apache.org/hadoop/NoRouteToHost at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:892) at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:839) at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1620) at org.apache.hadoop.ipc.Client.call(Client.java:1562) at org.apache.hadoop.ipc.Client.call(Client.java:1459) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:233) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:118) at com.sun.proxy.$Proxy24.cacheReport(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolClientSideTranslatorPB.cacheReport(DatanodeProtocolClientSideTranslatorPB.java:236) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.cacheReport(BPServiceActor.java:499) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.offerService(BPServiceActor.java:740) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:894) at java.lang.Thread.run(Thread.java:750) Caused by: java.net.NoRouteToHostException: No route to host at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:716) at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:205) at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:586) at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:730) at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:843) at org.apache.hadoop.ipc.Client$Connection.access$3800(Client.java:430) at org.apache.hadoop.ipc.Client.getConnection(Client.java:1681) at org.apache.hadoop.ipc.Client.call(Client.java:1506) ... 9 more ], ROLE=[hdfs-DATANODE-b30a464b10a57fdd49ea734cd52a8291], HOSTS=[dmidlkprdls04.svr.luc.edu], CATEGORY=[LOG_MESSAGE], EVENTCODE=[EV_LOG_EVENT], SERVICE=[hdfs], SERVICE_TYPE=[HDFS], LOG_LEVEL=[WARN], SEVERITY=[IMPORTANT]}, content=IOException in offerService, timestamp=1752706965719} com.cloudera.cmf.event.shaded.org.apache.avro.AvroRuntimeException: java.io.IOException: Error connecting to dmidlkprdls01.svr.luc.edu/192.168.158.1:7184 at com.cloudera.cmf.event.shaded.org.apache.avro.ipc.specific.SpecificRequestor.invoke(SpecificRequestor.java:124) at com.sun.proxy.$Proxy25.reportEvent(Unknown Source) at com.cloudera.cmf.event.publish.AvroEventStorePublishProxy.publishEvent(AvroEventStorePublishProxy.java:207) at com.cloudera.cmf.event.publish.EventStorePublisherWithRetry$PublishEventTask.run(EventStorePublisherWithRetry.java:242) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:750) Caused by: java.io.IOException: Error connecting to dmidlkprdls01.svr.luc.edu/192.168.158.1:7184 at com.cloudera.cmf.event.shaded.org.apache.avro.ipc.netty.NettyTransceiver.getChannel(NettyTransceiver.java:269) at com.cloudera.cmf.event.shaded.org.apache.avro.ipc.netty.NettyTransceiver.getRemoteName(NettyTransceiver.java:412) at com.cloudera.cmf.event.shaded.org.apache.avro.ipc.Requestor.writeHandshake(Requestor.java:213) at com.cloudera.cmf.event.shaded.org.apache.avro.ipc.Requestor.access$300(Requestor.java:52) at com.cloudera.cmf.event.shaded.org.apache.avro.ipc.Requestor$Request.getBytes(Requestor.java:489) at com.cloudera.cmf.event.shaded.org.apache.avro.ipc.Requestor.request(Requestor.java:152) at com.cloudera.cmf.event.shaded.org.apache.avro.ipc.Requestor.request(Requestor.java:101) at com.cloudera.cmf.event.shaded.org.apache.avro.ipc.specific.SpecificRequestor.invoke(SpecificRequestor.java:108) ... 8 more 2025-07-16 18:03:45,787 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:03:46,788 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:03:47,790 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:03:48,791 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:03:49,792 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:03:50,793 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:03:51,794 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:03:52,795 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:03:53,796 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:03:54,797 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:03:55,798 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:03:56,800 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:03:57,801 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:03:58,802 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:03:59,803 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:04:00,804 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:04:01,805 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:04:02,806 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:04:03,807 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:04:04,808 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:04:05,809 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:04:06,810 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:04:07,811 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:04:08,812 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:04:09,813 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:04:10,814 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:04:11,815 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:04:12,817 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:04:13,818 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:04:14,819 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:04:15,820 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:04:16,821 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:04:17,822 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:04:18,823 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:04:19,824 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:04:20,825 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:04:21,826 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:04:22,827 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:04:26,906 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:04:27,907 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:04:28,908 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:04:29,909 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:04:30,910 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:04:30,912 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: IOException in offerService java.net.NoRouteToHostException: No Route to Host from dmidlkprdls04.svr.luc.edu/192.168.158.4 to dmidlkprdls01.svr.luc.edu:8022 failed on socket timeout exception: java.net.NoRouteToHostException: No route to host; For more details see: http://wiki.apache.org/hadoop/NoRouteToHost at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:892) at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:839) at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1620) at org.apache.hadoop.ipc.Client.call(Client.java:1562) at org.apache.hadoop.ipc.Client.call(Client.java:1459) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:233) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:118) at com.sun.proxy.$Proxy24.sendHeartbeat(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolClientSideTranslatorPB.sendHeartbeat(DatanodeProtocolClientSideTranslatorPB.java:166) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.sendHeartBeat(BPServiceActor.java:553) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.offerService(BPServiceActor.java:694) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:894) at java.lang.Thread.run(Thread.java:750) Caused by: java.net.NoRouteToHostException: No route to host at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:716) at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:205) at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:586) at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:730) at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:843) at org.apache.hadoop.ipc.Client$Connection.access$3800(Client.java:430) at org.apache.hadoop.ipc.Client.getConnection(Client.java:1681) at org.apache.hadoop.ipc.Client.call(Client.java:1506) ... 9 more 2025-07-16 18:04:32,914 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:04:33,915 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:04:34,916 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:04:35,918 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:04:36,919 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:04:37,920 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:04:38,921 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:04:39,922 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:04:40,923 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:04:41,924 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:04:42,926 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:04:43,927 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:04:44,928 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:04:45,929 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:04:46,930 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:04:47,931 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:04:49,946 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:04:50,947 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:04:51,948 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:04:52,949 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:04:53,950 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:04:54,951 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:04:55,952 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:04:56,953 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:04:57,954 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:04:58,956 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:04:59,957 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:05:00,958 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:05:01,959 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:05:02,960 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:05:03,961 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:05:04,962 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:05:05,963 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:05:06,965 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:05:07,966 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:05:12,026 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:05:13,027 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:05:14,028 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:05:15,029 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:05:16,030 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:05:17,031 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:05:18,032 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:05:19,033 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:05:20,034 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:05:21,035 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:05:22,036 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:05:23,037 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:05:24,040 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:05:25,041 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:05:26,042 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:05:26,043 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: IOException in offerService java.net.NoRouteToHostException: No Route to Host from dmidlkprdls04.svr.luc.edu/192.168.158.4 to dmidlkprdls01.svr.luc.edu:8022 failed on socket timeout exception: java.net.NoRouteToHostException: No route to host; For more details see: http://wiki.apache.org/hadoop/NoRouteToHost at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:892) at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:839) at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1620) at org.apache.hadoop.ipc.Client.call(Client.java:1562) at org.apache.hadoop.ipc.Client.call(Client.java:1459) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:233) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:118) at com.sun.proxy.$Proxy24.sendHeartbeat(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolClientSideTranslatorPB.sendHeartbeat(DatanodeProtocolClientSideTranslatorPB.java:166) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.sendHeartBeat(BPServiceActor.java:553) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.offerService(BPServiceActor.java:694) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:894) at java.lang.Thread.run(Thread.java:750) Caused by: java.net.NoRouteToHostException: No route to host at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:716) at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:205) at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:586) at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:730) at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:843) at org.apache.hadoop.ipc.Client$Connection.access$3800(Client.java:430) at org.apache.hadoop.ipc.Client.getConnection(Client.java:1681) at org.apache.hadoop.ipc.Client.call(Client.java:1506) ... 9 more 2025-07-16 18:05:28,046 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:05:29,047 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:05:30,048 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:05:38,138 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:05:39,139 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:05:40,140 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:05:41,141 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:05:42,142 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:05:43,143 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:05:44,144 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:05:45,145 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:05:46,146 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:05:47,147 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:05:48,149 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:05:49,150 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:05:50,151 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:05:52,154 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:05:53,155 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:05:54,156 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:05:55,157 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:05:56,158 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:05:57,159 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:05:58,160 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:05:59,161 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:06:00,162 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:06:01,163 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:06:02,164 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:06:03,165 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:06:04,167 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:06:05,168 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:06:06,169 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:06:26,191 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); maxRetries=45 2025-07-16 18:06:46,211 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); maxRetries=45 2025-07-16 18:07:02,810 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:07:03,811 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:07:04,812 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:07:05,813 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:07:06,815 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:07:07,816 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:07:08,817 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:07:09,818 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:07:10,819 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:07:11,820 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:07:12,822 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:07:13,823 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:07:14,824 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:07:15,825 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:07:16,826 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:07:17,827 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:07:18,828 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:07:19,830 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:07:20,831 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:07:20,833 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: IOException in offerService java.net.NoRouteToHostException: No Route to Host from dmidlkprdls04.svr.luc.edu/192.168.158.4 to dmidlkprdls01.svr.luc.edu:8022 failed on socket timeout exception: java.net.NoRouteToHostException: No route to host; For more details see: http://wiki.apache.org/hadoop/NoRouteToHost at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:892) at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:839) at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1620) at org.apache.hadoop.ipc.Client.call(Client.java:1562) at org.apache.hadoop.ipc.Client.call(Client.java:1459) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:233) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:118) at com.sun.proxy.$Proxy24.sendHeartbeat(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolClientSideTranslatorPB.sendHeartbeat(DatanodeProtocolClientSideTranslatorPB.java:166) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.sendHeartBeat(BPServiceActor.java:553) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.offerService(BPServiceActor.java:694) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:894) at java.lang.Thread.run(Thread.java:750) Caused by: java.net.NoRouteToHostException: No route to host at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:716) at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:205) at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:586) at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:730) at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:843) at org.apache.hadoop.ipc.Client$Connection.access$3800(Client.java:430) at org.apache.hadoop.ipc.Client.getConnection(Client.java:1681) at org.apache.hadoop.ipc.Client.call(Client.java:1506) ... 9 more 2025-07-16 18:07:22,835 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:07:23,836 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:07:24,837 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:07:25,838 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:07:26,839 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:07:27,840 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:07:28,841 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:07:29,842 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:07:30,843 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:07:31,844 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:07:32,845 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:07:33,846 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:07:34,847 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:07:54,868 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); maxRetries=45 2025-07-16 18:08:14,888 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); maxRetries=45 2025-07-16 18:08:31,386 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:08:32,387 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:08:33,389 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:08:34,390 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:08:35,391 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:08:36,392 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:08:37,393 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:08:38,394 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:08:39,396 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:08:40,397 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:08:41,398 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:08:42,399 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:08:43,400 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:08:44,401 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:08:45,402 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:08:46,403 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:08:47,404 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:08:48,406 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:08:49,407 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:08:50,408 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:08:51,409 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:08:52,410 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:08:53,411 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:08:54,412 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:08:55,413 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:09:11,834 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:09:12,835 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:09:13,836 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:09:14,837 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:09:15,838 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:09:16,839 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:09:17,841 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:09:18,842 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:09:19,843 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:09:20,844 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:09:21,845 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:09:22,846 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:09:22,848 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: IOException in offerService java.net.NoRouteToHostException: No Route to Host from dmidlkprdls04.svr.luc.edu/192.168.158.4 to dmidlkprdls01.svr.luc.edu:8022 failed on socket timeout exception: java.net.NoRouteToHostException: No route to host; For more details see: http://wiki.apache.org/hadoop/NoRouteToHost at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:892) at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:839) at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1620) at org.apache.hadoop.ipc.Client.call(Client.java:1562) at org.apache.hadoop.ipc.Client.call(Client.java:1459) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:233) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:118) at com.sun.proxy.$Proxy24.sendHeartbeat(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolClientSideTranslatorPB.sendHeartbeat(DatanodeProtocolClientSideTranslatorPB.java:166) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.sendHeartBeat(BPServiceActor.java:553) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.offerService(BPServiceActor.java:694) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:894) at java.lang.Thread.run(Thread.java:750) Caused by: java.net.NoRouteToHostException: No route to host at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:716) at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:205) at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:586) at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:730) at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:843) at org.apache.hadoop.ipc.Client$Connection.access$3800(Client.java:430) at org.apache.hadoop.ipc.Client.getConnection(Client.java:1681) at org.apache.hadoop.ipc.Client.call(Client.java:1506) ... 9 more 2025-07-16 18:09:24,850 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:09:25,851 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:09:26,853 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:09:27,854 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:09:28,855 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:09:29,856 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:09:30,857 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:09:31,858 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:09:32,859 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:09:33,860 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:09:34,862 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:09:35,863 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:09:36,864 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:09:37,865 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:09:38,866 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:09:39,867 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:09:40,868 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:09:41,869 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:09:42,870 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:09:43,871 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:09:44,872 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:09:45,879 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:09:46,881 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:09:47,882 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:09:48,883 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:09:49,890 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:09:50,891 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:09:51,892 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:09:52,893 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:09:53,894 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:09:54,895 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:09:55,896 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:09:56,897 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:09:57,898 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:09:58,899 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:09:59,901 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:10:00,902 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:10:01,903 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:10:02,904 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:10:03,905 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:10:04,906 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:10:05,907 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:10:06,908 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:10:07,909 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:10:08,910 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:10:09,911 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:10:10,912 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:10:11,913 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:10:12,915 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:10:13,916 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:10:13,918 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: IOException in offerService java.net.NoRouteToHostException: No Route to Host from dmidlkprdls04.svr.luc.edu/192.168.158.4 to dmidlkprdls01.svr.luc.edu:8022 failed on socket timeout exception: java.net.NoRouteToHostException: No route to host; For more details see: http://wiki.apache.org/hadoop/NoRouteToHost at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:892) at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:839) at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1620) at org.apache.hadoop.ipc.Client.call(Client.java:1562) at org.apache.hadoop.ipc.Client.call(Client.java:1459) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:233) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:118) at com.sun.proxy.$Proxy24.sendHeartbeat(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolClientSideTranslatorPB.sendHeartbeat(DatanodeProtocolClientSideTranslatorPB.java:166) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.sendHeartBeat(BPServiceActor.java:553) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.offerService(BPServiceActor.java:694) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:894) at java.lang.Thread.run(Thread.java:750) Caused by: java.net.NoRouteToHostException: No route to host at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:716) at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:205) at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:586) at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:730) at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:843) at org.apache.hadoop.ipc.Client$Connection.access$3800(Client.java:430) at org.apache.hadoop.ipc.Client.getConnection(Client.java:1681) at org.apache.hadoop.ipc.Client.call(Client.java:1506) ... 9 more 2025-07-16 18:10:15,928 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:10:16,929 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:10:17,930 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:10:18,931 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:10:19,932 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:10:20,933 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:10:21,934 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:10:22,935 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:10:23,936 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:10:24,938 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:10:25,939 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:10:26,940 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:10:27,941 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:10:28,942 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:10:29,943 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:10:30,944 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:10:31,945 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:10:32,946 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:10:33,947 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:10:34,948 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:10:35,949 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:10:36,950 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:10:37,952 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:10:38,953 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:10:39,954 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:10:40,955 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:10:41,956 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:10:42,957 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:10:43,958 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:10:44,959 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:10:45,960 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:10:46,962 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:10:47,963 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:10:48,970 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:10:49,971 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:10:50,972 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:10:51,973 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:10:52,974 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:10:53,975 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:10:54,976 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:10:55,978 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:10:56,979 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:10:57,980 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:10:58,981 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:10:59,982 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:11:02,042 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:11:03,043 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:11:04,044 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:11:05,045 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:11:06,046 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:11:06,047 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: IOException in offerService java.net.NoRouteToHostException: No Route to Host from dmidlkprdls04.svr.luc.edu/192.168.158.4 to dmidlkprdls01.svr.luc.edu:8022 failed on socket timeout exception: java.net.NoRouteToHostException: No route to host; For more details see: http://wiki.apache.org/hadoop/NoRouteToHost at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:892) at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:839) at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1620) at org.apache.hadoop.ipc.Client.call(Client.java:1562) at org.apache.hadoop.ipc.Client.call(Client.java:1459) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:233) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:118) at com.sun.proxy.$Proxy24.sendHeartbeat(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolClientSideTranslatorPB.sendHeartbeat(DatanodeProtocolClientSideTranslatorPB.java:166) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.sendHeartBeat(BPServiceActor.java:553) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.offerService(BPServiceActor.java:694) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:894) at java.lang.Thread.run(Thread.java:750) Caused by: java.net.NoRouteToHostException: No route to host at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:716) at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:205) at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:586) at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:730) at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:843) at org.apache.hadoop.ipc.Client$Connection.access$3800(Client.java:430) at org.apache.hadoop.ipc.Client.getConnection(Client.java:1681) at org.apache.hadoop.ipc.Client.call(Client.java:1506) ... 9 more 2025-07-16 18:11:08,049 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:11:09,051 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:11:10,052 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:11:11,053 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:11:31,063 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); maxRetries=45 2025-07-16 18:11:33,082 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:11:35,130 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:11:36,131 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:11:37,132 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:11:38,133 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:11:39,134 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:11:40,135 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:11:41,136 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:11:42,137 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:11:43,138 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:11:44,139 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:11:45,140 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:11:46,141 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:11:47,142 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:11:48,143 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:11:49,144 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:11:50,146 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:12:10,166 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); maxRetries=45 2025-07-16 18:12:11,167 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:12:12,168 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:12:13,169 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:12:14,170 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:12:15,171 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:12:16,172 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:12:17,173 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:12:18,174 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:12:19,176 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:12:20,177 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:12:21,178 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:12:22,179 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:12:23,180 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:12:24,182 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:12:25,183 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:12:26,184 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:12:27,185 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:12:28,186 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:12:29,187 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:12:30,188 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:12:31,190 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:12:32,191 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:12:33,192 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:12:34,193 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:12:35,194 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:12:36,195 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:12:56,216 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); maxRetries=45 2025-07-16 18:13:16,236 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); maxRetries=45 2025-07-16 18:13:32,443 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:13:33,444 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:13:34,445 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:13:34,446 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: IOException in offerService java.net.NoRouteToHostException: No Route to Host from dmidlkprdls04.svr.luc.edu/192.168.158.4 to dmidlkprdls01.svr.luc.edu:8022 failed on socket timeout exception: java.net.NoRouteToHostException: No route to host; For more details see: http://wiki.apache.org/hadoop/NoRouteToHost at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:892) at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:839) at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1620) at org.apache.hadoop.ipc.Client.call(Client.java:1562) at org.apache.hadoop.ipc.Client.call(Client.java:1459) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:233) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:118) at com.sun.proxy.$Proxy24.sendHeartbeat(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolClientSideTranslatorPB.sendHeartbeat(DatanodeProtocolClientSideTranslatorPB.java:166) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.sendHeartBeat(BPServiceActor.java:553) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.offerService(BPServiceActor.java:694) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:894) at java.lang.Thread.run(Thread.java:750) Caused by: java.net.NoRouteToHostException: No route to host at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:716) at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:205) at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:586) at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:730) at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:843) at org.apache.hadoop.ipc.Client$Connection.access$3800(Client.java:430) at org.apache.hadoop.ipc.Client.getConnection(Client.java:1681) at org.apache.hadoop.ipc.Client.call(Client.java:1506) ... 9 more 2025-07-16 18:13:36,450 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:13:37,451 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:13:38,453 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:13:39,454 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:13:40,455 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:13:41,456 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:13:42,457 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:13:43,458 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:13:44,459 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:13:45,460 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:13:46,461 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:13:47,462 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:13:48,464 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:13:49,465 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:13:50,466 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:13:51,467 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:13:52,468 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:13:56,574 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:13:57,575 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:13:58,577 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:13:59,578 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:14:00,579 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:14:16,986 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:14:17,987 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:14:18,988 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:14:19,990 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:14:20,991 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:14:21,992 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:14:22,993 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:14:23,995 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:14:24,996 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:14:25,997 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:14:26,998 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:14:27,999 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:14:29,000 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:14:30,001 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:14:31,002 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:14:32,003 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:14:52,024 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); maxRetries=45 2025-07-16 18:15:08,186 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:15:09,187 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:15:10,188 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:15:11,189 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:15:12,190 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:15:13,191 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:15:15,226 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:15:16,227 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:15:17,228 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:15:18,229 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:15:19,230 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:15:20,231 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:15:20,233 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: IOException in offerService java.net.NoRouteToHostException: No Route to Host from dmidlkprdls04.svr.luc.edu/192.168.158.4 to dmidlkprdls01.svr.luc.edu:8022 failed on socket timeout exception: java.net.NoRouteToHostException: No route to host; For more details see: http://wiki.apache.org/hadoop/NoRouteToHost at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:892) at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:839) at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1620) at org.apache.hadoop.ipc.Client.call(Client.java:1562) at org.apache.hadoop.ipc.Client.call(Client.java:1459) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:233) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:118) at com.sun.proxy.$Proxy24.sendHeartbeat(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolClientSideTranslatorPB.sendHeartbeat(DatanodeProtocolClientSideTranslatorPB.java:166) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.sendHeartBeat(BPServiceActor.java:553) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.offerService(BPServiceActor.java:694) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:894) at java.lang.Thread.run(Thread.java:750) Caused by: java.net.NoRouteToHostException: No route to host at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:716) at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:205) at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:586) at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:730) at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:843) at org.apache.hadoop.ipc.Client$Connection.access$3800(Client.java:430) at org.apache.hadoop.ipc.Client.getConnection(Client.java:1681) at org.apache.hadoop.ipc.Client.call(Client.java:1506) ... 9 more 2025-07-16 18:15:22,235 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:15:23,236 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:15:24,237 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:15:28,346 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:15:29,347 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:15:30,348 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:15:31,349 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:15:32,350 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:15:33,351 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:15:34,352 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:15:35,353 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:15:36,354 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:15:37,355 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:15:57,362 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); maxRetries=45 2025-07-16 18:16:17,383 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); maxRetries=45 2025-07-16 18:16:33,692 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:16:34,693 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:16:35,694 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:16:36,695 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:16:37,696 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:16:38,697 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:16:39,698 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:16:40,699 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:16:41,700 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:16:42,701 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:16:43,702 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:16:44,703 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:16:45,704 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:16:46,705 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:16:47,706 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:16:48,707 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:16:49,708 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:16:50,709 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:16:51,710 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:16:52,711 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:16:53,712 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:16:54,713 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:16:55,714 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:16:56,715 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:16:57,716 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:17:05,818 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:17:06,819 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:17:07,820 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:17:08,821 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:17:09,822 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:17:10,823 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:17:11,824 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:17:12,825 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:17:13,826 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:17:14,827 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:17:15,828 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:17:16,829 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:17:16,831 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: IOException in offerService java.net.NoRouteToHostException: No Route to Host from dmidlkprdls04.svr.luc.edu/192.168.158.4 to dmidlkprdls01.svr.luc.edu:8022 failed on socket timeout exception: java.net.NoRouteToHostException: No route to host; For more details see: http://wiki.apache.org/hadoop/NoRouteToHost at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:892) at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:839) at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1620) at org.apache.hadoop.ipc.Client.call(Client.java:1562) at org.apache.hadoop.ipc.Client.call(Client.java:1459) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:233) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:118) at com.sun.proxy.$Proxy24.sendHeartbeat(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolClientSideTranslatorPB.sendHeartbeat(DatanodeProtocolClientSideTranslatorPB.java:166) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.sendHeartBeat(BPServiceActor.java:553) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.offerService(BPServiceActor.java:694) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:894) at java.lang.Thread.run(Thread.java:750) Caused by: java.net.NoRouteToHostException: No route to host at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:716) at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:205) at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:586) at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:730) at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:843) at org.apache.hadoop.ipc.Client$Connection.access$3800(Client.java:430) at org.apache.hadoop.ipc.Client.getConnection(Client.java:1681) at org.apache.hadoop.ipc.Client.call(Client.java:1506) ... 9 more 2025-07-16 18:17:18,834 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:17:19,835 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:17:20,836 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:17:21,837 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:17:22,838 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:17:23,839 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:17:24,840 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:17:25,841 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:17:26,842 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:17:27,843 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:17:28,845 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:17:29,846 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:17:30,847 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:17:31,848 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:17:32,849 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:17:33,850 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:17:34,851 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:17:35,852 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:17:36,853 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:17:37,854 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:17:38,855 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:17:39,856 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:17:40,858 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:17:41,859 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:17:42,860 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:17:43,861 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:17:44,862 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:17:45,863 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:17:46,864 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:17:47,865 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:17:48,866 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:17:49,867 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:17:50,868 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:17:51,869 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:17:52,870 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:17:53,871 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:17:54,872 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:17:55,873 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:17:56,875 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:17:57,876 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:17:58,877 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:17:59,878 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:18:00,879 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:18:01,880 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:18:02,881 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:18:03,882 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:18:04,883 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:18:05,884 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:18:06,885 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:18:07,886 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:18:07,888 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: IOException in offerService java.net.NoRouteToHostException: No Route to Host from dmidlkprdls04.svr.luc.edu/192.168.158.4 to dmidlkprdls01.svr.luc.edu:8022 failed on socket timeout exception: java.net.NoRouteToHostException: No route to host; For more details see: http://wiki.apache.org/hadoop/NoRouteToHost at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:892) at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:839) at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1620) at org.apache.hadoop.ipc.Client.call(Client.java:1562) at org.apache.hadoop.ipc.Client.call(Client.java:1459) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:233) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:118) at com.sun.proxy.$Proxy24.sendHeartbeat(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolClientSideTranslatorPB.sendHeartbeat(DatanodeProtocolClientSideTranslatorPB.java:166) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.sendHeartBeat(BPServiceActor.java:553) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.offerService(BPServiceActor.java:694) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:894) at java.lang.Thread.run(Thread.java:750) Caused by: java.net.NoRouteToHostException: No route to host at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:716) at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:205) at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:586) at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:730) at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:843) at org.apache.hadoop.ipc.Client$Connection.access$3800(Client.java:430) at org.apache.hadoop.ipc.Client.getConnection(Client.java:1681) at org.apache.hadoop.ipc.Client.call(Client.java:1506) ... 9 more 2025-07-16 18:18:09,890 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:18:10,891 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:18:11,892 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:18:12,893 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:18:13,894 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:18:14,896 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:18:15,897 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:18:16,898 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:18:17,899 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:18:18,900 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:18:19,901 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:18:20,902 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:18:21,903 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:18:22,904 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:18:23,905 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:18:24,906 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:18:25,907 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:18:26,908 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:18:27,909 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:18:28,910 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:18:29,911 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:18:31,962 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:18:32,963 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:18:33,964 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:18:34,965 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:18:35,966 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:18:36,967 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:18:37,968 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:18:38,969 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:18:39,970 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:18:40,971 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:18:41,972 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:18:42,974 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:18:43,975 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:18:44,976 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:18:45,977 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:18:46,978 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:18:47,979 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:18:48,980 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:18:49,981 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:18:50,982 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:18:51,983 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:18:52,984 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:18:55,002 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:18:56,003 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:18:57,004 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:18:58,005 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:18:59,006 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:19:00,007 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:19:02,042 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:19:02,044 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: IOException in offerService java.net.NoRouteToHostException: No Route to Host from dmidlkprdls04.svr.luc.edu/192.168.158.4 to dmidlkprdls01.svr.luc.edu:8022 failed on socket timeout exception: java.net.NoRouteToHostException: No route to host; For more details see: http://wiki.apache.org/hadoop/NoRouteToHost at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:892) at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:839) at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1620) at org.apache.hadoop.ipc.Client.call(Client.java:1562) at org.apache.hadoop.ipc.Client.call(Client.java:1459) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:233) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:118) at com.sun.proxy.$Proxy24.sendHeartbeat(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolClientSideTranslatorPB.sendHeartbeat(DatanodeProtocolClientSideTranslatorPB.java:166) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.sendHeartBeat(BPServiceActor.java:553) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.offerService(BPServiceActor.java:694) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:894) at java.lang.Thread.run(Thread.java:750) Caused by: java.net.NoRouteToHostException: No route to host at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:716) at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:205) at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:586) at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:730) at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:843) at org.apache.hadoop.ipc.Client$Connection.access$3800(Client.java:430) at org.apache.hadoop.ipc.Client.getConnection(Client.java:1681) at org.apache.hadoop.ipc.Client.call(Client.java:1506) ... 9 more 2025-07-16 18:19:04,046 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:19:05,047 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:19:06,049 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:19:07,050 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:19:08,051 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:19:09,052 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:19:10,053 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:19:11,054 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:19:12,055 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:19:13,056 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:19:14,057 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:19:15,058 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:19:16,059 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:19:17,061 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:19:18,062 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:19:19,063 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:19:20,064 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:19:21,065 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:19:22,066 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:19:23,067 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:19:24,068 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:19:25,069 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:19:26,070 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:19:27,072 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:19:28,073 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:19:29,074 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:19:30,075 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:19:31,076 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:19:32,077 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:19:33,078 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:19:34,080 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:19:35,081 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:19:36,082 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:19:37,083 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:19:38,084 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:19:39,085 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:19:40,086 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:19:41,087 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:19:42,088 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:19:43,089 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:19:44,091 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:19:45,092 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:19:46,093 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:19:47,094 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:19:48,095 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:19:49,096 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:19:50,097 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:19:51,098 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:19:52,099 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:19:53,100 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:19:53,102 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: IOException in offerService java.net.NoRouteToHostException: No Route to Host from dmidlkprdls04.svr.luc.edu/192.168.158.4 to dmidlkprdls01.svr.luc.edu:8022 failed on socket timeout exception: java.net.NoRouteToHostException: No route to host; For more details see: http://wiki.apache.org/hadoop/NoRouteToHost at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:892) at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:839) at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1620) at org.apache.hadoop.ipc.Client.call(Client.java:1562) at org.apache.hadoop.ipc.Client.call(Client.java:1459) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:233) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:118) at com.sun.proxy.$Proxy24.sendHeartbeat(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolClientSideTranslatorPB.sendHeartbeat(DatanodeProtocolClientSideTranslatorPB.java:166) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.sendHeartBeat(BPServiceActor.java:553) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.offerService(BPServiceActor.java:694) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:894) at java.lang.Thread.run(Thread.java:750) Caused by: java.net.NoRouteToHostException: No route to host at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:716) at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:205) at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:586) at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:730) at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:843) at org.apache.hadoop.ipc.Client$Connection.access$3800(Client.java:430) at org.apache.hadoop.ipc.Client.getConnection(Client.java:1681) at org.apache.hadoop.ipc.Client.call(Client.java:1506) ... 9 more 2025-07-16 18:19:55,104 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:19:56,105 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:19:57,106 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:19:58,107 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:19:59,108 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:20:00,109 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:20:01,111 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:20:02,112 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:20:04,122 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:20:05,123 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:20:06,124 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:20:07,125 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:20:08,126 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:20:09,127 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:20:10,128 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:20:11,130 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:20:12,131 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:20:13,132 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:20:14,133 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:20:15,134 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:20:16,135 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:20:17,136 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:20:18,137 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:20:19,138 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:20:20,140 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:20:21,141 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:20:22,142 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:20:23,143 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:20:24,144 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:20:25,145 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:20:26,146 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:20:27,147 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:20:28,148 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:20:29,149 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:20:30,150 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:20:31,151 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:20:32,153 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:20:33,154 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:20:34,155 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:20:35,156 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:20:36,157 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:20:37,158 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:20:38,159 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:20:39,160 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:20:43,226 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:20:44,227 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:20:45,228 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:20:46,229 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-16 18:21:06,250 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); maxRetries=45 2025-07-16 18:21:06,256 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: DatanodeCommand action : DNA_REGISTER from dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 with active state 2025-07-16 18:21:06,262 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: RemoteException in offerService org.apache.hadoop.ipc.RemoteException(java.io.IOException): processCacheReport from dead or unregistered datanode: null at org.apache.hadoop.hdfs.server.namenode.CacheManager.processCacheReport(CacheManager.java:1021) at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.cacheReport(NameNodeRpcServer.java:1615) at org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolServerSideTranslatorPB.cacheReport(DatanodeProtocolServerSideTranslatorPB.java:201) at org.apache.hadoop.hdfs.protocol.proto.DatanodeProtocolProtos$DatanodeProtocolService$2.callBlockingMethod(DatanodeProtocolProtos.java:31542) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:533) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1070) at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:994) at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:922) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1910) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2899) at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1616) at org.apache.hadoop.ipc.Client.call(Client.java:1562) at org.apache.hadoop.ipc.Client.call(Client.java:1459) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:233) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:118) at com.sun.proxy.$Proxy24.cacheReport(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolClientSideTranslatorPB.cacheReport(DatanodeProtocolClientSideTranslatorPB.java:236) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.cacheReport(BPServiceActor.java:499) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.offerService(BPServiceActor.java:740) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:894) at java.lang.Thread.run(Thread.java:750) 2025-07-16 18:21:06,262 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool BP-1059995147-192.168.158.1-1752101929360 (Datanode Uuid be50c32a-aa23-4b9d-aa7f-05816b6e5f1a) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 beginning handshake with NN 2025-07-16 18:21:06,265 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool Block pool BP-1059995147-192.168.158.1-1752101929360 (Datanode Uuid be50c32a-aa23-4b9d-aa7f-05816b6e5f1a) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 successfully registered with NN 2025-07-16 18:21:07,271 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Successfully sent block report 0x12cfd9a757d31f42, containing 4 storage report(s), of which we sent 4. The reports had 8 total blocks and used 1 RPC(s). This took 1 msec to generate and 5 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5. 2025-07-16 18:21:07,271 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Got finalize command for block pool BP-1059995147-192.168.158.1-1752101929360 2025-07-16 18:21:38,996 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751853_11029 src: /192.168.158.6:47718 dest: /192.168.158.4:9866 2025-07-16 18:21:39,043 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:47718, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_596032903_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751853_11029, duration(ns): 44720405 2025-07-16 18:21:39,043 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751853_11029, type=LAST_IN_PIPELINE terminating 2025-07-16 18:27:43,752 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751859_11035 src: /192.168.158.6:55492 dest: /192.168.158.4:9866 2025-07-16 18:27:43,777 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:55492, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_296843191_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751859_11035, duration(ns): 19564324 2025-07-16 18:27:43,777 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751859_11035, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-16 18:28:43,781 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751860_11036 src: /192.168.158.1:40680 dest: /192.168.158.4:9866 2025-07-16 18:28:43,815 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40680, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1245685498_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751860_11036, duration(ns): 24339517 2025-07-16 18:28:43,815 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751860_11036, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-16 18:32:43,780 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751864_11040 src: /192.168.158.1:55008 dest: /192.168.158.4:9866 2025-07-16 18:32:43,814 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55008, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-431277459_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751864_11040, duration(ns): 25357015 2025-07-16 18:32:43,815 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751864_11040, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-16 18:35:43,791 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751867_11043 src: /192.168.158.6:57100 dest: /192.168.158.4:9866 2025-07-16 18:35:43,813 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:57100, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1636545561_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751867_11043, duration(ns): 19149488 2025-07-16 18:35:43,813 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751867_11043, type=LAST_IN_PIPELINE terminating 2025-07-16 18:36:48,810 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751868_11044 src: /192.168.158.1:34334 dest: /192.168.158.4:9866 2025-07-16 18:36:48,844 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34334, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-836143979_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751868_11044, duration(ns): 24952325 2025-07-16 18:36:48,845 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751868_11044, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-16 18:38:53,815 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751870_11046 src: /192.168.158.5:56580 dest: /192.168.158.4:9866 2025-07-16 18:38:53,833 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:56580, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1986721659_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751870_11046, duration(ns): 16560123 2025-07-16 18:38:53,834 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751870_11046, type=LAST_IN_PIPELINE terminating 2025-07-16 18:39:53,817 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751871_11047 src: /192.168.158.9:50534 dest: /192.168.158.4:9866 2025-07-16 18:39:53,842 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:50534, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2027673818_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751871_11047, duration(ns): 19536711 2025-07-16 18:39:53,842 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751871_11047, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-16 18:42:53,810 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751874_11050 src: /192.168.158.7:48248 dest: /192.168.158.4:9866 2025-07-16 18:42:53,832 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:48248, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1961891540_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751874_11050, duration(ns): 20220527 2025-07-16 18:42:53,833 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751874_11050, type=LAST_IN_PIPELINE terminating 2025-07-16 18:45:58,834 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751877_11053 src: /192.168.158.6:48782 dest: /192.168.158.4:9866 2025-07-16 18:45:58,852 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:48782, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1098759952_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751877_11053, duration(ns): 15928076 2025-07-16 18:45:58,852 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751877_11053, type=LAST_IN_PIPELINE terminating 2025-07-16 18:51:13,833 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751882_11058 src: /192.168.158.5:44258 dest: /192.168.158.4:9866 2025-07-16 18:51:13,852 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:44258, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_862380307_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751882_11058, duration(ns): 16683427 2025-07-16 18:51:13,852 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751882_11058, type=LAST_IN_PIPELINE terminating 2025-07-16 18:53:13,850 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751884_11060 src: /192.168.158.6:38568 dest: /192.168.158.4:9866 2025-07-16 18:53:13,878 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:38568, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1010228154_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751884_11060, duration(ns): 22561941 2025-07-16 18:53:13,880 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751884_11060, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-16 18:55:23,829 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751886_11062 src: /192.168.158.5:45614 dest: /192.168.158.4:9866 2025-07-16 18:55:23,859 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:45614, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-235903081_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751886_11062, duration(ns): 23477922 2025-07-16 18:55:23,860 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751886_11062, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-16 18:57:28,825 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751888_11064 src: /192.168.158.1:47834 dest: /192.168.158.4:9866 2025-07-16 18:57:28,859 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47834, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2003806265_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751888_11064, duration(ns): 25035239 2025-07-16 18:57:28,859 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751888_11064, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-16 19:00:28,817 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751891_11067 src: /192.168.158.6:49696 dest: /192.168.158.4:9866 2025-07-16 19:00:28,844 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:49696, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-534494169_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751891_11067, duration(ns): 19972618 2025-07-16 19:00:28,844 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751891_11067, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-16 19:02:28,830 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751893_11069 src: /192.168.158.8:43702 dest: /192.168.158.4:9866 2025-07-16 19:02:28,845 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:43702, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2028112323_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751893_11069, duration(ns): 13469011 2025-07-16 19:02:28,846 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751893_11069, type=LAST_IN_PIPELINE terminating 2025-07-16 19:03:28,837 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751894_11070 src: /192.168.158.8:44950 dest: /192.168.158.4:9866 2025-07-16 19:03:28,861 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:44950, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1083644616_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751894_11070, duration(ns): 19059900 2025-07-16 19:03:28,862 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751894_11070, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-16 19:04:28,827 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751895_11071 src: /192.168.158.6:38692 dest: /192.168.158.4:9866 2025-07-16 19:04:28,845 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:38692, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_902635387_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751895_11071, duration(ns): 15913848 2025-07-16 19:04:28,846 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751895_11071, type=LAST_IN_PIPELINE terminating 2025-07-16 19:06:28,832 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751897_11073 src: /192.168.158.5:56544 dest: /192.168.158.4:9866 2025-07-16 19:06:28,858 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:56544, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1360893026_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751897_11073, duration(ns): 19887161 2025-07-16 19:06:28,858 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751897_11073, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-16 19:07:28,830 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751898_11074 src: /192.168.158.1:41414 dest: /192.168.158.4:9866 2025-07-16 19:07:28,862 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41414, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1983565472_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751898_11074, duration(ns): 23041254 2025-07-16 19:07:28,862 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751898_11074, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-16 19:08:28,841 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751899_11075 src: /192.168.158.5:47762 dest: /192.168.158.4:9866 2025-07-16 19:08:28,859 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:47762, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1839268682_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751899_11075, duration(ns): 15359686 2025-07-16 19:08:28,859 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751899_11075, type=LAST_IN_PIPELINE terminating 2025-07-16 19:09:28,839 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751900_11076 src: /192.168.158.9:42138 dest: /192.168.158.4:9866 2025-07-16 19:09:28,865 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:42138, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-615054250_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751900_11076, duration(ns): 20217227 2025-07-16 19:09:28,865 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751900_11076, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-16 19:10:28,852 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751901_11077 src: /192.168.158.8:43082 dest: /192.168.158.4:9866 2025-07-16 19:10:28,879 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:43082, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1339358851_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751901_11077, duration(ns): 20768045 2025-07-16 19:10:28,880 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751901_11077, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-16 19:12:33,858 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751903_11079 src: /192.168.158.7:42430 dest: /192.168.158.4:9866 2025-07-16 19:12:33,877 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:42430, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-378476569_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751903_11079, duration(ns): 16339487 2025-07-16 19:12:33,877 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751903_11079, type=LAST_IN_PIPELINE terminating 2025-07-16 19:13:33,859 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751904_11080 src: /192.168.158.1:39760 dest: /192.168.158.4:9866 2025-07-16 19:13:33,891 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39760, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1246386039_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751904_11080, duration(ns): 22888589 2025-07-16 19:13:33,891 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751904_11080, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-16 19:16:58,424 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751874_11050 replica FinalizedReplica, blk_1073751874_11050, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751874 for deletion 2025-07-16 19:16:58,425 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751877_11053 replica FinalizedReplica, blk_1073751877_11053, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751877 for deletion 2025-07-16 19:16:58,426 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751874_11050 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751874 2025-07-16 19:16:58,426 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751882_11058 replica FinalizedReplica, blk_1073751882_11058, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751882 for deletion 2025-07-16 19:16:58,426 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751877_11053 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751877 2025-07-16 19:16:58,427 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751884_11060 replica FinalizedReplica, blk_1073751884_11060, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751884 for deletion 2025-07-16 19:16:58,427 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751882_11058 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751882 2025-07-16 19:16:58,428 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751886_11062 replica FinalizedReplica, blk_1073751886_11062, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751886 for deletion 2025-07-16 19:16:58,428 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751884_11060 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751884 2025-07-16 19:16:58,428 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751888_11064 replica FinalizedReplica, blk_1073751888_11064, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751888 for deletion 2025-07-16 19:16:58,429 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751886_11062 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751886 2025-07-16 19:16:58,429 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751891_11067 replica FinalizedReplica, blk_1073751891_11067, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751891 for deletion 2025-07-16 19:16:58,429 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751888_11064 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751888 2025-07-16 19:16:58,430 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751891_11067 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751891 2025-07-16 19:16:58,429 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751893_11069 replica FinalizedReplica, blk_1073751893_11069, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751893 for deletion 2025-07-16 19:16:58,430 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751894_11070 replica FinalizedReplica, blk_1073751894_11070, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751894 for deletion 2025-07-16 19:16:58,430 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751895_11071 replica FinalizedReplica, blk_1073751895_11071, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751895 for deletion 2025-07-16 19:16:58,430 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751894_11070 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751894 2025-07-16 19:16:58,431 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751895_11071 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751895 2025-07-16 19:16:58,430 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751893_11069 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751893 2025-07-16 19:16:58,431 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751897_11073 replica FinalizedReplica, blk_1073751897_11073, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751897 for deletion 2025-07-16 19:16:58,432 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751898_11074 replica FinalizedReplica, blk_1073751898_11074, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751898 for deletion 2025-07-16 19:16:58,432 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751897_11073 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751897 2025-07-16 19:16:58,432 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751898_11074 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751898 2025-07-16 19:16:58,432 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751899_11075 replica FinalizedReplica, blk_1073751899_11075, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751899 for deletion 2025-07-16 19:16:58,432 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751900_11076 replica FinalizedReplica, blk_1073751900_11076, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751900 for deletion 2025-07-16 19:16:58,432 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751899_11075 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751899 2025-07-16 19:16:58,433 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751900_11076 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751900 2025-07-16 19:16:58,433 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751901_11077 replica FinalizedReplica, blk_1073751901_11077, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751901 for deletion 2025-07-16 19:16:58,433 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751903_11079 replica FinalizedReplica, blk_1073751903_11079, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751903 for deletion 2025-07-16 19:16:58,433 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751901_11077 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751901 2025-07-16 19:16:58,434 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751903_11079 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751903 2025-07-16 19:16:58,434 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751904_11080 replica FinalizedReplica, blk_1073751904_11080, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751904 for deletion 2025-07-16 19:16:58,434 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751853_11029 replica FinalizedReplica, blk_1073751853_11029, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751853 for deletion 2025-07-16 19:16:58,434 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751904_11080 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751904 2025-07-16 19:16:58,434 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751859_11035 replica FinalizedReplica, blk_1073751859_11035, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751859 for deletion 2025-07-16 19:16:58,435 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751853_11029 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751853 2025-07-16 19:16:58,435 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751860_11036 replica FinalizedReplica, blk_1073751860_11036, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751860 for deletion 2025-07-16 19:16:58,435 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751864_11040 replica FinalizedReplica, blk_1073751864_11040, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751864 for deletion 2025-07-16 19:16:58,435 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751859_11035 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751859 2025-07-16 19:16:58,435 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751860_11036 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751860 2025-07-16 19:16:58,436 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751867_11043 replica FinalizedReplica, blk_1073751867_11043, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751867 for deletion 2025-07-16 19:16:58,436 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751864_11040 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751864 2025-07-16 19:16:58,437 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751868_11044 replica FinalizedReplica, blk_1073751868_11044, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751868 for deletion 2025-07-16 19:16:58,436 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751867_11043 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751867 2025-07-16 19:16:58,437 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751868_11044 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751868 2025-07-16 19:16:58,437 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751870_11046 replica FinalizedReplica, blk_1073751870_11046, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751870 for deletion 2025-07-16 19:16:58,437 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751871_11047 replica FinalizedReplica, blk_1073751871_11047, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751871 for deletion 2025-07-16 19:16:58,437 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751870_11046 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751870 2025-07-16 19:16:58,438 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751871_11047 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751871 2025-07-16 19:17:33,856 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751908_11084 src: /192.168.158.9:48416 dest: /192.168.158.4:9866 2025-07-16 19:17:33,879 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:48416, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1237651130_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751908_11084, duration(ns): 17103598 2025-07-16 19:17:33,880 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751908_11084, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-16 19:17:37,426 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751908_11084 replica FinalizedReplica, blk_1073751908_11084, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751908 for deletion 2025-07-16 19:17:37,426 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751908_11084 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751908 2025-07-16 19:18:33,854 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751909_11085 src: /192.168.158.6:34036 dest: /192.168.158.4:9866 2025-07-16 19:18:33,872 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:34036, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1135956657_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751909_11085, duration(ns): 15989054 2025-07-16 19:18:33,873 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751909_11085, type=LAST_IN_PIPELINE terminating 2025-07-16 19:18:34,433 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751909_11085 replica FinalizedReplica, blk_1073751909_11085, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751909 for deletion 2025-07-16 19:18:34,434 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751909_11085 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751909 2025-07-16 19:22:38,875 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751913_11089 src: /192.168.158.7:41238 dest: /192.168.158.4:9866 2025-07-16 19:22:38,892 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:41238, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_571698704_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751913_11089, duration(ns): 14500094 2025-07-16 19:22:38,892 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751913_11089, type=LAST_IN_PIPELINE terminating 2025-07-16 19:22:40,463 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751913_11089 replica FinalizedReplica, blk_1073751913_11089, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751913 for deletion 2025-07-16 19:22:40,464 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751913_11089 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751913 2025-07-16 19:24:43,882 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751915_11091 src: /192.168.158.1:33092 dest: /192.168.158.4:9866 2025-07-16 19:24:43,913 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33092, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1685324550_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751915_11091, duration(ns): 21785630 2025-07-16 19:24:43,913 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751915_11091, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-16 19:24:46,468 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751915_11091 replica FinalizedReplica, blk_1073751915_11091, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751915 for deletion 2025-07-16 19:24:46,469 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751915_11091 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751915 2025-07-16 19:25:43,897 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751916_11092 src: /192.168.158.1:48882 dest: /192.168.158.4:9866 2025-07-16 19:25:43,927 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48882, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-370081650_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751916_11092, duration(ns): 20916880 2025-07-16 19:25:43,928 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751916_11092, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-16 19:25:46,475 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751916_11092 replica FinalizedReplica, blk_1073751916_11092, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751916 for deletion 2025-07-16 19:25:46,476 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751916_11092 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751916 2025-07-16 19:26:48,889 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751917_11093 src: /192.168.158.8:54528 dest: /192.168.158.4:9866 2025-07-16 19:26:48,908 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:54528, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1132140390_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751917_11093, duration(ns): 16743849 2025-07-16 19:26:48,908 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751917_11093, type=LAST_IN_PIPELINE terminating 2025-07-16 19:26:49,477 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751917_11093 replica FinalizedReplica, blk_1073751917_11093, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751917 for deletion 2025-07-16 19:26:49,478 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751917_11093 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751917 2025-07-16 19:27:48,901 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751918_11094 src: /192.168.158.7:57372 dest: /192.168.158.4:9866 2025-07-16 19:27:48,927 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:57372, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-529131465_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751918_11094, duration(ns): 19703264 2025-07-16 19:27:48,927 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751918_11094, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-16 19:27:49,478 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751918_11094 replica FinalizedReplica, blk_1073751918_11094, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751918 for deletion 2025-07-16 19:27:49,480 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751918_11094 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751918 2025-07-16 19:29:48,895 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751920_11096 src: /192.168.158.9:46164 dest: /192.168.158.4:9866 2025-07-16 19:29:48,914 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:46164, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1128045767_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751920_11096, duration(ns): 15691765 2025-07-16 19:29:48,914 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751920_11096, type=LAST_IN_PIPELINE terminating 2025-07-16 19:29:49,484 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751920_11096 replica FinalizedReplica, blk_1073751920_11096, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751920 for deletion 2025-07-16 19:29:49,485 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751920_11096 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751920 2025-07-16 19:30:48,895 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751921_11097 src: /192.168.158.1:53062 dest: /192.168.158.4:9866 2025-07-16 19:30:48,930 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53062, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2286085_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751921_11097, duration(ns): 25968257 2025-07-16 19:30:48,931 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751921_11097, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-16 19:30:52,486 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751921_11097 replica FinalizedReplica, blk_1073751921_11097, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751921 for deletion 2025-07-16 19:30:52,488 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751921_11097 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751921 2025-07-16 19:31:48,892 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751922_11098 src: /192.168.158.1:42596 dest: /192.168.158.4:9866 2025-07-16 19:31:48,924 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42596, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1361900598_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751922_11098, duration(ns): 23482775 2025-07-16 19:31:48,925 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751922_11098, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-16 19:31:52,488 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751922_11098 replica FinalizedReplica, blk_1073751922_11098, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751922 for deletion 2025-07-16 19:31:52,490 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751922_11098 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751922 2025-07-16 19:35:53,901 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751926_11102 src: /192.168.158.1:55614 dest: /192.168.158.4:9866 2025-07-16 19:35:53,937 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55614, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1846218843_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751926_11102, duration(ns): 26124371 2025-07-16 19:35:53,937 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751926_11102, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-16 19:35:55,500 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751926_11102 replica FinalizedReplica, blk_1073751926_11102, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751926 for deletion 2025-07-16 19:35:55,501 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751926_11102 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751926 2025-07-16 19:37:58,900 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751928_11104 src: /192.168.158.1:57906 dest: /192.168.158.4:9866 2025-07-16 19:37:58,931 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57906, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-66646625_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751928_11104, duration(ns): 22056280 2025-07-16 19:37:58,932 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751928_11104, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-16 19:38:04,508 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751928_11104 replica FinalizedReplica, blk_1073751928_11104, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751928 for deletion 2025-07-16 19:38:04,509 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751928_11104 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751928 2025-07-16 19:38:58,904 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751929_11105 src: /192.168.158.5:40888 dest: /192.168.158.4:9866 2025-07-16 19:38:58,929 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:40888, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-111647776_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751929_11105, duration(ns): 19275968 2025-07-16 19:38:58,929 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751929_11105, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-16 19:39:01,512 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751929_11105 replica FinalizedReplica, blk_1073751929_11105, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751929 for deletion 2025-07-16 19:39:01,513 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751929_11105 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751929 2025-07-16 19:39:58,907 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751930_11106 src: /192.168.158.1:34158 dest: /192.168.158.4:9866 2025-07-16 19:39:58,942 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34158, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-208175134_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751930_11106, duration(ns): 25196795 2025-07-16 19:39:58,942 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751930_11106, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-16 19:40:04,513 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751930_11106 replica FinalizedReplica, blk_1073751930_11106, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751930 for deletion 2025-07-16 19:40:04,514 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751930_11106 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751930 2025-07-16 19:41:58,912 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751932_11108 src: /192.168.158.7:55814 dest: /192.168.158.4:9866 2025-07-16 19:41:58,941 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:55814, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1864569616_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751932_11108, duration(ns): 22701828 2025-07-16 19:41:58,941 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751932_11108, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-16 19:42:04,523 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751932_11108 replica FinalizedReplica, blk_1073751932_11108, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751932 for deletion 2025-07-16 19:42:04,524 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751932_11108 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751932 2025-07-16 19:42:58,914 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751933_11109 src: /192.168.158.6:34892 dest: /192.168.158.4:9866 2025-07-16 19:42:58,933 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:34892, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1563114802_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751933_11109, duration(ns): 16950128 2025-07-16 19:42:58,933 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751933_11109, type=LAST_IN_PIPELINE terminating 2025-07-16 19:43:01,524 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751933_11109 replica FinalizedReplica, blk_1073751933_11109, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751933 for deletion 2025-07-16 19:43:01,525 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751933_11109 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751933 2025-07-16 19:44:58,917 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751935_11111 src: /192.168.158.7:38556 dest: /192.168.158.4:9866 2025-07-16 19:44:58,973 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:38556, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1815901346_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751935_11111, duration(ns): 50254507 2025-07-16 19:44:58,973 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751935_11111, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-16 19:45:01,526 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751935_11111 replica FinalizedReplica, blk_1073751935_11111, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751935 for deletion 2025-07-16 19:45:01,528 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751935_11111 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751935 2025-07-16 19:48:58,949 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751939_11115 src: /192.168.158.1:47626 dest: /192.168.158.4:9866 2025-07-16 19:48:58,983 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47626, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-706335243_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751939_11115, duration(ns): 24459381 2025-07-16 19:48:58,983 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751939_11115, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-16 19:49:04,531 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751939_11115 replica FinalizedReplica, blk_1073751939_11115, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751939 for deletion 2025-07-16 19:49:04,532 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751939_11115 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751939 2025-07-16 19:50:58,931 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751941_11117 src: /192.168.158.8:53016 dest: /192.168.158.4:9866 2025-07-16 19:50:58,951 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:53016, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1177437759_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751941_11117, duration(ns): 18011928 2025-07-16 19:50:58,951 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751941_11117, type=LAST_IN_PIPELINE terminating 2025-07-16 19:51:01,536 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751941_11117 replica FinalizedReplica, blk_1073751941_11117, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751941 for deletion 2025-07-16 19:51:01,537 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751941_11117 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751941 2025-07-16 19:51:58,941 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751942_11118 src: /192.168.158.7:33482 dest: /192.168.158.4:9866 2025-07-16 19:51:58,962 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:33482, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1457695596_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751942_11118, duration(ns): 19187020 2025-07-16 19:51:58,963 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751942_11118, type=LAST_IN_PIPELINE terminating 2025-07-16 19:52:04,536 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751942_11118 replica FinalizedReplica, blk_1073751942_11118, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751942 for deletion 2025-07-16 19:52:04,537 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751942_11118 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751942 2025-07-16 19:52:58,971 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751943_11119 src: /192.168.158.5:53488 dest: /192.168.158.4:9866 2025-07-16 19:52:58,996 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:53488, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2027989165_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751943_11119, duration(ns): 18800865 2025-07-16 19:52:58,996 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751943_11119, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-16 19:53:04,540 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751943_11119 replica FinalizedReplica, blk_1073751943_11119, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751943 for deletion 2025-07-16 19:53:04,541 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751943_11119 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751943 2025-07-16 19:54:03,952 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751944_11120 src: /192.168.158.9:46520 dest: /192.168.158.4:9866 2025-07-16 19:54:03,976 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:46520, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1348979038_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751944_11120, duration(ns): 18755691 2025-07-16 19:54:03,977 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751944_11120, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-16 19:54:04,539 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751944_11120 replica FinalizedReplica, blk_1073751944_11120, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751944 for deletion 2025-07-16 19:54:04,540 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751944_11120 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751944 2025-07-16 19:55:08,935 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751945_11121 src: /192.168.158.1:58762 dest: /192.168.158.4:9866 2025-07-16 19:55:08,967 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58762, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1552008678_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751945_11121, duration(ns): 22661562 2025-07-16 19:55:08,967 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751945_11121, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-16 19:55:13,541 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751945_11121 replica FinalizedReplica, blk_1073751945_11121, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751945 for deletion 2025-07-16 19:55:13,542 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751945_11121 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751945 2025-07-16 19:56:08,939 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751946_11122 src: /192.168.158.1:38988 dest: /192.168.158.4:9866 2025-07-16 19:56:08,974 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38988, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2116876506_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751946_11122, duration(ns): 25015068 2025-07-16 19:56:08,974 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751946_11122, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-16 19:56:10,542 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751946_11122 replica FinalizedReplica, blk_1073751946_11122, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751946 for deletion 2025-07-16 19:56:10,543 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751946_11122 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751946 2025-07-16 19:57:08,955 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751947_11123 src: /192.168.158.1:58974 dest: /192.168.158.4:9866 2025-07-16 19:57:08,988 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58974, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1333447397_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751947_11123, duration(ns): 23741876 2025-07-16 19:57:08,989 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751947_11123, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-16 19:57:13,546 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751947_11123 replica FinalizedReplica, blk_1073751947_11123, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751947 for deletion 2025-07-16 19:57:13,548 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751947_11123 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751947 2025-07-16 19:59:13,946 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751949_11125 src: /192.168.158.1:41436 dest: /192.168.158.4:9866 2025-07-16 19:59:13,980 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41436, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_672113902_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751949_11125, duration(ns): 25175330 2025-07-16 19:59:13,981 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751949_11125, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-16 19:59:19,551 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751949_11125 replica FinalizedReplica, blk_1073751949_11125, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751949 for deletion 2025-07-16 19:59:19,552 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751949_11125 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751949 2025-07-16 20:01:18,960 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751951_11127 src: /192.168.158.9:44558 dest: /192.168.158.4:9866 2025-07-16 20:01:18,979 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:44558, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1928872250_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751951_11127, duration(ns): 17153665 2025-07-16 20:01:18,979 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751951_11127, type=LAST_IN_PIPELINE terminating 2025-07-16 20:01:22,556 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751951_11127 replica FinalizedReplica, blk_1073751951_11127, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751951 for deletion 2025-07-16 20:01:22,557 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751951_11127 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751951 2025-07-16 20:02:18,961 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751952_11128 src: /192.168.158.5:58090 dest: /192.168.158.4:9866 2025-07-16 20:02:18,980 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:58090, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1177599945_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751952_11128, duration(ns): 15612930 2025-07-16 20:02:18,980 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751952_11128, type=LAST_IN_PIPELINE terminating 2025-07-16 20:02:19,557 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751952_11128 replica FinalizedReplica, blk_1073751952_11128, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751952 for deletion 2025-07-16 20:02:19,558 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751952_11128 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751952 2025-07-16 20:07:28,972 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751957_11133 src: /192.168.158.5:48894 dest: /192.168.158.4:9866 2025-07-16 20:07:28,994 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:48894, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1638343757_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751957_11133, duration(ns): 19507712 2025-07-16 20:07:28,994 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751957_11133, type=LAST_IN_PIPELINE terminating 2025-07-16 20:07:34,563 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751957_11133 replica FinalizedReplica, blk_1073751957_11133, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751957 for deletion 2025-07-16 20:07:34,564 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751957_11133 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751957 2025-07-16 20:09:28,966 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751959_11135 src: /192.168.158.1:34722 dest: /192.168.158.4:9866 2025-07-16 20:09:28,998 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34722, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-351821223_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751959_11135, duration(ns): 23183525 2025-07-16 20:09:28,998 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751959_11135, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-16 20:09:34,567 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751959_11135 replica FinalizedReplica, blk_1073751959_11135, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751959 for deletion 2025-07-16 20:09:34,569 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751959_11135 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751959 2025-07-16 20:11:28,968 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751961_11137 src: /192.168.158.1:40244 dest: /192.168.158.4:9866 2025-07-16 20:11:29,003 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40244, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1298568656_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751961_11137, duration(ns): 25305495 2025-07-16 20:11:29,003 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751961_11137, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-16 20:11:37,573 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751961_11137 replica FinalizedReplica, blk_1073751961_11137, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751961 for deletion 2025-07-16 20:11:37,574 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751961_11137 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751961 2025-07-16 20:13:28,984 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751963_11139 src: /192.168.158.7:49708 dest: /192.168.158.4:9866 2025-07-16 20:13:29,003 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:49708, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_79943920_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751963_11139, duration(ns): 16464910 2025-07-16 20:13:29,003 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751963_11139, type=LAST_IN_PIPELINE terminating 2025-07-16 20:13:37,577 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751963_11139 replica FinalizedReplica, blk_1073751963_11139, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751963 for deletion 2025-07-16 20:13:37,578 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751963_11139 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751963 2025-07-16 20:16:33,991 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751966_11142 src: /192.168.158.1:52210 dest: /192.168.158.4:9866 2025-07-16 20:16:34,026 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52210, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1371368535_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751966_11142, duration(ns): 26405054 2025-07-16 20:16:34,026 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751966_11142, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-16 20:16:37,583 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751966_11142 replica FinalizedReplica, blk_1073751966_11142, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751966 for deletion 2025-07-16 20:16:37,585 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751966_11142 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751966 2025-07-16 20:17:33,989 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751967_11143 src: /192.168.158.1:44072 dest: /192.168.158.4:9866 2025-07-16 20:17:34,023 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44072, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1977196643_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751967_11143, duration(ns): 24511231 2025-07-16 20:17:34,023 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751967_11143, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-16 20:17:37,584 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751967_11143 replica FinalizedReplica, blk_1073751967_11143, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751967 for deletion 2025-07-16 20:17:37,586 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751967_11143 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751967 2025-07-16 20:19:38,979 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751969_11145 src: /192.168.158.6:34790 dest: /192.168.158.4:9866 2025-07-16 20:19:39,004 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:34790, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1219485080_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751969_11145, duration(ns): 18811407 2025-07-16 20:19:39,004 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751969_11145, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-16 20:19:43,591 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751969_11145 replica FinalizedReplica, blk_1073751969_11145, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751969 for deletion 2025-07-16 20:19:43,593 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751969_11145 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751969 2025-07-16 20:20:38,991 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751970_11146 src: /192.168.158.8:43608 dest: /192.168.158.4:9866 2025-07-16 20:20:39,012 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:43608, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1577366138_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751970_11146, duration(ns): 18672430 2025-07-16 20:20:39,012 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751970_11146, type=LAST_IN_PIPELINE terminating 2025-07-16 20:20:43,591 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751970_11146 replica FinalizedReplica, blk_1073751970_11146, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751970 for deletion 2025-07-16 20:20:43,592 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751970_11146 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751970 2025-07-16 20:25:38,993 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751975_11151 src: /192.168.158.1:34678 dest: /192.168.158.4:9866 2025-07-16 20:25:39,024 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34678, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1698764046_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751975_11151, duration(ns): 22318840 2025-07-16 20:25:39,024 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751975_11151, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-16 20:25:43,596 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751975_11151 replica FinalizedReplica, blk_1073751975_11151, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751975 for deletion 2025-07-16 20:25:43,597 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751975_11151 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751975 2025-07-16 20:27:39,001 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751977_11153 src: /192.168.158.1:42988 dest: /192.168.158.4:9866 2025-07-16 20:27:39,038 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42988, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_787234813_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751977_11153, duration(ns): 27915974 2025-07-16 20:27:39,038 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751977_11153, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-16 20:27:43,598 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751977_11153 replica FinalizedReplica, blk_1073751977_11153, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751977 for deletion 2025-07-16 20:27:43,599 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751977_11153 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751977 2025-07-16 20:29:44,002 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751979_11155 src: /192.168.158.1:42114 dest: /192.168.158.4:9866 2025-07-16 20:29:44,038 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42114, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-16123717_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751979_11155, duration(ns): 27812213 2025-07-16 20:29:44,039 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751979_11155, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-16 20:29:49,603 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751979_11155 replica FinalizedReplica, blk_1073751979_11155, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751979 for deletion 2025-07-16 20:29:49,604 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751979_11155 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751979 2025-07-16 20:31:44,015 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751981_11157 src: /192.168.158.1:55412 dest: /192.168.158.4:9866 2025-07-16 20:31:44,051 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55412, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-917330678_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751981_11157, duration(ns): 26693382 2025-07-16 20:31:44,052 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751981_11157, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-16 20:31:49,607 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751981_11157 replica FinalizedReplica, blk_1073751981_11157, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751981 for deletion 2025-07-16 20:31:49,607 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751981_11157 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751981 2025-07-16 20:32:44,061 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751982_11158 src: /192.168.158.8:48176 dest: /192.168.158.4:9866 2025-07-16 20:32:44,079 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:48176, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_144170241_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751982_11158, duration(ns): 15439755 2025-07-16 20:32:44,079 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751982_11158, type=LAST_IN_PIPELINE terminating 2025-07-16 20:32:52,609 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751982_11158 replica FinalizedReplica, blk_1073751982_11158, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751982 for deletion 2025-07-16 20:32:52,610 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751982_11158 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751982 2025-07-16 20:34:44,023 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751984_11160 src: /192.168.158.6:55750 dest: /192.168.158.4:9866 2025-07-16 20:34:44,048 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:55750, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_542226023_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751984_11160, duration(ns): 19357295 2025-07-16 20:34:44,049 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751984_11160, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-16 20:34:49,612 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751984_11160 replica FinalizedReplica, blk_1073751984_11160, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751984 for deletion 2025-07-16 20:34:49,613 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751984_11160 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751984 2025-07-16 20:35:49,024 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751985_11161 src: /192.168.158.5:38290 dest: /192.168.158.4:9866 2025-07-16 20:35:49,042 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:38290, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-369251406_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751985_11161, duration(ns): 16105323 2025-07-16 20:35:49,042 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751985_11161, type=LAST_IN_PIPELINE terminating 2025-07-16 20:35:52,614 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751985_11161 replica FinalizedReplica, blk_1073751985_11161, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751985 for deletion 2025-07-16 20:35:52,615 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751985_11161 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751985 2025-07-16 20:38:49,020 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751988_11164 src: /192.168.158.7:53370 dest: /192.168.158.4:9866 2025-07-16 20:38:49,047 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:53370, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-939525955_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751988_11164, duration(ns): 20866378 2025-07-16 20:38:49,047 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751988_11164, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-16 20:38:55,621 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751988_11164 replica FinalizedReplica, blk_1073751988_11164, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751988 for deletion 2025-07-16 20:38:55,622 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751988_11164 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751988 2025-07-16 20:39:54,017 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751989_11165 src: /192.168.158.5:36816 dest: /192.168.158.4:9866 2025-07-16 20:39:54,042 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:36816, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-370058985_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751989_11165, duration(ns): 19113070 2025-07-16 20:39:54,044 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751989_11165, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-16 20:40:01,625 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751989_11165 replica FinalizedReplica, blk_1073751989_11165, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751989 for deletion 2025-07-16 20:40:01,626 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751989_11165 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751989 2025-07-16 20:40:54,028 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751990_11166 src: /192.168.158.1:52198 dest: /192.168.158.4:9866 2025-07-16 20:40:54,062 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52198, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-361408445_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751990_11166, duration(ns): 25205521 2025-07-16 20:40:54,063 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751990_11166, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-16 20:40:58,627 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751990_11166 replica FinalizedReplica, blk_1073751990_11166, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751990 for deletion 2025-07-16 20:40:58,628 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751990_11166 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751990 2025-07-16 20:47:59,050 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751997_11173 src: /192.168.158.8:49966 dest: /192.168.158.4:9866 2025-07-16 20:47:59,067 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:49966, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1408265138_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751997_11173, duration(ns): 15355042 2025-07-16 20:47:59,068 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751997_11173, type=LAST_IN_PIPELINE terminating 2025-07-16 20:48:07,638 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751997_11173 replica FinalizedReplica, blk_1073751997_11173, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751997 for deletion 2025-07-16 20:48:07,639 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751997_11173 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751997 2025-07-16 20:49:59,051 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073751999_11175 src: /192.168.158.5:55002 dest: /192.168.158.4:9866 2025-07-16 20:49:59,084 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:55002, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_44562210_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073751999_11175, duration(ns): 26500601 2025-07-16 20:49:59,084 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073751999_11175, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-16 20:50:04,643 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073751999_11175 replica FinalizedReplica, blk_1073751999_11175, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751999 for deletion 2025-07-16 20:50:04,644 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073751999_11175 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073751999 2025-07-16 20:52:09,042 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752001_11177 src: /192.168.158.6:50474 dest: /192.168.158.4:9866 2025-07-16 20:52:09,068 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:50474, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_588895643_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752001_11177, duration(ns): 20701621 2025-07-16 20:52:09,069 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752001_11177, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-16 20:52:13,654 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752001_11177 replica FinalizedReplica, blk_1073752001_11177, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073752001 for deletion 2025-07-16 20:52:13,655 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752001_11177 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073752001 2025-07-16 20:53:09,040 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752002_11178 src: /192.168.158.7:42544 dest: /192.168.158.4:9866 2025-07-16 20:53:09,061 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:42544, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_180784406_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752002_11178, duration(ns): 18961369 2025-07-16 20:53:09,061 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752002_11178, type=LAST_IN_PIPELINE terminating 2025-07-16 20:53:13,657 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752002_11178 replica FinalizedReplica, blk_1073752002_11178, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073752002 for deletion 2025-07-16 20:53:13,658 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752002_11178 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073752002 2025-07-16 21:00:09,066 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752009_11185 src: /192.168.158.9:49978 dest: /192.168.158.4:9866 2025-07-16 21:00:09,093 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:49978, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-734550479_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752009_11185, duration(ns): 20585232 2025-07-16 21:00:09,093 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752009_11185, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-16 21:00:13,676 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752009_11185 replica FinalizedReplica, blk_1073752009_11185, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073752009 for deletion 2025-07-16 21:00:13,677 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752009_11185 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073752009 2025-07-16 21:05:19,066 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752014_11190 src: /192.168.158.7:35716 dest: /192.168.158.4:9866 2025-07-16 21:05:19,083 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:35716, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1013691812_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752014_11190, duration(ns): 14269169 2025-07-16 21:05:19,083 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752014_11190, type=LAST_IN_PIPELINE terminating 2025-07-16 21:05:22,687 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752014_11190 replica FinalizedReplica, blk_1073752014_11190, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073752014 for deletion 2025-07-16 21:05:22,688 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752014_11190 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073752014 2025-07-16 21:09:19,068 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752018_11194 src: /192.168.158.6:41004 dest: /192.168.158.4:9866 2025-07-16 21:09:19,092 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:41004, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_418921013_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752018_11194, duration(ns): 19049356 2025-07-16 21:09:19,093 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752018_11194, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-16 21:09:22,702 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752018_11194 replica FinalizedReplica, blk_1073752018_11194, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073752018 for deletion 2025-07-16 21:09:22,703 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752018_11194 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073752018 2025-07-16 21:10:19,097 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752019_11195 src: /192.168.158.1:33722 dest: /192.168.158.4:9866 2025-07-16 21:10:19,130 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33722, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1526115291_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752019_11195, duration(ns): 23475328 2025-07-16 21:10:19,130 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752019_11195, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-16 21:10:22,705 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752019_11195 replica FinalizedReplica, blk_1073752019_11195, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073752019 for deletion 2025-07-16 21:10:22,706 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752019_11195 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073752019 2025-07-16 21:14:19,084 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752023_11199 src: /192.168.158.1:34142 dest: /192.168.158.4:9866 2025-07-16 21:14:19,120 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34142, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_930600912_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752023_11199, duration(ns): 26308911 2025-07-16 21:14:19,120 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752023_11199, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-16 21:14:22,713 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752023_11199 replica FinalizedReplica, blk_1073752023_11199, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073752023 for deletion 2025-07-16 21:14:22,714 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752023_11199 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073752023 2025-07-16 21:15:19,094 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752024_11200 src: /192.168.158.1:35012 dest: /192.168.158.4:9866 2025-07-16 21:15:19,156 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35012, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-224633719_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752024_11200, duration(ns): 50998638 2025-07-16 21:15:19,156 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752024_11200, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-16 21:15:22,713 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752024_11200 replica FinalizedReplica, blk_1073752024_11200, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073752024 for deletion 2025-07-16 21:15:22,714 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752024_11200 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073752024 2025-07-16 21:16:19,125 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752025_11201 src: /192.168.158.9:35308 dest: /192.168.158.4:9866 2025-07-16 21:16:19,147 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:35308, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_524128539_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752025_11201, duration(ns): 19344594 2025-07-16 21:16:19,147 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752025_11201, type=LAST_IN_PIPELINE terminating 2025-07-16 21:16:22,713 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752025_11201 replica FinalizedReplica, blk_1073752025_11201, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073752025 for deletion 2025-07-16 21:16:22,714 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752025_11201 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073752025 2025-07-16 21:20:29,104 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752029_11205 src: /192.168.158.8:49620 dest: /192.168.158.4:9866 2025-07-16 21:20:29,122 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:49620, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1868152635_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752029_11205, duration(ns): 15384957 2025-07-16 21:20:29,122 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752029_11205, type=LAST_IN_PIPELINE terminating 2025-07-16 21:20:34,723 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752029_11205 replica FinalizedReplica, blk_1073752029_11205, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073752029 for deletion 2025-07-16 21:20:34,724 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752029_11205 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073752029 2025-07-16 21:22:29,100 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752031_11207 src: /192.168.158.1:46132 dest: /192.168.158.4:9866 2025-07-16 21:22:29,135 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46132, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1983210297_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752031_11207, duration(ns): 26006613 2025-07-16 21:22:29,136 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752031_11207, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-16 21:22:34,726 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752031_11207 replica FinalizedReplica, blk_1073752031_11207, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073752031 for deletion 2025-07-16 21:22:34,727 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752031_11207 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073752031 2025-07-16 21:25:29,115 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752034_11210 src: /192.168.158.7:33108 dest: /192.168.158.4:9866 2025-07-16 21:25:29,132 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:33108, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1424585616_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752034_11210, duration(ns): 15550879 2025-07-16 21:25:29,133 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752034_11210, type=LAST_IN_PIPELINE terminating 2025-07-16 21:25:34,731 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752034_11210 replica FinalizedReplica, blk_1073752034_11210, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073752034 for deletion 2025-07-16 21:25:34,732 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752034_11210 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073752034 2025-07-16 21:30:39,109 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752039_11215 src: /192.168.158.9:40610 dest: /192.168.158.4:9866 2025-07-16 21:30:39,134 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:40610, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1517972267_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752039_11215, duration(ns): 20272183 2025-07-16 21:30:39,135 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752039_11215, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-16 21:30:43,743 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752039_11215 replica FinalizedReplica, blk_1073752039_11215, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073752039 for deletion 2025-07-16 21:30:43,745 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752039_11215 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073752039 2025-07-16 21:31:39,119 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752040_11216 src: /192.168.158.8:59348 dest: /192.168.158.4:9866 2025-07-16 21:31:39,146 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:59348, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-807335595_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752040_11216, duration(ns): 20811800 2025-07-16 21:31:39,146 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752040_11216, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-16 21:31:46,745 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752040_11216 replica FinalizedReplica, blk_1073752040_11216, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073752040 for deletion 2025-07-16 21:31:46,746 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752040_11216 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073752040 2025-07-16 21:32:39,157 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752041_11217 src: /192.168.158.5:43956 dest: /192.168.158.4:9866 2025-07-16 21:32:39,184 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:43956, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_447103800_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752041_11217, duration(ns): 21001657 2025-07-16 21:32:39,185 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752041_11217, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-16 21:32:43,747 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752041_11217 replica FinalizedReplica, blk_1073752041_11217, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073752041 for deletion 2025-07-16 21:32:43,749 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752041_11217 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073752041 2025-07-16 21:34:39,124 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752043_11219 src: /192.168.158.1:60114 dest: /192.168.158.4:9866 2025-07-16 21:34:39,158 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60114, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_833252244_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752043_11219, duration(ns): 24404623 2025-07-16 21:34:39,158 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752043_11219, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-16 21:34:43,750 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752043_11219 replica FinalizedReplica, blk_1073752043_11219, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073752043 for deletion 2025-07-16 21:34:43,751 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752043_11219 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073752043 2025-07-16 21:35:39,129 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752044_11220 src: /192.168.158.7:34240 dest: /192.168.158.4:9866 2025-07-16 21:35:39,158 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:34240, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_65531521_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752044_11220, duration(ns): 23056110 2025-07-16 21:35:39,158 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752044_11220, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-16 21:35:43,751 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752044_11220 replica FinalizedReplica, blk_1073752044_11220, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073752044 for deletion 2025-07-16 21:35:43,753 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752044_11220 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073752044 2025-07-16 21:37:44,132 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752046_11222 src: /192.168.158.1:42150 dest: /192.168.158.4:9866 2025-07-16 21:37:44,167 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42150, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_113072180_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752046_11222, duration(ns): 24876315 2025-07-16 21:37:44,167 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752046_11222, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-16 21:37:49,755 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752046_11222 replica FinalizedReplica, blk_1073752046_11222, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073752046 for deletion 2025-07-16 21:37:49,757 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752046_11222 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073752046 2025-07-16 21:40:44,119 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752049_11225 src: /192.168.158.8:37058 dest: /192.168.158.4:9866 2025-07-16 21:40:44,144 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:37058, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1131960905_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752049_11225, duration(ns): 19110695 2025-07-16 21:40:44,144 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752049_11225, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-16 21:40:46,762 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752049_11225 replica FinalizedReplica, blk_1073752049_11225, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073752049 for deletion 2025-07-16 21:40:46,763 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752049_11225 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073752049 2025-07-16 21:42:44,124 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752051_11227 src: /192.168.158.1:33152 dest: /192.168.158.4:9866 2025-07-16 21:42:44,160 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33152, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1994662115_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752051_11227, duration(ns): 25631994 2025-07-16 21:42:44,160 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752051_11227, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-16 21:42:49,769 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752051_11227 replica FinalizedReplica, blk_1073752051_11227, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073752051 for deletion 2025-07-16 21:42:49,770 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752051_11227 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073752051 2025-07-16 21:44:44,127 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752053_11229 src: /192.168.158.7:49718 dest: /192.168.158.4:9866 2025-07-16 21:44:44,146 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:49718, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_985037289_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752053_11229, duration(ns): 16301013 2025-07-16 21:44:44,146 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752053_11229, type=LAST_IN_PIPELINE terminating 2025-07-16 21:44:46,770 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752053_11229 replica FinalizedReplica, blk_1073752053_11229, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073752053 for deletion 2025-07-16 21:44:46,771 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752053_11229 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073752053 2025-07-16 21:50:44,145 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752059_11235 src: /192.168.158.1:51318 dest: /192.168.158.4:9866 2025-07-16 21:50:44,177 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51318, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1122511951_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752059_11235, duration(ns): 22343958 2025-07-16 21:50:44,177 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752059_11235, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-16 21:50:46,784 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752059_11235 replica FinalizedReplica, blk_1073752059_11235, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073752059 for deletion 2025-07-16 21:50:46,786 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752059_11235 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073752059 2025-07-16 21:54:49,157 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752063_11239 src: /192.168.158.1:53334 dest: /192.168.158.4:9866 2025-07-16 21:54:49,192 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53334, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2080553505_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752063_11239, duration(ns): 25841897 2025-07-16 21:54:49,193 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752063_11239, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-16 21:54:52,793 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752063_11239 replica FinalizedReplica, blk_1073752063_11239, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073752063 for deletion 2025-07-16 21:54:52,795 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752063_11239 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073752063 2025-07-16 21:55:54,181 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752064_11240 src: /192.168.158.7:57290 dest: /192.168.158.4:9866 2025-07-16 21:55:54,199 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:57290, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_539993584_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752064_11240, duration(ns): 16137132 2025-07-16 21:55:54,200 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752064_11240, type=LAST_IN_PIPELINE terminating 2025-07-16 21:55:58,798 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752064_11240 replica FinalizedReplica, blk_1073752064_11240, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752064 for deletion 2025-07-16 21:55:58,799 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752064_11240 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752064 2025-07-16 21:59:19,814 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Successfully sent block report 0x12cfd9a757d31f43, containing 4 storage report(s), of which we sent 4. The reports had 8 total blocks and used 1 RPC(s). This took 1 msec to generate and 4 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5. 2025-07-16 21:59:19,814 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Got finalize command for block pool BP-1059995147-192.168.158.1-1752101929360 2025-07-16 21:59:59,181 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752068_11244 src: /192.168.158.9:60784 dest: /192.168.158.4:9866 2025-07-16 21:59:59,201 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:60784, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1964688834_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752068_11244, duration(ns): 17847523 2025-07-16 21:59:59,202 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752068_11244, type=LAST_IN_PIPELINE terminating 2025-07-16 22:00:01,812 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752068_11244 replica FinalizedReplica, blk_1073752068_11244, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752068 for deletion 2025-07-16 22:00:01,813 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752068_11244 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752068 2025-07-16 22:00:59,172 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752069_11245 src: /192.168.158.1:50044 dest: /192.168.158.4:9866 2025-07-16 22:00:59,204 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50044, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-993445801_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752069_11245, duration(ns): 22469316 2025-07-16 22:00:59,204 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752069_11245, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-16 22:01:04,818 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752069_11245 replica FinalizedReplica, blk_1073752069_11245, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752069 for deletion 2025-07-16 22:01:04,819 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752069_11245 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752069 2025-07-16 22:01:59,191 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752070_11246 src: /192.168.158.8:54100 dest: /192.168.158.4:9866 2025-07-16 22:01:59,219 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:54100, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1628887953_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752070_11246, duration(ns): 22741139 2025-07-16 22:01:59,220 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752070_11246, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-16 22:02:01,819 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752070_11246 replica FinalizedReplica, blk_1073752070_11246, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752070 for deletion 2025-07-16 22:02:01,820 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752070_11246 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752070 2025-07-16 22:05:59,182 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752074_11250 src: /192.168.158.1:47526 dest: /192.168.158.4:9866 2025-07-16 22:05:59,214 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47526, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1882140817_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752074_11250, duration(ns): 22991393 2025-07-16 22:05:59,215 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752074_11250, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-16 22:06:01,828 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752074_11250 replica FinalizedReplica, blk_1073752074_11250, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752074 for deletion 2025-07-16 22:06:01,829 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752074_11250 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752074 2025-07-16 22:07:59,187 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752076_11252 src: /192.168.158.8:34678 dest: /192.168.158.4:9866 2025-07-16 22:07:59,206 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:34678, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1729916679_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752076_11252, duration(ns): 16448450 2025-07-16 22:07:59,206 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752076_11252, type=LAST_IN_PIPELINE terminating 2025-07-16 22:08:04,830 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752076_11252 replica FinalizedReplica, blk_1073752076_11252, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752076 for deletion 2025-07-16 22:08:04,832 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752076_11252 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752076 2025-07-16 22:08:59,180 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752077_11253 src: /192.168.158.1:39770 dest: /192.168.158.4:9866 2025-07-16 22:08:59,212 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39770, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1535305223_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752077_11253, duration(ns): 22011660 2025-07-16 22:08:59,212 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752077_11253, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-16 22:09:01,833 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752077_11253 replica FinalizedReplica, blk_1073752077_11253, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752077 for deletion 2025-07-16 22:09:01,834 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752077_11253 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752077 2025-07-16 22:13:04,185 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752081_11257 src: /192.168.158.1:47928 dest: /192.168.158.4:9866 2025-07-16 22:13:04,219 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47928, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_826232166_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752081_11257, duration(ns): 24406579 2025-07-16 22:13:04,219 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752081_11257, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-16 22:13:07,842 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752081_11257 replica FinalizedReplica, blk_1073752081_11257, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752081 for deletion 2025-07-16 22:13:07,843 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752081_11257 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752081 2025-07-16 22:15:09,194 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752083_11259 src: /192.168.158.5:56874 dest: /192.168.158.4:9866 2025-07-16 22:15:09,220 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:56874, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_59770875_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752083_11259, duration(ns): 19929001 2025-07-16 22:15:09,220 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752083_11259, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-16 22:15:13,845 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752083_11259 replica FinalizedReplica, blk_1073752083_11259, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752083 for deletion 2025-07-16 22:15:13,846 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752083_11259 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752083 2025-07-16 22:17:09,194 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752085_11261 src: /192.168.158.9:38468 dest: /192.168.158.4:9866 2025-07-16 22:17:09,219 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:38468, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1210049650_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752085_11261, duration(ns): 18503640 2025-07-16 22:17:09,219 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752085_11261, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-16 22:17:13,851 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752085_11261 replica FinalizedReplica, blk_1073752085_11261, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752085 for deletion 2025-07-16 22:17:13,852 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752085_11261 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752085 2025-07-16 22:19:14,243 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752087_11263 src: /192.168.158.8:54644 dest: /192.168.158.4:9866 2025-07-16 22:19:14,268 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:54644, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1725880014_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752087_11263, duration(ns): 19012396 2025-07-16 22:19:14,268 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752087_11263, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-16 22:19:16,857 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752087_11263 replica FinalizedReplica, blk_1073752087_11263, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752087 for deletion 2025-07-16 22:19:16,858 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752087_11263 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752087 2025-07-16 22:21:24,205 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752089_11265 src: /192.168.158.1:49226 dest: /192.168.158.4:9866 2025-07-16 22:21:24,237 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49226, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-895843450_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752089_11265, duration(ns): 22691687 2025-07-16 22:21:24,238 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752089_11265, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-16 22:21:31,868 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752089_11265 replica FinalizedReplica, blk_1073752089_11265, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752089 for deletion 2025-07-16 22:21:31,869 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752089_11265 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752089 2025-07-16 22:22:24,209 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752090_11266 src: /192.168.158.7:40364 dest: /192.168.158.4:9866 2025-07-16 22:22:24,239 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:40364, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1459754778_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752090_11266, duration(ns): 22716785 2025-07-16 22:22:24,239 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752090_11266, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-16 22:22:28,869 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752090_11266 replica FinalizedReplica, blk_1073752090_11266, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752090 for deletion 2025-07-16 22:22:28,870 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752090_11266 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752090 2025-07-16 22:24:24,208 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752092_11268 src: /192.168.158.1:60956 dest: /192.168.158.4:9866 2025-07-16 22:24:24,242 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60956, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_454562112_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752092_11268, duration(ns): 25800145 2025-07-16 22:24:24,243 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752092_11268, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-16 22:24:28,874 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752092_11268 replica FinalizedReplica, blk_1073752092_11268, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752092 for deletion 2025-07-16 22:24:28,875 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752092_11268 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752092 2025-07-16 22:26:24,236 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752094_11270 src: /192.168.158.1:35714 dest: /192.168.158.4:9866 2025-07-16 22:26:24,271 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35714, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_887198864_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752094_11270, duration(ns): 24707828 2025-07-16 22:26:24,271 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752094_11270, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-16 22:26:28,877 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752094_11270 replica FinalizedReplica, blk_1073752094_11270, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752094 for deletion 2025-07-16 22:26:28,878 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752094_11270 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752094 2025-07-16 22:28:24,238 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752096_11272 src: /192.168.158.1:44178 dest: /192.168.158.4:9866 2025-07-16 22:28:24,272 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44178, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1869016067_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752096_11272, duration(ns): 24898737 2025-07-16 22:28:24,272 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752096_11272, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-16 22:28:28,879 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752096_11272 replica FinalizedReplica, blk_1073752096_11272, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752096 for deletion 2025-07-16 22:28:28,880 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752096_11272 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752096 2025-07-16 22:30:29,239 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752098_11274 src: /192.168.158.6:40100 dest: /192.168.158.4:9866 2025-07-16 22:30:29,269 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:40100, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_704373570_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752098_11274, duration(ns): 22774513 2025-07-16 22:30:29,269 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752098_11274, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-16 22:30:31,882 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752098_11274 replica FinalizedReplica, blk_1073752098_11274, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752098 for deletion 2025-07-16 22:30:31,883 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752098_11274 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752098 2025-07-16 22:33:34,245 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752101_11277 src: /192.168.158.7:38450 dest: /192.168.158.4:9866 2025-07-16 22:33:34,270 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:38450, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1563575793_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752101_11277, duration(ns): 19630055 2025-07-16 22:33:34,271 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752101_11277, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-16 22:33:37,888 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752101_11277 replica FinalizedReplica, blk_1073752101_11277, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752101 for deletion 2025-07-16 22:33:37,890 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752101_11277 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752101 2025-07-16 22:39:44,275 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752107_11283 src: /192.168.158.7:41110 dest: /192.168.158.4:9866 2025-07-16 22:39:44,301 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:41110, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1381330104_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752107_11283, duration(ns): 19859624 2025-07-16 22:39:44,301 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752107_11283, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-16 22:39:46,905 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752107_11283 replica FinalizedReplica, blk_1073752107_11283, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752107 for deletion 2025-07-16 22:39:46,906 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752107_11283 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752107 2025-07-16 22:40:49,258 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752108_11284 src: /192.168.158.1:47450 dest: /192.168.158.4:9866 2025-07-16 22:40:49,291 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47450, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1586815662_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752108_11284, duration(ns): 23216419 2025-07-16 22:40:49,291 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752108_11284, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-16 22:40:52,911 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752108_11284 replica FinalizedReplica, blk_1073752108_11284, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752108 for deletion 2025-07-16 22:40:52,912 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752108_11284 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752108 2025-07-16 22:43:49,321 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752111_11287 src: /192.168.158.6:40872 dest: /192.168.158.4:9866 2025-07-16 22:43:49,339 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:40872, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_491938013_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752111_11287, duration(ns): 16536102 2025-07-16 22:43:49,340 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752111_11287, type=LAST_IN_PIPELINE terminating 2025-07-16 22:43:52,917 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752111_11287 replica FinalizedReplica, blk_1073752111_11287, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752111 for deletion 2025-07-16 22:43:52,919 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752111_11287 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752111 2025-07-16 22:50:59,281 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752118_11294 src: /192.168.158.9:54594 dest: /192.168.158.4:9866 2025-07-16 22:50:59,298 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:54594, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_732812737_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752118_11294, duration(ns): 14751428 2025-07-16 22:50:59,298 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752118_11294, type=LAST_IN_PIPELINE terminating 2025-07-16 22:51:01,931 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752118_11294 replica FinalizedReplica, blk_1073752118_11294, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752118 for deletion 2025-07-16 22:51:01,932 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752118_11294 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752118 2025-07-16 22:54:59,292 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752122_11298 src: /192.168.158.1:49116 dest: /192.168.158.4:9866 2025-07-16 22:54:59,324 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49116, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1357801422_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752122_11298, duration(ns): 22883825 2025-07-16 22:54:59,324 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752122_11298, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-16 22:55:04,943 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752122_11298 replica FinalizedReplica, blk_1073752122_11298, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752122 for deletion 2025-07-16 22:55:04,945 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752122_11298 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752122 2025-07-16 22:55:59,293 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752123_11299 src: /192.168.158.5:55872 dest: /192.168.158.4:9866 2025-07-16 22:55:59,318 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:55872, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_663095838_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752123_11299, duration(ns): 19589284 2025-07-16 22:55:59,318 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752123_11299, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-16 22:56:04,946 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752123_11299 replica FinalizedReplica, blk_1073752123_11299, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752123 for deletion 2025-07-16 22:56:04,947 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752123_11299 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752123 2025-07-16 22:56:59,282 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752124_11300 src: /192.168.158.9:49634 dest: /192.168.158.4:9866 2025-07-16 22:56:59,300 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:49634, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-566487091_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752124_11300, duration(ns): 15716032 2025-07-16 22:56:59,301 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752124_11300, type=LAST_IN_PIPELINE terminating 2025-07-16 22:57:04,950 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752124_11300 replica FinalizedReplica, blk_1073752124_11300, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752124 for deletion 2025-07-16 22:57:04,951 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752124_11300 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752124 2025-07-16 22:57:59,301 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752125_11301 src: /192.168.158.5:55836 dest: /192.168.158.4:9866 2025-07-16 22:57:59,319 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:55836, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_681651709_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752125_11301, duration(ns): 16062422 2025-07-16 22:57:59,320 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752125_11301, type=LAST_IN_PIPELINE terminating 2025-07-16 22:58:01,951 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752125_11301 replica FinalizedReplica, blk_1073752125_11301, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752125 for deletion 2025-07-16 22:58:01,952 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752125_11301 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752125 2025-07-16 22:59:04,293 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752126_11302 src: /192.168.158.1:36952 dest: /192.168.158.4:9866 2025-07-16 22:59:04,326 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36952, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-577205092_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752126_11302, duration(ns): 23669221 2025-07-16 22:59:04,327 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752126_11302, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-16 22:59:10,953 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752126_11302 replica FinalizedReplica, blk_1073752126_11302, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752126 for deletion 2025-07-16 22:59:10,954 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752126_11302 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752126 2025-07-16 23:00:04,294 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752127_11303 src: /192.168.158.9:35160 dest: /192.168.158.4:9866 2025-07-16 23:00:04,321 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:35160, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1294403809_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752127_11303, duration(ns): 21770068 2025-07-16 23:00:04,322 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752127_11303, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-16 23:00:07,954 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752127_11303 replica FinalizedReplica, blk_1073752127_11303, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752127 for deletion 2025-07-16 23:00:07,955 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752127_11303 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752127 2025-07-16 23:01:04,289 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752128_11304 src: /192.168.158.6:32802 dest: /192.168.158.4:9866 2025-07-16 23:01:04,315 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:32802, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-32048593_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752128_11304, duration(ns): 20306935 2025-07-16 23:01:04,315 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752128_11304, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-16 23:01:07,955 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752128_11304 replica FinalizedReplica, blk_1073752128_11304, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752128 for deletion 2025-07-16 23:01:07,957 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752128_11304 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752128 2025-07-16 23:02:04,306 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752129_11305 src: /192.168.158.1:57112 dest: /192.168.158.4:9866 2025-07-16 23:02:04,340 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57112, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_853128128_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752129_11305, duration(ns): 24564300 2025-07-16 23:02:04,341 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752129_11305, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-16 23:02:07,959 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752129_11305 replica FinalizedReplica, blk_1073752129_11305, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752129 for deletion 2025-07-16 23:02:07,960 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752129_11305 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752129 2025-07-16 23:03:04,302 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752130_11306 src: /192.168.158.7:55500 dest: /192.168.158.4:9866 2025-07-16 23:03:04,324 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:55500, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_926513659_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752130_11306, duration(ns): 19502798 2025-07-16 23:03:04,324 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752130_11306, type=LAST_IN_PIPELINE terminating 2025-07-16 23:03:07,960 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752130_11306 replica FinalizedReplica, blk_1073752130_11306, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752130 for deletion 2025-07-16 23:03:07,962 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752130_11306 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752130 2025-07-16 23:05:04,308 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752132_11308 src: /192.168.158.8:54344 dest: /192.168.158.4:9866 2025-07-16 23:05:04,331 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:54344, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2042949218_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752132_11308, duration(ns): 21243156 2025-07-16 23:05:04,331 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752132_11308, type=LAST_IN_PIPELINE terminating 2025-07-16 23:05:07,968 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752132_11308 replica FinalizedReplica, blk_1073752132_11308, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752132 for deletion 2025-07-16 23:05:07,969 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752132_11308 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752132 2025-07-16 23:07:04,288 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752134_11310 src: /192.168.158.1:58664 dest: /192.168.158.4:9866 2025-07-16 23:07:04,324 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58664, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1591332205_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752134_11310, duration(ns): 26032397 2025-07-16 23:07:04,324 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752134_11310, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-16 23:07:10,974 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752134_11310 replica FinalizedReplica, blk_1073752134_11310, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752134 for deletion 2025-07-16 23:07:10,975 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752134_11310 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752134 2025-07-16 23:09:09,294 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752136_11312 src: /192.168.158.1:58826 dest: /192.168.158.4:9866 2025-07-16 23:09:09,324 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58826, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_251270228_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752136_11312, duration(ns): 21834589 2025-07-16 23:09:09,324 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752136_11312, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-16 23:09:16,977 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752136_11312 replica FinalizedReplica, blk_1073752136_11312, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752136 for deletion 2025-07-16 23:09:16,978 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752136_11312 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752136 2025-07-16 23:10:09,319 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752137_11313 src: /192.168.158.1:41866 dest: /192.168.158.4:9866 2025-07-16 23:10:09,352 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41866, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1375116364_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752137_11313, duration(ns): 23533227 2025-07-16 23:10:09,352 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752137_11313, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-16 23:10:16,977 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752137_11313 replica FinalizedReplica, blk_1073752137_11313, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752137 for deletion 2025-07-16 23:10:16,978 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752137_11313 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752137 2025-07-16 23:12:09,311 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752139_11315 src: /192.168.158.6:35820 dest: /192.168.158.4:9866 2025-07-16 23:12:09,337 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:35820, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1068622928_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752139_11315, duration(ns): 20059688 2025-07-16 23:12:09,337 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752139_11315, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-16 23:12:16,981 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752139_11315 replica FinalizedReplica, blk_1073752139_11315, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752139 for deletion 2025-07-16 23:12:16,982 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752139_11315 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752139 2025-07-16 23:14:09,316 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752141_11317 src: /192.168.158.6:47436 dest: /192.168.158.4:9866 2025-07-16 23:14:09,335 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:47436, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_509455767_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752141_11317, duration(ns): 16017004 2025-07-16 23:14:09,336 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752141_11317, type=LAST_IN_PIPELINE terminating 2025-07-16 23:14:16,987 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752141_11317 replica FinalizedReplica, blk_1073752141_11317, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752141 for deletion 2025-07-16 23:14:16,988 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752141_11317 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752141 2025-07-16 23:15:09,335 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752142_11318 src: /192.168.158.9:40608 dest: /192.168.158.4:9866 2025-07-16 23:15:09,356 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:40608, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1131204733_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752142_11318, duration(ns): 18771085 2025-07-16 23:15:09,356 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752142_11318, type=LAST_IN_PIPELINE terminating 2025-07-16 23:15:13,988 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752142_11318 replica FinalizedReplica, blk_1073752142_11318, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752142 for deletion 2025-07-16 23:15:13,989 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752142_11318 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752142 2025-07-16 23:17:14,317 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752144_11320 src: /192.168.158.9:42796 dest: /192.168.158.4:9866 2025-07-16 23:17:14,344 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:42796, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1152034728_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752144_11320, duration(ns): 21501550 2025-07-16 23:17:14,344 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752144_11320, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-16 23:17:16,994 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752144_11320 replica FinalizedReplica, blk_1073752144_11320, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752144 for deletion 2025-07-16 23:17:16,995 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752144_11320 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752144 2025-07-16 23:19:14,342 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752146_11322 src: /192.168.158.8:34864 dest: /192.168.158.4:9866 2025-07-16 23:19:14,371 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:34864, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_967434910_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752146_11322, duration(ns): 24038761 2025-07-16 23:19:14,372 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752146_11322, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-16 23:19:19,995 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752146_11322 replica FinalizedReplica, blk_1073752146_11322, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752146 for deletion 2025-07-16 23:19:19,996 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752146_11322 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752146 2025-07-16 23:21:14,319 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752148_11324 src: /192.168.158.9:35870 dest: /192.168.158.4:9866 2025-07-16 23:21:14,336 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:35870, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-94665078_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752148_11324, duration(ns): 15732626 2025-07-16 23:21:14,337 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752148_11324, type=LAST_IN_PIPELINE terminating 2025-07-16 23:21:19,998 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752148_11324 replica FinalizedReplica, blk_1073752148_11324, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752148 for deletion 2025-07-16 23:21:19,999 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752148_11324 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752148 2025-07-16 23:27:14,355 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752154_11330 src: /192.168.158.8:38022 dest: /192.168.158.4:9866 2025-07-16 23:27:14,373 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:38022, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_467184590_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752154_11330, duration(ns): 16348255 2025-07-16 23:27:14,373 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752154_11330, type=LAST_IN_PIPELINE terminating 2025-07-16 23:27:17,013 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752154_11330 replica FinalizedReplica, blk_1073752154_11330, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752154 for deletion 2025-07-16 23:27:17,015 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752154_11330 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752154 2025-07-16 23:28:14,341 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752155_11331 src: /192.168.158.7:52144 dest: /192.168.158.4:9866 2025-07-16 23:28:14,366 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:52144, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1661727737_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752155_11331, duration(ns): 20283021 2025-07-16 23:28:14,367 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752155_11331, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-16 23:28:17,014 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752155_11331 replica FinalizedReplica, blk_1073752155_11331, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752155 for deletion 2025-07-16 23:28:17,015 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752155_11331 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752155 2025-07-16 23:29:14,352 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752156_11332 src: /192.168.158.9:43228 dest: /192.168.158.4:9866 2025-07-16 23:29:14,372 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:43228, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_433460108_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752156_11332, duration(ns): 18253148 2025-07-16 23:29:14,372 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752156_11332, type=LAST_IN_PIPELINE terminating 2025-07-16 23:29:17,015 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752156_11332 replica FinalizedReplica, blk_1073752156_11332, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752156 for deletion 2025-07-16 23:29:17,017 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752156_11332 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752156 2025-07-16 23:34:19,365 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752161_11337 src: /192.168.158.7:38920 dest: /192.168.158.4:9866 2025-07-16 23:34:19,383 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:38920, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_779213653_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752161_11337, duration(ns): 15471225 2025-07-16 23:34:19,383 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752161_11337, type=LAST_IN_PIPELINE terminating 2025-07-16 23:34:23,026 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752161_11337 replica FinalizedReplica, blk_1073752161_11337, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752161 for deletion 2025-07-16 23:34:23,027 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752161_11337 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752161 2025-07-16 23:36:13,269 INFO org.apache.hadoop.hdfs.server.datanode.DirectoryScanner: BlockPool BP-1059995147-192.168.158.1-1752101929360 Total blocks: 8, missing metadata files:0, missing block files:0, missing blocks in memory:0, mismatched blocks:0 2025-07-16 23:38:24,367 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752165_11341 src: /192.168.158.1:48990 dest: /192.168.158.4:9866 2025-07-16 23:38:24,399 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48990, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1861178914_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752165_11341, duration(ns): 23388134 2025-07-16 23:38:24,399 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752165_11341, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-16 23:38:29,035 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752165_11341 replica FinalizedReplica, blk_1073752165_11341, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752165 for deletion 2025-07-16 23:38:29,037 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752165_11341 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752165 2025-07-16 23:40:24,441 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752167_11343 src: /192.168.158.5:35534 dest: /192.168.158.4:9866 2025-07-16 23:40:24,466 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:35534, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1296460071_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752167_11343, duration(ns): 19711375 2025-07-16 23:40:24,466 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752167_11343, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-16 23:40:29,038 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752167_11343 replica FinalizedReplica, blk_1073752167_11343, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752167 for deletion 2025-07-16 23:40:29,040 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752167_11343 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752167 2025-07-16 23:42:24,378 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752169_11345 src: /192.168.158.9:49944 dest: /192.168.158.4:9866 2025-07-16 23:42:24,405 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:49944, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1296826459_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752169_11345, duration(ns): 20037114 2025-07-16 23:42:24,405 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752169_11345, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-16 23:42:29,043 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752169_11345 replica FinalizedReplica, blk_1073752169_11345, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752169 for deletion 2025-07-16 23:42:29,044 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752169_11345 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752169 2025-07-16 23:44:24,384 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752171_11347 src: /192.168.158.5:56320 dest: /192.168.158.4:9866 2025-07-16 23:44:24,409 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:56320, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1635680883_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752171_11347, duration(ns): 19460187 2025-07-16 23:44:24,409 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752171_11347, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-16 23:44:29,049 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752171_11347 replica FinalizedReplica, blk_1073752171_11347, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752171 for deletion 2025-07-16 23:44:29,050 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752171_11347 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752171 2025-07-16 23:45:24,392 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752172_11348 src: /192.168.158.6:33130 dest: /192.168.158.4:9866 2025-07-16 23:45:24,418 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:33130, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1293412695_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752172_11348, duration(ns): 20194100 2025-07-16 23:45:24,418 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752172_11348, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-16 23:45:32,050 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752172_11348 replica FinalizedReplica, blk_1073752172_11348, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752172 for deletion 2025-07-16 23:45:32,051 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752172_11348 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752172 2025-07-16 23:46:29,384 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752173_11349 src: /192.168.158.6:35426 dest: /192.168.158.4:9866 2025-07-16 23:46:29,400 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:35426, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1777197179_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752173_11349, duration(ns): 14417121 2025-07-16 23:46:29,401 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752173_11349, type=LAST_IN_PIPELINE terminating 2025-07-16 23:46:35,054 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752173_11349 replica FinalizedReplica, blk_1073752173_11349, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752173 for deletion 2025-07-16 23:46:35,055 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752173_11349 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752173 2025-07-16 23:48:29,388 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752175_11351 src: /192.168.158.5:41138 dest: /192.168.158.4:9866 2025-07-16 23:48:29,408 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:41138, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1915715294_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752175_11351, duration(ns): 18546465 2025-07-16 23:48:29,409 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752175_11351, type=LAST_IN_PIPELINE terminating 2025-07-16 23:48:32,063 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752175_11351 replica FinalizedReplica, blk_1073752175_11351, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752175 for deletion 2025-07-16 23:48:32,064 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752175_11351 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752175 2025-07-16 23:49:29,401 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752176_11352 src: /192.168.158.8:44094 dest: /192.168.158.4:9866 2025-07-16 23:49:29,422 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:44094, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1177359407_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752176_11352, duration(ns): 18731069 2025-07-16 23:49:29,422 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752176_11352, type=LAST_IN_PIPELINE terminating 2025-07-16 23:49:32,067 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752176_11352 replica FinalizedReplica, blk_1073752176_11352, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752176 for deletion 2025-07-16 23:49:32,068 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752176_11352 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752176 2025-07-16 23:50:29,390 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752177_11353 src: /192.168.158.7:55918 dest: /192.168.158.4:9866 2025-07-16 23:50:29,417 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:55918, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_969539469_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752177_11353, duration(ns): 21749568 2025-07-16 23:50:29,418 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752177_11353, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-16 23:50:32,068 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752177_11353 replica FinalizedReplica, blk_1073752177_11353, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752177 for deletion 2025-07-16 23:50:32,069 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752177_11353 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752177 2025-07-16 23:55:29,408 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752182_11358 src: /192.168.158.9:48898 dest: /192.168.158.4:9866 2025-07-16 23:55:29,427 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:48898, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_554860587_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752182_11358, duration(ns): 16779930 2025-07-16 23:55:29,427 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752182_11358, type=LAST_IN_PIPELINE terminating 2025-07-16 23:55:35,082 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752182_11358 replica FinalizedReplica, blk_1073752182_11358, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752182 for deletion 2025-07-16 23:55:35,083 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752182_11358 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752182 2025-07-17 00:01:39,415 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752188_11364 src: /192.168.158.9:32842 dest: /192.168.158.4:9866 2025-07-17 00:01:39,440 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:32842, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1751578621_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752188_11364, duration(ns): 19478095 2025-07-17 00:01:39,441 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752188_11364, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-17 00:01:47,092 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752188_11364 replica FinalizedReplica, blk_1073752188_11364, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752188 for deletion 2025-07-17 00:01:47,093 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752188_11364 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752188 2025-07-17 00:04:39,429 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752191_11367 src: /192.168.158.5:41092 dest: /192.168.158.4:9866 2025-07-17 00:04:39,449 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:41092, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1985094117_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752191_11367, duration(ns): 17530008 2025-07-17 00:04:39,450 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752191_11367, type=LAST_IN_PIPELINE terminating 2025-07-17 00:04:44,096 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752191_11367 replica FinalizedReplica, blk_1073752191_11367, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752191 for deletion 2025-07-17 00:04:44,097 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752191_11367 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752191 2025-07-17 00:13:54,413 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752200_11376 src: /192.168.158.1:50766 dest: /192.168.158.4:9866 2025-07-17 00:13:54,447 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50766, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1432591421_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752200_11376, duration(ns): 24426709 2025-07-17 00:13:54,448 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752200_11376, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-17 00:13:59,111 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752200_11376 replica FinalizedReplica, blk_1073752200_11376, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752200 for deletion 2025-07-17 00:13:59,112 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752200_11376 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752200 2025-07-17 00:15:54,419 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752202_11378 src: /192.168.158.1:47064 dest: /192.168.158.4:9866 2025-07-17 00:15:54,452 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47064, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2077614037_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752202_11378, duration(ns): 23546921 2025-07-17 00:15:54,452 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752202_11378, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-17 00:15:59,120 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752202_11378 replica FinalizedReplica, blk_1073752202_11378, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752202 for deletion 2025-07-17 00:15:59,122 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752202_11378 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752202 2025-07-17 00:16:54,419 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752203_11379 src: /192.168.158.1:36614 dest: /192.168.158.4:9866 2025-07-17 00:16:54,452 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36614, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-713542104_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752203_11379, duration(ns): 24762978 2025-07-17 00:16:54,453 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752203_11379, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-17 00:17:02,120 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752203_11379 replica FinalizedReplica, blk_1073752203_11379, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752203 for deletion 2025-07-17 00:17:02,121 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752203_11379 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752203 2025-07-17 00:17:59,415 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752204_11380 src: /192.168.158.1:43134 dest: /192.168.158.4:9866 2025-07-17 00:17:59,448 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43134, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-899823522_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752204_11380, duration(ns): 23906538 2025-07-17 00:17:59,449 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752204_11380, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-17 00:18:02,120 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752204_11380 replica FinalizedReplica, blk_1073752204_11380, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752204 for deletion 2025-07-17 00:18:02,122 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752204_11380 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752204 2025-07-17 00:20:59,434 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752207_11383 src: /192.168.158.8:42190 dest: /192.168.158.4:9866 2025-07-17 00:20:59,452 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:42190, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_538161405_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752207_11383, duration(ns): 15981060 2025-07-17 00:20:59,452 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752207_11383, type=LAST_IN_PIPELINE terminating 2025-07-17 00:21:02,128 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752207_11383 replica FinalizedReplica, blk_1073752207_11383, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752207 for deletion 2025-07-17 00:21:02,129 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752207_11383 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752207 2025-07-17 00:21:59,435 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752208_11384 src: /192.168.158.5:37092 dest: /192.168.158.4:9866 2025-07-17 00:21:59,454 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:37092, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1632594702_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752208_11384, duration(ns): 16702791 2025-07-17 00:21:59,454 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752208_11384, type=LAST_IN_PIPELINE terminating 2025-07-17 00:22:05,129 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752208_11384 replica FinalizedReplica, blk_1073752208_11384, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752208 for deletion 2025-07-17 00:22:05,130 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752208_11384 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752208 2025-07-17 00:23:59,434 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752210_11386 src: /192.168.158.7:56584 dest: /192.168.158.4:9866 2025-07-17 00:23:59,463 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:56584, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-472282446_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752210_11386, duration(ns): 23312981 2025-07-17 00:23:59,464 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752210_11386, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-17 00:24:02,130 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752210_11386 replica FinalizedReplica, blk_1073752210_11386, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752210 for deletion 2025-07-17 00:24:02,131 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752210_11386 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752210 2025-07-17 00:27:59,445 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752214_11390 src: /192.168.158.1:56028 dest: /192.168.158.4:9866 2025-07-17 00:27:59,477 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56028, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_607210517_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752214_11390, duration(ns): 22979857 2025-07-17 00:27:59,477 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752214_11390, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-17 00:28:02,139 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752214_11390 replica FinalizedReplica, blk_1073752214_11390, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752214 for deletion 2025-07-17 00:28:02,140 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752214_11390 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752214 2025-07-17 00:28:59,454 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752215_11391 src: /192.168.158.5:56808 dest: /192.168.158.4:9866 2025-07-17 00:28:59,478 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:56808, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1546885122_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752215_11391, duration(ns): 18750816 2025-07-17 00:28:59,479 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752215_11391, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-17 00:29:05,140 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752215_11391 replica FinalizedReplica, blk_1073752215_11391, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752215 for deletion 2025-07-17 00:29:05,141 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752215_11391 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752215 2025-07-17 00:30:59,454 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752217_11393 src: /192.168.158.8:37706 dest: /192.168.158.4:9866 2025-07-17 00:30:59,482 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:37706, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2107837403_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752217_11393, duration(ns): 22560523 2025-07-17 00:30:59,482 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752217_11393, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-17 00:31:05,147 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752217_11393 replica FinalizedReplica, blk_1073752217_11393, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752217 for deletion 2025-07-17 00:31:05,148 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752217_11393 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752217 2025-07-17 00:32:59,476 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752219_11395 src: /192.168.158.7:34024 dest: /192.168.158.4:9866 2025-07-17 00:32:59,495 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:34024, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-914354090_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752219_11395, duration(ns): 17056808 2025-07-17 00:32:59,495 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752219_11395, type=LAST_IN_PIPELINE terminating 2025-07-17 00:33:02,150 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752219_11395 replica FinalizedReplica, blk_1073752219_11395, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752219 for deletion 2025-07-17 00:33:02,151 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752219_11395 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752219 2025-07-17 00:37:09,455 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752223_11399 src: /192.168.158.1:33436 dest: /192.168.158.4:9866 2025-07-17 00:37:09,488 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33436, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_580844939_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752223_11399, duration(ns): 23808169 2025-07-17 00:37:09,489 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752223_11399, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-17 00:37:14,160 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752223_11399 replica FinalizedReplica, blk_1073752223_11399, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752223 for deletion 2025-07-17 00:37:14,161 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752223_11399 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752223 2025-07-17 00:41:29,477 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752227_11403 src: /192.168.158.5:47652 dest: /192.168.158.4:9866 2025-07-17 00:41:29,497 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:47652, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1568363904_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752227_11403, duration(ns): 18456019 2025-07-17 00:41:29,498 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752227_11403, type=LAST_IN_PIPELINE terminating 2025-07-17 00:41:32,169 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752227_11403 replica FinalizedReplica, blk_1073752227_11403, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752227 for deletion 2025-07-17 00:41:32,170 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752227_11403 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752227 2025-07-17 00:42:29,469 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752228_11404 src: /192.168.158.1:47104 dest: /192.168.158.4:9866 2025-07-17 00:42:29,500 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47104, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1233213325_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752228_11404, duration(ns): 21669344 2025-07-17 00:42:29,500 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752228_11404, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-17 00:42:32,173 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752228_11404 replica FinalizedReplica, blk_1073752228_11404, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752228 for deletion 2025-07-17 00:42:32,174 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752228_11404 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752228 2025-07-17 00:44:34,449 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752230_11406 src: /192.168.158.1:56672 dest: /192.168.158.4:9866 2025-07-17 00:44:34,484 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56672, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1472924187_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752230_11406, duration(ns): 25822355 2025-07-17 00:44:34,484 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752230_11406, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-17 00:44:41,175 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752230_11406 replica FinalizedReplica, blk_1073752230_11406, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752230 for deletion 2025-07-17 00:44:41,177 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752230_11406 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752230 2025-07-17 00:46:39,472 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752232_11408 src: /192.168.158.1:52168 dest: /192.168.158.4:9866 2025-07-17 00:46:39,508 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52168, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1761647063_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752232_11408, duration(ns): 26838609 2025-07-17 00:46:39,509 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752232_11408, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-17 00:46:47,195 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752232_11408 replica FinalizedReplica, blk_1073752232_11408, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752232 for deletion 2025-07-17 00:46:47,196 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752232_11408 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752232 2025-07-17 00:47:39,454 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752233_11409 src: /192.168.158.1:37748 dest: /192.168.158.4:9866 2025-07-17 00:47:39,486 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37748, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1092779525_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752233_11409, duration(ns): 22763052 2025-07-17 00:47:39,487 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752233_11409, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-17 00:47:44,182 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752233_11409 replica FinalizedReplica, blk_1073752233_11409, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752233 for deletion 2025-07-17 00:47:44,183 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752233_11409 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752233 2025-07-17 00:53:44,495 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752239_11415 src: /192.168.158.1:43332 dest: /192.168.158.4:9866 2025-07-17 00:53:44,530 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43332, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1325403688_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752239_11415, duration(ns): 24561864 2025-07-17 00:53:44,530 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752239_11415, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-17 00:53:50,190 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752239_11415 replica FinalizedReplica, blk_1073752239_11415, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752239 for deletion 2025-07-17 00:53:50,191 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752239_11415 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752239 2025-07-17 00:55:44,472 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752241_11417 src: /192.168.158.8:38426 dest: /192.168.158.4:9866 2025-07-17 00:55:44,489 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:38426, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1592283668_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752241_11417, duration(ns): 15763012 2025-07-17 00:55:44,490 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752241_11417, type=LAST_IN_PIPELINE terminating 2025-07-17 00:55:50,193 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752241_11417 replica FinalizedReplica, blk_1073752241_11417, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752241 for deletion 2025-07-17 00:55:50,194 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752241_11417 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752241 2025-07-17 00:56:44,470 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752242_11418 src: /192.168.158.1:36484 dest: /192.168.158.4:9866 2025-07-17 00:56:44,507 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36484, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_776432144_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752242_11418, duration(ns): 26354652 2025-07-17 00:56:44,507 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752242_11418, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-17 00:56:47,195 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752242_11418 replica FinalizedReplica, blk_1073752242_11418, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752242 for deletion 2025-07-17 00:56:47,196 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752242_11418 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752242 2025-07-17 00:57:44,470 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752243_11419 src: /192.168.158.9:43518 dest: /192.168.158.4:9866 2025-07-17 00:57:44,499 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:43518, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-823115298_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752243_11419, duration(ns): 23396900 2025-07-17 00:57:44,499 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752243_11419, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-17 00:57:47,195 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752243_11419 replica FinalizedReplica, blk_1073752243_11419, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752243 for deletion 2025-07-17 00:57:47,196 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752243_11419 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752243 2025-07-17 01:00:44,477 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752246_11422 src: /192.168.158.1:58920 dest: /192.168.158.4:9866 2025-07-17 01:00:44,508 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58920, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_291302617_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752246_11422, duration(ns): 21441664 2025-07-17 01:00:44,508 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752246_11422, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-17 01:00:50,202 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752246_11422 replica FinalizedReplica, blk_1073752246_11422, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752246 for deletion 2025-07-17 01:00:50,203 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752246_11422 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752246 2025-07-17 01:02:44,473 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752248_11424 src: /192.168.158.1:51998 dest: /192.168.158.4:9866 2025-07-17 01:02:44,507 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51998, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1257162439_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752248_11424, duration(ns): 23473354 2025-07-17 01:02:44,507 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752248_11424, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-17 01:02:50,206 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752248_11424 replica FinalizedReplica, blk_1073752248_11424, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752248 for deletion 2025-07-17 01:02:50,207 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752248_11424 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752248 2025-07-17 01:03:44,478 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752249_11425 src: /192.168.158.1:50172 dest: /192.168.158.4:9866 2025-07-17 01:03:44,511 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50172, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2060524494_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752249_11425, duration(ns): 24239618 2025-07-17 01:03:44,511 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752249_11425, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-17 01:03:50,210 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752249_11425 replica FinalizedReplica, blk_1073752249_11425, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752249 for deletion 2025-07-17 01:03:50,211 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752249_11425 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752249 2025-07-17 01:04:44,493 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752250_11426 src: /192.168.158.6:59830 dest: /192.168.158.4:9866 2025-07-17 01:04:44,512 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:59830, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1923301167_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752250_11426, duration(ns): 16794073 2025-07-17 01:04:44,513 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752250_11426, type=LAST_IN_PIPELINE terminating 2025-07-17 01:04:47,212 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752250_11426 replica FinalizedReplica, blk_1073752250_11426, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752250 for deletion 2025-07-17 01:04:47,213 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752250_11426 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752250 2025-07-17 01:05:44,491 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752251_11427 src: /192.168.158.6:52432 dest: /192.168.158.4:9866 2025-07-17 01:05:44,520 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:52432, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_549896494_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752251_11427, duration(ns): 23168985 2025-07-17 01:05:44,521 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752251_11427, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-17 01:05:47,215 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752251_11427 replica FinalizedReplica, blk_1073752251_11427, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752251 for deletion 2025-07-17 01:05:47,216 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752251_11427 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752251 2025-07-17 01:06:44,491 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752252_11428 src: /192.168.158.8:38850 dest: /192.168.158.4:9866 2025-07-17 01:06:44,517 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:38850, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1528830045_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752252_11428, duration(ns): 21013769 2025-07-17 01:06:44,518 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752252_11428, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-17 01:06:47,214 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752252_11428 replica FinalizedReplica, blk_1073752252_11428, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752252 for deletion 2025-07-17 01:06:47,215 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752252_11428 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752252 2025-07-17 01:09:49,497 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752255_11431 src: /192.168.158.7:36924 dest: /192.168.158.4:9866 2025-07-17 01:09:49,517 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:36924, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_184729496_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752255_11431, duration(ns): 17087877 2025-07-17 01:09:49,517 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752255_11431, type=LAST_IN_PIPELINE terminating 2025-07-17 01:09:53,218 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752255_11431 replica FinalizedReplica, blk_1073752255_11431, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752255 for deletion 2025-07-17 01:09:53,219 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752255_11431 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752255 2025-07-17 01:10:49,498 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752256_11432 src: /192.168.158.8:35420 dest: /192.168.158.4:9866 2025-07-17 01:10:49,516 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:35420, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1287187115_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752256_11432, duration(ns): 15462303 2025-07-17 01:10:49,516 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752256_11432, type=LAST_IN_PIPELINE terminating 2025-07-17 01:10:53,221 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752256_11432 replica FinalizedReplica, blk_1073752256_11432, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752256 for deletion 2025-07-17 01:10:53,222 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752256_11432 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752256 2025-07-17 01:15:54,502 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752261_11437 src: /192.168.158.9:43220 dest: /192.168.158.4:9866 2025-07-17 01:15:54,521 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:43220, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1306276605_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752261_11437, duration(ns): 16089979 2025-07-17 01:15:54,521 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752261_11437, type=LAST_IN_PIPELINE terminating 2025-07-17 01:16:02,227 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752261_11437 replica FinalizedReplica, blk_1073752261_11437, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752261 for deletion 2025-07-17 01:16:02,228 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752261_11437 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752261 2025-07-17 01:17:54,506 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752263_11439 src: /192.168.158.1:53034 dest: /192.168.158.4:9866 2025-07-17 01:17:54,541 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53034, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1930361226_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752263_11439, duration(ns): 25628619 2025-07-17 01:17:54,542 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752263_11439, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-17 01:17:59,229 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752263_11439 replica FinalizedReplica, blk_1073752263_11439, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752263 for deletion 2025-07-17 01:17:59,230 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752263_11439 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752263 2025-07-17 01:18:59,508 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752264_11440 src: /192.168.158.1:60520 dest: /192.168.158.4:9866 2025-07-17 01:18:59,542 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60520, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2128285280_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752264_11440, duration(ns): 25356293 2025-07-17 01:18:59,542 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752264_11440, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-17 01:19:02,231 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752264_11440 replica FinalizedReplica, blk_1073752264_11440, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752264 for deletion 2025-07-17 01:19:02,233 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752264_11440 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752264 2025-07-17 01:24:04,515 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752269_11445 src: /192.168.158.5:54420 dest: /192.168.158.4:9866 2025-07-17 01:24:04,533 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:54420, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1660364403_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752269_11445, duration(ns): 16558262 2025-07-17 01:24:04,534 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752269_11445, type=LAST_IN_PIPELINE terminating 2025-07-17 01:24:11,247 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752269_11445 replica FinalizedReplica, blk_1073752269_11445, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752269 for deletion 2025-07-17 01:24:11,248 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752269_11445 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752269 2025-07-17 01:25:04,518 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752270_11446 src: /192.168.158.6:36216 dest: /192.168.158.4:9866 2025-07-17 01:25:04,540 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:36216, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1618176709_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752270_11446, duration(ns): 19354473 2025-07-17 01:25:04,540 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752270_11446, type=LAST_IN_PIPELINE terminating 2025-07-17 01:25:08,249 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752270_11446 replica FinalizedReplica, blk_1073752270_11446, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752270 for deletion 2025-07-17 01:25:08,250 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752270_11446 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752270 2025-07-17 01:26:04,518 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752271_11447 src: /192.168.158.8:43880 dest: /192.168.158.4:9866 2025-07-17 01:26:04,538 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:43880, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-446209372_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752271_11447, duration(ns): 17689731 2025-07-17 01:26:04,538 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752271_11447, type=LAST_IN_PIPELINE terminating 2025-07-17 01:26:08,250 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752271_11447 replica FinalizedReplica, blk_1073752271_11447, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752271 for deletion 2025-07-17 01:26:08,252 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752271_11447 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752271 2025-07-17 01:29:04,517 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752274_11450 src: /192.168.158.1:33608 dest: /192.168.158.4:9866 2025-07-17 01:29:04,555 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33608, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-426209900_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752274_11450, duration(ns): 28474499 2025-07-17 01:29:04,555 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752274_11450, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-17 01:29:11,262 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752274_11450 replica FinalizedReplica, blk_1073752274_11450, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752274 for deletion 2025-07-17 01:29:11,263 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752274_11450 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752274 2025-07-17 01:39:19,541 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752284_11460 src: /192.168.158.5:33766 dest: /192.168.158.4:9866 2025-07-17 01:39:19,559 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:33766, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-22187839_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752284_11460, duration(ns): 15858180 2025-07-17 01:39:19,559 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752284_11460, type=LAST_IN_PIPELINE terminating 2025-07-17 01:39:23,291 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752284_11460 replica FinalizedReplica, blk_1073752284_11460, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752284 for deletion 2025-07-17 01:39:23,292 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752284_11460 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752284 2025-07-17 01:40:19,537 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752285_11461 src: /192.168.158.5:41900 dest: /192.168.158.4:9866 2025-07-17 01:40:19,556 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:41900, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1461018253_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752285_11461, duration(ns): 16778932 2025-07-17 01:40:19,556 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752285_11461, type=LAST_IN_PIPELINE terminating 2025-07-17 01:40:23,295 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752285_11461 replica FinalizedReplica, blk_1073752285_11461, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752285 for deletion 2025-07-17 01:40:23,296 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752285_11461 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752285 2025-07-17 01:41:24,540 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752286_11462 src: /192.168.158.1:57032 dest: /192.168.158.4:9866 2025-07-17 01:41:24,572 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57032, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-598924844_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752286_11462, duration(ns): 23252565 2025-07-17 01:41:24,572 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752286_11462, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-17 01:41:32,298 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752286_11462 replica FinalizedReplica, blk_1073752286_11462, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752286 for deletion 2025-07-17 01:41:32,299 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752286_11462 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752286 2025-07-17 01:44:29,545 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752289_11465 src: /192.168.158.6:56444 dest: /192.168.158.4:9866 2025-07-17 01:44:29,573 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:56444, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_621784396_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752289_11465, duration(ns): 21771590 2025-07-17 01:44:29,573 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752289_11465, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-17 01:44:35,306 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752289_11465 replica FinalizedReplica, blk_1073752289_11465, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752289 for deletion 2025-07-17 01:44:35,307 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752289_11465 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752289 2025-07-17 01:47:39,547 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752292_11468 src: /192.168.158.1:49332 dest: /192.168.158.4:9866 2025-07-17 01:47:39,582 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49332, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1729054248_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752292_11468, duration(ns): 25392025 2025-07-17 01:47:39,582 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752292_11468, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-17 01:47:47,312 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752292_11468 replica FinalizedReplica, blk_1073752292_11468, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752292 for deletion 2025-07-17 01:47:47,314 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752292_11468 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752292 2025-07-17 01:48:39,562 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752293_11469 src: /192.168.158.9:35450 dest: /192.168.158.4:9866 2025-07-17 01:48:39,581 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:35450, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1857456200_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752293_11469, duration(ns): 17350431 2025-07-17 01:48:39,581 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752293_11469, type=LAST_IN_PIPELINE terminating 2025-07-17 01:48:44,314 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752293_11469 replica FinalizedReplica, blk_1073752293_11469, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752293 for deletion 2025-07-17 01:48:44,315 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752293_11469 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752293 2025-07-17 01:49:44,547 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752294_11470 src: /192.168.158.1:37380 dest: /192.168.158.4:9866 2025-07-17 01:49:44,579 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37380, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1258716555_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752294_11470, duration(ns): 22519801 2025-07-17 01:49:44,579 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752294_11470, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-17 01:49:50,317 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752294_11470 replica FinalizedReplica, blk_1073752294_11470, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752294 for deletion 2025-07-17 01:49:50,318 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752294_11470 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752294 2025-07-17 01:50:44,549 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752295_11471 src: /192.168.158.5:52322 dest: /192.168.158.4:9866 2025-07-17 01:50:44,576 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:52322, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1708446585_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752295_11471, duration(ns): 21342866 2025-07-17 01:50:44,576 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752295_11471, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-17 01:50:50,317 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752295_11471 replica FinalizedReplica, blk_1073752295_11471, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752295 for deletion 2025-07-17 01:50:50,319 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752295_11471 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752295 2025-07-17 01:53:44,585 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752298_11474 src: /192.168.158.8:40688 dest: /192.168.158.4:9866 2025-07-17 01:53:44,611 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:40688, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_924048700_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752298_11474, duration(ns): 20431472 2025-07-17 01:53:44,611 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752298_11474, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-17 01:53:47,323 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752298_11474 replica FinalizedReplica, blk_1073752298_11474, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752298 for deletion 2025-07-17 01:53:47,324 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752298_11474 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752298 2025-07-17 01:55:44,600 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752300_11476 src: /192.168.158.5:49202 dest: /192.168.158.4:9866 2025-07-17 01:55:44,630 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:49202, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-590523311_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752300_11476, duration(ns): 24340630 2025-07-17 01:55:44,631 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752300_11476, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-17 01:55:47,327 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752300_11476 replica FinalizedReplica, blk_1073752300_11476, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752300 for deletion 2025-07-17 01:55:47,329 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752300_11476 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752300 2025-07-17 01:57:49,600 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752302_11478 src: /192.168.158.8:43504 dest: /192.168.158.4:9866 2025-07-17 01:57:49,628 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:43504, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_751287772_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752302_11478, duration(ns): 21842757 2025-07-17 01:57:49,628 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752302_11478, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-17 01:57:53,332 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752302_11478 replica FinalizedReplica, blk_1073752302_11478, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752302 for deletion 2025-07-17 01:57:53,334 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752302_11478 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752302 2025-07-17 02:00:49,617 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752305_11481 src: /192.168.158.1:44616 dest: /192.168.158.4:9866 2025-07-17 02:00:49,651 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44616, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1428704888_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752305_11481, duration(ns): 24953852 2025-07-17 02:00:49,652 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752305_11481, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-17 02:00:53,342 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752305_11481 replica FinalizedReplica, blk_1073752305_11481, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752305 for deletion 2025-07-17 02:00:53,344 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752305_11481 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752305 2025-07-17 02:05:49,631 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752310_11486 src: /192.168.158.9:47152 dest: /192.168.158.4:9866 2025-07-17 02:05:49,651 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:47152, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1487722695_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752310_11486, duration(ns): 17217291 2025-07-17 02:05:49,651 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752310_11486, type=LAST_IN_PIPELINE terminating 2025-07-17 02:05:53,354 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752310_11486 replica FinalizedReplica, blk_1073752310_11486, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752310 for deletion 2025-07-17 02:05:53,355 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752310_11486 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752310 2025-07-17 02:07:49,630 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752312_11488 src: /192.168.158.1:52514 dest: /192.168.158.4:9866 2025-07-17 02:07:49,664 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52514, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2010320905_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752312_11488, duration(ns): 24270184 2025-07-17 02:07:49,664 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752312_11488, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-17 02:07:53,354 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752312_11488 replica FinalizedReplica, blk_1073752312_11488, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752312 for deletion 2025-07-17 02:07:53,355 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752312_11488 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752312 2025-07-17 02:08:49,630 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752313_11489 src: /192.168.158.6:45870 dest: /192.168.158.4:9866 2025-07-17 02:08:49,650 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:45870, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-540894390_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752313_11489, duration(ns): 18550782 2025-07-17 02:08:49,651 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752313_11489, type=LAST_IN_PIPELINE terminating 2025-07-17 02:08:53,355 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752313_11489 replica FinalizedReplica, blk_1073752313_11489, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752313 for deletion 2025-07-17 02:08:53,357 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752313_11489 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752313 2025-07-17 02:12:49,634 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752317_11493 src: /192.168.158.5:56526 dest: /192.168.158.4:9866 2025-07-17 02:12:49,651 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:56526, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_100478259_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752317_11493, duration(ns): 15393937 2025-07-17 02:12:49,651 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752317_11493, type=LAST_IN_PIPELINE terminating 2025-07-17 02:12:56,366 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752317_11493 replica FinalizedReplica, blk_1073752317_11493, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752317 for deletion 2025-07-17 02:12:56,367 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752317_11493 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752317 2025-07-17 02:13:49,634 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752318_11494 src: /192.168.158.1:54654 dest: /192.168.158.4:9866 2025-07-17 02:13:49,668 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54654, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2101430737_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752318_11494, duration(ns): 24150062 2025-07-17 02:13:49,668 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752318_11494, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-17 02:13:53,368 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752318_11494 replica FinalizedReplica, blk_1073752318_11494, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752318 for deletion 2025-07-17 02:13:53,369 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752318_11494 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752318 2025-07-17 02:14:49,635 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752319_11495 src: /192.168.158.7:54198 dest: /192.168.158.4:9866 2025-07-17 02:14:49,660 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:54198, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_475833561_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752319_11495, duration(ns): 19300006 2025-07-17 02:14:49,660 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752319_11495, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-17 02:14:53,369 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752319_11495 replica FinalizedReplica, blk_1073752319_11495, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752319 for deletion 2025-07-17 02:14:53,371 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752319_11495 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir8/blk_1073752319 2025-07-17 02:15:49,638 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752320_11496 src: /192.168.158.8:43414 dest: /192.168.158.4:9866 2025-07-17 02:15:49,667 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:43414, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-799174316_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752320_11496, duration(ns): 22846498 2025-07-17 02:15:49,667 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752320_11496, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-17 02:15:56,372 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752320_11496 replica FinalizedReplica, blk_1073752320_11496, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752320 for deletion 2025-07-17 02:15:56,373 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752320_11496 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752320 2025-07-17 02:16:49,645 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752321_11497 src: /192.168.158.7:47194 dest: /192.168.158.4:9866 2025-07-17 02:16:49,666 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:47194, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-120542073_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752321_11497, duration(ns): 18438151 2025-07-17 02:16:49,667 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752321_11497, type=LAST_IN_PIPELINE terminating 2025-07-17 02:16:53,375 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752321_11497 replica FinalizedReplica, blk_1073752321_11497, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752321 for deletion 2025-07-17 02:16:53,377 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752321_11497 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752321 2025-07-17 02:21:49,657 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752326_11502 src: /192.168.158.5:37722 dest: /192.168.158.4:9866 2025-07-17 02:21:49,675 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:37722, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1293299369_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752326_11502, duration(ns): 16062899 2025-07-17 02:21:49,675 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752326_11502, type=LAST_IN_PIPELINE terminating 2025-07-17 02:21:53,390 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752326_11502 replica FinalizedReplica, blk_1073752326_11502, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752326 for deletion 2025-07-17 02:21:53,391 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752326_11502 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752326 2025-07-17 02:23:49,690 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752328_11504 src: /192.168.158.9:37516 dest: /192.168.158.4:9866 2025-07-17 02:23:49,709 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:37516, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1466064453_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752328_11504, duration(ns): 16961549 2025-07-17 02:23:49,710 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752328_11504, type=LAST_IN_PIPELINE terminating 2025-07-17 02:23:53,395 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752328_11504 replica FinalizedReplica, blk_1073752328_11504, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752328 for deletion 2025-07-17 02:23:53,396 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752328_11504 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752328 2025-07-17 02:25:49,693 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752330_11506 src: /192.168.158.1:58198 dest: /192.168.158.4:9866 2025-07-17 02:25:49,729 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58198, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1859419569_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752330_11506, duration(ns): 26302613 2025-07-17 02:25:49,729 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752330_11506, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-17 02:25:56,398 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752330_11506 replica FinalizedReplica, blk_1073752330_11506, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752330 for deletion 2025-07-17 02:25:56,399 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752330_11506 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752330 2025-07-17 02:27:54,686 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752332_11508 src: /192.168.158.1:33920 dest: /192.168.158.4:9866 2025-07-17 02:27:54,727 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33920, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_764356942_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752332_11508, duration(ns): 30297287 2025-07-17 02:27:54,727 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752332_11508, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-17 02:28:02,404 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752332_11508 replica FinalizedReplica, blk_1073752332_11508, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752332 for deletion 2025-07-17 02:28:02,405 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752332_11508 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752332 2025-07-17 02:28:54,690 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752333_11509 src: /192.168.158.1:51534 dest: /192.168.158.4:9866 2025-07-17 02:28:54,722 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51534, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-63480346_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752333_11509, duration(ns): 23884409 2025-07-17 02:28:54,723 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752333_11509, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-17 02:28:59,407 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752333_11509 replica FinalizedReplica, blk_1073752333_11509, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752333 for deletion 2025-07-17 02:28:59,409 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752333_11509 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752333 2025-07-17 02:30:54,701 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752335_11511 src: /192.168.158.8:46906 dest: /192.168.158.4:9866 2025-07-17 02:30:54,728 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:46906, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2023313372_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752335_11511, duration(ns): 21238491 2025-07-17 02:30:54,728 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752335_11511, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-17 02:30:59,412 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752335_11511 replica FinalizedReplica, blk_1073752335_11511, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752335 for deletion 2025-07-17 02:30:59,413 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752335_11511 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752335 2025-07-17 02:32:59,701 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752337_11513 src: /192.168.158.9:41826 dest: /192.168.158.4:9866 2025-07-17 02:32:59,724 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:41826, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1342169014_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752337_11513, duration(ns): 17732303 2025-07-17 02:32:59,725 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752337_11513, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-17 02:33:02,415 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752337_11513 replica FinalizedReplica, blk_1073752337_11513, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752337 for deletion 2025-07-17 02:33:02,416 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752337_11513 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752337 2025-07-17 02:33:59,707 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752338_11514 src: /192.168.158.9:57620 dest: /192.168.158.4:9866 2025-07-17 02:33:59,726 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:57620, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1350995785_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752338_11514, duration(ns): 16996069 2025-07-17 02:33:59,727 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752338_11514, type=LAST_IN_PIPELINE terminating 2025-07-17 02:34:02,418 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752338_11514 replica FinalizedReplica, blk_1073752338_11514, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752338 for deletion 2025-07-17 02:34:02,419 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752338_11514 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752338 2025-07-17 02:34:59,717 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752339_11515 src: /192.168.158.1:60616 dest: /192.168.158.4:9866 2025-07-17 02:34:59,749 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60616, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_666517697_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752339_11515, duration(ns): 23062364 2025-07-17 02:34:59,749 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752339_11515, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-17 02:35:05,423 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752339_11515 replica FinalizedReplica, blk_1073752339_11515, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752339 for deletion 2025-07-17 02:35:05,424 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752339_11515 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752339 2025-07-17 02:35:59,705 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752340_11516 src: /192.168.158.1:42388 dest: /192.168.158.4:9866 2025-07-17 02:35:59,741 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42388, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_748130951_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752340_11516, duration(ns): 25057664 2025-07-17 02:35:59,741 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752340_11516, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-17 02:36:02,424 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752340_11516 replica FinalizedReplica, blk_1073752340_11516, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752340 for deletion 2025-07-17 02:36:02,425 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752340_11516 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752340 2025-07-17 02:37:59,724 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752342_11518 src: /192.168.158.1:58452 dest: /192.168.158.4:9866 2025-07-17 02:37:59,754 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58452, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1761690512_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752342_11518, duration(ns): 21605600 2025-07-17 02:37:59,755 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752342_11518, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-17 02:38:05,428 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752342_11518 replica FinalizedReplica, blk_1073752342_11518, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752342 for deletion 2025-07-17 02:38:05,429 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752342_11518 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752342 2025-07-17 02:41:04,723 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752345_11521 src: /192.168.158.6:53488 dest: /192.168.158.4:9866 2025-07-17 02:41:04,750 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:53488, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1862910869_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752345_11521, duration(ns): 21656715 2025-07-17 02:41:04,751 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752345_11521, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-17 02:41:11,435 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752345_11521 replica FinalizedReplica, blk_1073752345_11521, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752345 for deletion 2025-07-17 02:41:11,436 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752345_11521 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752345 2025-07-17 02:42:04,720 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752346_11522 src: /192.168.158.1:48138 dest: /192.168.158.4:9866 2025-07-17 02:42:04,753 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48138, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1584091842_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752346_11522, duration(ns): 23600129 2025-07-17 02:42:04,754 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752346_11522, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-17 02:42:11,438 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752346_11522 replica FinalizedReplica, blk_1073752346_11522, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752346 for deletion 2025-07-17 02:42:11,439 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752346_11522 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752346 2025-07-17 02:43:09,716 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752347_11523 src: /192.168.158.8:38900 dest: /192.168.158.4:9866 2025-07-17 02:43:09,739 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:38900, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1964311044_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752347_11523, duration(ns): 17037497 2025-07-17 02:43:09,740 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752347_11523, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-17 02:43:14,439 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752347_11523 replica FinalizedReplica, blk_1073752347_11523, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752347 for deletion 2025-07-17 02:43:14,440 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752347_11523 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752347 2025-07-17 02:47:09,734 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752351_11527 src: /192.168.158.7:40154 dest: /192.168.158.4:9866 2025-07-17 02:47:09,763 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:40154, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1295013779_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752351_11527, duration(ns): 22130009 2025-07-17 02:47:09,763 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752351_11527, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-17 02:47:14,448 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752351_11527 replica FinalizedReplica, blk_1073752351_11527, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752351 for deletion 2025-07-17 02:47:14,449 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752351_11527 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752351 2025-07-17 02:48:09,734 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752352_11528 src: /192.168.158.6:49714 dest: /192.168.158.4:9866 2025-07-17 02:48:09,760 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:49714, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_51645600_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752352_11528, duration(ns): 20544946 2025-07-17 02:48:09,761 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752352_11528, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-17 02:48:11,450 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752352_11528 replica FinalizedReplica, blk_1073752352_11528, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752352 for deletion 2025-07-17 02:48:11,452 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752352_11528 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752352 2025-07-17 02:49:09,726 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752353_11529 src: /192.168.158.1:34528 dest: /192.168.158.4:9866 2025-07-17 02:49:09,758 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34528, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-271277358_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752353_11529, duration(ns): 22246477 2025-07-17 02:49:09,758 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752353_11529, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-17 02:49:11,453 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752353_11529 replica FinalizedReplica, blk_1073752353_11529, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752353 for deletion 2025-07-17 02:49:11,454 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752353_11529 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752353 2025-07-17 02:50:09,741 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752354_11530 src: /192.168.158.6:46346 dest: /192.168.158.4:9866 2025-07-17 02:50:09,758 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:46346, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1617419118_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752354_11530, duration(ns): 15033230 2025-07-17 02:50:09,758 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752354_11530, type=LAST_IN_PIPELINE terminating 2025-07-17 02:50:11,454 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752354_11530 replica FinalizedReplica, blk_1073752354_11530, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752354 for deletion 2025-07-17 02:50:11,456 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752354_11530 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752354 2025-07-17 02:51:09,737 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752355_11531 src: /192.168.158.9:43638 dest: /192.168.158.4:9866 2025-07-17 02:51:09,763 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:43638, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-815510268_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752355_11531, duration(ns): 20112441 2025-07-17 02:51:09,763 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752355_11531, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-17 02:51:14,456 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752355_11531 replica FinalizedReplica, blk_1073752355_11531, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752355 for deletion 2025-07-17 02:51:14,458 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752355_11531 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752355 2025-07-17 02:53:09,737 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752357_11533 src: /192.168.158.8:54332 dest: /192.168.158.4:9866 2025-07-17 02:53:09,759 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:54332, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_49896385_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752357_11533, duration(ns): 19315628 2025-07-17 02:53:09,759 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752357_11533, type=LAST_IN_PIPELINE terminating 2025-07-17 02:53:11,460 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752357_11533 replica FinalizedReplica, blk_1073752357_11533, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752357 for deletion 2025-07-17 02:53:11,461 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752357_11533 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752357 2025-07-17 02:55:09,737 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752359_11535 src: /192.168.158.1:43660 dest: /192.168.158.4:9866 2025-07-17 02:55:09,772 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43660, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-827102520_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752359_11535, duration(ns): 25487760 2025-07-17 02:55:09,772 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752359_11535, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-17 02:55:11,465 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752359_11535 replica FinalizedReplica, blk_1073752359_11535, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752359 for deletion 2025-07-17 02:55:11,466 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752359_11535 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752359 2025-07-17 02:56:09,764 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752360_11536 src: /192.168.158.5:58048 dest: /192.168.158.4:9866 2025-07-17 02:56:09,790 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:58048, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1670486850_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752360_11536, duration(ns): 20252456 2025-07-17 02:56:09,791 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752360_11536, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-17 02:56:11,469 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752360_11536 replica FinalizedReplica, blk_1073752360_11536, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752360 for deletion 2025-07-17 02:56:11,470 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752360_11536 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752360 2025-07-17 02:57:09,733 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752361_11537 src: /192.168.158.1:56278 dest: /192.168.158.4:9866 2025-07-17 02:57:09,766 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56278, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_440599881_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752361_11537, duration(ns): 23431022 2025-07-17 02:57:09,766 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752361_11537, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-17 02:57:11,469 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752361_11537 replica FinalizedReplica, blk_1073752361_11537, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752361 for deletion 2025-07-17 02:57:11,470 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752361_11537 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752361 2025-07-17 02:59:14,752 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752363_11539 src: /192.168.158.6:33268 dest: /192.168.158.4:9866 2025-07-17 02:59:14,781 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:33268, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1391738616_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752363_11539, duration(ns): 22654435 2025-07-17 02:59:14,782 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752363_11539, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-17 02:59:17,474 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752363_11539 replica FinalizedReplica, blk_1073752363_11539, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752363 for deletion 2025-07-17 02:59:17,476 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752363_11539 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752363 2025-07-17 03:02:14,745 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752366_11542 src: /192.168.158.1:52352 dest: /192.168.158.4:9866 2025-07-17 03:02:14,779 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52352, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1015672296_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752366_11542, duration(ns): 24239835 2025-07-17 03:02:14,779 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752366_11542, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-17 03:02:17,484 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752366_11542 replica FinalizedReplica, blk_1073752366_11542, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752366 for deletion 2025-07-17 03:02:17,485 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752366_11542 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752366 2025-07-17 03:04:19,748 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752368_11544 src: /192.168.158.1:41548 dest: /192.168.158.4:9866 2025-07-17 03:04:19,779 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41548, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1680268191_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752368_11544, duration(ns): 22150770 2025-07-17 03:04:19,779 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752368_11544, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-17 03:04:26,488 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752368_11544 replica FinalizedReplica, blk_1073752368_11544, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752368 for deletion 2025-07-17 03:04:26,489 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752368_11544 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752368 2025-07-17 03:08:19,758 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752372_11548 src: /192.168.158.1:39480 dest: /192.168.158.4:9866 2025-07-17 03:08:19,790 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39480, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-681327940_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752372_11548, duration(ns): 23331244 2025-07-17 03:08:19,790 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752372_11548, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-17 03:08:23,500 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752372_11548 replica FinalizedReplica, blk_1073752372_11548, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752372 for deletion 2025-07-17 03:08:23,501 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752372_11548 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752372 2025-07-17 03:10:19,764 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752374_11550 src: /192.168.158.6:53698 dest: /192.168.158.4:9866 2025-07-17 03:10:19,783 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:53698, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_381271172_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752374_11550, duration(ns): 16455606 2025-07-17 03:10:19,783 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752374_11550, type=LAST_IN_PIPELINE terminating 2025-07-17 03:10:23,504 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752374_11550 replica FinalizedReplica, blk_1073752374_11550, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752374 for deletion 2025-07-17 03:10:23,505 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752374_11550 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752374 2025-07-17 03:15:19,767 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752379_11555 src: /192.168.158.6:37492 dest: /192.168.158.4:9866 2025-07-17 03:15:19,792 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:37492, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1476190400_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752379_11555, duration(ns): 20005644 2025-07-17 03:15:19,793 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752379_11555, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-17 03:15:26,518 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752379_11555 replica FinalizedReplica, blk_1073752379_11555, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752379 for deletion 2025-07-17 03:15:26,520 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752379_11555 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752379 2025-07-17 03:16:19,766 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752380_11556 src: /192.168.158.5:34792 dest: /192.168.158.4:9866 2025-07-17 03:16:19,792 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:34792, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1779277955_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752380_11556, duration(ns): 20738486 2025-07-17 03:16:19,793 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752380_11556, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-17 03:16:23,520 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752380_11556 replica FinalizedReplica, blk_1073752380_11556, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752380 for deletion 2025-07-17 03:16:23,521 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752380_11556 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752380 2025-07-17 03:20:19,774 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752384_11560 src: /192.168.158.1:42412 dest: /192.168.158.4:9866 2025-07-17 03:20:19,808 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42412, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1931188196_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752384_11560, duration(ns): 23921997 2025-07-17 03:20:19,809 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752384_11560, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-17 03:20:26,531 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752384_11560 replica FinalizedReplica, blk_1073752384_11560, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752384 for deletion 2025-07-17 03:20:26,532 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752384_11560 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752384 2025-07-17 03:21:24,782 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752385_11561 src: /192.168.158.8:45052 dest: /192.168.158.4:9866 2025-07-17 03:21:24,809 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:45052, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2133286602_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752385_11561, duration(ns): 21197805 2025-07-17 03:21:24,810 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752385_11561, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-17 03:21:26,532 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752385_11561 replica FinalizedReplica, blk_1073752385_11561, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752385 for deletion 2025-07-17 03:21:26,534 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752385_11561 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752385 2025-07-17 03:24:34,785 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752388_11564 src: /192.168.158.9:51042 dest: /192.168.158.4:9866 2025-07-17 03:24:34,810 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:51042, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-436433657_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752388_11564, duration(ns): 19220907 2025-07-17 03:24:34,810 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752388_11564, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-17 03:24:41,543 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752388_11564 replica FinalizedReplica, blk_1073752388_11564, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752388 for deletion 2025-07-17 03:24:41,544 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752388_11564 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752388 2025-07-17 03:25:34,792 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752389_11565 src: /192.168.158.6:53628 dest: /192.168.158.4:9866 2025-07-17 03:25:34,820 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:53628, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2060849018_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752389_11565, duration(ns): 22497347 2025-07-17 03:25:34,820 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752389_11565, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-17 03:25:38,545 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752389_11565 replica FinalizedReplica, blk_1073752389_11565, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752389 for deletion 2025-07-17 03:25:38,546 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752389_11565 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752389 2025-07-17 03:26:34,792 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752390_11566 src: /192.168.158.9:46132 dest: /192.168.158.4:9866 2025-07-17 03:26:34,819 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:46132, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2081010273_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752390_11566, duration(ns): 21335568 2025-07-17 03:26:34,820 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752390_11566, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-17 03:26:38,546 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752390_11566 replica FinalizedReplica, blk_1073752390_11566, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752390 for deletion 2025-07-17 03:26:38,548 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752390_11566 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752390 2025-07-17 03:28:34,787 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752392_11568 src: /192.168.158.1:56558 dest: /192.168.158.4:9866 2025-07-17 03:28:34,821 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56558, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_370529735_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752392_11568, duration(ns): 24338344 2025-07-17 03:28:34,822 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752392_11568, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-17 03:28:41,550 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752392_11568 replica FinalizedReplica, blk_1073752392_11568, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752392 for deletion 2025-07-17 03:28:41,551 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752392_11568 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752392 2025-07-17 03:29:34,788 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752393_11569 src: /192.168.158.7:50046 dest: /192.168.158.4:9866 2025-07-17 03:29:34,816 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:50046, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1465512890_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752393_11569, duration(ns): 21559759 2025-07-17 03:29:34,816 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752393_11569, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-17 03:29:41,553 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752393_11569 replica FinalizedReplica, blk_1073752393_11569, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752393 for deletion 2025-07-17 03:29:41,554 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752393_11569 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752393 2025-07-17 03:32:34,808 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752396_11572 src: /192.168.158.8:39882 dest: /192.168.158.4:9866 2025-07-17 03:32:34,827 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:39882, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-359024895_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752396_11572, duration(ns): 17390795 2025-07-17 03:32:34,828 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752396_11572, type=LAST_IN_PIPELINE terminating 2025-07-17 03:32:41,556 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752396_11572 replica FinalizedReplica, blk_1073752396_11572, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752396 for deletion 2025-07-17 03:32:41,557 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752396_11572 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752396 2025-07-17 03:33:34,797 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752397_11573 src: /192.168.158.6:38152 dest: /192.168.158.4:9866 2025-07-17 03:33:34,817 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:38152, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-304819651_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752397_11573, duration(ns): 17844607 2025-07-17 03:33:34,817 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752397_11573, type=LAST_IN_PIPELINE terminating 2025-07-17 03:33:38,558 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752397_11573 replica FinalizedReplica, blk_1073752397_11573, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752397 for deletion 2025-07-17 03:33:38,559 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752397_11573 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752397 2025-07-17 03:34:39,800 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752398_11574 src: /192.168.158.9:50872 dest: /192.168.158.4:9866 2025-07-17 03:34:39,822 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:50872, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1911331157_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752398_11574, duration(ns): 19019666 2025-07-17 03:34:39,822 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752398_11574, type=LAST_IN_PIPELINE terminating 2025-07-17 03:34:41,560 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752398_11574 replica FinalizedReplica, blk_1073752398_11574, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752398 for deletion 2025-07-17 03:34:41,561 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752398_11574 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752398 2025-07-17 03:36:44,829 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752400_11576 src: /192.168.158.1:49938 dest: /192.168.158.4:9866 2025-07-17 03:36:44,863 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49938, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2131778631_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752400_11576, duration(ns): 23734047 2025-07-17 03:36:44,863 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752400_11576, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-17 03:36:50,562 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752400_11576 replica FinalizedReplica, blk_1073752400_11576, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752400 for deletion 2025-07-17 03:36:50,563 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752400_11576 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752400 2025-07-17 03:38:49,809 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752402_11578 src: /192.168.158.8:44900 dest: /192.168.158.4:9866 2025-07-17 03:38:49,836 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:44900, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1621086547_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752402_11578, duration(ns): 20997423 2025-07-17 03:38:49,836 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752402_11578, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-17 03:38:53,564 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752402_11578 replica FinalizedReplica, blk_1073752402_11578, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752402 for deletion 2025-07-17 03:38:53,565 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752402_11578 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752402 2025-07-17 03:39:49,843 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752403_11579 src: /192.168.158.1:50458 dest: /192.168.158.4:9866 2025-07-17 03:39:49,878 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50458, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_667926362_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752403_11579, duration(ns): 26173984 2025-07-17 03:39:49,878 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752403_11579, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-17 03:39:53,569 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752403_11579 replica FinalizedReplica, blk_1073752403_11579, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752403 for deletion 2025-07-17 03:39:53,570 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752403_11579 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752403 2025-07-17 03:40:49,812 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752404_11580 src: /192.168.158.1:38520 dest: /192.168.158.4:9866 2025-07-17 03:40:49,849 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38520, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1461769630_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752404_11580, duration(ns): 26808791 2025-07-17 03:40:49,850 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752404_11580, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-17 03:40:53,573 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752404_11580 replica FinalizedReplica, blk_1073752404_11580, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752404 for deletion 2025-07-17 03:40:53,575 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752404_11580 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752404 2025-07-17 03:41:49,816 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752405_11581 src: /192.168.158.9:57112 dest: /192.168.158.4:9866 2025-07-17 03:41:49,844 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:57112, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1180399948_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752405_11581, duration(ns): 21662990 2025-07-17 03:41:49,844 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752405_11581, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-17 03:41:53,575 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752405_11581 replica FinalizedReplica, blk_1073752405_11581, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752405 for deletion 2025-07-17 03:41:53,577 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752405_11581 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752405 2025-07-17 03:43:49,824 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752407_11583 src: /192.168.158.1:54268 dest: /192.168.158.4:9866 2025-07-17 03:43:49,859 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54268, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1299830219_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752407_11583, duration(ns): 24817565 2025-07-17 03:43:49,860 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752407_11583, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-17 03:43:53,579 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752407_11583 replica FinalizedReplica, blk_1073752407_11583, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752407 for deletion 2025-07-17 03:43:53,581 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752407_11583 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752407 2025-07-17 03:44:49,823 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752408_11584 src: /192.168.158.9:51964 dest: /192.168.158.4:9866 2025-07-17 03:44:49,851 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:51964, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-824647028_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752408_11584, duration(ns): 21436660 2025-07-17 03:44:49,851 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752408_11584, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-17 03:44:53,581 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752408_11584 replica FinalizedReplica, blk_1073752408_11584, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752408 for deletion 2025-07-17 03:44:53,583 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752408_11584 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752408 2025-07-17 03:46:54,829 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752410_11586 src: /192.168.158.5:44474 dest: /192.168.158.4:9866 2025-07-17 03:46:54,852 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:44474, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-620421989_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752410_11586, duration(ns): 20938140 2025-07-17 03:46:54,853 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752410_11586, type=LAST_IN_PIPELINE terminating 2025-07-17 03:46:59,585 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752410_11586 replica FinalizedReplica, blk_1073752410_11586, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752410 for deletion 2025-07-17 03:46:59,586 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752410_11586 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752410 2025-07-17 03:48:54,833 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752412_11588 src: /192.168.158.8:52904 dest: /192.168.158.4:9866 2025-07-17 03:48:54,861 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:52904, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_143367061_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752412_11588, duration(ns): 21720382 2025-07-17 03:48:54,861 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752412_11588, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-17 03:48:59,591 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752412_11588 replica FinalizedReplica, blk_1073752412_11588, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752412 for deletion 2025-07-17 03:48:59,592 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752412_11588 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752412 2025-07-17 03:51:04,838 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752414_11590 src: /192.168.158.5:51796 dest: /192.168.158.4:9866 2025-07-17 03:51:04,857 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:51796, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-115292571_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752414_11590, duration(ns): 16394394 2025-07-17 03:51:04,857 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752414_11590, type=LAST_IN_PIPELINE terminating 2025-07-17 03:51:08,595 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752414_11590 replica FinalizedReplica, blk_1073752414_11590, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752414 for deletion 2025-07-17 03:51:08,596 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752414_11590 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752414 2025-07-17 03:52:04,837 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752415_11591 src: /192.168.158.5:46794 dest: /192.168.158.4:9866 2025-07-17 03:52:04,855 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:46794, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_102974418_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752415_11591, duration(ns): 16553940 2025-07-17 03:52:04,856 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752415_11591, type=LAST_IN_PIPELINE terminating 2025-07-17 03:52:08,598 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752415_11591 replica FinalizedReplica, blk_1073752415_11591, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752415 for deletion 2025-07-17 03:52:08,599 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752415_11591 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752415 2025-07-17 03:53:09,824 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752416_11592 src: /192.168.158.7:52246 dest: /192.168.158.4:9866 2025-07-17 03:53:09,851 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:52246, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1729905913_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752416_11592, duration(ns): 20799374 2025-07-17 03:53:09,851 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752416_11592, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-17 03:53:14,600 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752416_11592 replica FinalizedReplica, blk_1073752416_11592, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752416 for deletion 2025-07-17 03:53:14,601 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752416_11592 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752416 2025-07-17 03:54:09,829 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752417_11593 src: /192.168.158.1:36266 dest: /192.168.158.4:9866 2025-07-17 03:54:09,862 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36266, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-302171403_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752417_11593, duration(ns): 23324654 2025-07-17 03:54:09,862 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752417_11593, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-17 03:54:11,603 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752417_11593 replica FinalizedReplica, blk_1073752417_11593, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752417 for deletion 2025-07-17 03:54:11,604 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752417_11593 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752417 2025-07-17 03:56:14,840 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752419_11595 src: /192.168.158.5:52114 dest: /192.168.158.4:9866 2025-07-17 03:56:14,858 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:52114, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1962077524_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752419_11595, duration(ns): 16089492 2025-07-17 03:56:14,858 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752419_11595, type=LAST_IN_PIPELINE terminating 2025-07-17 03:56:17,608 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752419_11595 replica FinalizedReplica, blk_1073752419_11595, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752419 for deletion 2025-07-17 03:56:17,609 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752419_11595 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752419 2025-07-17 03:58:14,847 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752421_11597 src: /192.168.158.6:41272 dest: /192.168.158.4:9866 2025-07-17 03:58:14,868 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:41272, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1404957717_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752421_11597, duration(ns): 18837848 2025-07-17 03:58:14,868 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752421_11597, type=LAST_IN_PIPELINE terminating 2025-07-17 03:58:20,610 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752421_11597 replica FinalizedReplica, blk_1073752421_11597, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752421 for deletion 2025-07-17 03:58:20,611 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752421_11597 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752421 2025-07-17 03:59:14,843 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752422_11598 src: /192.168.158.6:36394 dest: /192.168.158.4:9866 2025-07-17 03:59:14,860 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:36394, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_593719921_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752422_11598, duration(ns): 14833005 2025-07-17 03:59:14,860 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752422_11598, type=LAST_IN_PIPELINE terminating 2025-07-17 03:59:17,618 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Successfully sent block report 0x12cfd9a757d31f44, containing 4 storage report(s), of which we sent 4. The reports had 9 total blocks and used 1 RPC(s). This took 0 msec to generate and 5 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5. 2025-07-17 03:59:17,618 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Got finalize command for block pool BP-1059995147-192.168.158.1-1752101929360 2025-07-17 03:59:20,613 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752422_11598 replica FinalizedReplica, blk_1073752422_11598, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752422 for deletion 2025-07-17 03:59:20,614 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752422_11598 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752422 2025-07-17 04:00:14,850 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752423_11599 src: /192.168.158.8:49064 dest: /192.168.158.4:9866 2025-07-17 04:00:14,869 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:49064, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_702941095_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752423_11599, duration(ns): 16516987 2025-07-17 04:00:14,869 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752423_11599, type=LAST_IN_PIPELINE terminating 2025-07-17 04:00:20,616 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752423_11599 replica FinalizedReplica, blk_1073752423_11599, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752423 for deletion 2025-07-17 04:00:20,617 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752423_11599 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752423 2025-07-17 04:01:14,852 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752424_11600 src: /192.168.158.9:40584 dest: /192.168.158.4:9866 2025-07-17 04:01:14,871 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:40584, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_523928487_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752424_11600, duration(ns): 17065918 2025-07-17 04:01:14,871 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752424_11600, type=LAST_IN_PIPELINE terminating 2025-07-17 04:01:17,617 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752424_11600 replica FinalizedReplica, blk_1073752424_11600, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752424 for deletion 2025-07-17 04:01:17,618 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752424_11600 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752424 2025-07-17 04:03:19,848 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752426_11602 src: /192.168.158.1:55638 dest: /192.168.158.4:9866 2025-07-17 04:03:19,880 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55638, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2101433582_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752426_11602, duration(ns): 22728885 2025-07-17 04:03:19,880 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752426_11602, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-17 04:03:23,620 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752426_11602 replica FinalizedReplica, blk_1073752426_11602, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752426 for deletion 2025-07-17 04:03:23,621 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752426_11602 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752426 2025-07-17 04:06:19,856 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752429_11605 src: /192.168.158.1:46804 dest: /192.168.158.4:9866 2025-07-17 04:06:19,891 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46804, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1391865054_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752429_11605, duration(ns): 24927426 2025-07-17 04:06:19,892 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752429_11605, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-17 04:06:23,624 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752429_11605 replica FinalizedReplica, blk_1073752429_11605, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752429 for deletion 2025-07-17 04:06:23,626 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752429_11605 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752429 2025-07-17 04:07:19,856 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752430_11606 src: /192.168.158.1:37398 dest: /192.168.158.4:9866 2025-07-17 04:07:19,892 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37398, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1365646515_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752430_11606, duration(ns): 25678844 2025-07-17 04:07:19,892 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752430_11606, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-17 04:07:23,625 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752430_11606 replica FinalizedReplica, blk_1073752430_11606, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752430 for deletion 2025-07-17 04:07:23,626 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752430_11606 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752430 2025-07-17 04:08:19,851 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752431_11607 src: /192.168.158.1:42188 dest: /192.168.158.4:9866 2025-07-17 04:08:19,886 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42188, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2142866636_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752431_11607, duration(ns): 25547986 2025-07-17 04:08:19,886 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752431_11607, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-17 04:08:23,630 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752431_11607 replica FinalizedReplica, blk_1073752431_11607, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752431 for deletion 2025-07-17 04:08:23,631 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752431_11607 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752431 2025-07-17 04:10:19,878 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752433_11609 src: /192.168.158.9:39538 dest: /192.168.158.4:9866 2025-07-17 04:10:19,897 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:39538, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-206296098_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752433_11609, duration(ns): 16391308 2025-07-17 04:10:19,897 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752433_11609, type=LAST_IN_PIPELINE terminating 2025-07-17 04:10:26,633 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752433_11609 replica FinalizedReplica, blk_1073752433_11609, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752433 for deletion 2025-07-17 04:10:26,634 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752433_11609 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752433 2025-07-17 04:12:24,866 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752435_11611 src: /192.168.158.6:36982 dest: /192.168.158.4:9866 2025-07-17 04:12:24,885 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:36982, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_503961670_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752435_11611, duration(ns): 15904268 2025-07-17 04:12:24,885 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752435_11611, type=LAST_IN_PIPELINE terminating 2025-07-17 04:12:29,637 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752435_11611 replica FinalizedReplica, blk_1073752435_11611, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752435 for deletion 2025-07-17 04:12:29,638 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752435_11611 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752435 2025-07-17 04:15:29,867 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752438_11614 src: /192.168.158.5:59218 dest: /192.168.158.4:9866 2025-07-17 04:15:29,892 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:59218, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-295078468_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752438_11614, duration(ns): 19939371 2025-07-17 04:15:29,893 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752438_11614, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-17 04:15:32,648 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752438_11614 replica FinalizedReplica, blk_1073752438_11614, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752438 for deletion 2025-07-17 04:15:32,649 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752438_11614 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752438 2025-07-17 04:18:34,882 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752441_11617 src: /192.168.158.1:37484 dest: /192.168.158.4:9866 2025-07-17 04:18:34,914 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37484, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1073555183_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752441_11617, duration(ns): 23113044 2025-07-17 04:18:34,914 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752441_11617, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-17 04:18:38,652 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752441_11617 replica FinalizedReplica, blk_1073752441_11617, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752441 for deletion 2025-07-17 04:18:38,653 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752441_11617 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752441 2025-07-17 04:20:34,878 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752443_11619 src: /192.168.158.1:47946 dest: /192.168.158.4:9866 2025-07-17 04:20:34,912 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47946, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-987787784_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752443_11619, duration(ns): 23900737 2025-07-17 04:20:34,912 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752443_11619, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-17 04:20:38,654 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752443_11619 replica FinalizedReplica, blk_1073752443_11619, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752443 for deletion 2025-07-17 04:20:38,655 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752443_11619 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752443 2025-07-17 04:23:34,928 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752446_11622 src: /192.168.158.7:33646 dest: /192.168.158.4:9866 2025-07-17 04:23:34,954 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:33646, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_9239072_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752446_11622, duration(ns): 20554619 2025-07-17 04:23:34,955 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752446_11622, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-17 04:23:41,660 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752446_11622 replica FinalizedReplica, blk_1073752446_11622, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752446 for deletion 2025-07-17 04:23:41,661 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752446_11622 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752446 2025-07-17 04:24:34,887 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752447_11623 src: /192.168.158.1:44194 dest: /192.168.158.4:9866 2025-07-17 04:24:34,920 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44194, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-779288070_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752447_11623, duration(ns): 24546609 2025-07-17 04:24:34,921 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752447_11623, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-17 04:24:38,661 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752447_11623 replica FinalizedReplica, blk_1073752447_11623, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752447 for deletion 2025-07-17 04:24:38,662 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752447_11623 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752447 2025-07-17 04:25:34,887 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752448_11624 src: /192.168.158.9:56952 dest: /192.168.158.4:9866 2025-07-17 04:25:34,907 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:56952, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1323501528_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752448_11624, duration(ns): 18200664 2025-07-17 04:25:34,908 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752448_11624, type=LAST_IN_PIPELINE terminating 2025-07-17 04:25:38,662 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752448_11624 replica FinalizedReplica, blk_1073752448_11624, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752448 for deletion 2025-07-17 04:25:38,665 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752448_11624 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752448 2025-07-17 04:27:34,905 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752450_11626 src: /192.168.158.7:42454 dest: /192.168.158.4:9866 2025-07-17 04:27:34,932 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:42454, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-812238125_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752450_11626, duration(ns): 20527300 2025-07-17 04:27:34,932 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752450_11626, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-17 04:27:41,668 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752450_11626 replica FinalizedReplica, blk_1073752450_11626, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752450 for deletion 2025-07-17 04:27:41,669 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752450_11626 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752450 2025-07-17 04:28:34,899 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752451_11627 src: /192.168.158.1:35718 dest: /192.168.158.4:9866 2025-07-17 04:28:34,934 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35718, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2060921752_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752451_11627, duration(ns): 26502962 2025-07-17 04:28:34,935 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752451_11627, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-17 04:28:38,668 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752451_11627 replica FinalizedReplica, blk_1073752451_11627, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752451 for deletion 2025-07-17 04:28:38,670 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752451_11627 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752451 2025-07-17 04:30:34,912 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752453_11629 src: /192.168.158.6:49678 dest: /192.168.158.4:9866 2025-07-17 04:30:34,938 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:49678, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1826011997_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752453_11629, duration(ns): 20730874 2025-07-17 04:30:34,939 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752453_11629, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-17 04:30:38,674 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752453_11629 replica FinalizedReplica, blk_1073752453_11629, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752453 for deletion 2025-07-17 04:30:38,675 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752453_11629 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752453 2025-07-17 04:33:39,904 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752456_11632 src: /192.168.158.1:44870 dest: /192.168.158.4:9866 2025-07-17 04:33:39,938 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44870, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_509579813_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752456_11632, duration(ns): 24214078 2025-07-17 04:33:39,938 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752456_11632, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-17 04:33:44,686 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752456_11632 replica FinalizedReplica, blk_1073752456_11632, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752456 for deletion 2025-07-17 04:33:44,687 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752456_11632 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752456 2025-07-17 04:34:39,901 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752457_11633 src: /192.168.158.6:52606 dest: /192.168.158.4:9866 2025-07-17 04:34:39,928 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:52606, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1211218825_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752457_11633, duration(ns): 21437861 2025-07-17 04:34:39,928 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752457_11633, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-17 04:34:41,690 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752457_11633 replica FinalizedReplica, blk_1073752457_11633, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752457 for deletion 2025-07-17 04:34:41,691 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752457_11633 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752457 2025-07-17 04:37:44,926 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752460_11636 src: /192.168.158.6:50778 dest: /192.168.158.4:9866 2025-07-17 04:37:44,944 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:50778, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-954883066_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752460_11636, duration(ns): 16195349 2025-07-17 04:37:44,944 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752460_11636, type=LAST_IN_PIPELINE terminating 2025-07-17 04:37:47,702 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752460_11636 replica FinalizedReplica, blk_1073752460_11636, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752460 for deletion 2025-07-17 04:37:47,703 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752460_11636 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752460 2025-07-17 04:41:44,901 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752464_11640 src: /192.168.158.1:59868 dest: /192.168.158.4:9866 2025-07-17 04:41:44,934 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59868, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1207130743_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752464_11640, duration(ns): 23888615 2025-07-17 04:41:44,935 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752464_11640, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-17 04:41:50,713 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752464_11640 replica FinalizedReplica, blk_1073752464_11640, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752464 for deletion 2025-07-17 04:41:50,714 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752464_11640 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752464 2025-07-17 04:45:54,915 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752468_11644 src: /192.168.158.1:52762 dest: /192.168.158.4:9866 2025-07-17 04:45:54,947 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52762, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_300831229_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752468_11644, duration(ns): 23606000 2025-07-17 04:45:54,948 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752468_11644, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-17 04:45:56,721 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752468_11644 replica FinalizedReplica, blk_1073752468_11644, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752468 for deletion 2025-07-17 04:45:56,722 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752468_11644 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752468 2025-07-17 04:48:54,923 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752471_11647 src: /192.168.158.9:58346 dest: /192.168.158.4:9866 2025-07-17 04:48:54,944 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:58346, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2130179219_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752471_11647, duration(ns): 18175587 2025-07-17 04:48:54,944 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752471_11647, type=LAST_IN_PIPELINE terminating 2025-07-17 04:48:59,726 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752471_11647 replica FinalizedReplica, blk_1073752471_11647, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752471 for deletion 2025-07-17 04:48:59,727 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752471_11647 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752471 2025-07-17 04:50:54,925 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752473_11649 src: /192.168.158.5:48578 dest: /192.168.158.4:9866 2025-07-17 04:50:54,951 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:48578, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2123255223_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752473_11649, duration(ns): 20324010 2025-07-17 04:50:54,951 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752473_11649, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-17 04:50:56,732 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752473_11649 replica FinalizedReplica, blk_1073752473_11649, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752473 for deletion 2025-07-17 04:50:56,733 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752473_11649 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752473 2025-07-17 04:52:54,932 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752475_11651 src: /192.168.158.7:44960 dest: /192.168.158.4:9866 2025-07-17 04:52:54,959 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:44960, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-923174224_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752475_11651, duration(ns): 20896425 2025-07-17 04:52:54,959 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752475_11651, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-17 04:52:59,739 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752475_11651 replica FinalizedReplica, blk_1073752475_11651, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752475 for deletion 2025-07-17 04:52:59,740 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752475_11651 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752475 2025-07-17 04:53:54,935 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752476_11652 src: /192.168.158.1:46314 dest: /192.168.158.4:9866 2025-07-17 04:53:54,969 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46314, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2121840072_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752476_11652, duration(ns): 24687756 2025-07-17 04:53:54,969 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752476_11652, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-17 04:53:56,740 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752476_11652 replica FinalizedReplica, blk_1073752476_11652, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752476 for deletion 2025-07-17 04:53:56,741 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752476_11652 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752476 2025-07-17 04:54:54,932 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752477_11653 src: /192.168.158.1:43064 dest: /192.168.158.4:9866 2025-07-17 04:54:54,963 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43064, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1884295118_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752477_11653, duration(ns): 22129921 2025-07-17 04:54:54,964 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752477_11653, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-17 04:54:59,742 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752477_11653 replica FinalizedReplica, blk_1073752477_11653, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752477 for deletion 2025-07-17 04:54:59,743 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752477_11653 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752477 2025-07-17 04:55:54,942 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752478_11654 src: /192.168.158.5:56706 dest: /192.168.158.4:9866 2025-07-17 04:55:54,962 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:56706, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-635998031_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752478_11654, duration(ns): 17019864 2025-07-17 04:55:54,962 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752478_11654, type=LAST_IN_PIPELINE terminating 2025-07-17 04:55:56,745 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752478_11654 replica FinalizedReplica, blk_1073752478_11654, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752478 for deletion 2025-07-17 04:55:56,746 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752478_11654 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752478 2025-07-17 04:56:54,935 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752479_11655 src: /192.168.158.1:51074 dest: /192.168.158.4:9866 2025-07-17 04:56:54,968 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51074, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_218027422_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752479_11655, duration(ns): 22883242 2025-07-17 04:56:54,968 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752479_11655, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-17 04:56:59,747 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752479_11655 replica FinalizedReplica, blk_1073752479_11655, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752479 for deletion 2025-07-17 04:56:59,749 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752479_11655 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752479 2025-07-17 04:57:54,968 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752480_11656 src: /192.168.158.9:35616 dest: /192.168.158.4:9866 2025-07-17 04:57:54,986 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:35616, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-43339365_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752480_11656, duration(ns): 15264105 2025-07-17 04:57:54,986 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752480_11656, type=LAST_IN_PIPELINE terminating 2025-07-17 04:57:56,749 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752480_11656 replica FinalizedReplica, blk_1073752480_11656, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752480 for deletion 2025-07-17 04:57:56,750 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752480_11656 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752480 2025-07-17 05:00:54,953 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752483_11659 src: /192.168.158.1:34632 dest: /192.168.158.4:9866 2025-07-17 05:00:54,985 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34632, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1656982950_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752483_11659, duration(ns): 22175921 2025-07-17 05:00:54,985 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752483_11659, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-17 05:00:56,756 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752483_11659 replica FinalizedReplica, blk_1073752483_11659, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752483 for deletion 2025-07-17 05:00:56,757 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752483_11659 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752483 2025-07-17 05:01:54,952 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752484_11660 src: /192.168.158.1:41076 dest: /192.168.158.4:9866 2025-07-17 05:01:54,985 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41076, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_865524642_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752484_11660, duration(ns): 23950193 2025-07-17 05:01:54,986 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752484_11660, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-17 05:01:59,759 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752484_11660 replica FinalizedReplica, blk_1073752484_11660, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752484 for deletion 2025-07-17 05:01:59,760 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752484_11660 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752484 2025-07-17 05:02:54,963 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752485_11661 src: /192.168.158.8:53964 dest: /192.168.158.4:9866 2025-07-17 05:02:54,983 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:53964, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-561959560_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752485_11661, duration(ns): 18668559 2025-07-17 05:02:54,984 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752485_11661, type=LAST_IN_PIPELINE terminating 2025-07-17 05:02:56,762 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752485_11661 replica FinalizedReplica, blk_1073752485_11661, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752485 for deletion 2025-07-17 05:02:56,763 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752485_11661 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752485 2025-07-17 05:04:59,961 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752487_11663 src: /192.168.158.1:56734 dest: /192.168.158.4:9866 2025-07-17 05:04:59,996 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56734, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_261487730_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752487_11663, duration(ns): 25395273 2025-07-17 05:04:59,997 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752487_11663, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-17 05:05:02,766 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752487_11663 replica FinalizedReplica, blk_1073752487_11663, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752487 for deletion 2025-07-17 05:05:02,767 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752487_11663 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752487 2025-07-17 05:06:04,973 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752488_11664 src: /192.168.158.7:52120 dest: /192.168.158.4:9866 2025-07-17 05:06:05,000 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:52120, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_29723726_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752488_11664, duration(ns): 21307514 2025-07-17 05:06:05,001 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752488_11664, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-17 05:06:11,769 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752488_11664 replica FinalizedReplica, blk_1073752488_11664, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752488 for deletion 2025-07-17 05:06:11,771 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752488_11664 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752488 2025-07-17 05:08:04,958 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752490_11666 src: /192.168.158.1:57226 dest: /192.168.158.4:9866 2025-07-17 05:08:04,992 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57226, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1766335618_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752490_11666, duration(ns): 24692679 2025-07-17 05:08:04,993 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752490_11666, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-17 05:08:08,777 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752490_11666 replica FinalizedReplica, blk_1073752490_11666, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752490 for deletion 2025-07-17 05:08:08,779 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752490_11666 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752490 2025-07-17 05:11:04,969 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752493_11669 src: /192.168.158.1:33194 dest: /192.168.158.4:9866 2025-07-17 05:11:05,002 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33194, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1654696901_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752493_11669, duration(ns): 23336789 2025-07-17 05:11:05,002 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752493_11669, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-17 05:11:08,785 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752493_11669 replica FinalizedReplica, blk_1073752493_11669, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752493 for deletion 2025-07-17 05:11:08,786 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752493_11669 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752493 2025-07-17 05:18:19,987 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752500_11676 src: /192.168.158.9:38336 dest: /192.168.158.4:9866 2025-07-17 05:18:20,006 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:38336, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-67676849_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752500_11676, duration(ns): 16473891 2025-07-17 05:18:20,006 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752500_11676, type=LAST_IN_PIPELINE terminating 2025-07-17 05:18:26,801 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752500_11676 replica FinalizedReplica, blk_1073752500_11676, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752500 for deletion 2025-07-17 05:18:26,803 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752500_11676 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752500 2025-07-17 05:22:20,006 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752504_11680 src: /192.168.158.8:42448 dest: /192.168.158.4:9866 2025-07-17 05:22:20,033 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:42448, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-907778814_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752504_11680, duration(ns): 21918079 2025-07-17 05:22:20,034 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752504_11680, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-17 05:22:23,807 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752504_11680 replica FinalizedReplica, blk_1073752504_11680, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752504 for deletion 2025-07-17 05:22:23,808 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752504_11680 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752504 2025-07-17 05:23:19,991 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752505_11681 src: /192.168.158.8:36518 dest: /192.168.158.4:9866 2025-07-17 05:23:20,016 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:36518, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_414782251_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752505_11681, duration(ns): 19233450 2025-07-17 05:23:20,016 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752505_11681, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-17 05:23:23,810 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752505_11681 replica FinalizedReplica, blk_1073752505_11681, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752505 for deletion 2025-07-17 05:23:23,811 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752505_11681 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752505 2025-07-17 05:28:20,000 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752510_11686 src: /192.168.158.7:53798 dest: /192.168.158.4:9866 2025-07-17 05:28:20,019 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:53798, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1355995536_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752510_11686, duration(ns): 16559629 2025-07-17 05:28:20,019 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752510_11686, type=LAST_IN_PIPELINE terminating 2025-07-17 05:28:23,819 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752510_11686 replica FinalizedReplica, blk_1073752510_11686, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752510 for deletion 2025-07-17 05:28:23,820 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752510_11686 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752510 2025-07-17 05:32:29,999 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752514_11690 src: /192.168.158.9:54274 dest: /192.168.158.4:9866 2025-07-17 05:32:30,018 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:54274, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1745552388_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752514_11690, duration(ns): 16406849 2025-07-17 05:32:30,018 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752514_11690, type=LAST_IN_PIPELINE terminating 2025-07-17 05:32:32,831 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752514_11690 replica FinalizedReplica, blk_1073752514_11690, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752514 for deletion 2025-07-17 05:32:32,832 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752514_11690 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752514 2025-07-17 05:33:30,002 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752515_11691 src: /192.168.158.8:49350 dest: /192.168.158.4:9866 2025-07-17 05:33:30,021 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:49350, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1276472246_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752515_11691, duration(ns): 17361080 2025-07-17 05:33:30,022 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752515_11691, type=LAST_IN_PIPELINE terminating 2025-07-17 05:33:32,832 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752515_11691 replica FinalizedReplica, blk_1073752515_11691, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752515 for deletion 2025-07-17 05:33:32,834 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752515_11691 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752515 2025-07-17 05:34:34,997 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752516_11692 src: /192.168.158.7:48672 dest: /192.168.158.4:9866 2025-07-17 05:34:35,028 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:48672, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-81790281_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752516_11692, duration(ns): 24068099 2025-07-17 05:34:35,029 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752516_11692, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-17 05:34:41,835 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752516_11692 replica FinalizedReplica, blk_1073752516_11692, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752516 for deletion 2025-07-17 05:34:41,837 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752516_11692 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752516 2025-07-17 05:36:13,271 INFO org.apache.hadoop.hdfs.server.datanode.DirectoryScanner: BlockPool BP-1059995147-192.168.158.1-1752101929360 Total blocks: 8, missing metadata files:0, missing block files:0, missing blocks in memory:0, mismatched blocks:0 2025-07-17 05:38:40,011 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752520_11696 src: /192.168.158.5:42880 dest: /192.168.158.4:9866 2025-07-17 05:38:40,029 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:42880, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-525883485_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752520_11696, duration(ns): 15707624 2025-07-17 05:38:40,030 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752520_11696, type=LAST_IN_PIPELINE terminating 2025-07-17 05:38:44,845 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752520_11696 replica FinalizedReplica, blk_1073752520_11696, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752520 for deletion 2025-07-17 05:38:44,847 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752520_11696 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752520 2025-07-17 05:39:40,007 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752521_11697 src: /192.168.158.7:44404 dest: /192.168.158.4:9866 2025-07-17 05:39:40,032 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:44404, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1038386961_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752521_11697, duration(ns): 19017425 2025-07-17 05:39:40,032 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752521_11697, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-17 05:39:41,849 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752521_11697 replica FinalizedReplica, blk_1073752521_11697, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752521 for deletion 2025-07-17 05:39:41,850 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752521_11697 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752521 2025-07-17 05:40:40,021 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752522_11698 src: /192.168.158.6:40356 dest: /192.168.158.4:9866 2025-07-17 05:40:40,046 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:40356, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_647499161_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752522_11698, duration(ns): 19972624 2025-07-17 05:40:40,047 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752522_11698, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-17 05:40:41,851 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752522_11698 replica FinalizedReplica, blk_1073752522_11698, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752522 for deletion 2025-07-17 05:40:41,852 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752522_11698 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752522 2025-07-17 05:41:40,015 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752523_11699 src: /192.168.158.1:56320 dest: /192.168.158.4:9866 2025-07-17 05:41:40,052 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56320, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_563625228_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752523_11699, duration(ns): 26836515 2025-07-17 05:41:40,052 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752523_11699, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-17 05:41:44,855 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752523_11699 replica FinalizedReplica, blk_1073752523_11699, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752523 for deletion 2025-07-17 05:41:44,856 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752523_11699 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752523 2025-07-17 05:43:40,023 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752525_11701 src: /192.168.158.9:52398 dest: /192.168.158.4:9866 2025-07-17 05:43:40,051 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:52398, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1809450873_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752525_11701, duration(ns): 21630240 2025-07-17 05:43:40,051 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752525_11701, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-17 05:43:41,861 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752525_11701 replica FinalizedReplica, blk_1073752525_11701, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752525 for deletion 2025-07-17 05:43:41,862 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752525_11701 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752525 2025-07-17 05:44:40,022 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752526_11702 src: /192.168.158.1:34704 dest: /192.168.158.4:9866 2025-07-17 05:44:40,054 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34704, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-426206639_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752526_11702, duration(ns): 23165475 2025-07-17 05:44:40,054 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752526_11702, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-17 05:44:41,865 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752526_11702 replica FinalizedReplica, blk_1073752526_11702, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752526 for deletion 2025-07-17 05:44:41,866 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752526_11702 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752526 2025-07-17 05:46:40,028 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752528_11704 src: /192.168.158.9:37144 dest: /192.168.158.4:9866 2025-07-17 05:46:40,054 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:37144, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1079938206_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752528_11704, duration(ns): 19870068 2025-07-17 05:46:40,054 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752528_11704, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-17 05:46:41,867 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752528_11704 replica FinalizedReplica, blk_1073752528_11704, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752528 for deletion 2025-07-17 05:46:41,868 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752528_11704 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752528 2025-07-17 05:53:45,038 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752535_11711 src: /192.168.158.1:39486 dest: /192.168.158.4:9866 2025-07-17 05:53:45,071 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39486, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-563117361_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752535_11711, duration(ns): 24370276 2025-07-17 05:53:45,071 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752535_11711, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-17 05:53:50,887 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752535_11711 replica FinalizedReplica, blk_1073752535_11711, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752535 for deletion 2025-07-17 05:53:50,888 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752535_11711 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752535 2025-07-17 05:54:45,040 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752536_11712 src: /192.168.158.5:39402 dest: /192.168.158.4:9866 2025-07-17 05:54:45,067 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:39402, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1964977218_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752536_11712, duration(ns): 21156259 2025-07-17 05:54:45,067 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752536_11712, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-17 05:54:47,890 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752536_11712 replica FinalizedReplica, blk_1073752536_11712, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752536 for deletion 2025-07-17 05:54:47,891 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752536_11712 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752536 2025-07-17 05:55:45,045 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752537_11713 src: /192.168.158.9:32998 dest: /192.168.158.4:9866 2025-07-17 05:55:45,064 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:32998, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-556645949_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752537_11713, duration(ns): 17104665 2025-07-17 05:55:45,064 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752537_11713, type=LAST_IN_PIPELINE terminating 2025-07-17 05:55:47,893 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752537_11713 replica FinalizedReplica, blk_1073752537_11713, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752537 for deletion 2025-07-17 05:55:47,894 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752537_11713 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752537 2025-07-17 05:56:50,038 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752538_11714 src: /192.168.158.1:54282 dest: /192.168.158.4:9866 2025-07-17 05:56:50,072 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54282, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_726405291_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752538_11714, duration(ns): 24799463 2025-07-17 05:56:50,072 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752538_11714, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-17 05:56:53,894 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752538_11714 replica FinalizedReplica, blk_1073752538_11714, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752538 for deletion 2025-07-17 05:56:53,895 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752538_11714 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752538 2025-07-17 05:58:50,040 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752540_11716 src: /192.168.158.1:47960 dest: /192.168.158.4:9866 2025-07-17 05:58:50,075 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47960, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1228640891_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752540_11716, duration(ns): 25308399 2025-07-17 05:58:50,078 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752540_11716, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-17 05:58:56,899 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752540_11716 replica FinalizedReplica, blk_1073752540_11716, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752540 for deletion 2025-07-17 05:58:56,900 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752540_11716 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752540 2025-07-17 06:00:55,046 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752542_11718 src: /192.168.158.1:48538 dest: /192.168.158.4:9866 2025-07-17 06:00:55,079 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48538, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1792287140_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752542_11718, duration(ns): 23526362 2025-07-17 06:00:55,079 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752542_11718, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-17 06:00:56,904 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752542_11718 replica FinalizedReplica, blk_1073752542_11718, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752542 for deletion 2025-07-17 06:00:56,906 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752542_11718 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752542 2025-07-17 06:02:55,053 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752544_11720 src: /192.168.158.5:38888 dest: /192.168.158.4:9866 2025-07-17 06:02:55,071 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:38888, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1159739695_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752544_11720, duration(ns): 16548305 2025-07-17 06:02:55,072 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752544_11720, type=LAST_IN_PIPELINE terminating 2025-07-17 06:02:59,910 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752544_11720 replica FinalizedReplica, blk_1073752544_11720, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752544 for deletion 2025-07-17 06:02:59,911 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752544_11720 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752544 2025-07-17 06:04:55,062 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752546_11722 src: /192.168.158.8:53356 dest: /192.168.158.4:9866 2025-07-17 06:04:55,082 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:53356, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-371245800_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752546_11722, duration(ns): 17523940 2025-07-17 06:04:55,082 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752546_11722, type=LAST_IN_PIPELINE terminating 2025-07-17 06:04:56,913 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752546_11722 replica FinalizedReplica, blk_1073752546_11722, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752546 for deletion 2025-07-17 06:04:56,914 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752546_11722 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752546 2025-07-17 06:08:00,063 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752549_11725 src: /192.168.158.9:57544 dest: /192.168.158.4:9866 2025-07-17 06:08:00,088 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:57544, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-993431322_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752549_11725, duration(ns): 19683417 2025-07-17 06:08:00,088 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752549_11725, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-17 06:08:05,921 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752549_11725 replica FinalizedReplica, blk_1073752549_11725, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752549 for deletion 2025-07-17 06:08:05,922 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752549_11725 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752549 2025-07-17 06:11:05,071 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752552_11728 src: /192.168.158.9:35200 dest: /192.168.158.4:9866 2025-07-17 06:11:05,098 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:35200, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-99601109_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752552_11728, duration(ns): 21620897 2025-07-17 06:11:05,098 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752552_11728, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-17 06:11:11,930 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752552_11728 replica FinalizedReplica, blk_1073752552_11728, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752552 for deletion 2025-07-17 06:11:11,931 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752552_11728 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752552 2025-07-17 06:16:10,095 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752557_11733 src: /192.168.158.8:46212 dest: /192.168.158.4:9866 2025-07-17 06:16:10,115 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:46212, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-998009582_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752557_11733, duration(ns): 17442909 2025-07-17 06:16:10,115 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752557_11733, type=LAST_IN_PIPELINE terminating 2025-07-17 06:16:14,941 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752557_11733 replica FinalizedReplica, blk_1073752557_11733, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752557 for deletion 2025-07-17 06:16:14,942 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752557_11733 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752557 2025-07-17 06:17:10,083 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752558_11734 src: /192.168.158.8:44218 dest: /192.168.158.4:9866 2025-07-17 06:17:10,103 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:44218, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1186152822_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752558_11734, duration(ns): 17384630 2025-07-17 06:17:10,103 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752558_11734, type=LAST_IN_PIPELINE terminating 2025-07-17 06:17:11,943 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752558_11734 replica FinalizedReplica, blk_1073752558_11734, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752558 for deletion 2025-07-17 06:17:11,944 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752558_11734 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752558 2025-07-17 06:18:10,074 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752559_11735 src: /192.168.158.1:50140 dest: /192.168.158.4:9866 2025-07-17 06:18:10,112 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50140, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1734580303_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752559_11735, duration(ns): 28194186 2025-07-17 06:18:10,112 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752559_11735, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-17 06:18:11,946 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752559_11735 replica FinalizedReplica, blk_1073752559_11735, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752559 for deletion 2025-07-17 06:18:11,947 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752559_11735 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752559 2025-07-17 06:20:10,086 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752561_11737 src: /192.168.158.8:49608 dest: /192.168.158.4:9866 2025-07-17 06:20:10,105 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:49608, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1689787603_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752561_11737, duration(ns): 16674979 2025-07-17 06:20:10,105 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752561_11737, type=LAST_IN_PIPELINE terminating 2025-07-17 06:20:11,948 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752561_11737 replica FinalizedReplica, blk_1073752561_11737, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752561 for deletion 2025-07-17 06:20:11,949 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752561_11737 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752561 2025-07-17 06:21:10,093 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752562_11738 src: /192.168.158.9:58286 dest: /192.168.158.4:9866 2025-07-17 06:21:10,121 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:58286, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-337283804_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752562_11738, duration(ns): 22465102 2025-07-17 06:21:10,121 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752562_11738, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-17 06:21:11,948 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752562_11738 replica FinalizedReplica, blk_1073752562_11738, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752562 for deletion 2025-07-17 06:21:11,949 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752562_11738 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752562 2025-07-17 06:24:10,098 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752565_11741 src: /192.168.158.1:48652 dest: /192.168.158.4:9866 2025-07-17 06:24:10,136 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48652, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_466279472_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752565_11741, duration(ns): 29053262 2025-07-17 06:24:10,136 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752565_11741, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-17 06:24:11,956 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752565_11741 replica FinalizedReplica, blk_1073752565_11741, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752565 for deletion 2025-07-17 06:24:11,957 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752565_11741 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752565 2025-07-17 06:26:10,105 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752567_11743 src: /192.168.158.1:50108 dest: /192.168.158.4:9866 2025-07-17 06:26:10,143 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50108, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-143072856_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752567_11743, duration(ns): 26389623 2025-07-17 06:26:10,143 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752567_11743, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-17 06:26:14,960 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752567_11743 replica FinalizedReplica, blk_1073752567_11743, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752567 for deletion 2025-07-17 06:26:14,961 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752567_11743 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752567 2025-07-17 06:28:15,105 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752569_11745 src: /192.168.158.5:37566 dest: /192.168.158.4:9866 2025-07-17 06:28:15,134 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:37566, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1503105584_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752569_11745, duration(ns): 23425379 2025-07-17 06:28:15,134 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752569_11745, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-17 06:28:17,965 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752569_11745 replica FinalizedReplica, blk_1073752569_11745, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752569 for deletion 2025-07-17 06:28:17,966 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752569_11745 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752569 2025-07-17 06:29:15,102 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752570_11746 src: /192.168.158.1:58958 dest: /192.168.158.4:9866 2025-07-17 06:29:15,139 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58958, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1101738045_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752570_11746, duration(ns): 27875669 2025-07-17 06:29:15,140 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752570_11746, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-17 06:29:20,968 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752570_11746 replica FinalizedReplica, blk_1073752570_11746, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752570 for deletion 2025-07-17 06:29:20,969 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752570_11746 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752570 2025-07-17 06:34:25,115 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752575_11751 src: /192.168.158.5:46334 dest: /192.168.158.4:9866 2025-07-17 06:34:25,141 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:46334, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1435935173_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752575_11751, duration(ns): 21078408 2025-07-17 06:34:25,143 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752575_11751, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-17 06:34:26,980 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752575_11751 replica FinalizedReplica, blk_1073752575_11751, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752575 for deletion 2025-07-17 06:34:26,981 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752575_11751 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir9/blk_1073752575 2025-07-17 06:36:25,121 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752577_11753 src: /192.168.158.5:47894 dest: /192.168.158.4:9866 2025-07-17 06:36:25,140 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:47894, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_869885288_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752577_11753, duration(ns): 16867334 2025-07-17 06:36:25,141 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752577_11753, type=LAST_IN_PIPELINE terminating 2025-07-17 06:36:26,984 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752577_11753 replica FinalizedReplica, blk_1073752577_11753, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752577 for deletion 2025-07-17 06:36:26,985 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752577_11753 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752577 2025-07-17 06:42:30,151 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752583_11759 src: /192.168.158.5:58698 dest: /192.168.158.4:9866 2025-07-17 06:42:30,172 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:58698, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_91127580_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752583_11759, duration(ns): 18569029 2025-07-17 06:42:30,172 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752583_11759, type=LAST_IN_PIPELINE terminating 2025-07-17 06:42:32,998 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752583_11759 replica FinalizedReplica, blk_1073752583_11759, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752583 for deletion 2025-07-17 06:42:32,999 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752583_11759 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752583 2025-07-17 06:48:35,148 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752589_11765 src: /192.168.158.1:56256 dest: /192.168.158.4:9866 2025-07-17 06:48:35,183 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56256, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_841860334_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752589_11765, duration(ns): 23666175 2025-07-17 06:48:35,183 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752589_11765, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-17 06:48:39,006 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752589_11765 replica FinalizedReplica, blk_1073752589_11765, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752589 for deletion 2025-07-17 06:48:39,007 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752589_11765 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752589 2025-07-17 06:49:35,160 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752590_11766 src: /192.168.158.1:43338 dest: /192.168.158.4:9866 2025-07-17 06:49:35,196 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43338, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1106349496_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752590_11766, duration(ns): 26019502 2025-07-17 06:49:35,196 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752590_11766, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-17 06:49:39,009 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752590_11766 replica FinalizedReplica, blk_1073752590_11766, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752590 for deletion 2025-07-17 06:49:39,010 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752590_11766 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752590 2025-07-17 06:50:35,152 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752591_11767 src: /192.168.158.1:53718 dest: /192.168.158.4:9866 2025-07-17 06:50:35,187 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53718, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1194157392_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752591_11767, duration(ns): 25768999 2025-07-17 06:50:35,187 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752591_11767, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-17 06:50:39,012 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752591_11767 replica FinalizedReplica, blk_1073752591_11767, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752591 for deletion 2025-07-17 06:50:39,013 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752591_11767 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752591 2025-07-17 06:52:35,154 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752593_11769 src: /192.168.158.8:52980 dest: /192.168.158.4:9866 2025-07-17 06:52:35,181 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:52980, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2047253730_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752593_11769, duration(ns): 21200056 2025-07-17 06:52:35,181 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752593_11769, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-17 06:52:39,018 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752593_11769 replica FinalizedReplica, blk_1073752593_11769, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752593 for deletion 2025-07-17 06:52:39,019 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752593_11769 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752593 2025-07-17 06:53:35,153 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752594_11770 src: /192.168.158.5:47602 dest: /192.168.158.4:9866 2025-07-17 06:53:35,180 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:47602, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1094106524_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752594_11770, duration(ns): 21517638 2025-07-17 06:53:35,180 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752594_11770, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-17 06:53:39,021 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752594_11770 replica FinalizedReplica, blk_1073752594_11770, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752594 for deletion 2025-07-17 06:53:39,022 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752594_11770 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752594 2025-07-17 06:54:35,149 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752595_11771 src: /192.168.158.1:54340 dest: /192.168.158.4:9866 2025-07-17 06:54:35,185 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54340, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-131415204_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752595_11771, duration(ns): 26622800 2025-07-17 06:54:35,185 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752595_11771, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-17 06:54:39,021 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752595_11771 replica FinalizedReplica, blk_1073752595_11771, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752595 for deletion 2025-07-17 06:54:39,022 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752595_11771 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752595 2025-07-17 06:55:35,158 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752596_11772 src: /192.168.158.5:49926 dest: /192.168.158.4:9866 2025-07-17 06:55:35,177 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:49926, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-925933671_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752596_11772, duration(ns): 16293668 2025-07-17 06:55:35,177 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752596_11772, type=LAST_IN_PIPELINE terminating 2025-07-17 06:55:42,021 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752596_11772 replica FinalizedReplica, blk_1073752596_11772, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752596 for deletion 2025-07-17 06:55:42,022 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752596_11772 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752596 2025-07-17 06:58:35,162 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752599_11775 src: /192.168.158.6:39446 dest: /192.168.158.4:9866 2025-07-17 06:58:35,189 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:39446, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1238589995_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752599_11775, duration(ns): 20973671 2025-07-17 06:58:35,189 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752599_11775, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-17 06:58:39,027 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752599_11775 replica FinalizedReplica, blk_1073752599_11775, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752599 for deletion 2025-07-17 06:58:39,028 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752599_11775 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752599 2025-07-17 07:00:35,195 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752601_11777 src: /192.168.158.5:59454 dest: /192.168.158.4:9866 2025-07-17 07:00:35,214 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:59454, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1890426842_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752601_11777, duration(ns): 16806984 2025-07-17 07:00:35,215 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752601_11777, type=LAST_IN_PIPELINE terminating 2025-07-17 07:00:39,030 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752601_11777 replica FinalizedReplica, blk_1073752601_11777, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752601 for deletion 2025-07-17 07:00:39,032 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752601_11777 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752601 2025-07-17 07:02:35,220 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752603_11779 src: /192.168.158.1:47918 dest: /192.168.158.4:9866 2025-07-17 07:02:35,257 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47918, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1165267599_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752603_11779, duration(ns): 25724716 2025-07-17 07:02:35,257 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752603_11779, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-17 07:02:39,036 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752603_11779 replica FinalizedReplica, blk_1073752603_11779, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752603 for deletion 2025-07-17 07:02:39,037 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752603_11779 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752603 2025-07-17 07:03:35,172 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752604_11780 src: /192.168.158.6:57428 dest: /192.168.158.4:9866 2025-07-17 07:03:35,192 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:57428, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_320837450_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752604_11780, duration(ns): 17980339 2025-07-17 07:03:35,193 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752604_11780, type=LAST_IN_PIPELINE terminating 2025-07-17 07:03:39,039 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752604_11780 replica FinalizedReplica, blk_1073752604_11780, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752604 for deletion 2025-07-17 07:03:39,040 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752604_11780 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752604 2025-07-17 07:04:35,174 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752605_11781 src: /192.168.158.9:45490 dest: /192.168.158.4:9866 2025-07-17 07:04:35,199 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:45490, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1770462658_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752605_11781, duration(ns): 19577274 2025-07-17 07:04:35,201 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752605_11781, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-17 07:04:39,043 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752605_11781 replica FinalizedReplica, blk_1073752605_11781, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752605 for deletion 2025-07-17 07:04:39,044 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752605_11781 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752605 2025-07-17 07:05:35,173 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752606_11782 src: /192.168.158.1:45174 dest: /192.168.158.4:9866 2025-07-17 07:05:35,210 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45174, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_59632299_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752606_11782, duration(ns): 27991008 2025-07-17 07:05:35,211 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752606_11782, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-17 07:05:39,043 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752606_11782 replica FinalizedReplica, blk_1073752606_11782, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752606 for deletion 2025-07-17 07:05:39,045 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752606_11782 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752606 2025-07-17 07:08:40,176 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752609_11785 src: /192.168.158.6:59194 dest: /192.168.158.4:9866 2025-07-17 07:08:40,204 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:59194, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1434218848_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752609_11785, duration(ns): 21893539 2025-07-17 07:08:40,204 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752609_11785, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-17 07:08:45,046 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752609_11785 replica FinalizedReplica, blk_1073752609_11785, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752609 for deletion 2025-07-17 07:08:45,048 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752609_11785 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752609 2025-07-17 07:09:45,178 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752610_11786 src: /192.168.158.5:51096 dest: /192.168.158.4:9866 2025-07-17 07:09:45,207 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:51096, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1539217791_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752610_11786, duration(ns): 22960786 2025-07-17 07:09:45,208 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752610_11786, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-17 07:09:48,047 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752610_11786 replica FinalizedReplica, blk_1073752610_11786, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752610 for deletion 2025-07-17 07:09:48,048 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752610_11786 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752610 2025-07-17 07:11:50,188 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752612_11788 src: /192.168.158.7:34418 dest: /192.168.158.4:9866 2025-07-17 07:11:50,209 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:34418, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_50446445_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752612_11788, duration(ns): 19418554 2025-07-17 07:11:50,210 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752612_11788, type=LAST_IN_PIPELINE terminating 2025-07-17 07:11:57,054 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752612_11788 replica FinalizedReplica, blk_1073752612_11788, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752612 for deletion 2025-07-17 07:11:57,055 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752612_11788 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752612 2025-07-17 07:13:50,190 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752614_11790 src: /192.168.158.8:45974 dest: /192.168.158.4:9866 2025-07-17 07:13:50,209 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:45974, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_923990779_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752614_11790, duration(ns): 16859741 2025-07-17 07:13:50,210 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752614_11790, type=LAST_IN_PIPELINE terminating 2025-07-17 07:13:54,057 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752614_11790 replica FinalizedReplica, blk_1073752614_11790, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752614 for deletion 2025-07-17 07:13:54,058 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752614_11790 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752614 2025-07-17 07:14:50,187 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752615_11791 src: /192.168.158.9:48150 dest: /192.168.158.4:9866 2025-07-17 07:14:50,215 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:48150, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-65694926_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752615_11791, duration(ns): 21932940 2025-07-17 07:14:50,215 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752615_11791, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-17 07:14:57,061 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752615_11791 replica FinalizedReplica, blk_1073752615_11791, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752615 for deletion 2025-07-17 07:14:57,062 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752615_11791 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752615 2025-07-17 07:16:50,192 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752617_11793 src: /192.168.158.1:42900 dest: /192.168.158.4:9866 2025-07-17 07:16:50,227 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42900, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-733378548_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752617_11793, duration(ns): 25275603 2025-07-17 07:16:50,227 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752617_11793, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-17 07:16:54,069 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752617_11793 replica FinalizedReplica, blk_1073752617_11793, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752617 for deletion 2025-07-17 07:16:54,070 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752617_11793 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752617 2025-07-17 07:18:55,202 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752619_11795 src: /192.168.158.5:44424 dest: /192.168.158.4:9866 2025-07-17 07:18:55,221 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:44424, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-655249050_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752619_11795, duration(ns): 16770356 2025-07-17 07:18:55,222 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752619_11795, type=LAST_IN_PIPELINE terminating 2025-07-17 07:18:57,074 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752619_11795 replica FinalizedReplica, blk_1073752619_11795, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752619 for deletion 2025-07-17 07:18:57,075 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752619_11795 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752619 2025-07-17 07:19:55,205 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752620_11796 src: /192.168.158.8:45516 dest: /192.168.158.4:9866 2025-07-17 07:19:55,231 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:45516, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-570180341_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752620_11796, duration(ns): 19849646 2025-07-17 07:19:55,231 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752620_11796, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-17 07:19:57,075 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752620_11796 replica FinalizedReplica, blk_1073752620_11796, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752620 for deletion 2025-07-17 07:19:57,077 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752620_11796 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752620 2025-07-17 07:20:55,201 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752621_11797 src: /192.168.158.7:56928 dest: /192.168.158.4:9866 2025-07-17 07:20:55,226 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:56928, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-304589361_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752621_11797, duration(ns): 18869287 2025-07-17 07:20:55,226 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752621_11797, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-17 07:20:57,076 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752621_11797 replica FinalizedReplica, blk_1073752621_11797, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752621 for deletion 2025-07-17 07:20:57,077 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752621_11797 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752621 2025-07-17 07:21:55,204 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752622_11798 src: /192.168.158.8:59972 dest: /192.168.158.4:9866 2025-07-17 07:21:55,230 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:59972, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1444710644_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752622_11798, duration(ns): 20367229 2025-07-17 07:21:55,231 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752622_11798, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-17 07:21:57,079 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752622_11798 replica FinalizedReplica, blk_1073752622_11798, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752622 for deletion 2025-07-17 07:21:57,080 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752622_11798 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752622 2025-07-17 07:22:55,207 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752623_11799 src: /192.168.158.8:45172 dest: /192.168.158.4:9866 2025-07-17 07:22:55,227 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:45172, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2127010025_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752623_11799, duration(ns): 18584205 2025-07-17 07:22:55,228 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752623_11799, type=LAST_IN_PIPELINE terminating 2025-07-17 07:22:57,084 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752623_11799 replica FinalizedReplica, blk_1073752623_11799, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752623 for deletion 2025-07-17 07:22:57,085 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752623_11799 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752623 2025-07-17 07:24:55,207 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752625_11801 src: /192.168.158.7:36942 dest: /192.168.158.4:9866 2025-07-17 07:24:55,236 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:36942, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_764661509_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752625_11801, duration(ns): 22788717 2025-07-17 07:24:55,236 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752625_11801, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-17 07:25:00,087 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752625_11801 replica FinalizedReplica, blk_1073752625_11801, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752625 for deletion 2025-07-17 07:25:00,088 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752625_11801 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752625 2025-07-17 07:29:00,220 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752629_11805 src: /192.168.158.5:42176 dest: /192.168.158.4:9866 2025-07-17 07:29:00,243 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:42176, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1472667134_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752629_11805, duration(ns): 20688870 2025-07-17 07:29:00,243 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752629_11805, type=LAST_IN_PIPELINE terminating 2025-07-17 07:29:06,099 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752629_11805 replica FinalizedReplica, blk_1073752629_11805, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752629 for deletion 2025-07-17 07:29:06,100 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752629_11805 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752629 2025-07-17 07:30:00,215 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752630_11806 src: /192.168.158.1:33788 dest: /192.168.158.4:9866 2025-07-17 07:30:00,251 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33788, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-138287971_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752630_11806, duration(ns): 27273462 2025-07-17 07:30:00,252 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752630_11806, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-17 07:30:06,103 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752630_11806 replica FinalizedReplica, blk_1073752630_11806, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752630 for deletion 2025-07-17 07:30:06,104 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752630_11806 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752630 2025-07-17 07:31:00,222 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752631_11807 src: /192.168.158.8:54558 dest: /192.168.158.4:9866 2025-07-17 07:31:00,242 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:54558, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2076279639_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752631_11807, duration(ns): 17240318 2025-07-17 07:31:00,242 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752631_11807, type=LAST_IN_PIPELINE terminating 2025-07-17 07:31:06,106 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752631_11807 replica FinalizedReplica, blk_1073752631_11807, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752631 for deletion 2025-07-17 07:31:06,107 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752631_11807 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752631 2025-07-17 07:32:00,222 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752632_11808 src: /192.168.158.1:43386 dest: /192.168.158.4:9866 2025-07-17 07:32:00,258 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43386, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-625585451_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752632_11808, duration(ns): 26250646 2025-07-17 07:32:00,258 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752632_11808, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-17 07:32:06,109 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752632_11808 replica FinalizedReplica, blk_1073752632_11808, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752632 for deletion 2025-07-17 07:32:06,110 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752632_11808 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752632 2025-07-17 07:33:00,220 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752633_11809 src: /192.168.158.1:49304 dest: /192.168.158.4:9866 2025-07-17 07:33:00,255 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49304, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-578390342_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752633_11809, duration(ns): 24840411 2025-07-17 07:33:00,255 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752633_11809, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-17 07:33:03,112 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752633_11809 replica FinalizedReplica, blk_1073752633_11809, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752633 for deletion 2025-07-17 07:33:03,113 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752633_11809 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752633 2025-07-17 07:36:00,226 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752636_11812 src: /192.168.158.1:44294 dest: /192.168.158.4:9866 2025-07-17 07:36:00,261 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44294, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_921812329_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752636_11812, duration(ns): 24858186 2025-07-17 07:36:00,262 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752636_11812, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-17 07:36:03,118 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752636_11812 replica FinalizedReplica, blk_1073752636_11812, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752636 for deletion 2025-07-17 07:36:03,119 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752636_11812 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752636 2025-07-17 07:37:00,231 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752637_11813 src: /192.168.158.6:36354 dest: /192.168.158.4:9866 2025-07-17 07:37:00,249 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:36354, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1741941371_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752637_11813, duration(ns): 16148118 2025-07-17 07:37:00,250 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752637_11813, type=LAST_IN_PIPELINE terminating 2025-07-17 07:37:03,122 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752637_11813 replica FinalizedReplica, blk_1073752637_11813, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752637 for deletion 2025-07-17 07:37:03,124 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752637_11813 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752637 2025-07-17 07:38:00,232 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752638_11814 src: /192.168.158.8:44708 dest: /192.168.158.4:9866 2025-07-17 07:38:00,251 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:44708, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-144936810_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752638_11814, duration(ns): 17385389 2025-07-17 07:38:00,252 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752638_11814, type=LAST_IN_PIPELINE terminating 2025-07-17 07:38:03,122 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752638_11814 replica FinalizedReplica, blk_1073752638_11814, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752638 for deletion 2025-07-17 07:38:03,124 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752638_11814 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752638 2025-07-17 07:39:00,237 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752639_11815 src: /192.168.158.9:53868 dest: /192.168.158.4:9866 2025-07-17 07:39:00,262 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:53868, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1167942599_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752639_11815, duration(ns): 19284128 2025-07-17 07:39:00,263 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752639_11815, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-17 07:39:06,128 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752639_11815 replica FinalizedReplica, blk_1073752639_11815, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752639 for deletion 2025-07-17 07:39:06,129 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752639_11815 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752639 2025-07-17 07:40:00,234 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752640_11816 src: /192.168.158.1:41324 dest: /192.168.158.4:9866 2025-07-17 07:40:00,271 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41324, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1725506620_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752640_11816, duration(ns): 26306949 2025-07-17 07:40:00,271 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752640_11816, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-17 07:40:03,131 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752640_11816 replica FinalizedReplica, blk_1073752640_11816, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752640 for deletion 2025-07-17 07:40:03,132 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752640_11816 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752640 2025-07-17 07:42:05,246 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752642_11818 src: /192.168.158.1:59896 dest: /192.168.158.4:9866 2025-07-17 07:42:05,280 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59896, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_433639162_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752642_11818, duration(ns): 24197392 2025-07-17 07:42:05,280 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752642_11818, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-17 07:42:06,134 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752642_11818 replica FinalizedReplica, blk_1073752642_11818, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752642 for deletion 2025-07-17 07:42:06,135 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752642_11818 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752642 2025-07-17 07:43:05,245 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752643_11819 src: /192.168.158.5:41708 dest: /192.168.158.4:9866 2025-07-17 07:43:05,265 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:41708, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1583704907_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752643_11819, duration(ns): 18331616 2025-07-17 07:43:05,266 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752643_11819, type=LAST_IN_PIPELINE terminating 2025-07-17 07:43:09,134 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752643_11819 replica FinalizedReplica, blk_1073752643_11819, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752643 for deletion 2025-07-17 07:43:09,136 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752643_11819 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752643 2025-07-17 07:48:10,254 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752648_11824 src: /192.168.158.9:52756 dest: /192.168.158.4:9866 2025-07-17 07:48:10,273 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:52756, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-330371232_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752648_11824, duration(ns): 16708801 2025-07-17 07:48:10,273 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752648_11824, type=LAST_IN_PIPELINE terminating 2025-07-17 07:48:12,143 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752648_11824 replica FinalizedReplica, blk_1073752648_11824, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752648 for deletion 2025-07-17 07:48:12,144 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752648_11824 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752648 2025-07-17 07:56:20,270 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752656_11832 src: /192.168.158.1:46396 dest: /192.168.158.4:9866 2025-07-17 07:56:20,305 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46396, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_776936635_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752656_11832, duration(ns): 26373397 2025-07-17 07:56:20,306 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752656_11832, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-17 07:56:21,165 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752656_11832 replica FinalizedReplica, blk_1073752656_11832, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752656 for deletion 2025-07-17 07:56:21,166 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752656_11832 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752656 2025-07-17 07:57:20,274 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752657_11833 src: /192.168.158.9:52294 dest: /192.168.158.4:9866 2025-07-17 07:57:20,299 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:52294, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_578798313_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752657_11833, duration(ns): 19401402 2025-07-17 07:57:20,299 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752657_11833, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-17 07:57:21,168 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752657_11833 replica FinalizedReplica, blk_1073752657_11833, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752657 for deletion 2025-07-17 07:57:21,169 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752657_11833 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752657 2025-07-17 07:59:20,287 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752659_11835 src: /192.168.158.8:60402 dest: /192.168.158.4:9866 2025-07-17 07:59:20,311 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:60402, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-223608397_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752659_11835, duration(ns): 18709676 2025-07-17 07:59:20,311 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752659_11835, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-17 07:59:24,171 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752659_11835 replica FinalizedReplica, blk_1073752659_11835, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752659 for deletion 2025-07-17 07:59:24,173 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752659_11835 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752659 2025-07-17 08:02:25,288 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752662_11838 src: /192.168.158.5:48084 dest: /192.168.158.4:9866 2025-07-17 08:02:25,315 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:48084, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1223637244_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752662_11838, duration(ns): 21881183 2025-07-17 08:02:25,315 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752662_11838, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-17 08:02:27,180 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752662_11838 replica FinalizedReplica, blk_1073752662_11838, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752662 for deletion 2025-07-17 08:02:27,181 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752662_11838 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752662 2025-07-17 08:03:25,292 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752663_11839 src: /192.168.158.9:52000 dest: /192.168.158.4:9866 2025-07-17 08:03:25,315 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:52000, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1773594111_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752663_11839, duration(ns): 21489939 2025-07-17 08:03:25,316 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752663_11839, type=LAST_IN_PIPELINE terminating 2025-07-17 08:03:27,182 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752663_11839 replica FinalizedReplica, blk_1073752663_11839, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752663 for deletion 2025-07-17 08:03:27,183 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752663_11839 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752663 2025-07-17 08:04:25,287 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752664_11840 src: /192.168.158.1:42936 dest: /192.168.158.4:9866 2025-07-17 08:04:25,320 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42936, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-668372893_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752664_11840, duration(ns): 24303127 2025-07-17 08:04:25,321 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752664_11840, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-17 08:04:27,184 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752664_11840 replica FinalizedReplica, blk_1073752664_11840, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752664 for deletion 2025-07-17 08:04:27,186 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752664_11840 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752664 2025-07-17 08:09:30,298 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752669_11845 src: /192.168.158.5:40914 dest: /192.168.158.4:9866 2025-07-17 08:09:30,323 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:40914, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1559682173_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752669_11845, duration(ns): 19831206 2025-07-17 08:09:30,323 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752669_11845, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-17 08:09:36,193 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752669_11845 replica FinalizedReplica, blk_1073752669_11845, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752669 for deletion 2025-07-17 08:09:36,194 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752669_11845 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752669 2025-07-17 08:12:30,299 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752672_11848 src: /192.168.158.1:32978 dest: /192.168.158.4:9866 2025-07-17 08:12:30,335 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:32978, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_919342793_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752672_11848, duration(ns): 26433915 2025-07-17 08:12:30,335 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752672_11848, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-17 08:12:33,198 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752672_11848 replica FinalizedReplica, blk_1073752672_11848, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752672 for deletion 2025-07-17 08:12:33,199 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752672_11848 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752672 2025-07-17 08:13:30,307 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752673_11849 src: /192.168.158.6:42778 dest: /192.168.158.4:9866 2025-07-17 08:13:30,333 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:42778, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-716384624_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752673_11849, duration(ns): 20891934 2025-07-17 08:13:30,334 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752673_11849, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-17 08:13:36,200 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752673_11849 replica FinalizedReplica, blk_1073752673_11849, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752673 for deletion 2025-07-17 08:13:36,201 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752673_11849 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752673 2025-07-17 08:14:30,306 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752674_11850 src: /192.168.158.9:42020 dest: /192.168.158.4:9866 2025-07-17 08:14:30,331 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:42020, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1881645108_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752674_11850, duration(ns): 19813262 2025-07-17 08:14:30,332 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752674_11850, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-17 08:14:33,201 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752674_11850 replica FinalizedReplica, blk_1073752674_11850, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752674 for deletion 2025-07-17 08:14:33,203 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752674_11850 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752674 2025-07-17 08:19:45,322 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752679_11855 src: /192.168.158.9:38610 dest: /192.168.158.4:9866 2025-07-17 08:19:45,349 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:38610, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-65432803_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752679_11855, duration(ns): 20650048 2025-07-17 08:19:45,349 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752679_11855, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-17 08:19:51,216 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752679_11855 replica FinalizedReplica, blk_1073752679_11855, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752679 for deletion 2025-07-17 08:19:51,217 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752679_11855 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752679 2025-07-17 08:22:45,316 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752682_11858 src: /192.168.158.1:37830 dest: /192.168.158.4:9866 2025-07-17 08:22:45,350 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37830, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1644448713_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752682_11858, duration(ns): 25547582 2025-07-17 08:22:45,350 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752682_11858, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-17 08:22:51,226 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752682_11858 replica FinalizedReplica, blk_1073752682_11858, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752682 for deletion 2025-07-17 08:22:51,227 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752682_11858 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752682 2025-07-17 08:23:50,318 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752683_11859 src: /192.168.158.7:59510 dest: /192.168.158.4:9866 2025-07-17 08:23:50,345 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:59510, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1724513593_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752683_11859, duration(ns): 20953443 2025-07-17 08:23:50,345 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752683_11859, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-17 08:23:51,227 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752683_11859 replica FinalizedReplica, blk_1073752683_11859, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752683 for deletion 2025-07-17 08:23:51,229 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752683_11859 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752683 2025-07-17 08:24:50,340 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752684_11860 src: /192.168.158.9:48674 dest: /192.168.158.4:9866 2025-07-17 08:24:50,365 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:48674, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-991897411_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752684_11860, duration(ns): 19914512 2025-07-17 08:24:50,366 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752684_11860, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-17 08:24:54,228 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752684_11860 replica FinalizedReplica, blk_1073752684_11860, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752684 for deletion 2025-07-17 08:24:54,229 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752684_11860 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752684 2025-07-17 08:26:50,332 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752686_11862 src: /192.168.158.9:49278 dest: /192.168.158.4:9866 2025-07-17 08:26:50,358 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:49278, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1741599149_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752686_11862, duration(ns): 19371794 2025-07-17 08:26:50,358 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752686_11862, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-17 08:26:54,231 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752686_11862 replica FinalizedReplica, blk_1073752686_11862, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752686 for deletion 2025-07-17 08:26:54,232 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752686_11862 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752686 2025-07-17 08:27:50,338 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752687_11863 src: /192.168.158.8:34088 dest: /192.168.158.4:9866 2025-07-17 08:27:50,365 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:34088, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_248438755_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752687_11863, duration(ns): 21276268 2025-07-17 08:27:50,365 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752687_11863, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-17 08:27:51,235 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752687_11863 replica FinalizedReplica, blk_1073752687_11863, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752687 for deletion 2025-07-17 08:27:51,237 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752687_11863 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752687 2025-07-17 08:28:50,327 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752688_11864 src: /192.168.158.1:49816 dest: /192.168.158.4:9866 2025-07-17 08:28:50,366 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49816, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-730010542_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752688_11864, duration(ns): 26892314 2025-07-17 08:28:50,366 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752688_11864, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-17 08:28:51,238 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752688_11864 replica FinalizedReplica, blk_1073752688_11864, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752688 for deletion 2025-07-17 08:28:51,240 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752688_11864 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752688 2025-07-17 08:29:55,339 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752689_11865 src: /192.168.158.8:55908 dest: /192.168.158.4:9866 2025-07-17 08:29:55,359 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:55908, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1587172159_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752689_11865, duration(ns): 17776504 2025-07-17 08:29:55,359 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752689_11865, type=LAST_IN_PIPELINE terminating 2025-07-17 08:29:57,239 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752689_11865 replica FinalizedReplica, blk_1073752689_11865, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752689 for deletion 2025-07-17 08:29:57,241 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752689_11865 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752689 2025-07-17 08:30:55,333 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752690_11866 src: /192.168.158.1:50138 dest: /192.168.158.4:9866 2025-07-17 08:30:55,369 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50138, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_497430472_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752690_11866, duration(ns): 26553990 2025-07-17 08:30:55,370 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752690_11866, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-17 08:30:57,243 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752690_11866 replica FinalizedReplica, blk_1073752690_11866, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752690 for deletion 2025-07-17 08:30:57,244 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752690_11866 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752690 2025-07-17 08:31:55,339 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752691_11867 src: /192.168.158.5:53564 dest: /192.168.158.4:9866 2025-07-17 08:31:55,368 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:53564, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1677236903_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752691_11867, duration(ns): 22958304 2025-07-17 08:31:55,368 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752691_11867, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-17 08:31:57,248 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752691_11867 replica FinalizedReplica, blk_1073752691_11867, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752691 for deletion 2025-07-17 08:31:57,249 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752691_11867 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752691 2025-07-17 08:35:55,351 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752695_11871 src: /192.168.158.8:45834 dest: /192.168.158.4:9866 2025-07-17 08:35:55,371 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:45834, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-84307436_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752695_11871, duration(ns): 17928248 2025-07-17 08:35:55,371 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752695_11871, type=LAST_IN_PIPELINE terminating 2025-07-17 08:35:57,257 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752695_11871 replica FinalizedReplica, blk_1073752695_11871, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752695 for deletion 2025-07-17 08:35:57,259 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752695_11871 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752695 2025-07-17 08:36:55,353 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752696_11872 src: /192.168.158.5:45502 dest: /192.168.158.4:9866 2025-07-17 08:36:55,379 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:45502, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-591425888_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752696_11872, duration(ns): 20076716 2025-07-17 08:36:55,379 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752696_11872, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-17 08:37:00,259 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752696_11872 replica FinalizedReplica, blk_1073752696_11872, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752696 for deletion 2025-07-17 08:37:00,260 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752696_11872 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752696 2025-07-17 08:38:00,352 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752697_11873 src: /192.168.158.9:40154 dest: /192.168.158.4:9866 2025-07-17 08:38:00,378 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:40154, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1985920179_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752697_11873, duration(ns): 20484459 2025-07-17 08:38:00,378 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752697_11873, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-17 08:38:03,262 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752697_11873 replica FinalizedReplica, blk_1073752697_11873, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752697 for deletion 2025-07-17 08:38:03,263 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752697_11873 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752697 2025-07-17 08:40:00,353 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752699_11875 src: /192.168.158.1:52720 dest: /192.168.158.4:9866 2025-07-17 08:40:00,388 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52720, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1235799787_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752699_11875, duration(ns): 24279478 2025-07-17 08:40:00,388 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752699_11875, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-17 08:40:03,268 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752699_11875 replica FinalizedReplica, blk_1073752699_11875, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752699 for deletion 2025-07-17 08:40:03,269 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752699_11875 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752699 2025-07-17 08:41:05,361 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752700_11876 src: /192.168.158.6:44214 dest: /192.168.158.4:9866 2025-07-17 08:41:05,381 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:44214, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-295232604_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752700_11876, duration(ns): 15414762 2025-07-17 08:41:05,381 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752700_11876, type=LAST_IN_PIPELINE terminating 2025-07-17 08:41:06,270 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752700_11876 replica FinalizedReplica, blk_1073752700_11876, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752700 for deletion 2025-07-17 08:41:06,271 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752700_11876 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752700 2025-07-17 08:43:05,354 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752702_11878 src: /192.168.158.1:39296 dest: /192.168.158.4:9866 2025-07-17 08:43:05,386 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39296, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1592799748_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752702_11878, duration(ns): 22016098 2025-07-17 08:43:05,386 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752702_11878, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-17 08:43:09,275 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752702_11878 replica FinalizedReplica, blk_1073752702_11878, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752702 for deletion 2025-07-17 08:43:09,276 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752702_11878 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752702 2025-07-17 08:45:05,358 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752704_11880 src: /192.168.158.1:47872 dest: /192.168.158.4:9866 2025-07-17 08:45:05,393 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47872, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-696346501_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752704_11880, duration(ns): 24690033 2025-07-17 08:45:05,393 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752704_11880, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-17 08:45:06,281 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752704_11880 replica FinalizedReplica, blk_1073752704_11880, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752704 for deletion 2025-07-17 08:45:06,282 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752704_11880 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752704 2025-07-17 08:48:05,366 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752707_11883 src: /192.168.158.6:42690 dest: /192.168.158.4:9866 2025-07-17 08:48:05,385 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:42690, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-628501049_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752707_11883, duration(ns): 17483971 2025-07-17 08:48:05,386 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752707_11883, type=LAST_IN_PIPELINE terminating 2025-07-17 08:48:06,288 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752707_11883 replica FinalizedReplica, blk_1073752707_11883, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752707 for deletion 2025-07-17 08:48:06,289 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752707_11883 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752707 2025-07-17 08:51:10,367 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752710_11886 src: /192.168.158.1:52002 dest: /192.168.158.4:9866 2025-07-17 08:51:10,401 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52002, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1234280651_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752710_11886, duration(ns): 24014379 2025-07-17 08:51:10,401 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752710_11886, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-17 08:51:12,294 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752710_11886 replica FinalizedReplica, blk_1073752710_11886, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752710 for deletion 2025-07-17 08:51:12,295 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752710_11886 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752710 2025-07-17 08:54:15,366 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752713_11889 src: /192.168.158.1:40586 dest: /192.168.158.4:9866 2025-07-17 08:54:15,401 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40586, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1913408645_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752713_11889, duration(ns): 25917354 2025-07-17 08:54:15,401 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752713_11889, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-17 08:54:18,303 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752713_11889 replica FinalizedReplica, blk_1073752713_11889, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752713 for deletion 2025-07-17 08:54:18,304 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752713_11889 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752713 2025-07-17 08:56:15,363 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752715_11891 src: /192.168.158.1:45512 dest: /192.168.158.4:9866 2025-07-17 08:56:15,395 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45512, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-312247002_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752715_11891, duration(ns): 23318218 2025-07-17 08:56:15,396 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752715_11891, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-17 08:56:18,308 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752715_11891 replica FinalizedReplica, blk_1073752715_11891, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752715 for deletion 2025-07-17 08:56:18,309 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752715_11891 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752715 2025-07-17 08:58:15,375 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752717_11893 src: /192.168.158.7:58174 dest: /192.168.158.4:9866 2025-07-17 08:58:15,393 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:58174, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_307134922_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752717_11893, duration(ns): 16008869 2025-07-17 08:58:15,393 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752717_11893, type=LAST_IN_PIPELINE terminating 2025-07-17 08:58:18,309 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752717_11893 replica FinalizedReplica, blk_1073752717_11893, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752717 for deletion 2025-07-17 08:58:18,310 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752717_11893 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752717 2025-07-17 09:00:15,382 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752719_11895 src: /192.168.158.9:50170 dest: /192.168.158.4:9866 2025-07-17 09:00:15,404 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:50170, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1195840734_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752719_11895, duration(ns): 19581859 2025-07-17 09:00:15,404 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752719_11895, type=LAST_IN_PIPELINE terminating 2025-07-17 09:00:18,312 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752719_11895 replica FinalizedReplica, blk_1073752719_11895, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752719 for deletion 2025-07-17 09:00:18,313 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752719_11895 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752719 2025-07-17 09:02:15,370 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752721_11897 src: /192.168.158.1:44464 dest: /192.168.158.4:9866 2025-07-17 09:02:15,404 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44464, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1413639404_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752721_11897, duration(ns): 24665903 2025-07-17 09:02:15,405 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752721_11897, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-17 09:02:21,315 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752721_11897 replica FinalizedReplica, blk_1073752721_11897, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752721 for deletion 2025-07-17 09:02:21,316 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752721_11897 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752721 2025-07-17 09:03:15,378 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752722_11898 src: /192.168.158.9:52260 dest: /192.168.158.4:9866 2025-07-17 09:03:15,404 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:52260, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1818882029_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752722_11898, duration(ns): 19420756 2025-07-17 09:03:15,404 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752722_11898, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-17 09:03:18,316 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752722_11898 replica FinalizedReplica, blk_1073752722_11898, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752722 for deletion 2025-07-17 09:03:18,317 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752722_11898 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752722 2025-07-17 09:04:15,376 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752723_11899 src: /192.168.158.1:39864 dest: /192.168.158.4:9866 2025-07-17 09:04:15,410 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39864, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1254578544_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752723_11899, duration(ns): 24687775 2025-07-17 09:04:15,410 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752723_11899, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-17 09:04:18,317 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752723_11899 replica FinalizedReplica, blk_1073752723_11899, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752723 for deletion 2025-07-17 09:04:18,319 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752723_11899 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752723 2025-07-17 09:05:15,381 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752724_11900 src: /192.168.158.7:42472 dest: /192.168.158.4:9866 2025-07-17 09:05:15,401 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:42472, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-946965592_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752724_11900, duration(ns): 16148818 2025-07-17 09:05:15,401 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752724_11900, type=LAST_IN_PIPELINE terminating 2025-07-17 09:05:18,322 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752724_11900 replica FinalizedReplica, blk_1073752724_11900, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752724 for deletion 2025-07-17 09:05:18,323 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752724_11900 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752724 2025-07-17 09:06:15,394 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752725_11901 src: /192.168.158.7:46524 dest: /192.168.158.4:9866 2025-07-17 09:06:15,414 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:46524, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1217934568_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752725_11901, duration(ns): 17862487 2025-07-17 09:06:15,414 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752725_11901, type=LAST_IN_PIPELINE terminating 2025-07-17 09:06:18,322 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752725_11901 replica FinalizedReplica, blk_1073752725_11901, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752725 for deletion 2025-07-17 09:06:18,323 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752725_11901 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752725 2025-07-17 09:08:15,387 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752727_11903 src: /192.168.158.5:42180 dest: /192.168.158.4:9866 2025-07-17 09:08:15,415 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:42180, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1110423833_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752727_11903, duration(ns): 22268278 2025-07-17 09:08:15,416 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752727_11903, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-17 09:08:21,326 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752727_11903 replica FinalizedReplica, blk_1073752727_11903, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752727 for deletion 2025-07-17 09:08:21,327 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752727_11903 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752727 2025-07-17 09:10:15,392 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752729_11905 src: /192.168.158.1:54352 dest: /192.168.158.4:9866 2025-07-17 09:10:15,425 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54352, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-617573674_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752729_11905, duration(ns): 23186021 2025-07-17 09:10:15,425 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752729_11905, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-17 09:10:18,329 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752729_11905 replica FinalizedReplica, blk_1073752729_11905, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752729 for deletion 2025-07-17 09:10:18,331 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752729_11905 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752729 2025-07-17 09:11:20,398 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752730_11906 src: /192.168.158.9:54880 dest: /192.168.158.4:9866 2025-07-17 09:11:20,426 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:54880, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_503655660_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752730_11906, duration(ns): 25467021 2025-07-17 09:11:20,426 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752730_11906, type=LAST_IN_PIPELINE terminating 2025-07-17 09:11:24,333 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752730_11906 replica FinalizedReplica, blk_1073752730_11906, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752730 for deletion 2025-07-17 09:11:24,334 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752730_11906 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752730 2025-07-17 09:12:25,391 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752731_11907 src: /192.168.158.1:36548 dest: /192.168.158.4:9866 2025-07-17 09:12:25,426 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36548, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1495461744_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752731_11907, duration(ns): 26266250 2025-07-17 09:12:25,426 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752731_11907, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-17 09:12:30,336 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752731_11907 replica FinalizedReplica, blk_1073752731_11907, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752731 for deletion 2025-07-17 09:12:30,337 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752731_11907 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752731 2025-07-17 09:14:25,404 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752733_11909 src: /192.168.158.6:49594 dest: /192.168.158.4:9866 2025-07-17 09:14:25,424 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:49594, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1321299521_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752733_11909, duration(ns): 17818614 2025-07-17 09:14:25,424 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752733_11909, type=LAST_IN_PIPELINE terminating 2025-07-17 09:14:30,338 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752733_11909 replica FinalizedReplica, blk_1073752733_11909, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752733 for deletion 2025-07-17 09:14:30,339 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752733_11909 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752733 2025-07-17 09:15:25,408 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752734_11910 src: /192.168.158.7:42766 dest: /192.168.158.4:9866 2025-07-17 09:15:25,434 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:42766, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1416073390_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752734_11910, duration(ns): 19482518 2025-07-17 09:15:25,434 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752734_11910, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-17 09:15:30,340 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752734_11910 replica FinalizedReplica, blk_1073752734_11910, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752734 for deletion 2025-07-17 09:15:30,341 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752734_11910 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752734 2025-07-17 09:17:25,411 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752736_11912 src: /192.168.158.7:56868 dest: /192.168.158.4:9866 2025-07-17 09:17:25,429 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:56868, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_827070641_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752736_11912, duration(ns): 15968147 2025-07-17 09:17:25,429 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752736_11912, type=LAST_IN_PIPELINE terminating 2025-07-17 09:17:27,343 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752736_11912 replica FinalizedReplica, blk_1073752736_11912, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752736 for deletion 2025-07-17 09:17:27,344 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752736_11912 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752736 2025-07-17 09:18:25,410 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752737_11913 src: /192.168.158.5:48320 dest: /192.168.158.4:9866 2025-07-17 09:18:25,437 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:48320, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_36616034_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752737_11913, duration(ns): 21566110 2025-07-17 09:18:25,437 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752737_11913, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-17 09:18:30,345 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752737_11913 replica FinalizedReplica, blk_1073752737_11913, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752737 for deletion 2025-07-17 09:18:30,346 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752737_11913 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752737 2025-07-17 09:21:25,414 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752740_11916 src: /192.168.158.7:57750 dest: /192.168.158.4:9866 2025-07-17 09:21:25,440 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:57750, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_481298848_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752740_11916, duration(ns): 19998310 2025-07-17 09:21:25,441 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752740_11916, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-17 09:21:27,349 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752740_11916 replica FinalizedReplica, blk_1073752740_11916, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752740 for deletion 2025-07-17 09:21:27,350 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752740_11916 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752740 2025-07-17 09:22:25,415 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752741_11917 src: /192.168.158.1:34814 dest: /192.168.158.4:9866 2025-07-17 09:22:25,452 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34814, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1891486598_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752741_11917, duration(ns): 28131044 2025-07-17 09:22:25,452 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752741_11917, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-17 09:22:30,350 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752741_11917 replica FinalizedReplica, blk_1073752741_11917, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752741 for deletion 2025-07-17 09:22:30,351 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752741_11917 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752741 2025-07-17 09:23:30,418 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752742_11918 src: /192.168.158.1:46526 dest: /192.168.158.4:9866 2025-07-17 09:23:30,450 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46526, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1384450054_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752742_11918, duration(ns): 23055281 2025-07-17 09:23:30,451 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752742_11918, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-17 09:23:33,352 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752742_11918 replica FinalizedReplica, blk_1073752742_11918, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752742 for deletion 2025-07-17 09:23:33,354 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752742_11918 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752742 2025-07-17 09:25:30,427 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752744_11920 src: /192.168.158.7:57278 dest: /192.168.158.4:9866 2025-07-17 09:25:30,448 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:57278, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1820464330_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752744_11920, duration(ns): 18826288 2025-07-17 09:25:30,448 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752744_11920, type=LAST_IN_PIPELINE terminating 2025-07-17 09:25:33,355 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752744_11920 replica FinalizedReplica, blk_1073752744_11920, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752744 for deletion 2025-07-17 09:25:33,357 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752744_11920 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752744 2025-07-17 09:26:30,423 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752745_11921 src: /192.168.158.7:60202 dest: /192.168.158.4:9866 2025-07-17 09:26:30,450 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:60202, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_57670151_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752745_11921, duration(ns): 21241056 2025-07-17 09:26:30,451 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752745_11921, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-17 09:26:33,359 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752745_11921 replica FinalizedReplica, blk_1073752745_11921, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752745 for deletion 2025-07-17 09:26:33,360 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752745_11921 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752745 2025-07-17 09:27:30,422 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752746_11922 src: /192.168.158.1:46482 dest: /192.168.158.4:9866 2025-07-17 09:27:30,454 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46482, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1002953870_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752746_11922, duration(ns): 23243988 2025-07-17 09:27:30,455 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752746_11922, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-17 09:27:36,361 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752746_11922 replica FinalizedReplica, blk_1073752746_11922, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752746 for deletion 2025-07-17 09:27:36,362 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752746_11922 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752746 2025-07-17 09:30:30,438 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752749_11925 src: /192.168.158.9:47280 dest: /192.168.158.4:9866 2025-07-17 09:30:30,457 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:47280, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_699537465_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752749_11925, duration(ns): 16429776 2025-07-17 09:30:30,457 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752749_11925, type=LAST_IN_PIPELINE terminating 2025-07-17 09:30:33,367 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752749_11925 replica FinalizedReplica, blk_1073752749_11925, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752749 for deletion 2025-07-17 09:30:33,368 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752749_11925 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752749 2025-07-17 09:31:35,435 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752750_11926 src: /192.168.158.6:46440 dest: /192.168.158.4:9866 2025-07-17 09:31:35,463 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:46440, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-395962498_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752750_11926, duration(ns): 22160139 2025-07-17 09:31:35,463 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752750_11926, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-17 09:31:36,367 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752750_11926 replica FinalizedReplica, blk_1073752750_11926, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752750 for deletion 2025-07-17 09:31:36,368 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752750_11926 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752750 2025-07-17 09:34:40,439 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752753_11929 src: /192.168.158.1:57568 dest: /192.168.158.4:9866 2025-07-17 09:34:40,473 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57568, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1991955747_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752753_11929, duration(ns): 24736827 2025-07-17 09:34:40,473 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752753_11929, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-17 09:34:42,374 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752753_11929 replica FinalizedReplica, blk_1073752753_11929, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752753 for deletion 2025-07-17 09:34:42,375 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752753_11929 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752753 2025-07-17 09:35:45,444 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752754_11930 src: /192.168.158.7:53186 dest: /192.168.158.4:9866 2025-07-17 09:35:45,471 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:53186, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-4472219_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752754_11930, duration(ns): 21308660 2025-07-17 09:35:45,471 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752754_11930, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-17 09:35:48,377 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752754_11930 replica FinalizedReplica, blk_1073752754_11930, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752754 for deletion 2025-07-17 09:35:48,378 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752754_11930 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752754 2025-07-17 09:39:50,441 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752758_11934 src: /192.168.158.1:49270 dest: /192.168.158.4:9866 2025-07-17 09:39:50,476 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49270, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1449162675_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752758_11934, duration(ns): 24223235 2025-07-17 09:39:50,477 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752758_11934, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-17 09:39:51,383 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752758_11934 replica FinalizedReplica, blk_1073752758_11934, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752758 for deletion 2025-07-17 09:39:51,385 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752758_11934 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752758 2025-07-17 09:42:55,442 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752761_11937 src: /192.168.158.1:44784 dest: /192.168.158.4:9866 2025-07-17 09:42:55,479 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44784, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-810675875_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752761_11937, duration(ns): 25828139 2025-07-17 09:42:55,479 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752761_11937, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-17 09:42:57,389 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752761_11937 replica FinalizedReplica, blk_1073752761_11937, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752761 for deletion 2025-07-17 09:42:57,390 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752761_11937 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752761 2025-07-17 09:44:55,444 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752763_11939 src: /192.168.158.1:54614 dest: /192.168.158.4:9866 2025-07-17 09:44:55,480 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54614, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1346775903_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752763_11939, duration(ns): 26720421 2025-07-17 09:44:55,480 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752763_11939, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-17 09:44:57,390 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752763_11939 replica FinalizedReplica, blk_1073752763_11939, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752763 for deletion 2025-07-17 09:44:57,391 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752763_11939 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752763 2025-07-17 09:49:00,463 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752767_11943 src: /192.168.158.6:42152 dest: /192.168.158.4:9866 2025-07-17 09:49:00,481 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:42152, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1341197926_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752767_11943, duration(ns): 15947324 2025-07-17 09:49:00,481 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752767_11943, type=LAST_IN_PIPELINE terminating 2025-07-17 09:49:03,399 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752767_11943 replica FinalizedReplica, blk_1073752767_11943, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752767 for deletion 2025-07-17 09:49:03,401 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752767_11943 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752767 2025-07-17 09:50:00,461 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752768_11944 src: /192.168.158.1:51176 dest: /192.168.158.4:9866 2025-07-17 09:50:00,495 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51176, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-161782284_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752768_11944, duration(ns): 23802601 2025-07-17 09:50:00,495 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752768_11944, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-17 09:50:03,402 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752768_11944 replica FinalizedReplica, blk_1073752768_11944, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752768 for deletion 2025-07-17 09:50:03,403 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752768_11944 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752768 2025-07-17 09:51:05,466 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752769_11945 src: /192.168.158.5:54740 dest: /192.168.158.4:9866 2025-07-17 09:51:05,485 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:54740, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1338300914_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752769_11945, duration(ns): 16809546 2025-07-17 09:51:05,486 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752769_11945, type=LAST_IN_PIPELINE terminating 2025-07-17 09:51:06,404 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752769_11945 replica FinalizedReplica, blk_1073752769_11945, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752769 for deletion 2025-07-17 09:51:06,405 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752769_11945 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752769 2025-07-17 09:54:10,474 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752772_11948 src: /192.168.158.8:36868 dest: /192.168.158.4:9866 2025-07-17 09:54:10,495 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:36868, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-73844644_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752772_11948, duration(ns): 18612617 2025-07-17 09:54:10,495 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752772_11948, type=LAST_IN_PIPELINE terminating 2025-07-17 09:54:12,408 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752772_11948 replica FinalizedReplica, blk_1073752772_11948, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752772 for deletion 2025-07-17 09:54:12,409 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752772_11948 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752772 2025-07-17 09:56:10,470 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752774_11950 src: /192.168.158.1:59848 dest: /192.168.158.4:9866 2025-07-17 09:56:10,503 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59848, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_493180570_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752774_11950, duration(ns): 23533162 2025-07-17 09:56:10,503 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752774_11950, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-17 09:56:12,409 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752774_11950 replica FinalizedReplica, blk_1073752774_11950, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752774 for deletion 2025-07-17 09:56:12,410 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752774_11950 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752774 2025-07-17 09:57:10,473 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752775_11951 src: /192.168.158.8:33156 dest: /192.168.158.4:9866 2025-07-17 09:57:10,498 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:33156, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1048590640_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752775_11951, duration(ns): 19395046 2025-07-17 09:57:10,499 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752775_11951, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-17 09:57:12,409 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752775_11951 replica FinalizedReplica, blk_1073752775_11951, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752775 for deletion 2025-07-17 09:57:12,410 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752775_11951 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752775 2025-07-17 09:58:10,495 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752776_11952 src: /192.168.158.9:34042 dest: /192.168.158.4:9866 2025-07-17 09:58:10,513 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:34042, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-234286070_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752776_11952, duration(ns): 16295021 2025-07-17 09:58:10,514 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752776_11952, type=LAST_IN_PIPELINE terminating 2025-07-17 09:58:15,412 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752776_11952 replica FinalizedReplica, blk_1073752776_11952, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752776 for deletion 2025-07-17 09:58:15,413 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752776_11952 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752776 2025-07-17 09:59:18,417 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Successfully sent block report 0x12cfd9a757d31f45, containing 4 storage report(s), of which we sent 4. The reports had 8 total blocks and used 1 RPC(s). This took 1 msec to generate and 3 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5. 2025-07-17 09:59:18,418 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Got finalize command for block pool BP-1059995147-192.168.158.1-1752101929360 2025-07-17 10:00:10,478 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752778_11954 src: /192.168.158.1:57572 dest: /192.168.158.4:9866 2025-07-17 10:00:10,515 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57572, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-709630293_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752778_11954, duration(ns): 27229778 2025-07-17 10:00:10,516 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752778_11954, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-17 10:00:15,415 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752778_11954 replica FinalizedReplica, blk_1073752778_11954, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752778 for deletion 2025-07-17 10:00:15,416 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752778_11954 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752778 2025-07-17 10:01:15,486 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752779_11955 src: /192.168.158.9:56700 dest: /192.168.158.4:9866 2025-07-17 10:01:15,508 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:56700, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_565844699_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752779_11955, duration(ns): 19592540 2025-07-17 10:01:15,508 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752779_11955, type=LAST_IN_PIPELINE terminating 2025-07-17 10:01:18,415 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752779_11955 replica FinalizedReplica, blk_1073752779_11955, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752779 for deletion 2025-07-17 10:01:18,416 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752779_11955 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752779 2025-07-17 10:02:15,490 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752780_11956 src: /192.168.158.1:38906 dest: /192.168.158.4:9866 2025-07-17 10:02:15,522 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38906, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2038958038_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752780_11956, duration(ns): 23469558 2025-07-17 10:02:15,523 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752780_11956, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-17 10:02:21,417 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752780_11956 replica FinalizedReplica, blk_1073752780_11956, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752780 for deletion 2025-07-17 10:02:21,419 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752780_11956 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752780 2025-07-17 10:04:15,494 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752782_11958 src: /192.168.158.7:36796 dest: /192.168.158.4:9866 2025-07-17 10:04:15,512 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:36796, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1294005121_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752782_11958, duration(ns): 15846830 2025-07-17 10:04:15,512 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752782_11958, type=LAST_IN_PIPELINE terminating 2025-07-17 10:04:21,423 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752782_11958 replica FinalizedReplica, blk_1073752782_11958, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752782 for deletion 2025-07-17 10:04:21,424 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752782_11958 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752782 2025-07-17 10:10:15,501 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752788_11964 src: /192.168.158.7:54622 dest: /192.168.158.4:9866 2025-07-17 10:10:15,529 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:54622, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1007494627_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752788_11964, duration(ns): 22193186 2025-07-17 10:10:15,529 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752788_11964, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-17 10:10:18,440 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752788_11964 replica FinalizedReplica, blk_1073752788_11964, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752788 for deletion 2025-07-17 10:10:18,442 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752788_11964 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752788 2025-07-17 10:13:20,500 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752791_11967 src: /192.168.158.1:41994 dest: /192.168.158.4:9866 2025-07-17 10:13:20,532 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41994, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_962861215_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752791_11967, duration(ns): 23069289 2025-07-17 10:13:20,532 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752791_11967, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-17 10:13:21,447 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752791_11967 replica FinalizedReplica, blk_1073752791_11967, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752791 for deletion 2025-07-17 10:13:21,448 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752791_11967 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752791 2025-07-17 10:14:20,510 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752792_11968 src: /192.168.158.7:57496 dest: /192.168.158.4:9866 2025-07-17 10:14:20,537 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:57496, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1149853263_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752792_11968, duration(ns): 21838843 2025-07-17 10:14:20,538 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752792_11968, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-17 10:14:21,450 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752792_11968 replica FinalizedReplica, blk_1073752792_11968, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752792 for deletion 2025-07-17 10:14:21,451 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752792_11968 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752792 2025-07-17 10:15:20,517 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752793_11969 src: /192.168.158.9:39162 dest: /192.168.158.4:9866 2025-07-17 10:15:20,535 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:39162, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_225295678_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752793_11969, duration(ns): 16215302 2025-07-17 10:15:20,535 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752793_11969, type=LAST_IN_PIPELINE terminating 2025-07-17 10:15:21,453 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752793_11969 replica FinalizedReplica, blk_1073752793_11969, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752793 for deletion 2025-07-17 10:15:21,454 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752793_11969 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752793 2025-07-17 10:16:20,510 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752794_11970 src: /192.168.158.8:35542 dest: /192.168.158.4:9866 2025-07-17 10:16:20,539 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:35542, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_488595844_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752794_11970, duration(ns): 22862314 2025-07-17 10:16:20,539 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752794_11970, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-17 10:16:21,454 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752794_11970 replica FinalizedReplica, blk_1073752794_11970, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752794 for deletion 2025-07-17 10:16:21,455 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752794_11970 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752794 2025-07-17 10:17:20,508 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752795_11971 src: /192.168.158.1:42972 dest: /192.168.158.4:9866 2025-07-17 10:17:20,541 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42972, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-312719450_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752795_11971, duration(ns): 24032182 2025-07-17 10:17:20,541 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752795_11971, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-17 10:17:21,455 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752795_11971 replica FinalizedReplica, blk_1073752795_11971, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752795 for deletion 2025-07-17 10:17:21,457 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752795_11971 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752795 2025-07-17 10:18:20,518 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752796_11972 src: /192.168.158.9:44004 dest: /192.168.158.4:9866 2025-07-17 10:18:20,537 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:44004, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-498840541_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752796_11972, duration(ns): 16173415 2025-07-17 10:18:20,538 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752796_11972, type=LAST_IN_PIPELINE terminating 2025-07-17 10:18:21,457 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752796_11972 replica FinalizedReplica, blk_1073752796_11972, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752796 for deletion 2025-07-17 10:18:21,458 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752796_11972 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752796 2025-07-17 10:26:25,528 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752804_11980 src: /192.168.158.1:55288 dest: /192.168.158.4:9866 2025-07-17 10:26:25,562 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55288, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_213996488_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752804_11980, duration(ns): 25420245 2025-07-17 10:26:25,563 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752804_11980, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-17 10:26:27,474 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752804_11980 replica FinalizedReplica, blk_1073752804_11980, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752804 for deletion 2025-07-17 10:26:27,475 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752804_11980 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752804 2025-07-17 10:27:25,535 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752805_11981 src: /192.168.158.9:60570 dest: /192.168.158.4:9866 2025-07-17 10:27:25,554 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:60570, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-89624878_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752805_11981, duration(ns): 16981531 2025-07-17 10:27:25,554 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752805_11981, type=LAST_IN_PIPELINE terminating 2025-07-17 10:27:27,477 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752805_11981 replica FinalizedReplica, blk_1073752805_11981, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752805 for deletion 2025-07-17 10:27:27,478 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752805_11981 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752805 2025-07-17 10:30:30,533 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752808_11984 src: /192.168.158.1:54240 dest: /192.168.158.4:9866 2025-07-17 10:30:30,565 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54240, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1297139727_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752808_11984, duration(ns): 23386662 2025-07-17 10:30:30,566 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752808_11984, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-17 10:30:33,484 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752808_11984 replica FinalizedReplica, blk_1073752808_11984, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752808 for deletion 2025-07-17 10:30:33,485 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752808_11984 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752808 2025-07-17 10:33:30,547 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752811_11987 src: /192.168.158.8:56880 dest: /192.168.158.4:9866 2025-07-17 10:33:30,567 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:56880, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-748336646_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752811_11987, duration(ns): 17400507 2025-07-17 10:33:30,567 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752811_11987, type=LAST_IN_PIPELINE terminating 2025-07-17 10:33:33,491 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752811_11987 replica FinalizedReplica, blk_1073752811_11987, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752811 for deletion 2025-07-17 10:33:33,492 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752811_11987 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752811 2025-07-17 10:36:30,572 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752814_11990 src: /192.168.158.7:56362 dest: /192.168.158.4:9866 2025-07-17 10:36:30,597 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:56362, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-788979781_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752814_11990, duration(ns): 20037442 2025-07-17 10:36:30,597 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752814_11990, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-17 10:36:33,500 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752814_11990 replica FinalizedReplica, blk_1073752814_11990, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752814 for deletion 2025-07-17 10:36:33,501 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752814_11990 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752814 2025-07-17 10:39:30,573 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752817_11993 src: /192.168.158.1:57410 dest: /192.168.158.4:9866 2025-07-17 10:39:30,605 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57410, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1025085896_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752817_11993, duration(ns): 23788268 2025-07-17 10:39:30,607 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752817_11993, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-17 10:39:33,509 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752817_11993 replica FinalizedReplica, blk_1073752817_11993, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752817 for deletion 2025-07-17 10:39:33,511 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752817_11993 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752817 2025-07-17 10:40:30,575 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752818_11994 src: /192.168.158.7:33542 dest: /192.168.158.4:9866 2025-07-17 10:40:30,602 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:33542, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1563437599_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752818_11994, duration(ns): 20897857 2025-07-17 10:40:30,602 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752818_11994, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-17 10:40:36,514 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752818_11994 replica FinalizedReplica, blk_1073752818_11994, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752818 for deletion 2025-07-17 10:40:36,515 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752818_11994 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752818 2025-07-17 10:41:30,573 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752819_11995 src: /192.168.158.1:51092 dest: /192.168.158.4:9866 2025-07-17 10:41:30,605 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51092, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-26822543_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752819_11995, duration(ns): 23506210 2025-07-17 10:41:30,606 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752819_11995, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-17 10:41:33,515 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752819_11995 replica FinalizedReplica, blk_1073752819_11995, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752819 for deletion 2025-07-17 10:41:33,516 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752819_11995 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752819 2025-07-17 10:42:30,580 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752820_11996 src: /192.168.158.1:34262 dest: /192.168.158.4:9866 2025-07-17 10:42:30,614 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34262, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-247460705_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752820_11996, duration(ns): 23293802 2025-07-17 10:42:30,614 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752820_11996, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-17 10:42:36,516 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752820_11996 replica FinalizedReplica, blk_1073752820_11996, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752820 for deletion 2025-07-17 10:42:36,517 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752820_11996 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752820 2025-07-17 10:43:30,585 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752821_11997 src: /192.168.158.6:32776 dest: /192.168.158.4:9866 2025-07-17 10:43:30,614 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:32776, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1846116360_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752821_11997, duration(ns): 22621121 2025-07-17 10:43:30,614 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752821_11997, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-17 10:43:36,517 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752821_11997 replica FinalizedReplica, blk_1073752821_11997, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752821 for deletion 2025-07-17 10:43:36,517 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752821_11997 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752821 2025-07-17 10:44:30,579 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752822_11998 src: /192.168.158.9:43676 dest: /192.168.158.4:9866 2025-07-17 10:44:30,606 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:43676, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-353987230_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752822_11998, duration(ns): 21028778 2025-07-17 10:44:30,606 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752822_11998, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-17 10:44:33,518 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752822_11998 replica FinalizedReplica, blk_1073752822_11998, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752822 for deletion 2025-07-17 10:44:33,519 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752822_11998 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752822 2025-07-17 10:46:40,581 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752824_12000 src: /192.168.158.8:41018 dest: /192.168.158.4:9866 2025-07-17 10:46:40,605 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:41018, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-768731962_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752824_12000, duration(ns): 18032939 2025-07-17 10:46:40,605 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752824_12000, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-17 10:46:42,523 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752824_12000 replica FinalizedReplica, blk_1073752824_12000, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752824 for deletion 2025-07-17 10:46:42,524 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752824_12000 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752824 2025-07-17 10:47:40,584 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752825_12001 src: /192.168.158.7:38524 dest: /192.168.158.4:9866 2025-07-17 10:47:40,603 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:38524, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_383433884_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752825_12001, duration(ns): 16235068 2025-07-17 10:47:40,603 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752825_12001, type=LAST_IN_PIPELINE terminating 2025-07-17 10:47:45,525 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752825_12001 replica FinalizedReplica, blk_1073752825_12001, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752825 for deletion 2025-07-17 10:47:45,526 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752825_12001 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752825 2025-07-17 10:48:45,588 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752826_12002 src: /192.168.158.5:56476 dest: /192.168.158.4:9866 2025-07-17 10:48:45,607 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:56476, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-513442223_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752826_12002, duration(ns): 17347339 2025-07-17 10:48:45,607 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752826_12002, type=LAST_IN_PIPELINE terminating 2025-07-17 10:48:48,527 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752826_12002 replica FinalizedReplica, blk_1073752826_12002, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752826 for deletion 2025-07-17 10:48:48,529 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752826_12002 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752826 2025-07-17 10:49:45,590 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752827_12003 src: /192.168.158.9:46194 dest: /192.168.158.4:9866 2025-07-17 10:49:45,615 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:46194, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1143631048_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752827_12003, duration(ns): 19487426 2025-07-17 10:49:45,615 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752827_12003, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-17 10:49:51,532 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752827_12003 replica FinalizedReplica, blk_1073752827_12003, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752827 for deletion 2025-07-17 10:49:51,533 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752827_12003 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752827 2025-07-17 10:50:45,591 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752828_12004 src: /192.168.158.7:53622 dest: /192.168.158.4:9866 2025-07-17 10:50:45,617 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:53622, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2008018491_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752828_12004, duration(ns): 20757408 2025-07-17 10:50:45,617 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752828_12004, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-17 10:50:48,532 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752828_12004 replica FinalizedReplica, blk_1073752828_12004, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752828 for deletion 2025-07-17 10:50:48,533 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752828_12004 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752828 2025-07-17 10:51:45,598 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752829_12005 src: /192.168.158.6:60868 dest: /192.168.158.4:9866 2025-07-17 10:51:45,616 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:60868, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1080420180_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752829_12005, duration(ns): 16283409 2025-07-17 10:51:45,617 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752829_12005, type=LAST_IN_PIPELINE terminating 2025-07-17 10:51:48,534 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752829_12005 replica FinalizedReplica, blk_1073752829_12005, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752829 for deletion 2025-07-17 10:51:48,536 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752829_12005 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752829 2025-07-17 10:52:50,592 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752830_12006 src: /192.168.158.5:47656 dest: /192.168.158.4:9866 2025-07-17 10:52:50,612 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:47656, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_191046445_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752830_12006, duration(ns): 17447123 2025-07-17 10:52:50,612 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752830_12006, type=LAST_IN_PIPELINE terminating 2025-07-17 10:52:51,536 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752830_12006 replica FinalizedReplica, blk_1073752830_12006, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752830 for deletion 2025-07-17 10:52:51,537 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752830_12006 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752830 2025-07-17 10:53:50,594 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752831_12007 src: /192.168.158.5:58442 dest: /192.168.158.4:9866 2025-07-17 10:53:50,612 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:58442, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-181392412_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752831_12007, duration(ns): 15770279 2025-07-17 10:53:50,613 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752831_12007, type=LAST_IN_PIPELINE terminating 2025-07-17 10:53:51,537 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752831_12007 replica FinalizedReplica, blk_1073752831_12007, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752831 for deletion 2025-07-17 10:53:51,538 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752831_12007 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir10/blk_1073752831 2025-07-17 10:54:50,624 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752832_12008 src: /192.168.158.1:43576 dest: /192.168.158.4:9866 2025-07-17 10:54:50,659 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43576, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-475781113_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752832_12008, duration(ns): 24618757 2025-07-17 10:54:50,659 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752832_12008, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-17 10:54:51,539 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752832_12008 replica FinalizedReplica, blk_1073752832_12008, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752832 for deletion 2025-07-17 10:54:51,540 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752832_12008 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752832 2025-07-17 10:55:50,597 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752833_12009 src: /192.168.158.9:48724 dest: /192.168.158.4:9866 2025-07-17 10:55:50,623 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:48724, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2081302261_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752833_12009, duration(ns): 20083497 2025-07-17 10:55:50,624 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752833_12009, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-17 10:55:51,541 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752833_12009 replica FinalizedReplica, blk_1073752833_12009, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752833 for deletion 2025-07-17 10:55:51,542 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752833_12009 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752833 2025-07-17 10:56:50,601 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752834_12010 src: /192.168.158.1:42426 dest: /192.168.158.4:9866 2025-07-17 10:56:50,634 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42426, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_292706869_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752834_12010, duration(ns): 23826689 2025-07-17 10:56:50,634 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752834_12010, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-17 10:56:51,542 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752834_12010 replica FinalizedReplica, blk_1073752834_12010, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752834 for deletion 2025-07-17 10:56:51,544 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752834_12010 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752834 2025-07-17 10:57:50,610 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752835_12011 src: /192.168.158.7:33804 dest: /192.168.158.4:9866 2025-07-17 10:57:50,630 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:33804, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1338880041_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752835_12011, duration(ns): 17945823 2025-07-17 10:57:50,631 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752835_12011, type=LAST_IN_PIPELINE terminating 2025-07-17 10:57:54,543 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752835_12011 replica FinalizedReplica, blk_1073752835_12011, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752835 for deletion 2025-07-17 10:57:54,545 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752835_12011 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752835 2025-07-17 10:58:50,611 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752836_12012 src: /192.168.158.8:48046 dest: /192.168.158.4:9866 2025-07-17 10:58:50,634 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:48046, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_529895507_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752836_12012, duration(ns): 20903659 2025-07-17 10:58:50,634 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752836_12012, type=LAST_IN_PIPELINE terminating 2025-07-17 10:58:51,546 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752836_12012 replica FinalizedReplica, blk_1073752836_12012, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752836 for deletion 2025-07-17 10:58:51,547 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752836_12012 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752836 2025-07-17 11:03:50,613 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752841_12017 src: /192.168.158.9:47330 dest: /192.168.158.4:9866 2025-07-17 11:03:50,639 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:47330, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-135136617_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752841_12017, duration(ns): 20356589 2025-07-17 11:03:50,640 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752841_12017, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-17 11:03:54,556 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752841_12017 replica FinalizedReplica, blk_1073752841_12017, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752841 for deletion 2025-07-17 11:03:54,557 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752841_12017 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752841 2025-07-17 11:04:50,616 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752842_12018 src: /192.168.158.9:53136 dest: /192.168.158.4:9866 2025-07-17 11:04:50,640 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:53136, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_766881996_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752842_12018, duration(ns): 18800774 2025-07-17 11:04:50,641 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752842_12018, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-17 11:04:51,558 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752842_12018 replica FinalizedReplica, blk_1073752842_12018, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752842 for deletion 2025-07-17 11:04:51,559 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752842_12018 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752842 2025-07-17 11:05:50,660 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752843_12019 src: /192.168.158.1:46508 dest: /192.168.158.4:9866 2025-07-17 11:05:50,694 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46508, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1039972026_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752843_12019, duration(ns): 24424408 2025-07-17 11:05:50,694 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752843_12019, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-17 11:05:54,559 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752843_12019 replica FinalizedReplica, blk_1073752843_12019, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752843 for deletion 2025-07-17 11:05:54,561 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752843_12019 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752843 2025-07-17 11:07:55,625 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752845_12021 src: /192.168.158.9:51592 dest: /192.168.158.4:9866 2025-07-17 11:07:55,644 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:51592, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1006852523_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752845_12021, duration(ns): 16695859 2025-07-17 11:07:55,644 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752845_12021, type=LAST_IN_PIPELINE terminating 2025-07-17 11:07:57,564 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752845_12021 replica FinalizedReplica, blk_1073752845_12021, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752845 for deletion 2025-07-17 11:07:57,565 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752845_12021 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752845 2025-07-17 11:09:00,633 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752846_12022 src: /192.168.158.9:40790 dest: /192.168.158.4:9866 2025-07-17 11:09:00,650 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:40790, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1031749390_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752846_12022, duration(ns): 15055794 2025-07-17 11:09:00,650 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752846_12022, type=LAST_IN_PIPELINE terminating 2025-07-17 11:09:06,565 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752846_12022 replica FinalizedReplica, blk_1073752846_12022, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752846 for deletion 2025-07-17 11:09:06,566 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752846_12022 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752846 2025-07-17 11:10:05,630 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752847_12023 src: /192.168.158.8:45500 dest: /192.168.158.4:9866 2025-07-17 11:10:05,650 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:45500, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2115481634_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752847_12023, duration(ns): 17776195 2025-07-17 11:10:05,651 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752847_12023, type=LAST_IN_PIPELINE terminating 2025-07-17 11:10:09,569 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752847_12023 replica FinalizedReplica, blk_1073752847_12023, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752847 for deletion 2025-07-17 11:10:09,570 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752847_12023 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752847 2025-07-17 11:17:10,645 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752854_12030 src: /192.168.158.7:54678 dest: /192.168.158.4:9866 2025-07-17 11:17:10,666 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:54678, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1375141324_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752854_12030, duration(ns): 18737524 2025-07-17 11:17:10,667 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752854_12030, type=LAST_IN_PIPELINE terminating 2025-07-17 11:17:15,582 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752854_12030 replica FinalizedReplica, blk_1073752854_12030, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752854 for deletion 2025-07-17 11:17:15,583 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752854_12030 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752854 2025-07-17 11:18:10,643 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752855_12031 src: /192.168.158.1:59266 dest: /192.168.158.4:9866 2025-07-17 11:18:10,677 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59266, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1933647851_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752855_12031, duration(ns): 22680817 2025-07-17 11:18:10,677 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752855_12031, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-17 11:18:12,583 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752855_12031 replica FinalizedReplica, blk_1073752855_12031, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752855 for deletion 2025-07-17 11:18:12,584 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752855_12031 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752855 2025-07-17 11:19:10,644 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752856_12032 src: /192.168.158.6:42954 dest: /192.168.158.4:9866 2025-07-17 11:19:10,669 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:42954, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1370449981_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752856_12032, duration(ns): 19905541 2025-07-17 11:19:10,670 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752856_12032, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-17 11:19:12,582 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752856_12032 replica FinalizedReplica, blk_1073752856_12032, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752856 for deletion 2025-07-17 11:19:12,584 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752856_12032 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752856 2025-07-17 11:21:10,653 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752858_12034 src: /192.168.158.6:39980 dest: /192.168.158.4:9866 2025-07-17 11:21:10,673 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:39980, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-759227215_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752858_12034, duration(ns): 17300924 2025-07-17 11:21:10,673 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752858_12034, type=LAST_IN_PIPELINE terminating 2025-07-17 11:21:15,589 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752858_12034 replica FinalizedReplica, blk_1073752858_12034, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752858 for deletion 2025-07-17 11:21:15,590 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752858_12034 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752858 2025-07-17 11:23:15,665 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752860_12036 src: /192.168.158.8:36392 dest: /192.168.158.4:9866 2025-07-17 11:23:15,692 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:36392, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-79537250_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752860_12036, duration(ns): 20456051 2025-07-17 11:23:15,692 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752860_12036, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-17 11:23:18,593 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752860_12036 replica FinalizedReplica, blk_1073752860_12036, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752860 for deletion 2025-07-17 11:23:18,594 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752860_12036 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752860 2025-07-17 11:24:15,643 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752861_12037 src: /192.168.158.1:44168 dest: /192.168.158.4:9866 2025-07-17 11:24:15,677 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44168, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1731396790_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752861_12037, duration(ns): 24354503 2025-07-17 11:24:15,677 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752861_12037, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-17 11:24:18,596 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752861_12037 replica FinalizedReplica, blk_1073752861_12037, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752861 for deletion 2025-07-17 11:24:18,597 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752861_12037 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752861 2025-07-17 11:25:15,660 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752862_12038 src: /192.168.158.1:38722 dest: /192.168.158.4:9866 2025-07-17 11:25:15,692 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38722, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1950416522_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752862_12038, duration(ns): 22898885 2025-07-17 11:25:15,693 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752862_12038, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-17 11:25:18,597 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752862_12038 replica FinalizedReplica, blk_1073752862_12038, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752862 for deletion 2025-07-17 11:25:18,600 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752862_12038 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752862 2025-07-17 11:27:15,650 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752864_12040 src: /192.168.158.1:51814 dest: /192.168.158.4:9866 2025-07-17 11:27:15,683 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51814, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1595847001_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752864_12040, duration(ns): 23318509 2025-07-17 11:27:15,683 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752864_12040, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-17 11:27:18,600 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752864_12040 replica FinalizedReplica, blk_1073752864_12040, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752864 for deletion 2025-07-17 11:27:18,601 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752864_12040 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752864 2025-07-17 11:29:20,657 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752866_12042 src: /192.168.158.5:50134 dest: /192.168.158.4:9866 2025-07-17 11:29:20,682 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:50134, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1918253935_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752866_12042, duration(ns): 20006788 2025-07-17 11:29:20,683 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752866_12042, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-17 11:29:24,608 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752866_12042 replica FinalizedReplica, blk_1073752866_12042, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752866 for deletion 2025-07-17 11:29:24,609 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752866_12042 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752866 2025-07-17 11:32:25,666 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752869_12045 src: /192.168.158.1:52432 dest: /192.168.158.4:9866 2025-07-17 11:32:25,701 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52432, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_548291141_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752869_12045, duration(ns): 24963459 2025-07-17 11:32:25,701 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752869_12045, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-17 11:32:27,614 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752869_12045 replica FinalizedReplica, blk_1073752869_12045, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752869 for deletion 2025-07-17 11:32:27,615 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752869_12045 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752869 2025-07-17 11:33:25,665 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752870_12046 src: /192.168.158.1:55714 dest: /192.168.158.4:9866 2025-07-17 11:33:25,701 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55714, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1913253542_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752870_12046, duration(ns): 26477039 2025-07-17 11:33:25,701 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752870_12046, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-17 11:33:30,616 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752870_12046 replica FinalizedReplica, blk_1073752870_12046, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752870 for deletion 2025-07-17 11:33:30,617 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752870_12046 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752870 2025-07-17 11:34:25,664 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752871_12047 src: /192.168.158.1:55508 dest: /192.168.158.4:9866 2025-07-17 11:34:25,696 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55508, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1653384844_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752871_12047, duration(ns): 23308581 2025-07-17 11:34:25,697 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752871_12047, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-17 11:34:27,617 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752871_12047 replica FinalizedReplica, blk_1073752871_12047, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752871 for deletion 2025-07-17 11:34:27,618 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752871_12047 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752871 2025-07-17 11:36:13,269 INFO org.apache.hadoop.hdfs.server.datanode.DirectoryScanner: BlockPool BP-1059995147-192.168.158.1-1752101929360 Total blocks: 8, missing metadata files:0, missing block files:0, missing blocks in memory:0, mismatched blocks:0 2025-07-17 11:36:25,677 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752873_12049 src: /192.168.158.5:35582 dest: /192.168.158.4:9866 2025-07-17 11:36:25,704 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:35582, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-617838728_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752873_12049, duration(ns): 21576523 2025-07-17 11:36:25,705 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752873_12049, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-17 11:36:30,619 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752873_12049 replica FinalizedReplica, blk_1073752873_12049, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752873 for deletion 2025-07-17 11:36:30,620 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752873_12049 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752873 2025-07-17 11:38:25,683 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752875_12051 src: /192.168.158.8:40878 dest: /192.168.158.4:9866 2025-07-17 11:38:25,704 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:40878, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_879614514_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752875_12051, duration(ns): 18127732 2025-07-17 11:38:25,704 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752875_12051, type=LAST_IN_PIPELINE terminating 2025-07-17 11:38:27,625 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752875_12051 replica FinalizedReplica, blk_1073752875_12051, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752875 for deletion 2025-07-17 11:38:27,626 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752875_12051 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752875 2025-07-17 11:39:25,676 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752876_12052 src: /192.168.158.1:48144 dest: /192.168.158.4:9866 2025-07-17 11:39:25,708 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48144, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-387442611_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752876_12052, duration(ns): 22551506 2025-07-17 11:39:25,708 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752876_12052, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-17 11:39:30,626 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752876_12052 replica FinalizedReplica, blk_1073752876_12052, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752876 for deletion 2025-07-17 11:39:30,627 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752876_12052 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752876 2025-07-17 11:40:30,682 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752877_12053 src: /192.168.158.8:51646 dest: /192.168.158.4:9866 2025-07-17 11:40:30,706 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:51646, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_605144628_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752877_12053, duration(ns): 18176474 2025-07-17 11:40:30,707 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752877_12053, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-17 11:40:36,626 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752877_12053 replica FinalizedReplica, blk_1073752877_12053, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752877 for deletion 2025-07-17 11:40:36,628 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752877_12053 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752877 2025-07-17 11:41:35,687 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752878_12054 src: /192.168.158.8:47416 dest: /192.168.158.4:9866 2025-07-17 11:41:35,705 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:47416, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-360682777_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752878_12054, duration(ns): 16368388 2025-07-17 11:41:35,706 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752878_12054, type=LAST_IN_PIPELINE terminating 2025-07-17 11:41:39,631 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752878_12054 replica FinalizedReplica, blk_1073752878_12054, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752878 for deletion 2025-07-17 11:41:39,632 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752878_12054 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752878 2025-07-17 11:43:35,701 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752880_12056 src: /192.168.158.5:38406 dest: /192.168.158.4:9866 2025-07-17 11:43:35,719 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:38406, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1443279460_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752880_12056, duration(ns): 15910519 2025-07-17 11:43:35,719 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752880_12056, type=LAST_IN_PIPELINE terminating 2025-07-17 11:43:39,635 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752880_12056 replica FinalizedReplica, blk_1073752880_12056, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752880 for deletion 2025-07-17 11:43:39,636 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752880_12056 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752880 2025-07-17 11:45:35,690 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752882_12058 src: /192.168.158.1:46178 dest: /192.168.158.4:9866 2025-07-17 11:45:35,724 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46178, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_291116742_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752882_12058, duration(ns): 23925748 2025-07-17 11:45:35,724 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752882_12058, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-17 11:45:36,638 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752882_12058 replica FinalizedReplica, blk_1073752882_12058, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752882 for deletion 2025-07-17 11:45:36,639 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752882_12058 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752882 2025-07-17 11:49:35,708 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752886_12062 src: /192.168.158.9:34390 dest: /192.168.158.4:9866 2025-07-17 11:49:35,726 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:34390, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-598726351_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752886_12062, duration(ns): 15841536 2025-07-17 11:49:35,727 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752886_12062, type=LAST_IN_PIPELINE terminating 2025-07-17 11:49:36,646 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752886_12062 replica FinalizedReplica, blk_1073752886_12062, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752886 for deletion 2025-07-17 11:49:36,647 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752886_12062 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752886 2025-07-17 11:52:40,703 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752889_12065 src: /192.168.158.1:55692 dest: /192.168.158.4:9866 2025-07-17 11:52:40,739 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55692, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1243824460_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752889_12065, duration(ns): 27075018 2025-07-17 11:52:40,740 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752889_12065, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-17 11:52:42,655 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752889_12065 replica FinalizedReplica, blk_1073752889_12065, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752889 for deletion 2025-07-17 11:52:42,656 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752889_12065 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752889 2025-07-17 11:53:40,702 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752890_12066 src: /192.168.158.1:59410 dest: /192.168.158.4:9866 2025-07-17 11:53:40,738 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59410, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-631217154_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752890_12066, duration(ns): 25713964 2025-07-17 11:53:40,738 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752890_12066, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-17 11:53:42,655 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752890_12066 replica FinalizedReplica, blk_1073752890_12066, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752890 for deletion 2025-07-17 11:53:42,656 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752890_12066 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752890 2025-07-17 11:56:40,712 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752893_12069 src: /192.168.158.8:34782 dest: /192.168.158.4:9866 2025-07-17 11:56:40,738 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:34782, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1316834765_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752893_12069, duration(ns): 20864863 2025-07-17 11:56:40,739 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752893_12069, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-17 11:56:42,659 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752893_12069 replica FinalizedReplica, blk_1073752893_12069, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752893 for deletion 2025-07-17 11:56:42,661 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752893_12069 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752893 2025-07-17 12:00:40,719 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752897_12073 src: /192.168.158.7:41668 dest: /192.168.158.4:9866 2025-07-17 12:00:40,747 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:41668, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1077354231_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752897_12073, duration(ns): 22101104 2025-07-17 12:00:40,748 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752897_12073, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-17 12:00:45,672 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752897_12073 replica FinalizedReplica, blk_1073752897_12073, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752897 for deletion 2025-07-17 12:00:45,674 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752897_12073 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752897 2025-07-17 12:01:45,716 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752898_12074 src: /192.168.158.7:56416 dest: /192.168.158.4:9866 2025-07-17 12:01:45,741 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:56416, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_174952302_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752898_12074, duration(ns): 20000110 2025-07-17 12:01:45,742 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752898_12074, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-17 12:01:48,678 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752898_12074 replica FinalizedReplica, blk_1073752898_12074, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752898 for deletion 2025-07-17 12:01:48,679 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752898_12074 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752898 2025-07-17 12:02:45,723 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752899_12075 src: /192.168.158.8:37308 dest: /192.168.158.4:9866 2025-07-17 12:02:45,743 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:37308, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-301343492_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752899_12075, duration(ns): 17244106 2025-07-17 12:02:45,743 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752899_12075, type=LAST_IN_PIPELINE terminating 2025-07-17 12:02:48,681 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752899_12075 replica FinalizedReplica, blk_1073752899_12075, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752899 for deletion 2025-07-17 12:02:48,682 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752899_12075 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752899 2025-07-17 12:03:50,719 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752900_12076 src: /192.168.158.7:56018 dest: /192.168.158.4:9866 2025-07-17 12:03:50,746 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:56018, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_177003557_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752900_12076, duration(ns): 21293369 2025-07-17 12:03:50,747 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752900_12076, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-17 12:03:51,680 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752900_12076 replica FinalizedReplica, blk_1073752900_12076, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752900 for deletion 2025-07-17 12:03:51,682 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752900_12076 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752900 2025-07-17 12:05:50,727 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752902_12078 src: /192.168.158.5:44120 dest: /192.168.158.4:9866 2025-07-17 12:05:50,752 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:44120, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1420388166_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752902_12078, duration(ns): 19805075 2025-07-17 12:05:50,753 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752902_12078, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-17 12:05:54,687 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752902_12078 replica FinalizedReplica, blk_1073752902_12078, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752902 for deletion 2025-07-17 12:05:54,688 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752902_12078 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752902 2025-07-17 12:07:55,734 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752904_12080 src: /192.168.158.5:48836 dest: /192.168.158.4:9866 2025-07-17 12:07:55,753 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:48836, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1615015483_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752904_12080, duration(ns): 16759434 2025-07-17 12:07:55,753 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752904_12080, type=LAST_IN_PIPELINE terminating 2025-07-17 12:07:57,691 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752904_12080 replica FinalizedReplica, blk_1073752904_12080, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752904 for deletion 2025-07-17 12:07:57,692 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752904_12080 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752904 2025-07-17 12:10:00,744 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752906_12082 src: /192.168.158.1:39732 dest: /192.168.158.4:9866 2025-07-17 12:10:00,780 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39732, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1378358741_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752906_12082, duration(ns): 25359454 2025-07-17 12:10:00,780 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752906_12082, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-17 12:10:06,696 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752906_12082 replica FinalizedReplica, blk_1073752906_12082, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752906 for deletion 2025-07-17 12:10:06,697 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752906_12082 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752906 2025-07-17 12:11:00,748 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752907_12083 src: /192.168.158.6:51142 dest: /192.168.158.4:9866 2025-07-17 12:11:00,776 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:51142, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1574018269_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752907_12083, duration(ns): 23151548 2025-07-17 12:11:00,777 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752907_12083, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-17 12:11:03,696 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752907_12083 replica FinalizedReplica, blk_1073752907_12083, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752907 for deletion 2025-07-17 12:11:03,697 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752907_12083 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752907 2025-07-17 12:12:00,754 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752908_12084 src: /192.168.158.8:51684 dest: /192.168.158.4:9866 2025-07-17 12:12:00,775 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:51684, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1237716380_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752908_12084, duration(ns): 18027362 2025-07-17 12:12:00,775 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752908_12084, type=LAST_IN_PIPELINE terminating 2025-07-17 12:12:03,697 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752908_12084 replica FinalizedReplica, blk_1073752908_12084, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752908 for deletion 2025-07-17 12:12:03,699 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752908_12084 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752908 2025-07-17 12:13:00,788 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752909_12085 src: /192.168.158.1:54670 dest: /192.168.158.4:9866 2025-07-17 12:13:00,824 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54670, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-398608740_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752909_12085, duration(ns): 27257482 2025-07-17 12:13:00,824 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752909_12085, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-17 12:13:03,702 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752909_12085 replica FinalizedReplica, blk_1073752909_12085, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752909 for deletion 2025-07-17 12:13:03,703 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752909_12085 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752909 2025-07-17 12:17:00,757 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752913_12089 src: /192.168.158.1:45350 dest: /192.168.158.4:9866 2025-07-17 12:17:00,788 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45350, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2022049130_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752913_12089, duration(ns): 21687949 2025-07-17 12:17:00,788 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752913_12089, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-17 12:17:03,709 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752913_12089 replica FinalizedReplica, blk_1073752913_12089, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752913 for deletion 2025-07-17 12:17:03,710 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752913_12089 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752913 2025-07-17 12:18:00,763 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752914_12090 src: /192.168.158.8:50526 dest: /192.168.158.4:9866 2025-07-17 12:18:00,783 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:50526, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1626887372_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752914_12090, duration(ns): 17029690 2025-07-17 12:18:00,783 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752914_12090, type=LAST_IN_PIPELINE terminating 2025-07-17 12:18:03,710 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752914_12090 replica FinalizedReplica, blk_1073752914_12090, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752914 for deletion 2025-07-17 12:18:03,712 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752914_12090 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752914 2025-07-17 12:24:05,774 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752920_12096 src: /192.168.158.6:44988 dest: /192.168.158.4:9866 2025-07-17 12:24:05,793 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:44988, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_540941308_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752920_12096, duration(ns): 16588667 2025-07-17 12:24:05,793 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752920_12096, type=LAST_IN_PIPELINE terminating 2025-07-17 12:24:12,724 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752920_12096 replica FinalizedReplica, blk_1073752920_12096, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752920 for deletion 2025-07-17 12:24:12,725 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752920_12096 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752920 2025-07-17 12:25:05,775 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752921_12097 src: /192.168.158.5:39672 dest: /192.168.158.4:9866 2025-07-17 12:25:05,794 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:39672, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-394026874_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752921_12097, duration(ns): 17006133 2025-07-17 12:25:05,794 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752921_12097, type=LAST_IN_PIPELINE terminating 2025-07-17 12:25:12,727 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752921_12097 replica FinalizedReplica, blk_1073752921_12097, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752921 for deletion 2025-07-17 12:25:12,728 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752921_12097 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752921 2025-07-17 12:26:05,801 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752922_12098 src: /192.168.158.7:59484 dest: /192.168.158.4:9866 2025-07-17 12:26:05,822 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:59484, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2061162068_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752922_12098, duration(ns): 18244574 2025-07-17 12:26:05,822 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752922_12098, type=LAST_IN_PIPELINE terminating 2025-07-17 12:26:12,728 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752922_12098 replica FinalizedReplica, blk_1073752922_12098, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752922 for deletion 2025-07-17 12:26:12,730 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752922_12098 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752922 2025-07-17 12:27:05,785 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752923_12099 src: /192.168.158.9:42226 dest: /192.168.158.4:9866 2025-07-17 12:27:05,803 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:42226, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1937171410_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752923_12099, duration(ns): 16387536 2025-07-17 12:27:05,804 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752923_12099, type=LAST_IN_PIPELINE terminating 2025-07-17 12:27:09,729 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752923_12099 replica FinalizedReplica, blk_1073752923_12099, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752923 for deletion 2025-07-17 12:27:09,730 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752923_12099 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752923 2025-07-17 12:29:05,762 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752925_12101 src: /192.168.158.6:42506 dest: /192.168.158.4:9866 2025-07-17 12:29:05,789 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:42506, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1090469686_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752925_12101, duration(ns): 21442054 2025-07-17 12:29:05,790 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752925_12101, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-17 12:29:09,733 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752925_12101 replica FinalizedReplica, blk_1073752925_12101, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752925 for deletion 2025-07-17 12:29:09,734 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752925_12101 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752925 2025-07-17 12:32:10,788 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752928_12104 src: /192.168.158.5:42656 dest: /192.168.158.4:9866 2025-07-17 12:32:10,808 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:42656, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1680041931_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752928_12104, duration(ns): 16731729 2025-07-17 12:32:10,808 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752928_12104, type=LAST_IN_PIPELINE terminating 2025-07-17 12:32:15,738 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752928_12104 replica FinalizedReplica, blk_1073752928_12104, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752928 for deletion 2025-07-17 12:32:15,739 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752928_12104 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752928 2025-07-17 12:35:10,787 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752931_12107 src: /192.168.158.1:58020 dest: /192.168.158.4:9866 2025-07-17 12:35:10,822 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58020, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_938403126_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752931_12107, duration(ns): 25542463 2025-07-17 12:35:10,823 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752931_12107, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-17 12:35:15,743 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752931_12107 replica FinalizedReplica, blk_1073752931_12107, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752931 for deletion 2025-07-17 12:35:15,744 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752931_12107 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752931 2025-07-17 12:38:15,781 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752934_12110 src: /192.168.158.8:51374 dest: /192.168.158.4:9866 2025-07-17 12:38:15,808 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:51374, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1861432785_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752934_12110, duration(ns): 21546345 2025-07-17 12:38:15,809 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752934_12110, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-17 12:38:21,747 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752934_12110 replica FinalizedReplica, blk_1073752934_12110, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752934 for deletion 2025-07-17 12:38:21,749 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752934_12110 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752934 2025-07-17 12:42:20,779 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752938_12114 src: /192.168.158.1:44178 dest: /192.168.158.4:9866 2025-07-17 12:42:20,811 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44178, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-860079835_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752938_12114, duration(ns): 23677019 2025-07-17 12:42:20,812 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752938_12114, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-17 12:42:27,755 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752938_12114 replica FinalizedReplica, blk_1073752938_12114, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752938 for deletion 2025-07-17 12:42:27,756 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752938_12114 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752938 2025-07-17 12:45:20,792 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752941_12117 src: /192.168.158.8:50294 dest: /192.168.158.4:9866 2025-07-17 12:45:20,823 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:50294, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1939124137_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752941_12117, duration(ns): 21905586 2025-07-17 12:45:20,823 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752941_12117, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-17 12:45:24,759 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752941_12117 replica FinalizedReplica, blk_1073752941_12117, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752941 for deletion 2025-07-17 12:45:24,760 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752941_12117 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752941 2025-07-17 12:47:20,797 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752943_12119 src: /192.168.158.5:39228 dest: /192.168.158.4:9866 2025-07-17 12:47:20,818 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:39228, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1109595297_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752943_12119, duration(ns): 18660889 2025-07-17 12:47:20,819 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752943_12119, type=LAST_IN_PIPELINE terminating 2025-07-17 12:47:24,764 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752943_12119 replica FinalizedReplica, blk_1073752943_12119, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752943 for deletion 2025-07-17 12:47:24,765 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752943_12119 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752943 2025-07-17 12:48:25,801 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752944_12120 src: /192.168.158.6:36678 dest: /192.168.158.4:9866 2025-07-17 12:48:25,829 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:36678, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1460946124_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752944_12120, duration(ns): 20941573 2025-07-17 12:48:25,830 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752944_12120, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-17 12:48:30,765 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752944_12120 replica FinalizedReplica, blk_1073752944_12120, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752944 for deletion 2025-07-17 12:48:30,766 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752944_12120 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752944 2025-07-17 12:51:30,815 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752947_12123 src: /192.168.158.6:57174 dest: /192.168.158.4:9866 2025-07-17 12:51:30,833 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:57174, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1741955855_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752947_12123, duration(ns): 15627472 2025-07-17 12:51:30,833 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752947_12123, type=LAST_IN_PIPELINE terminating 2025-07-17 12:51:33,772 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752947_12123 replica FinalizedReplica, blk_1073752947_12123, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752947 for deletion 2025-07-17 12:51:33,773 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752947_12123 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752947 2025-07-17 12:52:30,812 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752948_12124 src: /192.168.158.7:58202 dest: /192.168.158.4:9866 2025-07-17 12:52:30,832 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:58202, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1891693106_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752948_12124, duration(ns): 17714404 2025-07-17 12:52:30,832 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752948_12124, type=LAST_IN_PIPELINE terminating 2025-07-17 12:52:33,775 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752948_12124 replica FinalizedReplica, blk_1073752948_12124, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752948 for deletion 2025-07-17 12:52:33,776 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752948_12124 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752948 2025-07-17 12:53:35,814 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752949_12125 src: /192.168.158.5:57512 dest: /192.168.158.4:9866 2025-07-17 12:53:35,832 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:57512, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1434903848_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752949_12125, duration(ns): 14959668 2025-07-17 12:53:35,833 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752949_12125, type=LAST_IN_PIPELINE terminating 2025-07-17 12:53:39,775 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752949_12125 replica FinalizedReplica, blk_1073752949_12125, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752949 for deletion 2025-07-17 12:53:39,776 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752949_12125 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752949 2025-07-17 12:55:35,796 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752951_12127 src: /192.168.158.1:59300 dest: /192.168.158.4:9866 2025-07-17 12:55:35,829 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59300, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-329296714_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752951_12127, duration(ns): 24305242 2025-07-17 12:55:35,830 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752951_12127, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-17 12:55:42,779 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752951_12127 replica FinalizedReplica, blk_1073752951_12127, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752951 for deletion 2025-07-17 12:55:42,781 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752951_12127 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752951 2025-07-17 12:56:35,842 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752952_12128 src: /192.168.158.5:48140 dest: /192.168.158.4:9866 2025-07-17 12:56:35,863 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:48140, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-183962712_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752952_12128, duration(ns): 18389616 2025-07-17 12:56:35,863 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752952_12128, type=LAST_IN_PIPELINE terminating 2025-07-17 12:56:39,782 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752952_12128 replica FinalizedReplica, blk_1073752952_12128, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752952 for deletion 2025-07-17 12:56:39,783 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752952_12128 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752952 2025-07-17 12:57:35,819 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752953_12129 src: /192.168.158.6:36690 dest: /192.168.158.4:9866 2025-07-17 12:57:35,841 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:36690, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_726479102_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752953_12129, duration(ns): 19176044 2025-07-17 12:57:35,841 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752953_12129, type=LAST_IN_PIPELINE terminating 2025-07-17 12:57:39,784 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752953_12129 replica FinalizedReplica, blk_1073752953_12129, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752953 for deletion 2025-07-17 12:57:39,786 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752953_12129 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752953 2025-07-17 13:00:35,810 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752956_12132 src: /192.168.158.1:34700 dest: /192.168.158.4:9866 2025-07-17 13:00:35,843 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34700, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1443683092_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752956_12132, duration(ns): 23767855 2025-07-17 13:00:35,843 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752956_12132, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-17 13:00:39,791 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752956_12132 replica FinalizedReplica, blk_1073752956_12132, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752956 for deletion 2025-07-17 13:00:39,792 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752956_12132 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752956 2025-07-17 13:01:40,813 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752957_12133 src: /192.168.158.1:44770 dest: /192.168.158.4:9866 2025-07-17 13:01:40,845 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44770, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_144415982_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752957_12133, duration(ns): 22207581 2025-07-17 13:01:40,845 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752957_12133, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-17 13:01:48,797 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752957_12133 replica FinalizedReplica, blk_1073752957_12133, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752957 for deletion 2025-07-17 13:01:48,798 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752957_12133 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752957 2025-07-17 13:02:40,814 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752958_12134 src: /192.168.158.5:59192 dest: /192.168.158.4:9866 2025-07-17 13:02:40,834 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:59192, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-525922776_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752958_12134, duration(ns): 17039172 2025-07-17 13:02:40,834 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752958_12134, type=LAST_IN_PIPELINE terminating 2025-07-17 13:02:48,801 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752958_12134 replica FinalizedReplica, blk_1073752958_12134, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752958 for deletion 2025-07-17 13:02:48,802 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752958_12134 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752958 2025-07-17 13:07:40,818 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752963_12139 src: /192.168.158.5:53220 dest: /192.168.158.4:9866 2025-07-17 13:07:40,844 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:53220, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1958969424_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752963_12139, duration(ns): 20774873 2025-07-17 13:07:40,844 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752963_12139, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-17 13:07:45,811 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752963_12139 replica FinalizedReplica, blk_1073752963_12139, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752963 for deletion 2025-07-17 13:07:45,812 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752963_12139 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752963 2025-07-17 13:11:45,836 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752967_12143 src: /192.168.158.7:36184 dest: /192.168.158.4:9866 2025-07-17 13:11:45,856 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:36184, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1892678411_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752967_12143, duration(ns): 18313174 2025-07-17 13:11:45,857 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752967_12143, type=LAST_IN_PIPELINE terminating 2025-07-17 13:11:48,817 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752967_12143 replica FinalizedReplica, blk_1073752967_12143, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752967 for deletion 2025-07-17 13:11:48,818 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752967_12143 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752967 2025-07-17 13:12:45,835 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752968_12144 src: /192.168.158.1:53274 dest: /192.168.158.4:9866 2025-07-17 13:12:45,869 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53274, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1693665846_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752968_12144, duration(ns): 25425175 2025-07-17 13:12:45,869 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752968_12144, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-17 13:12:48,819 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752968_12144 replica FinalizedReplica, blk_1073752968_12144, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752968 for deletion 2025-07-17 13:12:48,821 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752968_12144 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752968 2025-07-17 13:14:45,844 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752970_12146 src: /192.168.158.6:59812 dest: /192.168.158.4:9866 2025-07-17 13:14:45,864 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:59812, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1171420458_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752970_12146, duration(ns): 18020869 2025-07-17 13:14:45,864 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752970_12146, type=LAST_IN_PIPELINE terminating 2025-07-17 13:14:48,823 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752970_12146 replica FinalizedReplica, blk_1073752970_12146, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752970 for deletion 2025-07-17 13:14:48,824 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752970_12146 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752970 2025-07-17 13:17:50,859 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752973_12149 src: /192.168.158.9:47540 dest: /192.168.158.4:9866 2025-07-17 13:17:50,885 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:47540, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2096048032_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752973_12149, duration(ns): 20043757 2025-07-17 13:17:50,886 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752973_12149, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-17 13:17:54,831 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752973_12149 replica FinalizedReplica, blk_1073752973_12149, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752973 for deletion 2025-07-17 13:17:54,832 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752973_12149 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752973 2025-07-17 13:18:50,884 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752974_12150 src: /192.168.158.1:41150 dest: /192.168.158.4:9866 2025-07-17 13:18:50,915 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41150, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-884061805_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752974_12150, duration(ns): 21565778 2025-07-17 13:18:50,916 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752974_12150, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-17 13:18:54,834 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752974_12150 replica FinalizedReplica, blk_1073752974_12150, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752974 for deletion 2025-07-17 13:18:54,835 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752974_12150 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752974 2025-07-17 13:19:50,844 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752975_12151 src: /192.168.158.1:48698 dest: /192.168.158.4:9866 2025-07-17 13:19:50,877 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48698, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1266516732_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752975_12151, duration(ns): 23889221 2025-07-17 13:19:50,877 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752975_12151, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-17 13:19:54,835 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752975_12151 replica FinalizedReplica, blk_1073752975_12151, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752975 for deletion 2025-07-17 13:19:54,837 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752975_12151 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752975 2025-07-17 13:25:50,856 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752981_12157 src: /192.168.158.1:42154 dest: /192.168.158.4:9866 2025-07-17 13:25:50,891 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42154, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1241121062_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752981_12157, duration(ns): 26078950 2025-07-17 13:25:50,891 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752981_12157, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-17 13:25:57,851 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752981_12157 replica FinalizedReplica, blk_1073752981_12157, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752981 for deletion 2025-07-17 13:25:57,852 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752981_12157 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752981 2025-07-17 13:26:50,870 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752982_12158 src: /192.168.158.7:55048 dest: /192.168.158.4:9866 2025-07-17 13:26:50,895 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:55048, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2079634562_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752982_12158, duration(ns): 19194964 2025-07-17 13:26:50,895 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752982_12158, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-17 13:26:57,851 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752982_12158 replica FinalizedReplica, blk_1073752982_12158, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752982 for deletion 2025-07-17 13:26:57,852 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752982_12158 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752982 2025-07-17 13:27:50,899 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752983_12159 src: /192.168.158.8:32850 dest: /192.168.158.4:9866 2025-07-17 13:27:50,924 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:32850, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_331862084_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752983_12159, duration(ns): 19694972 2025-07-17 13:27:50,927 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752983_12159, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-17 13:27:54,852 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752983_12159 replica FinalizedReplica, blk_1073752983_12159, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752983 for deletion 2025-07-17 13:27:54,854 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752983_12159 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752983 2025-07-17 13:30:50,871 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752986_12162 src: /192.168.158.1:57940 dest: /192.168.158.4:9866 2025-07-17 13:30:50,903 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57940, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1210993890_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752986_12162, duration(ns): 22776240 2025-07-17 13:30:50,904 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752986_12162, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-17 13:30:54,856 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752986_12162 replica FinalizedReplica, blk_1073752986_12162, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752986 for deletion 2025-07-17 13:30:54,857 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752986_12162 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752986 2025-07-17 13:31:55,870 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752987_12163 src: /192.168.158.1:57360 dest: /192.168.158.4:9866 2025-07-17 13:31:55,904 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57360, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1099060063_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752987_12163, duration(ns): 24862887 2025-07-17 13:31:55,904 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752987_12163, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-17 13:32:03,857 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752987_12163 replica FinalizedReplica, blk_1073752987_12163, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752987 for deletion 2025-07-17 13:32:03,859 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752987_12163 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752987 2025-07-17 13:33:55,880 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752989_12165 src: /192.168.158.6:35354 dest: /192.168.158.4:9866 2025-07-17 13:33:55,898 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:35354, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1830975520_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752989_12165, duration(ns): 15755695 2025-07-17 13:33:55,898 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752989_12165, type=LAST_IN_PIPELINE terminating 2025-07-17 13:34:03,861 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752989_12165 replica FinalizedReplica, blk_1073752989_12165, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752989 for deletion 2025-07-17 13:34:03,862 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752989_12165 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752989 2025-07-17 13:36:55,887 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752992_12168 src: /192.168.158.8:60284 dest: /192.168.158.4:9866 2025-07-17 13:36:55,917 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:60284, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_400983344_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752992_12168, duration(ns): 22767304 2025-07-17 13:36:55,917 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752992_12168, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-17 13:37:03,869 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752992_12168 replica FinalizedReplica, blk_1073752992_12168, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752992 for deletion 2025-07-17 13:37:03,870 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752992_12168 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752992 2025-07-17 13:38:55,886 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752994_12170 src: /192.168.158.1:58010 dest: /192.168.158.4:9866 2025-07-17 13:38:55,922 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58010, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1889992758_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752994_12170, duration(ns): 26521457 2025-07-17 13:38:55,923 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752994_12170, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-17 13:39:00,869 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752994_12170 replica FinalizedReplica, blk_1073752994_12170, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752994 for deletion 2025-07-17 13:39:00,870 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752994_12170 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752994 2025-07-17 13:41:00,920 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752996_12172 src: /192.168.158.8:44374 dest: /192.168.158.4:9866 2025-07-17 13:41:00,947 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:44374, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1366624544_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752996_12172, duration(ns): 21512508 2025-07-17 13:41:00,947 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752996_12172, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-17 13:41:03,875 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752996_12172 replica FinalizedReplica, blk_1073752996_12172, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752996 for deletion 2025-07-17 13:41:03,876 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752996_12172 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752996 2025-07-17 13:42:00,921 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752997_12173 src: /192.168.158.8:49402 dest: /192.168.158.4:9866 2025-07-17 13:42:00,946 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:49402, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-757211297_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752997_12173, duration(ns): 20034351 2025-07-17 13:42:00,947 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752997_12173, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-17 13:42:03,876 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752997_12173 replica FinalizedReplica, blk_1073752997_12173, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752997 for deletion 2025-07-17 13:42:03,877 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752997_12173 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752997 2025-07-17 13:44:00,894 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073752999_12175 src: /192.168.158.5:50742 dest: /192.168.158.4:9866 2025-07-17 13:44:00,911 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:50742, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2019819094_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073752999_12175, duration(ns): 14795327 2025-07-17 13:44:00,912 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073752999_12175, type=LAST_IN_PIPELINE terminating 2025-07-17 13:44:06,882 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073752999_12175 replica FinalizedReplica, blk_1073752999_12175, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752999 for deletion 2025-07-17 13:44:06,883 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073752999_12175 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073752999 2025-07-17 13:45:00,892 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753000_12176 src: /192.168.158.1:39596 dest: /192.168.158.4:9866 2025-07-17 13:45:00,924 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39596, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-775235207_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753000_12176, duration(ns): 22379726 2025-07-17 13:45:00,924 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753000_12176, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-17 13:45:06,884 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753000_12176 replica FinalizedReplica, blk_1073753000_12176, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753000 for deletion 2025-07-17 13:45:06,885 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753000_12176 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753000 2025-07-17 13:46:00,892 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753001_12177 src: /192.168.158.1:55492 dest: /192.168.158.4:9866 2025-07-17 13:46:00,924 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55492, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1719191612_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753001_12177, duration(ns): 22578432 2025-07-17 13:46:00,924 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753001_12177, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-17 13:46:03,885 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753001_12177 replica FinalizedReplica, blk_1073753001_12177, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753001 for deletion 2025-07-17 13:46:03,886 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753001_12177 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753001 2025-07-17 13:47:00,893 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753002_12178 src: /192.168.158.1:36448 dest: /192.168.158.4:9866 2025-07-17 13:47:00,928 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36448, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_9443712_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753002_12178, duration(ns): 25517000 2025-07-17 13:47:00,928 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753002_12178, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-17 13:47:03,886 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753002_12178 replica FinalizedReplica, blk_1073753002_12178, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753002 for deletion 2025-07-17 13:47:03,888 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753002_12178 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753002 2025-07-17 13:49:00,905 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753004_12180 src: /192.168.158.5:38344 dest: /192.168.158.4:9866 2025-07-17 13:49:00,924 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:38344, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_196289707_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753004_12180, duration(ns): 16646514 2025-07-17 13:49:00,924 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753004_12180, type=LAST_IN_PIPELINE terminating 2025-07-17 13:49:03,890 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753004_12180 replica FinalizedReplica, blk_1073753004_12180, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753004 for deletion 2025-07-17 13:49:03,891 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753004_12180 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753004 2025-07-17 13:51:05,900 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753006_12182 src: /192.168.158.1:38300 dest: /192.168.158.4:9866 2025-07-17 13:51:05,933 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38300, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1663086732_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753006_12182, duration(ns): 24147176 2025-07-17 13:51:05,933 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753006_12182, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-17 13:51:09,893 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753006_12182 replica FinalizedReplica, blk_1073753006_12182, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753006 for deletion 2025-07-17 13:51:09,894 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753006_12182 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753006 2025-07-17 13:53:05,901 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753008_12184 src: /192.168.158.1:57722 dest: /192.168.158.4:9866 2025-07-17 13:53:05,934 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57722, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-351292344_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753008_12184, duration(ns): 24537345 2025-07-17 13:53:05,935 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753008_12184, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-17 13:53:09,898 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753008_12184 replica FinalizedReplica, blk_1073753008_12184, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753008 for deletion 2025-07-17 13:53:09,899 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753008_12184 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753008 2025-07-17 13:55:05,913 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753010_12186 src: /192.168.158.8:60332 dest: /192.168.158.4:9866 2025-07-17 13:55:05,934 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:60332, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-843457272_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753010_12186, duration(ns): 18942100 2025-07-17 13:55:05,935 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753010_12186, type=LAST_IN_PIPELINE terminating 2025-07-17 13:55:09,902 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753010_12186 replica FinalizedReplica, blk_1073753010_12186, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753010 for deletion 2025-07-17 13:55:09,904 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753010_12186 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753010 2025-07-17 13:56:10,914 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753011_12187 src: /192.168.158.6:34890 dest: /192.168.158.4:9866 2025-07-17 13:56:10,934 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:34890, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_601438260_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753011_12187, duration(ns): 18225973 2025-07-17 13:56:10,934 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753011_12187, type=LAST_IN_PIPELINE terminating 2025-07-17 13:56:15,904 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753011_12187 replica FinalizedReplica, blk_1073753011_12187, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753011 for deletion 2025-07-17 13:56:15,905 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753011_12187 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753011 2025-07-17 13:57:10,911 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753012_12188 src: /192.168.158.1:40594 dest: /192.168.158.4:9866 2025-07-17 13:57:10,947 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40594, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2138340724_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753012_12188, duration(ns): 25467459 2025-07-17 13:57:10,947 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753012_12188, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-17 13:57:15,908 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753012_12188 replica FinalizedReplica, blk_1073753012_12188, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753012 for deletion 2025-07-17 13:57:15,909 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753012_12188 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753012 2025-07-17 13:59:15,919 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753014_12190 src: /192.168.158.8:60760 dest: /192.168.158.4:9866 2025-07-17 13:59:15,944 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:60760, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1265295945_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753014_12190, duration(ns): 19687709 2025-07-17 13:59:15,944 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753014_12190, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-17 13:59:21,914 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753014_12190 replica FinalizedReplica, blk_1073753014_12190, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753014 for deletion 2025-07-17 13:59:21,915 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753014_12190 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753014 2025-07-17 14:00:15,927 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753015_12191 src: /192.168.158.1:55750 dest: /192.168.158.4:9866 2025-07-17 14:00:15,962 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55750, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1076352078_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753015_12191, duration(ns): 25952438 2025-07-17 14:00:15,962 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753015_12191, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-17 14:00:21,917 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753015_12191 replica FinalizedReplica, blk_1073753015_12191, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753015 for deletion 2025-07-17 14:00:21,918 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753015_12191 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753015 2025-07-17 14:01:20,925 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753016_12192 src: /192.168.158.1:40548 dest: /192.168.158.4:9866 2025-07-17 14:01:20,958 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40548, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1416530932_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753016_12192, duration(ns): 23891089 2025-07-17 14:01:20,958 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753016_12192, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-17 14:01:24,919 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753016_12192 replica FinalizedReplica, blk_1073753016_12192, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753016 for deletion 2025-07-17 14:01:24,920 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753016_12192 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753016 2025-07-17 14:03:25,926 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753018_12194 src: /192.168.158.1:39666 dest: /192.168.158.4:9866 2025-07-17 14:03:25,957 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39666, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_819901362_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753018_12194, duration(ns): 21916331 2025-07-17 14:03:25,957 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753018_12194, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-17 14:03:30,921 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753018_12194 replica FinalizedReplica, blk_1073753018_12194, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753018 for deletion 2025-07-17 14:03:30,922 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753018_12194 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753018 2025-07-17 14:06:30,939 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753021_12197 src: /192.168.158.9:53474 dest: /192.168.158.4:9866 2025-07-17 14:06:30,960 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:53474, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1805642621_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753021_12197, duration(ns): 18993140 2025-07-17 14:06:30,960 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753021_12197, type=LAST_IN_PIPELINE terminating 2025-07-17 14:06:36,926 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753021_12197 replica FinalizedReplica, blk_1073753021_12197, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753021 for deletion 2025-07-17 14:06:36,927 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753021_12197 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753021 2025-07-17 14:07:30,956 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753022_12198 src: /192.168.158.5:34578 dest: /192.168.158.4:9866 2025-07-17 14:07:30,975 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:34578, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1799264548_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753022_12198, duration(ns): 17123707 2025-07-17 14:07:30,975 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753022_12198, type=LAST_IN_PIPELINE terminating 2025-07-17 14:07:33,929 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753022_12198 replica FinalizedReplica, blk_1073753022_12198, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753022 for deletion 2025-07-17 14:07:33,930 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753022_12198 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753022 2025-07-17 14:08:35,933 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753023_12199 src: /192.168.158.1:47324 dest: /192.168.158.4:9866 2025-07-17 14:08:35,973 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47324, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1525024995_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753023_12199, duration(ns): 29259589 2025-07-17 14:08:35,973 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753023_12199, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-17 14:08:39,931 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753023_12199 replica FinalizedReplica, blk_1073753023_12199, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753023 for deletion 2025-07-17 14:08:39,932 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753023_12199 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753023 2025-07-17 14:09:35,945 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753024_12200 src: /192.168.158.5:45878 dest: /192.168.158.4:9866 2025-07-17 14:09:35,994 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:45878, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1160175267_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753024_12200, duration(ns): 46814906 2025-07-17 14:09:35,995 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753024_12200, type=LAST_IN_PIPELINE terminating 2025-07-17 14:09:39,932 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753024_12200 replica FinalizedReplica, blk_1073753024_12200, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753024 for deletion 2025-07-17 14:09:39,933 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753024_12200 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753024 2025-07-17 14:11:35,948 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753026_12202 src: /192.168.158.7:52002 dest: /192.168.158.4:9866 2025-07-17 14:11:35,967 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:52002, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-681995346_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753026_12202, duration(ns): 16859614 2025-07-17 14:11:35,967 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753026_12202, type=LAST_IN_PIPELINE terminating 2025-07-17 14:11:39,938 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753026_12202 replica FinalizedReplica, blk_1073753026_12202, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753026 for deletion 2025-07-17 14:11:39,940 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753026_12202 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753026 2025-07-17 14:14:35,953 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753029_12205 src: /192.168.158.1:52454 dest: /192.168.158.4:9866 2025-07-17 14:14:35,986 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52454, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_588226015_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753029_12205, duration(ns): 24106301 2025-07-17 14:14:35,986 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753029_12205, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-17 14:14:42,945 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753029_12205 replica FinalizedReplica, blk_1073753029_12205, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753029 for deletion 2025-07-17 14:14:42,946 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753029_12205 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753029 2025-07-17 14:15:35,957 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753030_12206 src: /192.168.158.6:45712 dest: /192.168.158.4:9866 2025-07-17 14:15:35,983 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:45712, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_657328692_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753030_12206, duration(ns): 20173673 2025-07-17 14:15:35,983 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753030_12206, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-17 14:15:42,948 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753030_12206 replica FinalizedReplica, blk_1073753030_12206, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753030 for deletion 2025-07-17 14:15:42,949 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753030_12206 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753030 2025-07-17 14:16:35,958 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753031_12207 src: /192.168.158.1:33770 dest: /192.168.158.4:9866 2025-07-17 14:16:35,989 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33770, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194008757_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753031_12207, duration(ns): 22447852 2025-07-17 14:16:35,989 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753031_12207, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-17 14:16:39,950 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753031_12207 replica FinalizedReplica, blk_1073753031_12207, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753031 for deletion 2025-07-17 14:16:39,951 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753031_12207 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753031 2025-07-17 14:17:35,978 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753032_12208 src: /192.168.158.9:32816 dest: /192.168.158.4:9866 2025-07-17 14:17:36,004 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:32816, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1460792434_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753032_12208, duration(ns): 20447638 2025-07-17 14:17:36,006 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753032_12208, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-17 14:17:39,953 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753032_12208 replica FinalizedReplica, blk_1073753032_12208, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753032 for deletion 2025-07-17 14:17:39,954 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753032_12208 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753032 2025-07-17 14:21:40,989 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753036_12212 src: /192.168.158.1:54022 dest: /192.168.158.4:9866 2025-07-17 14:21:41,021 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54022, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1804330540_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753036_12212, duration(ns): 22348542 2025-07-17 14:21:41,021 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753036_12212, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-17 14:21:45,958 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753036_12212 replica FinalizedReplica, blk_1073753036_12212, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753036 for deletion 2025-07-17 14:21:45,959 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753036_12212 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753036 2025-07-17 14:26:40,995 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753041_12217 src: /192.168.158.7:44768 dest: /192.168.158.4:9866 2025-07-17 14:26:41,014 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:44768, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-747616097_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753041_12217, duration(ns): 16209137 2025-07-17 14:26:41,014 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753041_12217, type=LAST_IN_PIPELINE terminating 2025-07-17 14:26:45,973 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753041_12217 replica FinalizedReplica, blk_1073753041_12217, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753041 for deletion 2025-07-17 14:26:45,974 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753041_12217 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753041 2025-07-17 14:27:45,994 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753042_12218 src: /192.168.158.9:58048 dest: /192.168.158.4:9866 2025-07-17 14:27:46,011 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:58048, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1444583823_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753042_12218, duration(ns): 15669342 2025-07-17 14:27:46,012 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753042_12218, type=LAST_IN_PIPELINE terminating 2025-07-17 14:27:48,976 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753042_12218 replica FinalizedReplica, blk_1073753042_12218, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753042 for deletion 2025-07-17 14:27:48,977 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753042_12218 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753042 2025-07-17 14:29:45,996 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753044_12220 src: /192.168.158.8:33758 dest: /192.168.158.4:9866 2025-07-17 14:29:46,014 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:33758, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_10656735_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753044_12220, duration(ns): 15741415 2025-07-17 14:29:46,015 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753044_12220, type=LAST_IN_PIPELINE terminating 2025-07-17 14:29:48,982 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753044_12220 replica FinalizedReplica, blk_1073753044_12220, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753044 for deletion 2025-07-17 14:29:48,983 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753044_12220 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753044 2025-07-17 14:30:45,993 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753045_12221 src: /192.168.158.1:43676 dest: /192.168.158.4:9866 2025-07-17 14:30:46,029 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43676, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1464440283_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753045_12221, duration(ns): 27114018 2025-07-17 14:30:46,029 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753045_12221, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-17 14:30:48,986 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753045_12221 replica FinalizedReplica, blk_1073753045_12221, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753045 for deletion 2025-07-17 14:30:48,987 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753045_12221 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753045 2025-07-17 14:31:51,010 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753046_12222 src: /192.168.158.6:44736 dest: /192.168.158.4:9866 2025-07-17 14:31:51,030 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:44736, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1092300421_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753046_12222, duration(ns): 17074639 2025-07-17 14:31:51,030 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753046_12222, type=LAST_IN_PIPELINE terminating 2025-07-17 14:31:54,987 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753046_12222 replica FinalizedReplica, blk_1073753046_12222, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753046 for deletion 2025-07-17 14:31:54,988 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753046_12222 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753046 2025-07-17 14:33:56,005 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753048_12224 src: /192.168.158.7:48656 dest: /192.168.158.4:9866 2025-07-17 14:33:56,024 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:48656, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2143059389_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753048_12224, duration(ns): 16524042 2025-07-17 14:33:56,024 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753048_12224, type=LAST_IN_PIPELINE terminating 2025-07-17 14:34:03,993 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753048_12224 replica FinalizedReplica, blk_1073753048_12224, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753048 for deletion 2025-07-17 14:34:03,994 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753048_12224 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753048 2025-07-17 14:35:56,019 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753050_12226 src: /192.168.158.1:56976 dest: /192.168.158.4:9866 2025-07-17 14:35:56,055 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56976, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1107253088_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753050_12226, duration(ns): 26746356 2025-07-17 14:35:56,055 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753050_12226, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-17 14:36:00,996 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753050_12226 replica FinalizedReplica, blk_1073753050_12226, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753050 for deletion 2025-07-17 14:36:00,997 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753050_12226 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753050 2025-07-17 14:37:01,010 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753051_12227 src: /192.168.158.1:46794 dest: /192.168.158.4:9866 2025-07-17 14:37:01,045 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46794, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1954921159_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753051_12227, duration(ns): 24840615 2025-07-17 14:37:01,046 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753051_12227, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-17 14:37:03,998 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753051_12227 replica FinalizedReplica, blk_1073753051_12227, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753051 for deletion 2025-07-17 14:37:03,999 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753051_12227 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753051 2025-07-17 14:38:01,019 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753052_12228 src: /192.168.158.1:58980 dest: /192.168.158.4:9866 2025-07-17 14:38:01,058 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58980, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_109534084_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753052_12228, duration(ns): 29297993 2025-07-17 14:38:01,058 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753052_12228, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-17 14:38:07,000 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753052_12228 replica FinalizedReplica, blk_1073753052_12228, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753052 for deletion 2025-07-17 14:38:07,001 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753052_12228 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753052 2025-07-17 14:43:06,019 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753057_12233 src: /192.168.158.6:34438 dest: /192.168.158.4:9866 2025-07-17 14:43:06,036 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:34438, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-580165744_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753057_12233, duration(ns): 15534858 2025-07-17 14:43:06,037 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753057_12233, type=LAST_IN_PIPELINE terminating 2025-07-17 14:43:10,008 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753057_12233 replica FinalizedReplica, blk_1073753057_12233, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753057 for deletion 2025-07-17 14:43:10,009 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753057_12233 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753057 2025-07-17 14:45:11,018 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753059_12235 src: /192.168.158.1:37234 dest: /192.168.158.4:9866 2025-07-17 14:45:11,051 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37234, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2137599022_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753059_12235, duration(ns): 24199252 2025-07-17 14:45:11,052 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753059_12235, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-17 14:45:16,010 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753059_12235 replica FinalizedReplica, blk_1073753059_12235, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753059 for deletion 2025-07-17 14:45:16,011 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753059_12235 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753059 2025-07-17 14:46:11,015 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753060_12236 src: /192.168.158.1:48890 dest: /192.168.158.4:9866 2025-07-17 14:46:11,047 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48890, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_323069008_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753060_12236, duration(ns): 23439072 2025-07-17 14:46:11,048 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753060_12236, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-17 14:46:16,011 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753060_12236 replica FinalizedReplica, blk_1073753060_12236, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753060 for deletion 2025-07-17 14:46:16,013 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753060_12236 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753060 2025-07-17 14:47:11,028 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753061_12237 src: /192.168.158.7:59234 dest: /192.168.158.4:9866 2025-07-17 14:47:11,048 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:59234, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2085535240_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753061_12237, duration(ns): 17101842 2025-07-17 14:47:11,049 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753061_12237, type=LAST_IN_PIPELINE terminating 2025-07-17 14:47:16,014 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753061_12237 replica FinalizedReplica, blk_1073753061_12237, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753061 for deletion 2025-07-17 14:47:16,016 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753061_12237 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753061 2025-07-17 14:49:11,031 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753063_12239 src: /192.168.158.9:49124 dest: /192.168.158.4:9866 2025-07-17 14:49:11,053 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:49124, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_395174650_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753063_12239, duration(ns): 20003178 2025-07-17 14:49:11,053 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753063_12239, type=LAST_IN_PIPELINE terminating 2025-07-17 14:49:16,018 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753063_12239 replica FinalizedReplica, blk_1073753063_12239, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753063 for deletion 2025-07-17 14:49:16,019 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753063_12239 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753063 2025-07-17 14:52:11,021 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753066_12242 src: /192.168.158.5:49354 dest: /192.168.158.4:9866 2025-07-17 14:52:11,049 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:49354, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1440975286_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753066_12242, duration(ns): 22156616 2025-07-17 14:52:11,049 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753066_12242, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-17 14:52:16,024 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753066_12242 replica FinalizedReplica, blk_1073753066_12242, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753066 for deletion 2025-07-17 14:52:16,025 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753066_12242 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753066 2025-07-17 14:53:11,023 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753067_12243 src: /192.168.158.7:51312 dest: /192.168.158.4:9866 2025-07-17 14:53:11,048 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:51312, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_353198079_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753067_12243, duration(ns): 19585403 2025-07-17 14:53:11,048 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753067_12243, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-17 14:53:16,027 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753067_12243 replica FinalizedReplica, blk_1073753067_12243, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753067 for deletion 2025-07-17 14:53:16,028 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753067_12243 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753067 2025-07-17 14:57:21,039 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753071_12247 src: /192.168.158.1:48072 dest: /192.168.158.4:9866 2025-07-17 14:57:21,071 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48072, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-315273022_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753071_12247, duration(ns): 22462706 2025-07-17 14:57:21,071 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753071_12247, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-17 14:57:28,035 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753071_12247 replica FinalizedReplica, blk_1073753071_12247, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753071 for deletion 2025-07-17 14:57:28,036 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753071_12247 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753071 2025-07-17 14:58:26,067 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753072_12248 src: /192.168.158.7:54248 dest: /192.168.158.4:9866 2025-07-17 14:58:26,086 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:54248, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1635641613_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753072_12248, duration(ns): 17572023 2025-07-17 14:58:26,087 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753072_12248, type=LAST_IN_PIPELINE terminating 2025-07-17 14:58:31,038 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753072_12248 replica FinalizedReplica, blk_1073753072_12248, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753072 for deletion 2025-07-17 14:58:31,039 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753072_12248 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753072 2025-07-17 15:06:41,043 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753080_12256 src: /192.168.158.7:36944 dest: /192.168.158.4:9866 2025-07-17 15:06:41,062 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:36944, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2109030351_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753080_12256, duration(ns): 16800226 2025-07-17 15:06:41,062 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753080_12256, type=LAST_IN_PIPELINE terminating 2025-07-17 15:06:49,052 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753080_12256 replica FinalizedReplica, blk_1073753080_12256, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753080 for deletion 2025-07-17 15:06:49,053 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753080_12256 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753080 2025-07-17 15:09:41,044 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753083_12259 src: /192.168.158.5:58622 dest: /192.168.158.4:9866 2025-07-17 15:09:41,062 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:58622, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1574662142_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753083_12259, duration(ns): 16211414 2025-07-17 15:09:41,062 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753083_12259, type=LAST_IN_PIPELINE terminating 2025-07-17 15:09:46,062 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753083_12259 replica FinalizedReplica, blk_1073753083_12259, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753083 for deletion 2025-07-17 15:09:46,063 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753083_12259 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753083 2025-07-17 15:12:41,044 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753086_12262 src: /192.168.158.1:34416 dest: /192.168.158.4:9866 2025-07-17 15:12:41,080 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34416, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-385804971_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753086_12262, duration(ns): 26968970 2025-07-17 15:12:41,080 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753086_12262, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-17 15:12:49,068 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753086_12262 replica FinalizedReplica, blk_1073753086_12262, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753086 for deletion 2025-07-17 15:12:49,070 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753086_12262 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir11/blk_1073753086 2025-07-17 15:14:46,044 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753088_12264 src: /192.168.158.1:33902 dest: /192.168.158.4:9866 2025-07-17 15:14:46,077 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33902, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-995285692_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753088_12264, duration(ns): 22800904 2025-07-17 15:14:46,077 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753088_12264, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-17 15:14:49,071 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753088_12264 replica FinalizedReplica, blk_1073753088_12264, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753088 for deletion 2025-07-17 15:14:49,072 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753088_12264 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753088 2025-07-17 15:15:46,052 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753089_12265 src: /192.168.158.6:58008 dest: /192.168.158.4:9866 2025-07-17 15:15:46,070 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:58008, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1507732496_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753089_12265, duration(ns): 15921032 2025-07-17 15:15:46,071 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753089_12265, type=LAST_IN_PIPELINE terminating 2025-07-17 15:15:52,073 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753089_12265 replica FinalizedReplica, blk_1073753089_12265, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753089 for deletion 2025-07-17 15:15:52,074 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753089_12265 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753089 2025-07-17 15:18:46,055 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753092_12268 src: /192.168.158.1:40154 dest: /192.168.158.4:9866 2025-07-17 15:18:46,094 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40154, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-156142289_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753092_12268, duration(ns): 28202735 2025-07-17 15:18:46,095 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753092_12268, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-17 15:18:52,075 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753092_12268 replica FinalizedReplica, blk_1073753092_12268, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753092 for deletion 2025-07-17 15:18:52,076 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753092_12268 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753092 2025-07-17 15:19:46,058 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753093_12269 src: /192.168.158.1:59154 dest: /192.168.158.4:9866 2025-07-17 15:19:46,091 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59154, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-340721847_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753093_12269, duration(ns): 24616496 2025-07-17 15:19:46,092 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753093_12269, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-17 15:19:52,078 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753093_12269 replica FinalizedReplica, blk_1073753093_12269, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753093 for deletion 2025-07-17 15:19:52,079 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753093_12269 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753093 2025-07-17 15:22:51,057 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753096_12272 src: /192.168.158.1:57714 dest: /192.168.158.4:9866 2025-07-17 15:22:51,091 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57714, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1570905990_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753096_12272, duration(ns): 24776807 2025-07-17 15:22:51,092 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753096_12272, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-17 15:22:55,086 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753096_12272 replica FinalizedReplica, blk_1073753096_12272, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753096 for deletion 2025-07-17 15:22:55,087 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753096_12272 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753096 2025-07-17 15:29:01,070 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753102_12278 src: /192.168.158.7:42294 dest: /192.168.158.4:9866 2025-07-17 15:29:01,095 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:42294, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2076258998_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753102_12278, duration(ns): 20188719 2025-07-17 15:29:01,097 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753102_12278, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-17 15:29:04,098 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753102_12278 replica FinalizedReplica, blk_1073753102_12278, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753102 for deletion 2025-07-17 15:29:04,100 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753102_12278 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753102 2025-07-17 15:30:06,069 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753103_12279 src: /192.168.158.1:55426 dest: /192.168.158.4:9866 2025-07-17 15:30:06,102 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55426, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1872720798_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753103_12279, duration(ns): 24366031 2025-07-17 15:30:06,103 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753103_12279, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-17 15:30:13,100 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753103_12279 replica FinalizedReplica, blk_1073753103_12279, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753103 for deletion 2025-07-17 15:30:13,101 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753103_12279 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753103 2025-07-17 15:31:06,082 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753104_12280 src: /192.168.158.8:58448 dest: /192.168.158.4:9866 2025-07-17 15:31:06,101 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:58448, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-275066055_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753104_12280, duration(ns): 17300799 2025-07-17 15:31:06,102 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753104_12280, type=LAST_IN_PIPELINE terminating 2025-07-17 15:31:10,101 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753104_12280 replica FinalizedReplica, blk_1073753104_12280, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753104 for deletion 2025-07-17 15:31:10,102 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753104_12280 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753104 2025-07-17 15:34:06,080 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753107_12283 src: /192.168.158.1:45810 dest: /192.168.158.4:9866 2025-07-17 15:34:06,111 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45810, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2008525796_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753107_12283, duration(ns): 22499271 2025-07-17 15:34:06,111 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753107_12283, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-17 15:34:13,107 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753107_12283 replica FinalizedReplica, blk_1073753107_12283, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753107 for deletion 2025-07-17 15:34:13,108 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753107_12283 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753107 2025-07-17 15:36:16,094 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753109_12285 src: /192.168.158.5:55324 dest: /192.168.158.4:9866 2025-07-17 15:36:16,113 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:55324, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1538493459_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753109_12285, duration(ns): 16883835 2025-07-17 15:36:16,114 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753109_12285, type=LAST_IN_PIPELINE terminating 2025-07-17 15:36:19,111 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753109_12285 replica FinalizedReplica, blk_1073753109_12285, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753109 for deletion 2025-07-17 15:36:19,112 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753109_12285 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753109 2025-07-17 15:37:21,092 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753110_12286 src: /192.168.158.6:46954 dest: /192.168.158.4:9866 2025-07-17 15:37:21,120 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:46954, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2002251349_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753110_12286, duration(ns): 21935223 2025-07-17 15:37:21,120 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753110_12286, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-17 15:37:25,113 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753110_12286 replica FinalizedReplica, blk_1073753110_12286, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753110 for deletion 2025-07-17 15:37:25,114 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753110_12286 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753110 2025-07-17 15:42:31,083 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753115_12291 src: /192.168.158.1:46974 dest: /192.168.158.4:9866 2025-07-17 15:42:31,120 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46974, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-282730775_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753115_12291, duration(ns): 27032166 2025-07-17 15:42:31,120 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753115_12291, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-17 15:42:37,123 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753115_12291 replica FinalizedReplica, blk_1073753115_12291, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753115 for deletion 2025-07-17 15:42:37,124 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753115_12291 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753115 2025-07-17 15:43:31,089 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753116_12292 src: /192.168.158.6:39482 dest: /192.168.158.4:9866 2025-07-17 15:43:31,110 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:39482, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1304053456_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753116_12292, duration(ns): 18775922 2025-07-17 15:43:31,111 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753116_12292, type=LAST_IN_PIPELINE terminating 2025-07-17 15:43:34,125 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753116_12292 replica FinalizedReplica, blk_1073753116_12292, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753116 for deletion 2025-07-17 15:43:34,126 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753116_12292 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753116 2025-07-17 15:44:31,086 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753117_12293 src: /192.168.158.5:44696 dest: /192.168.158.4:9866 2025-07-17 15:44:31,112 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:44696, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_304714352_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753117_12293, duration(ns): 21292943 2025-07-17 15:44:31,113 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753117_12293, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-17 15:44:37,126 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753117_12293 replica FinalizedReplica, blk_1073753117_12293, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753117 for deletion 2025-07-17 15:44:37,128 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753117_12293 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753117 2025-07-17 15:45:31,088 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753118_12294 src: /192.168.158.8:37692 dest: /192.168.158.4:9866 2025-07-17 15:45:31,114 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:37692, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1760200745_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753118_12294, duration(ns): 20439399 2025-07-17 15:45:31,115 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753118_12294, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-17 15:45:34,127 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753118_12294 replica FinalizedReplica, blk_1073753118_12294, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753118 for deletion 2025-07-17 15:45:34,128 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753118_12294 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753118 2025-07-17 15:48:36,088 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753121_12297 src: /192.168.158.8:54104 dest: /192.168.158.4:9866 2025-07-17 15:48:36,115 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:54104, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1984235903_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753121_12297, duration(ns): 21435542 2025-07-17 15:48:36,116 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753121_12297, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-17 15:48:40,135 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753121_12297 replica FinalizedReplica, blk_1073753121_12297, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753121 for deletion 2025-07-17 15:48:40,136 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753121_12297 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753121 2025-07-17 15:50:36,092 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753123_12299 src: /192.168.158.1:34572 dest: /192.168.158.4:9866 2025-07-17 15:50:36,127 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34572, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-552260556_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753123_12299, duration(ns): 26632322 2025-07-17 15:50:36,128 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753123_12299, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-17 15:50:40,140 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753123_12299 replica FinalizedReplica, blk_1073753123_12299, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753123 for deletion 2025-07-17 15:50:40,141 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753123_12299 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753123 2025-07-17 15:51:36,091 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753124_12300 src: /192.168.158.1:46674 dest: /192.168.158.4:9866 2025-07-17 15:51:36,124 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46674, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_922424608_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753124_12300, duration(ns): 24592627 2025-07-17 15:51:36,124 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753124_12300, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-17 15:51:40,216 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753124_12300 replica FinalizedReplica, blk_1073753124_12300, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753124 for deletion 2025-07-17 15:51:40,218 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753124_12300 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753124 2025-07-17 15:53:36,103 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753126_12302 src: /192.168.158.7:49828 dest: /192.168.158.4:9866 2025-07-17 15:53:36,120 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:49828, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-195321743_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753126_12302, duration(ns): 15155877 2025-07-17 15:53:36,120 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753126_12302, type=LAST_IN_PIPELINE terminating 2025-07-17 15:53:40,153 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753126_12302 replica FinalizedReplica, blk_1073753126_12302, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753126 for deletion 2025-07-17 15:53:40,154 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753126_12302 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753126 2025-07-17 15:54:36,105 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753127_12303 src: /192.168.158.9:60376 dest: /192.168.158.4:9866 2025-07-17 15:54:36,124 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:60376, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1371270965_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753127_12303, duration(ns): 16250945 2025-07-17 15:54:36,124 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753127_12303, type=LAST_IN_PIPELINE terminating 2025-07-17 15:54:43,155 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753127_12303 replica FinalizedReplica, blk_1073753127_12303, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753127 for deletion 2025-07-17 15:54:43,156 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753127_12303 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753127 2025-07-17 15:57:36,123 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753130_12306 src: /192.168.158.7:40076 dest: /192.168.158.4:9866 2025-07-17 15:57:36,142 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:40076, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2025520800_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753130_12306, duration(ns): 17563401 2025-07-17 15:57:36,143 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753130_12306, type=LAST_IN_PIPELINE terminating 2025-07-17 15:57:40,159 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753130_12306 replica FinalizedReplica, blk_1073753130_12306, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753130 for deletion 2025-07-17 15:57:40,160 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753130_12306 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753130 2025-07-17 15:59:19,166 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Successfully sent block report 0x12cfd9a757d31f46, containing 4 storage report(s), of which we sent 4. The reports had 8 total blocks and used 1 RPC(s). This took 1 msec to generate and 4 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5. 2025-07-17 15:59:19,166 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Got finalize command for block pool BP-1059995147-192.168.158.1-1752101929360 2025-07-17 15:59:36,111 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753132_12308 src: /192.168.158.1:50046 dest: /192.168.158.4:9866 2025-07-17 15:59:36,144 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50046, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2100028495_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753132_12308, duration(ns): 24057585 2025-07-17 15:59:36,144 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753132_12308, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-17 15:59:40,161 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753132_12308 replica FinalizedReplica, blk_1073753132_12308, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753132 for deletion 2025-07-17 15:59:40,162 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753132_12308 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753132 2025-07-17 16:02:36,128 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753135_12311 src: /192.168.158.1:58848 dest: /192.168.158.4:9866 2025-07-17 16:02:36,163 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58848, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1731877121_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753135_12311, duration(ns): 25961318 2025-07-17 16:02:36,164 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753135_12311, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-17 16:02:40,164 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753135_12311 replica FinalizedReplica, blk_1073753135_12311, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753135 for deletion 2025-07-17 16:02:40,165 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753135_12311 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753135 2025-07-17 16:03:36,125 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753136_12312 src: /192.168.158.5:43486 dest: /192.168.158.4:9866 2025-07-17 16:03:36,146 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:43486, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1725172706_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753136_12312, duration(ns): 18923043 2025-07-17 16:03:36,146 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753136_12312, type=LAST_IN_PIPELINE terminating 2025-07-17 16:03:40,167 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753136_12312 replica FinalizedReplica, blk_1073753136_12312, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753136 for deletion 2025-07-17 16:03:40,168 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753136_12312 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753136 2025-07-17 16:04:41,123 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753137_12313 src: /192.168.158.1:56724 dest: /192.168.158.4:9866 2025-07-17 16:04:41,159 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56724, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-295538613_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753137_12313, duration(ns): 26821313 2025-07-17 16:04:41,159 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753137_12313, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-17 16:04:43,168 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753137_12313 replica FinalizedReplica, blk_1073753137_12313, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753137 for deletion 2025-07-17 16:04:43,169 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753137_12313 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753137 2025-07-17 16:08:51,126 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753141_12317 src: /192.168.158.9:59538 dest: /192.168.158.4:9866 2025-07-17 16:08:51,153 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:59538, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-338437302_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753141_12317, duration(ns): 21153467 2025-07-17 16:08:51,153 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753141_12317, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-17 16:08:55,182 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753141_12317 replica FinalizedReplica, blk_1073753141_12317, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753141 for deletion 2025-07-17 16:08:55,183 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753141_12317 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753141 2025-07-17 16:09:51,132 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753142_12318 src: /192.168.158.8:50864 dest: /192.168.158.4:9866 2025-07-17 16:09:51,151 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:50864, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2143646995_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753142_12318, duration(ns): 17195060 2025-07-17 16:09:51,152 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753142_12318, type=LAST_IN_PIPELINE terminating 2025-07-17 16:09:55,185 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753142_12318 replica FinalizedReplica, blk_1073753142_12318, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753142 for deletion 2025-07-17 16:09:55,186 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753142_12318 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753142 2025-07-17 16:10:51,132 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753143_12319 src: /192.168.158.7:39992 dest: /192.168.158.4:9866 2025-07-17 16:10:51,154 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:39992, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1779426715_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753143_12319, duration(ns): 19791632 2025-07-17 16:10:51,155 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753143_12319, type=LAST_IN_PIPELINE terminating 2025-07-17 16:10:55,186 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753143_12319 replica FinalizedReplica, blk_1073753143_12319, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753143 for deletion 2025-07-17 16:10:55,188 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753143_12319 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753143 2025-07-17 16:11:56,146 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753144_12320 src: /192.168.158.8:39868 dest: /192.168.158.4:9866 2025-07-17 16:11:56,172 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:39868, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1526854121_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753144_12320, duration(ns): 19866890 2025-07-17 16:11:56,173 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753144_12320, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-17 16:12:01,188 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753144_12320 replica FinalizedReplica, blk_1073753144_12320, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753144 for deletion 2025-07-17 16:12:01,190 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753144_12320 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753144 2025-07-17 16:12:56,135 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753145_12321 src: /192.168.158.6:52884 dest: /192.168.158.4:9866 2025-07-17 16:12:56,163 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:52884, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1950301069_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753145_12321, duration(ns): 21955087 2025-07-17 16:12:56,163 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753145_12321, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-17 16:12:58,190 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753145_12321 replica FinalizedReplica, blk_1073753145_12321, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753145 for deletion 2025-07-17 16:12:58,191 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753145_12321 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753145 2025-07-17 16:13:56,137 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753146_12322 src: /192.168.158.1:48922 dest: /192.168.158.4:9866 2025-07-17 16:13:56,170 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48922, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_180826888_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753146_12322, duration(ns): 24448198 2025-07-17 16:13:56,171 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753146_12322, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-17 16:14:01,193 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753146_12322 replica FinalizedReplica, blk_1073753146_12322, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753146 for deletion 2025-07-17 16:14:01,194 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753146_12322 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753146 2025-07-17 16:17:01,139 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753149_12325 src: /192.168.158.7:39360 dest: /192.168.158.4:9866 2025-07-17 16:17:01,167 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:39360, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1416554627_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753149_12325, duration(ns): 21413385 2025-07-17 16:17:01,167 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753149_12325, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-17 16:17:07,198 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753149_12325 replica FinalizedReplica, blk_1073753149_12325, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753149 for deletion 2025-07-17 16:17:07,199 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753149_12325 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753149 2025-07-17 16:19:06,142 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753151_12327 src: /192.168.158.5:36016 dest: /192.168.158.4:9866 2025-07-17 16:19:06,168 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:36016, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1391742845_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753151_12327, duration(ns): 20795338 2025-07-17 16:19:06,169 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753151_12327, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-17 16:19:13,201 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753151_12327 replica FinalizedReplica, blk_1073753151_12327, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753151 for deletion 2025-07-17 16:19:13,202 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753151_12327 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753151 2025-07-17 16:20:06,150 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753152_12328 src: /192.168.158.8:59412 dest: /192.168.158.4:9866 2025-07-17 16:20:06,171 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:59412, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_492087394_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753152_12328, duration(ns): 18596074 2025-07-17 16:20:06,171 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753152_12328, type=LAST_IN_PIPELINE terminating 2025-07-17 16:20:10,201 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753152_12328 replica FinalizedReplica, blk_1073753152_12328, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753152 for deletion 2025-07-17 16:20:10,202 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753152_12328 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753152 2025-07-17 16:21:11,156 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753153_12329 src: /192.168.158.9:44596 dest: /192.168.158.4:9866 2025-07-17 16:21:11,182 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:44596, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1210875160_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753153_12329, duration(ns): 20595290 2025-07-17 16:21:11,183 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753153_12329, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-17 16:21:13,202 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753153_12329 replica FinalizedReplica, blk_1073753153_12329, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753153 for deletion 2025-07-17 16:21:13,203 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753153_12329 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753153 2025-07-17 16:23:21,161 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753155_12331 src: /192.168.158.9:42516 dest: /192.168.158.4:9866 2025-07-17 16:23:21,180 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:42516, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-406060076_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753155_12331, duration(ns): 17548895 2025-07-17 16:23:21,181 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753155_12331, type=LAST_IN_PIPELINE terminating 2025-07-17 16:23:28,206 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753155_12331 replica FinalizedReplica, blk_1073753155_12331, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753155 for deletion 2025-07-17 16:23:28,208 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753155_12331 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753155 2025-07-17 16:24:26,158 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753156_12332 src: /192.168.158.5:43894 dest: /192.168.158.4:9866 2025-07-17 16:24:26,176 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:43894, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1326315806_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753156_12332, duration(ns): 16561463 2025-07-17 16:24:26,177 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753156_12332, type=LAST_IN_PIPELINE terminating 2025-07-17 16:24:28,209 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753156_12332 replica FinalizedReplica, blk_1073753156_12332, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753156 for deletion 2025-07-17 16:24:28,210 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753156_12332 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753156 2025-07-17 16:25:26,165 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753157_12333 src: /192.168.158.6:60404 dest: /192.168.158.4:9866 2025-07-17 16:25:26,184 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:60404, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_68366846_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753157_12333, duration(ns): 16971340 2025-07-17 16:25:26,184 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753157_12333, type=LAST_IN_PIPELINE terminating 2025-07-17 16:25:28,212 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753157_12333 replica FinalizedReplica, blk_1073753157_12333, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753157 for deletion 2025-07-17 16:25:28,213 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753157_12333 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753157 2025-07-17 16:29:31,165 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753161_12337 src: /192.168.158.9:56434 dest: /192.168.158.4:9866 2025-07-17 16:29:31,184 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:56434, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1221463908_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753161_12337, duration(ns): 16848922 2025-07-17 16:29:31,184 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753161_12337, type=LAST_IN_PIPELINE terminating 2025-07-17 16:29:34,220 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753161_12337 replica FinalizedReplica, blk_1073753161_12337, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753161 for deletion 2025-07-17 16:29:34,222 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753161_12337 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753161 2025-07-17 16:32:36,198 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753164_12340 src: /192.168.158.8:45206 dest: /192.168.158.4:9866 2025-07-17 16:32:36,222 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:45206, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1743757578_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753164_12340, duration(ns): 19189487 2025-07-17 16:32:36,223 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753164_12340, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-17 16:32:40,226 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753164_12340 replica FinalizedReplica, blk_1073753164_12340, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753164 for deletion 2025-07-17 16:32:40,227 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753164_12340 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753164 2025-07-17 16:34:36,173 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753166_12342 src: /192.168.158.7:35174 dest: /192.168.158.4:9866 2025-07-17 16:34:36,200 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:35174, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1965817513_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753166_12342, duration(ns): 21557833 2025-07-17 16:34:36,200 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753166_12342, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-17 16:34:43,229 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753166_12342 replica FinalizedReplica, blk_1073753166_12342, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753166 for deletion 2025-07-17 16:34:43,230 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753166_12342 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753166 2025-07-17 16:35:41,173 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753167_12343 src: /192.168.158.5:50580 dest: /192.168.158.4:9866 2025-07-17 16:35:41,191 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:50580, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_304617031_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753167_12343, duration(ns): 16118386 2025-07-17 16:35:41,191 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753167_12343, type=LAST_IN_PIPELINE terminating 2025-07-17 16:35:43,233 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753167_12343 replica FinalizedReplica, blk_1073753167_12343, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753167 for deletion 2025-07-17 16:35:43,234 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753167_12343 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753167 2025-07-17 16:36:41,174 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753168_12344 src: /192.168.158.8:37278 dest: /192.168.158.4:9866 2025-07-17 16:36:41,200 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:37278, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2113348878_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753168_12344, duration(ns): 20910005 2025-07-17 16:36:41,201 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753168_12344, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-17 16:36:43,235 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753168_12344 replica FinalizedReplica, blk_1073753168_12344, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753168 for deletion 2025-07-17 16:36:43,237 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753168_12344 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753168 2025-07-17 16:38:46,172 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753170_12346 src: /192.168.158.6:37338 dest: /192.168.158.4:9866 2025-07-17 16:38:46,202 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:37338, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1359498076_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753170_12346, duration(ns): 23379626 2025-07-17 16:38:46,202 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753170_12346, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-17 16:38:49,238 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753170_12346 replica FinalizedReplica, blk_1073753170_12346, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753170 for deletion 2025-07-17 16:38:49,239 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753170_12346 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753170 2025-07-17 16:41:51,176 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753173_12349 src: /192.168.158.1:33950 dest: /192.168.158.4:9866 2025-07-17 16:41:51,208 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33950, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1953314498_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753173_12349, duration(ns): 23440305 2025-07-17 16:41:51,209 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753173_12349, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-17 16:41:58,245 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753173_12349 replica FinalizedReplica, blk_1073753173_12349, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753173 for deletion 2025-07-17 16:41:58,246 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753173_12349 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753173 2025-07-17 16:42:51,180 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753174_12350 src: /192.168.158.6:54240 dest: /192.168.158.4:9866 2025-07-17 16:42:51,197 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:54240, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1704075131_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753174_12350, duration(ns): 15372972 2025-07-17 16:42:51,198 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753174_12350, type=LAST_IN_PIPELINE terminating 2025-07-17 16:42:55,248 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753174_12350 replica FinalizedReplica, blk_1073753174_12350, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753174 for deletion 2025-07-17 16:42:55,249 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753174_12350 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753174 2025-07-17 16:43:56,177 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753175_12351 src: /192.168.158.1:52898 dest: /192.168.158.4:9866 2025-07-17 16:43:56,210 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52898, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_634429751_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753175_12351, duration(ns): 23238409 2025-07-17 16:43:56,210 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753175_12351, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-17 16:43:58,249 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753175_12351 replica FinalizedReplica, blk_1073753175_12351, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753175 for deletion 2025-07-17 16:43:58,250 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753175_12351 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753175 2025-07-17 16:45:01,182 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753176_12352 src: /192.168.158.7:37560 dest: /192.168.158.4:9866 2025-07-17 16:45:01,209 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:37560, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-881793819_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753176_12352, duration(ns): 21418115 2025-07-17 16:45:01,210 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753176_12352, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-17 16:45:07,251 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753176_12352 replica FinalizedReplica, blk_1073753176_12352, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753176 for deletion 2025-07-17 16:45:07,252 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753176_12352 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753176 2025-07-17 16:50:06,200 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753181_12357 src: /192.168.158.1:36130 dest: /192.168.158.4:9866 2025-07-17 16:50:06,235 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36130, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1204335910_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753181_12357, duration(ns): 25138463 2025-07-17 16:50:06,235 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753181_12357, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-17 16:50:10,257 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753181_12357 replica FinalizedReplica, blk_1073753181_12357, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753181 for deletion 2025-07-17 16:50:10,258 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753181_12357 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753181 2025-07-17 16:51:06,199 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753182_12358 src: /192.168.158.1:51640 dest: /192.168.158.4:9866 2025-07-17 16:51:06,232 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51640, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1954859716_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753182_12358, duration(ns): 23965186 2025-07-17 16:51:06,232 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753182_12358, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-17 16:51:10,259 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753182_12358 replica FinalizedReplica, blk_1073753182_12358, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753182 for deletion 2025-07-17 16:51:10,260 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753182_12358 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753182 2025-07-17 16:54:06,210 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753185_12361 src: /192.168.158.5:47152 dest: /192.168.158.4:9866 2025-07-17 16:54:06,228 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:47152, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1593799004_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753185_12361, duration(ns): 15930734 2025-07-17 16:54:06,229 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753185_12361, type=LAST_IN_PIPELINE terminating 2025-07-17 16:54:13,265 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753185_12361 replica FinalizedReplica, blk_1073753185_12361, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753185 for deletion 2025-07-17 16:54:13,266 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753185_12361 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753185 2025-07-17 16:55:06,211 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753186_12362 src: /192.168.158.9:34840 dest: /192.168.158.4:9866 2025-07-17 16:55:06,241 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:34840, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-8825704_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753186_12362, duration(ns): 23504567 2025-07-17 16:55:06,242 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753186_12362, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-17 16:55:10,269 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753186_12362 replica FinalizedReplica, blk_1073753186_12362, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753186 for deletion 2025-07-17 16:55:10,270 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753186_12362 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753186 2025-07-17 16:57:06,216 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753188_12364 src: /192.168.158.1:50628 dest: /192.168.158.4:9866 2025-07-17 16:57:06,252 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50628, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1721625297_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753188_12364, duration(ns): 26357505 2025-07-17 16:57:06,252 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753188_12364, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-17 16:57:13,277 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753188_12364 replica FinalizedReplica, blk_1073753188_12364, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753188 for deletion 2025-07-17 16:57:13,278 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753188_12364 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753188 2025-07-17 17:02:11,251 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753193_12369 src: /192.168.158.8:42218 dest: /192.168.158.4:9866 2025-07-17 17:02:11,271 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:42218, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1962690356_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753193_12369, duration(ns): 17542789 2025-07-17 17:02:11,271 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753193_12369, type=LAST_IN_PIPELINE terminating 2025-07-17 17:02:13,284 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753193_12369 replica FinalizedReplica, blk_1073753193_12369, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753193 for deletion 2025-07-17 17:02:13,285 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753193_12369 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753193 2025-07-17 17:04:11,248 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753195_12371 src: /192.168.158.1:32912 dest: /192.168.158.4:9866 2025-07-17 17:04:11,283 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:32912, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1820286076_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753195_12371, duration(ns): 24142155 2025-07-17 17:04:11,283 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753195_12371, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-17 17:04:16,287 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753195_12371 replica FinalizedReplica, blk_1073753195_12371, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753195 for deletion 2025-07-17 17:04:16,288 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753195_12371 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753195 2025-07-17 17:07:16,269 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753198_12374 src: /192.168.158.8:42568 dest: /192.168.158.4:9866 2025-07-17 17:07:16,287 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:42568, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-546881953_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753198_12374, duration(ns): 15583393 2025-07-17 17:07:16,287 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753198_12374, type=LAST_IN_PIPELINE terminating 2025-07-17 17:07:19,291 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753198_12374 replica FinalizedReplica, blk_1073753198_12374, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753198 for deletion 2025-07-17 17:07:19,292 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753198_12374 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753198 2025-07-17 17:09:16,263 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753200_12376 src: /192.168.158.9:34364 dest: /192.168.158.4:9866 2025-07-17 17:09:16,281 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:34364, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1061191546_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753200_12376, duration(ns): 15569235 2025-07-17 17:09:16,281 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753200_12376, type=LAST_IN_PIPELINE terminating 2025-07-17 17:09:19,294 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753200_12376 replica FinalizedReplica, blk_1073753200_12376, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753200 for deletion 2025-07-17 17:09:19,295 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753200_12376 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753200 2025-07-17 17:15:21,284 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753206_12382 src: /192.168.158.1:57562 dest: /192.168.158.4:9866 2025-07-17 17:15:21,316 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57562, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1040314627_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753206_12382, duration(ns): 22253671 2025-07-17 17:15:21,316 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753206_12382, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-17 17:15:28,302 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753206_12382 replica FinalizedReplica, blk_1073753206_12382, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753206 for deletion 2025-07-17 17:15:28,303 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753206_12382 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753206 2025-07-17 17:17:26,277 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753208_12384 src: /192.168.158.5:52762 dest: /192.168.158.4:9866 2025-07-17 17:17:26,297 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:52762, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1808144392_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753208_12384, duration(ns): 17064092 2025-07-17 17:17:26,297 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753208_12384, type=LAST_IN_PIPELINE terminating 2025-07-17 17:17:31,304 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753208_12384 replica FinalizedReplica, blk_1073753208_12384, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753208 for deletion 2025-07-17 17:17:31,305 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753208_12384 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753208 2025-07-17 17:18:26,277 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753209_12385 src: /192.168.158.8:54662 dest: /192.168.158.4:9866 2025-07-17 17:18:26,301 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:54662, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1609166883_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753209_12385, duration(ns): 18968448 2025-07-17 17:18:26,302 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753209_12385, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-17 17:18:28,304 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753209_12385 replica FinalizedReplica, blk_1073753209_12385, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753209 for deletion 2025-07-17 17:18:28,305 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753209_12385 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753209 2025-07-17 17:29:41,288 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753220_12396 src: /192.168.158.6:57328 dest: /192.168.158.4:9866 2025-07-17 17:29:41,308 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:57328, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1752248928_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753220_12396, duration(ns): 17815871 2025-07-17 17:29:41,309 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753220_12396, type=LAST_IN_PIPELINE terminating 2025-07-17 17:29:46,326 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753220_12396 replica FinalizedReplica, blk_1073753220_12396, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753220 for deletion 2025-07-17 17:29:46,327 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753220_12396 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753220 2025-07-17 17:31:41,284 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753222_12398 src: /192.168.158.1:56646 dest: /192.168.158.4:9866 2025-07-17 17:31:41,317 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56646, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_406104542_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753222_12398, duration(ns): 24899428 2025-07-17 17:31:41,318 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753222_12398, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-17 17:31:46,331 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753222_12398 replica FinalizedReplica, blk_1073753222_12398, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753222 for deletion 2025-07-17 17:31:46,332 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753222_12398 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753222 2025-07-17 17:32:46,292 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753223_12399 src: /192.168.158.1:38222 dest: /192.168.158.4:9866 2025-07-17 17:32:46,325 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38222, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-209836377_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753223_12399, duration(ns): 24724920 2025-07-17 17:32:46,326 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753223_12399, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-17 17:32:52,332 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753223_12399 replica FinalizedReplica, blk_1073753223_12399, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753223 for deletion 2025-07-17 17:32:52,334 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753223_12399 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753223 2025-07-17 17:33:46,294 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753224_12400 src: /192.168.158.1:59650 dest: /192.168.158.4:9866 2025-07-17 17:33:46,326 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59650, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_643676435_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753224_12400, duration(ns): 23335440 2025-07-17 17:33:46,327 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753224_12400, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-17 17:33:49,333 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753224_12400 replica FinalizedReplica, blk_1073753224_12400, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753224 for deletion 2025-07-17 17:33:49,334 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753224_12400 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753224 2025-07-17 17:35:46,310 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753226_12402 src: /192.168.158.9:55210 dest: /192.168.158.4:9866 2025-07-17 17:35:46,328 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:55210, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2018602610_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753226_12402, duration(ns): 16346228 2025-07-17 17:35:46,329 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753226_12402, type=LAST_IN_PIPELINE terminating 2025-07-17 17:35:52,337 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753226_12402 replica FinalizedReplica, blk_1073753226_12402, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753226 for deletion 2025-07-17 17:35:52,338 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753226_12402 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753226 2025-07-17 17:36:13,269 INFO org.apache.hadoop.hdfs.server.datanode.DirectoryScanner: BlockPool BP-1059995147-192.168.158.1-1752101929360 Total blocks: 8, missing metadata files:0, missing block files:0, missing blocks in memory:0, mismatched blocks:0 2025-07-17 17:37:56,312 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753228_12404 src: /192.168.158.9:38668 dest: /192.168.158.4:9866 2025-07-17 17:37:56,337 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:38668, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-409188849_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753228_12404, duration(ns): 19073482 2025-07-17 17:37:56,337 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753228_12404, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-17 17:37:58,337 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753228_12404 replica FinalizedReplica, blk_1073753228_12404, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753228 for deletion 2025-07-17 17:37:58,338 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753228_12404 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753228 2025-07-17 17:40:01,306 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753230_12406 src: /192.168.158.1:52232 dest: /192.168.158.4:9866 2025-07-17 17:40:01,343 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52232, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2037232680_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753230_12406, duration(ns): 27673822 2025-07-17 17:40:01,344 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753230_12406, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-17 17:40:04,338 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753230_12406 replica FinalizedReplica, blk_1073753230_12406, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753230 for deletion 2025-07-17 17:40:04,339 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753230_12406 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753230 2025-07-17 17:42:01,322 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753232_12408 src: /192.168.158.6:36526 dest: /192.168.158.4:9866 2025-07-17 17:42:01,350 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:36526, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1163599223_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753232_12408, duration(ns): 22294291 2025-07-17 17:42:01,350 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753232_12408, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-17 17:42:04,344 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753232_12408 replica FinalizedReplica, blk_1073753232_12408, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753232 for deletion 2025-07-17 17:42:04,345 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753232_12408 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753232 2025-07-17 17:43:01,317 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753233_12409 src: /192.168.158.1:51218 dest: /192.168.158.4:9866 2025-07-17 17:43:01,349 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51218, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_256516637_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753233_12409, duration(ns): 22773418 2025-07-17 17:43:01,349 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753233_12409, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-17 17:43:04,347 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753233_12409 replica FinalizedReplica, blk_1073753233_12409, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753233 for deletion 2025-07-17 17:43:04,348 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753233_12409 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753233 2025-07-17 17:44:01,323 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753234_12410 src: /192.168.158.9:33472 dest: /192.168.158.4:9866 2025-07-17 17:44:01,351 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:33472, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1397083770_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753234_12410, duration(ns): 22141564 2025-07-17 17:44:01,353 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753234_12410, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-17 17:44:04,352 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753234_12410 replica FinalizedReplica, blk_1073753234_12410, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753234 for deletion 2025-07-17 17:44:04,353 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753234_12410 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753234 2025-07-17 17:45:01,323 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753235_12411 src: /192.168.158.9:49358 dest: /192.168.158.4:9866 2025-07-17 17:45:01,348 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:49358, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_927730985_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753235_12411, duration(ns): 18649559 2025-07-17 17:45:01,349 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753235_12411, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-17 17:45:04,355 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753235_12411 replica FinalizedReplica, blk_1073753235_12411, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753235 for deletion 2025-07-17 17:45:04,356 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753235_12411 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753235 2025-07-17 17:46:01,323 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753236_12412 src: /192.168.158.1:45228 dest: /192.168.158.4:9866 2025-07-17 17:46:01,359 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45228, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_584679217_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753236_12412, duration(ns): 25964425 2025-07-17 17:46:01,359 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753236_12412, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-17 17:46:04,357 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753236_12412 replica FinalizedReplica, blk_1073753236_12412, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753236 for deletion 2025-07-17 17:46:04,359 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753236_12412 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753236 2025-07-17 17:48:11,324 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753238_12414 src: /192.168.158.9:37750 dest: /192.168.158.4:9866 2025-07-17 17:48:11,352 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:37750, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1305041292_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753238_12414, duration(ns): 22541314 2025-07-17 17:48:11,352 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753238_12414, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-17 17:48:16,361 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753238_12414 replica FinalizedReplica, blk_1073753238_12414, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753238 for deletion 2025-07-17 17:48:16,362 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753238_12414 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753238 2025-07-17 17:50:11,341 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753240_12416 src: /192.168.158.9:45078 dest: /192.168.158.4:9866 2025-07-17 17:50:11,361 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:45078, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1719225224_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753240_12416, duration(ns): 17458821 2025-07-17 17:50:11,361 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753240_12416, type=LAST_IN_PIPELINE terminating 2025-07-17 17:50:13,362 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753240_12416 replica FinalizedReplica, blk_1073753240_12416, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753240 for deletion 2025-07-17 17:50:13,363 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753240_12416 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753240 2025-07-17 17:53:21,327 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753243_12419 src: /192.168.158.6:56182 dest: /192.168.158.4:9866 2025-07-17 17:53:21,353 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:56182, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1581020209_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753243_12419, duration(ns): 20384651 2025-07-17 17:53:21,353 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753243_12419, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-17 17:53:25,369 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753243_12419 replica FinalizedReplica, blk_1073753243_12419, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753243 for deletion 2025-07-17 17:53:25,370 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753243_12419 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753243 2025-07-17 17:56:21,342 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753246_12422 src: /192.168.158.6:45468 dest: /192.168.158.4:9866 2025-07-17 17:56:21,366 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:45468, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1917832820_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753246_12422, duration(ns): 18942018 2025-07-17 17:56:21,367 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753246_12422, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-17 17:56:25,375 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753246_12422 replica FinalizedReplica, blk_1073753246_12422, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753246 for deletion 2025-07-17 17:56:25,376 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753246_12422 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753246 2025-07-17 17:57:21,346 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753247_12423 src: /192.168.158.9:34070 dest: /192.168.158.4:9866 2025-07-17 17:57:21,373 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:34070, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1076379475_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753247_12423, duration(ns): 20600366 2025-07-17 17:57:21,373 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753247_12423, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-17 17:57:25,376 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753247_12423 replica FinalizedReplica, blk_1073753247_12423, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753247 for deletion 2025-07-17 17:57:25,378 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753247_12423 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753247 2025-07-17 17:58:26,338 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753248_12424 src: /192.168.158.1:54288 dest: /192.168.158.4:9866 2025-07-17 17:58:26,372 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54288, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-993235866_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753248_12424, duration(ns): 23869135 2025-07-17 17:58:26,372 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753248_12424, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-17 17:58:28,378 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753248_12424 replica FinalizedReplica, blk_1073753248_12424, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753248 for deletion 2025-07-17 17:58:28,379 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753248_12424 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753248 2025-07-17 17:59:26,341 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753249_12425 src: /192.168.158.1:48638 dest: /192.168.158.4:9866 2025-07-17 17:59:26,375 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48638, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1392089225_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753249_12425, duration(ns): 24689374 2025-07-17 17:59:26,375 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753249_12425, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-17 17:59:28,382 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753249_12425 replica FinalizedReplica, blk_1073753249_12425, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753249 for deletion 2025-07-17 17:59:28,384 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753249_12425 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753249 2025-07-17 18:00:26,349 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753250_12426 src: /192.168.158.5:47042 dest: /192.168.158.4:9866 2025-07-17 18:00:26,376 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:47042, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-876703141_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753250_12426, duration(ns): 21449097 2025-07-17 18:00:26,376 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753250_12426, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-17 18:00:31,385 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753250_12426 replica FinalizedReplica, blk_1073753250_12426, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753250 for deletion 2025-07-17 18:00:31,386 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753250_12426 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753250 2025-07-17 18:02:31,366 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753252_12428 src: /192.168.158.7:58720 dest: /192.168.158.4:9866 2025-07-17 18:02:31,385 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:58720, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-886044077_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753252_12428, duration(ns): 16413793 2025-07-17 18:02:31,385 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753252_12428, type=LAST_IN_PIPELINE terminating 2025-07-17 18:02:34,390 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753252_12428 replica FinalizedReplica, blk_1073753252_12428, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753252 for deletion 2025-07-17 18:02:34,391 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753252_12428 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753252 2025-07-17 18:05:36,351 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753255_12431 src: /192.168.158.7:52328 dest: /192.168.158.4:9866 2025-07-17 18:05:36,378 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:52328, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1912007401_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753255_12431, duration(ns): 21292862 2025-07-17 18:05:36,378 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753255_12431, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-17 18:05:40,398 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753255_12431 replica FinalizedReplica, blk_1073753255_12431, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753255 for deletion 2025-07-17 18:05:40,399 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753255_12431 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753255 2025-07-17 18:12:41,355 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753262_12438 src: /192.168.158.1:37078 dest: /192.168.158.4:9866 2025-07-17 18:12:41,393 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37078, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1177280605_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753262_12438, duration(ns): 27407179 2025-07-17 18:12:41,394 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753262_12438, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-17 18:12:43,408 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753262_12438 replica FinalizedReplica, blk_1073753262_12438, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753262 for deletion 2025-07-17 18:12:43,409 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753262_12438 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753262 2025-07-17 18:15:51,372 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753265_12441 src: /192.168.158.5:58904 dest: /192.168.158.4:9866 2025-07-17 18:15:51,398 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:58904, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-575632233_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753265_12441, duration(ns): 20651253 2025-07-17 18:15:51,400 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753265_12441, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-17 18:15:55,414 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753265_12441 replica FinalizedReplica, blk_1073753265_12441, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753265 for deletion 2025-07-17 18:15:55,415 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753265_12441 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753265 2025-07-17 18:17:51,368 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753267_12443 src: /192.168.158.9:49806 dest: /192.168.158.4:9866 2025-07-17 18:17:51,395 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:49806, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1923718038_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753267_12443, duration(ns): 21303137 2025-07-17 18:17:51,395 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753267_12443, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-17 18:17:55,415 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753267_12443 replica FinalizedReplica, blk_1073753267_12443, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753267 for deletion 2025-07-17 18:17:55,416 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753267_12443 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753267 2025-07-17 18:21:51,364 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753271_12447 src: /192.168.158.1:36046 dest: /192.168.158.4:9866 2025-07-17 18:21:51,398 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36046, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_939710486_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753271_12447, duration(ns): 24283366 2025-07-17 18:21:51,398 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753271_12447, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-17 18:21:55,422 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753271_12447 replica FinalizedReplica, blk_1073753271_12447, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753271 for deletion 2025-07-17 18:21:55,423 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753271_12447 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753271 2025-07-17 18:26:01,385 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753275_12451 src: /192.168.158.6:56296 dest: /192.168.158.4:9866 2025-07-17 18:26:01,407 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:56296, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-866807159_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753275_12451, duration(ns): 19554176 2025-07-17 18:26:01,407 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753275_12451, type=LAST_IN_PIPELINE terminating 2025-07-17 18:26:07,427 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753275_12451 replica FinalizedReplica, blk_1073753275_12451, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753275 for deletion 2025-07-17 18:26:07,428 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753275_12451 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753275 2025-07-17 18:28:01,380 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753277_12453 src: /192.168.158.9:34654 dest: /192.168.158.4:9866 2025-07-17 18:28:01,406 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:34654, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_831323202_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753277_12453, duration(ns): 19885573 2025-07-17 18:28:01,406 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753277_12453, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-17 18:28:07,435 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753277_12453 replica FinalizedReplica, blk_1073753277_12453, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753277 for deletion 2025-07-17 18:28:07,436 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753277_12453 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753277 2025-07-17 18:30:06,386 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753279_12455 src: /192.168.158.6:46048 dest: /192.168.158.4:9866 2025-07-17 18:30:06,411 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:46048, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_962858085_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753279_12455, duration(ns): 20119091 2025-07-17 18:30:06,412 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753279_12455, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-17 18:30:13,439 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753279_12455 replica FinalizedReplica, blk_1073753279_12455, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753279 for deletion 2025-07-17 18:30:13,440 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753279_12455 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753279 2025-07-17 18:32:06,395 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753281_12457 src: /192.168.158.9:46058 dest: /192.168.158.4:9866 2025-07-17 18:32:06,414 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:46058, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_232343895_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753281_12457, duration(ns): 16038737 2025-07-17 18:32:06,414 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753281_12457, type=LAST_IN_PIPELINE terminating 2025-07-17 18:32:13,444 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753281_12457 replica FinalizedReplica, blk_1073753281_12457, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753281 for deletion 2025-07-17 18:32:13,446 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753281_12457 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753281 2025-07-17 18:35:11,390 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753284_12460 src: /192.168.158.1:51636 dest: /192.168.158.4:9866 2025-07-17 18:35:11,424 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51636, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1115015061_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753284_12460, duration(ns): 23764585 2025-07-17 18:35:11,424 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753284_12460, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-17 18:35:13,450 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753284_12460 replica FinalizedReplica, blk_1073753284_12460, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753284 for deletion 2025-07-17 18:35:13,452 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753284_12460 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753284 2025-07-17 18:39:21,395 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753288_12464 src: /192.168.158.9:38630 dest: /192.168.158.4:9866 2025-07-17 18:39:21,414 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:38630, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-656560268_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753288_12464, duration(ns): 16436554 2025-07-17 18:39:21,415 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753288_12464, type=LAST_IN_PIPELINE terminating 2025-07-17 18:39:25,456 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753288_12464 replica FinalizedReplica, blk_1073753288_12464, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753288 for deletion 2025-07-17 18:39:25,457 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753288_12464 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753288 2025-07-17 18:40:21,398 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753289_12465 src: /192.168.158.8:52846 dest: /192.168.158.4:9866 2025-07-17 18:40:21,418 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:52846, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1073813219_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753289_12465, duration(ns): 17208380 2025-07-17 18:40:21,418 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753289_12465, type=LAST_IN_PIPELINE terminating 2025-07-17 18:40:25,459 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753289_12465 replica FinalizedReplica, blk_1073753289_12465, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753289 for deletion 2025-07-17 18:40:25,460 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753289_12465 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753289 2025-07-17 18:42:21,397 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753291_12467 src: /192.168.158.1:59290 dest: /192.168.158.4:9866 2025-07-17 18:42:21,431 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59290, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-562681300_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753291_12467, duration(ns): 24872306 2025-07-17 18:42:21,432 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753291_12467, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-17 18:42:25,459 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753291_12467 replica FinalizedReplica, blk_1073753291_12467, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753291 for deletion 2025-07-17 18:42:25,461 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753291_12467 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753291 2025-07-17 18:43:26,408 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753292_12468 src: /192.168.158.1:34816 dest: /192.168.158.4:9866 2025-07-17 18:43:26,443 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34816, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_186412246_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753292_12468, duration(ns): 25737299 2025-07-17 18:43:26,443 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753292_12468, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-17 18:43:28,460 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753292_12468 replica FinalizedReplica, blk_1073753292_12468, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753292 for deletion 2025-07-17 18:43:28,462 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753292_12468 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753292 2025-07-17 18:45:26,413 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753294_12470 src: /192.168.158.8:38408 dest: /192.168.158.4:9866 2025-07-17 18:45:26,439 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:38408, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1468686275_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753294_12470, duration(ns): 20657512 2025-07-17 18:45:26,439 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753294_12470, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-17 18:45:28,462 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753294_12470 replica FinalizedReplica, blk_1073753294_12470, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753294 for deletion 2025-07-17 18:45:28,464 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753294_12470 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753294 2025-07-17 18:48:26,406 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753297_12473 src: /192.168.158.9:55784 dest: /192.168.158.4:9866 2025-07-17 18:48:26,432 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:55784, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1825390001_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753297_12473, duration(ns): 20006309 2025-07-17 18:48:26,432 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753297_12473, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-17 18:48:28,470 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753297_12473 replica FinalizedReplica, blk_1073753297_12473, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753297 for deletion 2025-07-17 18:48:28,472 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753297_12473 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753297 2025-07-17 18:49:26,411 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753298_12474 src: /192.168.158.5:41944 dest: /192.168.158.4:9866 2025-07-17 18:49:26,429 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:41944, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-861642267_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753298_12474, duration(ns): 16586630 2025-07-17 18:49:26,430 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753298_12474, type=LAST_IN_PIPELINE terminating 2025-07-17 18:49:28,472 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753298_12474 replica FinalizedReplica, blk_1073753298_12474, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753298 for deletion 2025-07-17 18:49:28,474 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753298_12474 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753298 2025-07-17 18:51:31,420 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753300_12476 src: /192.168.158.1:36310 dest: /192.168.158.4:9866 2025-07-17 18:51:31,452 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36310, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-221982716_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753300_12476, duration(ns): 23082749 2025-07-17 18:51:31,452 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753300_12476, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-17 18:51:37,477 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753300_12476 replica FinalizedReplica, blk_1073753300_12476, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753300 for deletion 2025-07-17 18:51:37,478 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753300_12476 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753300 2025-07-17 18:52:31,427 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753301_12477 src: /192.168.158.8:53580 dest: /192.168.158.4:9866 2025-07-17 18:52:31,449 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:53580, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_962726443_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753301_12477, duration(ns): 19598099 2025-07-17 18:52:31,449 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753301_12477, type=LAST_IN_PIPELINE terminating 2025-07-17 18:52:34,479 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753301_12477 replica FinalizedReplica, blk_1073753301_12477, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753301 for deletion 2025-07-17 18:52:34,480 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753301_12477 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753301 2025-07-17 18:55:31,426 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753304_12480 src: /192.168.158.1:44630 dest: /192.168.158.4:9866 2025-07-17 18:55:31,465 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44630, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1963527788_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753304_12480, duration(ns): 29672348 2025-07-17 18:55:31,465 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753304_12480, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-17 18:55:34,485 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753304_12480 replica FinalizedReplica, blk_1073753304_12480, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753304 for deletion 2025-07-17 18:55:34,486 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753304_12480 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753304 2025-07-17 18:56:31,402 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753305_12481 src: /192.168.158.8:33424 dest: /192.168.158.4:9866 2025-07-17 18:56:31,427 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:33424, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-813165694_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753305_12481, duration(ns): 19991990 2025-07-17 18:56:31,428 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753305_12481, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-17 18:56:34,488 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753305_12481 replica FinalizedReplica, blk_1073753305_12481, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753305 for deletion 2025-07-17 18:56:34,489 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753305_12481 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753305 2025-07-17 18:58:31,432 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753307_12483 src: /192.168.158.1:47744 dest: /192.168.158.4:9866 2025-07-17 18:58:31,465 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47744, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1710121329_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753307_12483, duration(ns): 23291249 2025-07-17 18:58:31,465 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753307_12483, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-17 18:58:37,492 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753307_12483 replica FinalizedReplica, blk_1073753307_12483, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753307 for deletion 2025-07-17 18:58:37,493 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753307_12483 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753307 2025-07-17 19:01:31,434 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753310_12486 src: /192.168.158.9:54750 dest: /192.168.158.4:9866 2025-07-17 19:01:31,462 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:54750, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_376153087_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753310_12486, duration(ns): 22302584 2025-07-17 19:01:31,463 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753310_12486, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-17 19:01:34,493 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753310_12486 replica FinalizedReplica, blk_1073753310_12486, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753310 for deletion 2025-07-17 19:01:34,494 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753310_12486 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753310 2025-07-17 19:03:31,440 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753312_12488 src: /192.168.158.5:59346 dest: /192.168.158.4:9866 2025-07-17 19:03:31,465 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:59346, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-906421575_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753312_12488, duration(ns): 19693263 2025-07-17 19:03:31,465 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753312_12488, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-17 19:03:34,495 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753312_12488 replica FinalizedReplica, blk_1073753312_12488, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753312 for deletion 2025-07-17 19:03:34,496 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753312_12488 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753312 2025-07-17 19:05:31,433 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753314_12490 src: /192.168.158.1:39052 dest: /192.168.158.4:9866 2025-07-17 19:05:31,467 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39052, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-202204296_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753314_12490, duration(ns): 24004966 2025-07-17 19:05:31,467 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753314_12490, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-17 19:05:37,498 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753314_12490 replica FinalizedReplica, blk_1073753314_12490, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753314 for deletion 2025-07-17 19:05:37,499 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753314_12490 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753314 2025-07-17 19:06:31,447 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753315_12491 src: /192.168.158.1:37664 dest: /192.168.158.4:9866 2025-07-17 19:06:31,483 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37664, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_888856170_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753315_12491, duration(ns): 26708787 2025-07-17 19:06:31,483 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753315_12491, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-17 19:06:37,502 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753315_12491 replica FinalizedReplica, blk_1073753315_12491, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753315 for deletion 2025-07-17 19:06:37,503 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753315_12491 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753315 2025-07-17 19:09:31,446 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753318_12494 src: /192.168.158.6:36472 dest: /192.168.158.4:9866 2025-07-17 19:09:31,465 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:36472, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1867070304_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753318_12494, duration(ns): 16882589 2025-07-17 19:09:31,466 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753318_12494, type=LAST_IN_PIPELINE terminating 2025-07-17 19:09:37,504 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753318_12494 replica FinalizedReplica, blk_1073753318_12494, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753318 for deletion 2025-07-17 19:09:37,505 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753318_12494 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753318 2025-07-17 19:13:31,467 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753322_12498 src: /192.168.158.9:38126 dest: /192.168.158.4:9866 2025-07-17 19:13:31,494 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:38126, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1665142909_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753322_12498, duration(ns): 20179652 2025-07-17 19:13:31,495 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753322_12498, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-17 19:13:37,508 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753322_12498 replica FinalizedReplica, blk_1073753322_12498, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753322 for deletion 2025-07-17 19:13:37,509 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753322_12498 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753322 2025-07-17 19:17:41,452 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753326_12502 src: /192.168.158.6:47578 dest: /192.168.158.4:9866 2025-07-17 19:17:41,469 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:47578, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-676391466_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753326_12502, duration(ns): 15554714 2025-07-17 19:17:41,470 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753326_12502, type=LAST_IN_PIPELINE terminating 2025-07-17 19:17:43,519 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753326_12502 replica FinalizedReplica, blk_1073753326_12502, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753326 for deletion 2025-07-17 19:17:43,520 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753326_12502 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753326 2025-07-17 19:22:46,467 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753331_12507 src: /192.168.158.9:55070 dest: /192.168.158.4:9866 2025-07-17 19:22:46,486 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:55070, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_868052934_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753331_12507, duration(ns): 16376024 2025-07-17 19:22:46,486 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753331_12507, type=LAST_IN_PIPELINE terminating 2025-07-17 19:22:52,534 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753331_12507 replica FinalizedReplica, blk_1073753331_12507, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753331 for deletion 2025-07-17 19:22:52,535 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753331_12507 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753331 2025-07-17 19:26:46,470 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753335_12511 src: /192.168.158.1:39750 dest: /192.168.158.4:9866 2025-07-17 19:26:46,505 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39750, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-438381238_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753335_12511, duration(ns): 24426164 2025-07-17 19:26:46,506 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753335_12511, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-17 19:26:49,543 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753335_12511 replica FinalizedReplica, blk_1073753335_12511, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753335 for deletion 2025-07-17 19:26:49,544 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753335_12511 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753335 2025-07-17 19:30:51,482 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753339_12515 src: /192.168.158.1:33692 dest: /192.168.158.4:9866 2025-07-17 19:30:51,517 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33692, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-541924245_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753339_12515, duration(ns): 26463951 2025-07-17 19:30:51,518 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753339_12515, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-17 19:30:52,551 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753339_12515 replica FinalizedReplica, blk_1073753339_12515, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753339 for deletion 2025-07-17 19:30:52,552 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753339_12515 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753339 2025-07-17 19:31:51,481 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753340_12516 src: /192.168.158.1:41228 dest: /192.168.158.4:9866 2025-07-17 19:31:51,517 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41228, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1216230193_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753340_12516, duration(ns): 24906993 2025-07-17 19:31:51,517 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753340_12516, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-17 19:31:55,552 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753340_12516 replica FinalizedReplica, blk_1073753340_12516, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753340 for deletion 2025-07-17 19:31:55,553 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753340_12516 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753340 2025-07-17 19:32:51,486 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753341_12517 src: /192.168.158.5:50988 dest: /192.168.158.4:9866 2025-07-17 19:32:51,504 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:50988, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1412655173_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753341_12517, duration(ns): 15670417 2025-07-17 19:32:51,504 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753341_12517, type=LAST_IN_PIPELINE terminating 2025-07-17 19:32:52,555 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753341_12517 replica FinalizedReplica, blk_1073753341_12517, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753341 for deletion 2025-07-17 19:32:52,556 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753341_12517 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753341 2025-07-17 19:33:51,494 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753342_12518 src: /192.168.158.8:39300 dest: /192.168.158.4:9866 2025-07-17 19:33:51,513 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:39300, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-287246595_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753342_12518, duration(ns): 16595561 2025-07-17 19:33:51,513 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753342_12518, type=LAST_IN_PIPELINE terminating 2025-07-17 19:33:52,556 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753342_12518 replica FinalizedReplica, blk_1073753342_12518, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753342 for deletion 2025-07-17 19:33:52,558 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753342_12518 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir12/blk_1073753342 2025-07-17 19:35:51,491 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753344_12520 src: /192.168.158.8:45938 dest: /192.168.158.4:9866 2025-07-17 19:35:51,509 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:45938, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1724294414_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753344_12520, duration(ns): 15819857 2025-07-17 19:35:51,509 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753344_12520, type=LAST_IN_PIPELINE terminating 2025-07-17 19:35:55,563 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753344_12520 replica FinalizedReplica, blk_1073753344_12520, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753344 for deletion 2025-07-17 19:35:55,564 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753344_12520 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753344 2025-07-17 19:38:56,478 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753347_12523 src: /192.168.158.1:59290 dest: /192.168.158.4:9866 2025-07-17 19:38:56,511 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59290, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_366497619_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753347_12523, duration(ns): 23569725 2025-07-17 19:38:56,511 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753347_12523, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-17 19:38:58,567 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753347_12523 replica FinalizedReplica, blk_1073753347_12523, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753347 for deletion 2025-07-17 19:38:58,568 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753347_12523 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753347 2025-07-17 19:39:56,483 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753348_12524 src: /192.168.158.1:37182 dest: /192.168.158.4:9866 2025-07-17 19:39:56,516 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37182, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2037182561_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753348_12524, duration(ns): 23957506 2025-07-17 19:39:56,516 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753348_12524, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-17 19:40:01,569 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753348_12524 replica FinalizedReplica, blk_1073753348_12524, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753348 for deletion 2025-07-17 19:40:01,569 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753348_12524 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753348 2025-07-17 19:41:56,491 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753350_12526 src: /192.168.158.5:40188 dest: /192.168.158.4:9866 2025-07-17 19:41:56,520 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:40188, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_277186775_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753350_12526, duration(ns): 23095900 2025-07-17 19:41:56,521 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753350_12526, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-17 19:41:58,571 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753350_12526 replica FinalizedReplica, blk_1073753350_12526, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753350 for deletion 2025-07-17 19:41:58,572 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753350_12526 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753350 2025-07-17 19:44:01,505 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753352_12528 src: /192.168.158.6:40850 dest: /192.168.158.4:9866 2025-07-17 19:44:01,526 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:40850, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_212759123_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753352_12528, duration(ns): 18551140 2025-07-17 19:44:01,526 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753352_12528, type=LAST_IN_PIPELINE terminating 2025-07-17 19:44:04,575 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753352_12528 replica FinalizedReplica, blk_1073753352_12528, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753352 for deletion 2025-07-17 19:44:04,576 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753352_12528 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753352 2025-07-17 19:45:01,496 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753353_12529 src: /192.168.158.1:44354 dest: /192.168.158.4:9866 2025-07-17 19:45:01,529 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44354, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1370917613_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753353_12529, duration(ns): 22440755 2025-07-17 19:45:01,533 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753353_12529, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-17 19:45:04,576 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753353_12529 replica FinalizedReplica, blk_1073753353_12529, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753353 for deletion 2025-07-17 19:45:04,579 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753353_12529 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753353 2025-07-17 19:53:21,506 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753361_12537 src: /192.168.158.1:55466 dest: /192.168.158.4:9866 2025-07-17 19:53:21,541 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55466, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1945369517_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753361_12537, duration(ns): 25221503 2025-07-17 19:53:21,546 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753361_12537, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-17 19:53:25,598 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753361_12537 replica FinalizedReplica, blk_1073753361_12537, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753361 for deletion 2025-07-17 19:53:25,599 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753361_12537 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753361 2025-07-17 19:54:21,512 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753362_12538 src: /192.168.158.1:50964 dest: /192.168.158.4:9866 2025-07-17 19:54:21,544 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50964, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1470125047_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753362_12538, duration(ns): 22997338 2025-07-17 19:54:21,544 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753362_12538, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-17 19:54:25,602 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753362_12538 replica FinalizedReplica, blk_1073753362_12538, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753362 for deletion 2025-07-17 19:54:25,603 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753362_12538 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753362 2025-07-17 19:56:21,528 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753364_12540 src: /192.168.158.8:42154 dest: /192.168.158.4:9866 2025-07-17 19:56:21,555 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:42154, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_870568021_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753364_12540, duration(ns): 21205576 2025-07-17 19:56:21,555 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753364_12540, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-17 19:56:22,607 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753364_12540 replica FinalizedReplica, blk_1073753364_12540, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753364 for deletion 2025-07-17 19:56:22,608 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753364_12540 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753364 2025-07-17 19:57:21,523 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753365_12541 src: /192.168.158.5:34916 dest: /192.168.158.4:9866 2025-07-17 19:57:21,542 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:34916, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1625238206_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753365_12541, duration(ns): 17081489 2025-07-17 19:57:21,543 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753365_12541, type=LAST_IN_PIPELINE terminating 2025-07-17 19:57:22,608 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753365_12541 replica FinalizedReplica, blk_1073753365_12541, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753365 for deletion 2025-07-17 19:57:22,609 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753365_12541 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753365 2025-07-17 19:58:21,522 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753366_12542 src: /192.168.158.8:37636 dest: /192.168.158.4:9866 2025-07-17 19:58:21,548 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:37636, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1878825001_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753366_12542, duration(ns): 19576527 2025-07-17 19:58:21,548 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753366_12542, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-17 19:58:25,609 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753366_12542 replica FinalizedReplica, blk_1073753366_12542, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753366 for deletion 2025-07-17 19:58:25,611 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753366_12542 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753366 2025-07-17 20:00:31,525 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753368_12544 src: /192.168.158.5:38086 dest: /192.168.158.4:9866 2025-07-17 20:00:31,546 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:38086, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_874197922_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753368_12544, duration(ns): 18690786 2025-07-17 20:00:31,546 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753368_12544, type=LAST_IN_PIPELINE terminating 2025-07-17 20:00:34,614 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753368_12544 replica FinalizedReplica, blk_1073753368_12544, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753368 for deletion 2025-07-17 20:00:34,615 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753368_12544 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753368 2025-07-17 20:01:36,523 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753369_12545 src: /192.168.158.5:34898 dest: /192.168.158.4:9866 2025-07-17 20:01:36,550 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:34898, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1526701899_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753369_12545, duration(ns): 21315307 2025-07-17 20:01:36,550 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753369_12545, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-17 20:01:37,614 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753369_12545 replica FinalizedReplica, blk_1073753369_12545, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753369 for deletion 2025-07-17 20:01:37,616 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753369_12545 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753369 2025-07-17 20:02:36,537 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753370_12546 src: /192.168.158.9:57678 dest: /192.168.158.4:9866 2025-07-17 20:02:36,557 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:57678, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1017654038_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753370_12546, duration(ns): 18009678 2025-07-17 20:02:36,557 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753370_12546, type=LAST_IN_PIPELINE terminating 2025-07-17 20:02:37,617 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753370_12546 replica FinalizedReplica, blk_1073753370_12546, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753370 for deletion 2025-07-17 20:02:37,618 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753370_12546 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753370 2025-07-17 20:07:36,542 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753375_12551 src: /192.168.158.8:46952 dest: /192.168.158.4:9866 2025-07-17 20:07:36,567 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:46952, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1284860763_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753375_12551, duration(ns): 19630863 2025-07-17 20:07:36,568 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753375_12551, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-17 20:07:37,622 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753375_12551 replica FinalizedReplica, blk_1073753375_12551, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753375 for deletion 2025-07-17 20:07:37,623 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753375_12551 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753375 2025-07-17 20:13:36,550 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753381_12557 src: /192.168.158.1:55372 dest: /192.168.158.4:9866 2025-07-17 20:13:36,587 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55372, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2007775635_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753381_12557, duration(ns): 26894281 2025-07-17 20:13:36,587 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753381_12557, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-17 20:13:40,630 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753381_12557 replica FinalizedReplica, blk_1073753381_12557, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753381 for deletion 2025-07-17 20:13:40,631 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753381_12557 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753381 2025-07-17 20:14:36,549 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753382_12558 src: /192.168.158.1:40224 dest: /192.168.158.4:9866 2025-07-17 20:14:36,585 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40224, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1628877824_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753382_12558, duration(ns): 26702608 2025-07-17 20:14:36,585 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753382_12558, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-17 20:14:37,632 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753382_12558 replica FinalizedReplica, blk_1073753382_12558, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753382 for deletion 2025-07-17 20:14:37,633 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753382_12558 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753382 2025-07-17 20:15:36,551 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753383_12559 src: /192.168.158.6:37158 dest: /192.168.158.4:9866 2025-07-17 20:15:36,577 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:37158, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-222368098_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753383_12559, duration(ns): 20259490 2025-07-17 20:15:36,577 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753383_12559, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-17 20:15:40,634 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753383_12559 replica FinalizedReplica, blk_1073753383_12559, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753383 for deletion 2025-07-17 20:15:40,635 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753383_12559 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753383 2025-07-17 20:17:36,590 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753385_12561 src: /192.168.158.8:40678 dest: /192.168.158.4:9866 2025-07-17 20:17:36,616 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:40678, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-996993475_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753385_12561, duration(ns): 20372278 2025-07-17 20:17:36,616 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753385_12561, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-17 20:17:40,637 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753385_12561 replica FinalizedReplica, blk_1073753385_12561, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753385 for deletion 2025-07-17 20:17:40,638 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753385_12561 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753385 2025-07-17 20:19:36,550 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753387_12563 src: /192.168.158.5:43326 dest: /192.168.158.4:9866 2025-07-17 20:19:36,576 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:43326, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_944366681_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753387_12563, duration(ns): 20569655 2025-07-17 20:19:36,577 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753387_12563, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-17 20:19:37,640 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753387_12563 replica FinalizedReplica, blk_1073753387_12563, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753387 for deletion 2025-07-17 20:19:37,642 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753387_12563 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753387 2025-07-17 20:20:41,547 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753388_12564 src: /192.168.158.6:35388 dest: /192.168.158.4:9866 2025-07-17 20:20:41,574 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:35388, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_242763924_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753388_12564, duration(ns): 20603657 2025-07-17 20:20:41,574 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753388_12564, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-17 20:20:46,642 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753388_12564 replica FinalizedReplica, blk_1073753388_12564, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753388 for deletion 2025-07-17 20:20:46,643 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753388_12564 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753388 2025-07-17 20:21:41,551 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753389_12565 src: /192.168.158.8:33852 dest: /192.168.158.4:9866 2025-07-17 20:21:41,577 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:33852, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1355598666_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753389_12565, duration(ns): 20284584 2025-07-17 20:21:41,577 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753389_12565, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-17 20:21:43,643 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753389_12565 replica FinalizedReplica, blk_1073753389_12565, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753389 for deletion 2025-07-17 20:21:43,644 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753389_12565 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753389 2025-07-17 20:22:41,551 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753390_12566 src: /192.168.158.8:34760 dest: /192.168.158.4:9866 2025-07-17 20:22:41,571 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:34760, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_534510714_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753390_12566, duration(ns): 17513748 2025-07-17 20:22:41,571 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753390_12566, type=LAST_IN_PIPELINE terminating 2025-07-17 20:22:43,646 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753390_12566 replica FinalizedReplica, blk_1073753390_12566, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753390 for deletion 2025-07-17 20:22:43,647 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753390_12566 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753390 2025-07-17 20:23:46,552 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753391_12567 src: /192.168.158.1:40618 dest: /192.168.158.4:9866 2025-07-17 20:23:46,588 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40618, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1402826046_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753391_12567, duration(ns): 26811611 2025-07-17 20:23:46,588 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753391_12567, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-17 20:23:52,649 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753391_12567 replica FinalizedReplica, blk_1073753391_12567, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753391 for deletion 2025-07-17 20:23:52,650 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753391_12567 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753391 2025-07-17 20:24:46,547 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753392_12568 src: /192.168.158.1:52888 dest: /192.168.158.4:9866 2025-07-17 20:24:46,583 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52888, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-822059432_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753392_12568, duration(ns): 26420304 2025-07-17 20:24:46,583 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753392_12568, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-17 20:24:49,651 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753392_12568 replica FinalizedReplica, blk_1073753392_12568, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753392 for deletion 2025-07-17 20:24:49,652 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753392_12568 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753392 2025-07-17 20:27:51,574 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753395_12571 src: /192.168.158.9:48556 dest: /192.168.158.4:9866 2025-07-17 20:27:51,594 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:48556, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1072558456_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753395_12571, duration(ns): 17040707 2025-07-17 20:27:51,594 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753395_12571, type=LAST_IN_PIPELINE terminating 2025-07-17 20:27:52,656 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753395_12571 replica FinalizedReplica, blk_1073753395_12571, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753395 for deletion 2025-07-17 20:27:52,657 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753395_12571 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753395 2025-07-17 20:29:56,565 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753397_12573 src: /192.168.158.1:43370 dest: /192.168.158.4:9866 2025-07-17 20:29:56,607 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43370, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1696832274_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753397_12573, duration(ns): 32024463 2025-07-17 20:29:56,607 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753397_12573, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-17 20:30:01,662 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753397_12573 replica FinalizedReplica, blk_1073753397_12573, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753397 for deletion 2025-07-17 20:30:01,663 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753397_12573 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753397 2025-07-17 20:31:56,558 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753399_12575 src: /192.168.158.1:34546 dest: /192.168.158.4:9866 2025-07-17 20:31:56,590 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34546, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_576459283_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753399_12575, duration(ns): 22996281 2025-07-17 20:31:56,591 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753399_12575, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-17 20:31:58,667 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753399_12575 replica FinalizedReplica, blk_1073753399_12575, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753399 for deletion 2025-07-17 20:31:58,668 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753399_12575 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753399 2025-07-17 20:34:06,581 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753401_12577 src: /192.168.158.9:38206 dest: /192.168.158.4:9866 2025-07-17 20:34:06,600 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:38206, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_212333974_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753401_12577, duration(ns): 16990390 2025-07-17 20:34:06,601 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753401_12577, type=LAST_IN_PIPELINE terminating 2025-07-17 20:34:07,671 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753401_12577 replica FinalizedReplica, blk_1073753401_12577, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753401 for deletion 2025-07-17 20:34:07,672 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753401_12577 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753401 2025-07-17 20:35:06,580 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753402_12578 src: /192.168.158.5:45680 dest: /192.168.158.4:9866 2025-07-17 20:35:06,601 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:45680, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-146095765_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753402_12578, duration(ns): 18808431 2025-07-17 20:35:06,601 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753402_12578, type=LAST_IN_PIPELINE terminating 2025-07-17 20:35:07,674 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753402_12578 replica FinalizedReplica, blk_1073753402_12578, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753402 for deletion 2025-07-17 20:35:07,675 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753402_12578 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753402 2025-07-17 20:36:06,575 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753403_12579 src: /192.168.158.1:57854 dest: /192.168.158.4:9866 2025-07-17 20:36:06,611 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57854, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1031248738_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753403_12579, duration(ns): 27485463 2025-07-17 20:36:06,612 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753403_12579, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-17 20:36:07,673 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753403_12579 replica FinalizedReplica, blk_1073753403_12579, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753403 for deletion 2025-07-17 20:36:07,675 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753403_12579 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753403 2025-07-17 20:38:06,573 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753405_12581 src: /192.168.158.7:38192 dest: /192.168.158.4:9866 2025-07-17 20:38:06,593 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:38192, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1361203001_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753405_12581, duration(ns): 17389971 2025-07-17 20:38:06,593 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753405_12581, type=LAST_IN_PIPELINE terminating 2025-07-17 20:38:07,679 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753405_12581 replica FinalizedReplica, blk_1073753405_12581, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753405 for deletion 2025-07-17 20:38:07,680 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753405_12581 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753405 2025-07-17 20:39:06,577 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753406_12582 src: /192.168.158.6:48804 dest: /192.168.158.4:9866 2025-07-17 20:39:06,596 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:48804, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1531624293_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753406_12582, duration(ns): 16798022 2025-07-17 20:39:06,596 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753406_12582, type=LAST_IN_PIPELINE terminating 2025-07-17 20:39:07,683 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753406_12582 replica FinalizedReplica, blk_1073753406_12582, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753406 for deletion 2025-07-17 20:39:07,684 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753406_12582 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753406 2025-07-17 20:40:06,582 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753407_12583 src: /192.168.158.5:54944 dest: /192.168.158.4:9866 2025-07-17 20:40:06,610 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:54944, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-246329082_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753407_12583, duration(ns): 22405865 2025-07-17 20:40:06,610 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753407_12583, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-17 20:40:10,686 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753407_12583 replica FinalizedReplica, blk_1073753407_12583, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753407 for deletion 2025-07-17 20:40:10,687 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753407_12583 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753407 2025-07-17 20:41:11,577 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753408_12584 src: /192.168.158.8:50656 dest: /192.168.158.4:9866 2025-07-17 20:41:11,595 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:50656, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1291134706_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753408_12584, duration(ns): 15708359 2025-07-17 20:41:11,595 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753408_12584, type=LAST_IN_PIPELINE terminating 2025-07-17 20:41:13,689 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753408_12584 replica FinalizedReplica, blk_1073753408_12584, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753408 for deletion 2025-07-17 20:41:13,690 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753408_12584 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753408 2025-07-17 20:43:16,580 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753410_12586 src: /192.168.158.5:60494 dest: /192.168.158.4:9866 2025-07-17 20:43:16,601 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:60494, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2068633707_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753410_12586, duration(ns): 18222251 2025-07-17 20:43:16,602 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753410_12586, type=LAST_IN_PIPELINE terminating 2025-07-17 20:43:22,697 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753410_12586 replica FinalizedReplica, blk_1073753410_12586, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753410 for deletion 2025-07-17 20:43:22,698 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753410_12586 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753410 2025-07-17 20:44:16,580 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753411_12587 src: /192.168.158.9:34786 dest: /192.168.158.4:9866 2025-07-17 20:44:16,602 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:34786, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_475237814_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753411_12587, duration(ns): 20338806 2025-07-17 20:44:16,603 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753411_12587, type=LAST_IN_PIPELINE terminating 2025-07-17 20:44:19,699 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753411_12587 replica FinalizedReplica, blk_1073753411_12587, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753411 for deletion 2025-07-17 20:44:19,701 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753411_12587 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753411 2025-07-17 20:45:16,590 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753412_12588 src: /192.168.158.7:53618 dest: /192.168.158.4:9866 2025-07-17 20:45:16,608 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:53618, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-28620275_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753412_12588, duration(ns): 16432242 2025-07-17 20:45:16,609 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753412_12588, type=LAST_IN_PIPELINE terminating 2025-07-17 20:45:19,701 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753412_12588 replica FinalizedReplica, blk_1073753412_12588, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753412 for deletion 2025-07-17 20:45:19,703 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753412_12588 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753412 2025-07-17 20:48:26,581 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753415_12591 src: /192.168.158.1:33746 dest: /192.168.158.4:9866 2025-07-17 20:48:26,615 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33746, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-228664163_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753415_12591, duration(ns): 25026121 2025-07-17 20:48:26,615 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753415_12591, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-17 20:48:31,712 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753415_12591 replica FinalizedReplica, blk_1073753415_12591, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753415 for deletion 2025-07-17 20:48:31,713 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753415_12591 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753415 2025-07-17 20:49:26,592 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753416_12592 src: /192.168.158.8:42194 dest: /192.168.158.4:9866 2025-07-17 20:49:26,618 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:42194, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1032278756_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753416_12592, duration(ns): 21078762 2025-07-17 20:49:26,619 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753416_12592, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-17 20:49:28,713 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753416_12592 replica FinalizedReplica, blk_1073753416_12592, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753416 for deletion 2025-07-17 20:49:28,715 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753416_12592 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753416 2025-07-17 20:50:31,588 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753417_12593 src: /192.168.158.1:41154 dest: /192.168.158.4:9866 2025-07-17 20:50:31,619 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41154, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_797052720_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753417_12593, duration(ns): 22226617 2025-07-17 20:50:31,620 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753417_12593, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-17 20:50:34,714 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753417_12593 replica FinalizedReplica, blk_1073753417_12593, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753417 for deletion 2025-07-17 20:50:34,716 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753417_12593 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753417 2025-07-17 20:52:36,598 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753419_12595 src: /192.168.158.1:48820 dest: /192.168.158.4:9866 2025-07-17 20:52:36,632 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48820, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1521173017_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753419_12595, duration(ns): 24912604 2025-07-17 20:52:36,633 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753419_12595, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-17 20:52:37,718 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753419_12595 replica FinalizedReplica, blk_1073753419_12595, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753419 for deletion 2025-07-17 20:52:37,719 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753419_12595 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753419 2025-07-17 20:54:36,596 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753421_12597 src: /192.168.158.7:50072 dest: /192.168.158.4:9866 2025-07-17 20:54:36,615 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:50072, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-542535359_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753421_12597, duration(ns): 16961367 2025-07-17 20:54:36,615 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753421_12597, type=LAST_IN_PIPELINE terminating 2025-07-17 20:54:37,722 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753421_12597 replica FinalizedReplica, blk_1073753421_12597, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753421 for deletion 2025-07-17 20:54:37,723 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753421_12597 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753421 2025-07-17 20:55:41,593 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753422_12598 src: /192.168.158.7:50780 dest: /192.168.158.4:9866 2025-07-17 20:55:41,612 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:50780, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-899584159_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753422_12598, duration(ns): 16391138 2025-07-17 20:55:41,612 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753422_12598, type=LAST_IN_PIPELINE terminating 2025-07-17 20:55:46,723 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753422_12598 replica FinalizedReplica, blk_1073753422_12598, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753422 for deletion 2025-07-17 20:55:46,724 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753422_12598 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753422 2025-07-17 20:58:46,598 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753425_12601 src: /192.168.158.6:51110 dest: /192.168.158.4:9866 2025-07-17 20:58:46,627 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:51110, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1219797987_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753425_12601, duration(ns): 23630202 2025-07-17 20:58:46,629 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753425_12601, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-17 20:58:49,732 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753425_12601 replica FinalizedReplica, blk_1073753425_12601, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753425 for deletion 2025-07-17 20:58:49,733 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753425_12601 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753425 2025-07-17 20:59:46,601 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753426_12602 src: /192.168.158.9:55026 dest: /192.168.158.4:9866 2025-07-17 20:59:46,627 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:55026, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_426323906_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753426_12602, duration(ns): 20285721 2025-07-17 20:59:46,628 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753426_12602, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-17 20:59:52,734 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753426_12602 replica FinalizedReplica, blk_1073753426_12602, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753426 for deletion 2025-07-17 20:59:52,735 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753426_12602 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753426 2025-07-17 21:00:46,605 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753427_12603 src: /192.168.158.1:36200 dest: /192.168.158.4:9866 2025-07-17 21:00:46,644 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36200, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1324431445_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753427_12603, duration(ns): 29247042 2025-07-17 21:00:46,645 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753427_12603, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-17 21:00:49,739 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753427_12603 replica FinalizedReplica, blk_1073753427_12603, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753427 for deletion 2025-07-17 21:00:49,740 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753427_12603 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753427 2025-07-17 21:01:46,603 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753428_12604 src: /192.168.158.7:58084 dest: /192.168.158.4:9866 2025-07-17 21:01:46,629 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:58084, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_62397508_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753428_12604, duration(ns): 20751128 2025-07-17 21:01:46,631 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753428_12604, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-17 21:01:52,743 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753428_12604 replica FinalizedReplica, blk_1073753428_12604, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753428 for deletion 2025-07-17 21:01:52,744 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753428_12604 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753428 2025-07-17 21:03:51,626 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753430_12606 src: /192.168.158.6:40800 dest: /192.168.158.4:9866 2025-07-17 21:03:51,648 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:40800, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1035097543_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753430_12606, duration(ns): 20161644 2025-07-17 21:03:51,649 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753430_12606, type=LAST_IN_PIPELINE terminating 2025-07-17 21:03:52,749 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753430_12606 replica FinalizedReplica, blk_1073753430_12606, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753430 for deletion 2025-07-17 21:03:52,750 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753430_12606 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753430 2025-07-17 21:04:51,634 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753431_12607 src: /192.168.158.6:59050 dest: /192.168.158.4:9866 2025-07-17 21:04:51,653 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:59050, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1333212381_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753431_12607, duration(ns): 16409886 2025-07-17 21:04:51,653 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753431_12607, type=LAST_IN_PIPELINE terminating 2025-07-17 21:04:55,754 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753431_12607 replica FinalizedReplica, blk_1073753431_12607, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753431 for deletion 2025-07-17 21:04:55,755 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753431_12607 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753431 2025-07-17 21:05:51,629 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753432_12608 src: /192.168.158.5:48590 dest: /192.168.158.4:9866 2025-07-17 21:05:51,653 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:48590, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1222008393_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753432_12608, duration(ns): 18940443 2025-07-17 21:05:51,653 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753432_12608, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-17 21:05:52,758 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753432_12608 replica FinalizedReplica, blk_1073753432_12608, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753432 for deletion 2025-07-17 21:05:52,759 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753432_12608 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753432 2025-07-17 21:06:51,626 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753433_12609 src: /192.168.158.1:53060 dest: /192.168.158.4:9866 2025-07-17 21:06:51,664 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53060, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_397558932_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753433_12609, duration(ns): 28050580 2025-07-17 21:06:51,664 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753433_12609, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-17 21:06:55,761 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753433_12609 replica FinalizedReplica, blk_1073753433_12609, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753433 for deletion 2025-07-17 21:06:55,762 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753433_12609 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753433 2025-07-17 21:07:51,644 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753434_12610 src: /192.168.158.9:54716 dest: /192.168.158.4:9866 2025-07-17 21:07:51,666 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:54716, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_886041367_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753434_12610, duration(ns): 19754045 2025-07-17 21:07:51,666 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753434_12610, type=LAST_IN_PIPELINE terminating 2025-07-17 21:07:52,763 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753434_12610 replica FinalizedReplica, blk_1073753434_12610, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753434 for deletion 2025-07-17 21:07:52,764 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753434_12610 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753434 2025-07-17 21:09:56,648 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753436_12612 src: /192.168.158.8:59016 dest: /192.168.158.4:9866 2025-07-17 21:09:56,674 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:59016, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1880002147_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753436_12612, duration(ns): 20107112 2025-07-17 21:09:56,674 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753436_12612, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-17 21:10:01,767 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753436_12612 replica FinalizedReplica, blk_1073753436_12612, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753436 for deletion 2025-07-17 21:10:01,768 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753436_12612 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753436 2025-07-17 21:11:56,643 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753438_12614 src: /192.168.158.1:55496 dest: /192.168.158.4:9866 2025-07-17 21:11:56,676 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55496, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1664879351_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753438_12614, duration(ns): 24301083 2025-07-17 21:11:56,676 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753438_12614, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-17 21:11:58,771 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753438_12614 replica FinalizedReplica, blk_1073753438_12614, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753438 for deletion 2025-07-17 21:11:58,773 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753438_12614 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753438 2025-07-17 21:12:56,640 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753439_12615 src: /192.168.158.8:35010 dest: /192.168.158.4:9866 2025-07-17 21:12:56,672 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:35010, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1572276256_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753439_12615, duration(ns): 25621636 2025-07-17 21:12:56,673 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753439_12615, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-17 21:12:58,772 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753439_12615 replica FinalizedReplica, blk_1073753439_12615, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753439 for deletion 2025-07-17 21:12:58,773 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753439_12615 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753439 2025-07-17 21:14:56,650 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753441_12617 src: /192.168.158.1:32896 dest: /192.168.158.4:9866 2025-07-17 21:14:56,685 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:32896, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_545553291_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753441_12617, duration(ns): 25912869 2025-07-17 21:14:56,685 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753441_12617, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-17 21:14:58,776 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753441_12617 replica FinalizedReplica, blk_1073753441_12617, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753441 for deletion 2025-07-17 21:14:58,777 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753441_12617 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753441 2025-07-17 21:15:56,647 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753442_12618 src: /192.168.158.1:54204 dest: /192.168.158.4:9866 2025-07-17 21:15:56,681 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54204, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1035436568_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753442_12618, duration(ns): 25072411 2025-07-17 21:15:56,681 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753442_12618, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-17 21:16:01,780 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753442_12618 replica FinalizedReplica, blk_1073753442_12618, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753442 for deletion 2025-07-17 21:16:01,781 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753442_12618 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753442 2025-07-17 21:16:56,657 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753443_12619 src: /192.168.158.8:57898 dest: /192.168.158.4:9866 2025-07-17 21:16:56,683 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:57898, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-180562591_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753443_12619, duration(ns): 19822330 2025-07-17 21:16:56,683 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753443_12619, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-17 21:17:01,783 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753443_12619 replica FinalizedReplica, blk_1073753443_12619, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753443 for deletion 2025-07-17 21:17:01,784 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753443_12619 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753443 2025-07-17 21:17:56,664 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753444_12620 src: /192.168.158.6:53182 dest: /192.168.158.4:9866 2025-07-17 21:17:56,682 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:53182, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1771701495_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753444_12620, duration(ns): 16296103 2025-07-17 21:17:56,683 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753444_12620, type=LAST_IN_PIPELINE terminating 2025-07-17 21:18:01,786 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753444_12620 replica FinalizedReplica, blk_1073753444_12620, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753444 for deletion 2025-07-17 21:18:01,787 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753444_12620 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753444 2025-07-17 21:21:01,659 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753447_12623 src: /192.168.158.6:60368 dest: /192.168.158.4:9866 2025-07-17 21:21:01,678 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:60368, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1259495308_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753447_12623, duration(ns): 16145193 2025-07-17 21:21:01,678 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753447_12623, type=LAST_IN_PIPELINE terminating 2025-07-17 21:21:04,790 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753447_12623 replica FinalizedReplica, blk_1073753447_12623, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753447 for deletion 2025-07-17 21:21:04,791 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753447_12623 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753447 2025-07-17 21:23:06,673 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753449_12625 src: /192.168.158.5:56028 dest: /192.168.158.4:9866 2025-07-17 21:23:06,692 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:56028, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1480058642_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753449_12625, duration(ns): 16207915 2025-07-17 21:23:06,692 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753449_12625, type=LAST_IN_PIPELINE terminating 2025-07-17 21:23:07,791 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753449_12625 replica FinalizedReplica, blk_1073753449_12625, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753449 for deletion 2025-07-17 21:23:07,792 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753449_12625 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753449 2025-07-17 21:28:16,679 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753454_12630 src: /192.168.158.7:46676 dest: /192.168.158.4:9866 2025-07-17 21:28:16,696 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:46676, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2059669871_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753454_12630, duration(ns): 15319550 2025-07-17 21:28:16,697 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753454_12630, type=LAST_IN_PIPELINE terminating 2025-07-17 21:28:19,804 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753454_12630 replica FinalizedReplica, blk_1073753454_12630, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753454 for deletion 2025-07-17 21:28:19,806 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753454_12630 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753454 2025-07-17 21:29:16,689 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753455_12631 src: /192.168.158.8:43698 dest: /192.168.158.4:9866 2025-07-17 21:29:16,713 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:43698, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1286026574_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753455_12631, duration(ns): 19038626 2025-07-17 21:29:16,714 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753455_12631, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-17 21:29:19,807 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753455_12631 replica FinalizedReplica, blk_1073753455_12631, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753455 for deletion 2025-07-17 21:29:19,808 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753455_12631 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753455 2025-07-17 21:31:16,678 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753457_12633 src: /192.168.158.5:33262 dest: /192.168.158.4:9866 2025-07-17 21:31:16,698 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:33262, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-478631420_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753457_12633, duration(ns): 17320388 2025-07-17 21:31:16,698 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753457_12633, type=LAST_IN_PIPELINE terminating 2025-07-17 21:31:19,810 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753457_12633 replica FinalizedReplica, blk_1073753457_12633, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753457 for deletion 2025-07-17 21:31:19,811 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753457_12633 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753457 2025-07-17 21:33:16,677 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753459_12635 src: /192.168.158.6:34264 dest: /192.168.158.4:9866 2025-07-17 21:33:16,702 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:34264, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1257908139_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753459_12635, duration(ns): 18844399 2025-07-17 21:33:16,702 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753459_12635, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-17 21:33:19,813 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753459_12635 replica FinalizedReplica, blk_1073753459_12635, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753459 for deletion 2025-07-17 21:33:19,814 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753459_12635 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753459 2025-07-17 21:34:16,677 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753460_12636 src: /192.168.158.8:40702 dest: /192.168.158.4:9866 2025-07-17 21:34:16,698 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:40702, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1569337513_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753460_12636, duration(ns): 18302936 2025-07-17 21:34:16,698 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753460_12636, type=LAST_IN_PIPELINE terminating 2025-07-17 21:34:22,814 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753460_12636 replica FinalizedReplica, blk_1073753460_12636, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753460 for deletion 2025-07-17 21:34:22,815 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753460_12636 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753460 2025-07-17 21:36:16,687 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753462_12638 src: /192.168.158.9:53838 dest: /192.168.158.4:9866 2025-07-17 21:36:16,705 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:53838, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1177196701_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753462_12638, duration(ns): 15778282 2025-07-17 21:36:16,705 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753462_12638, type=LAST_IN_PIPELINE terminating 2025-07-17 21:36:19,817 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753462_12638 replica FinalizedReplica, blk_1073753462_12638, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753462 for deletion 2025-07-17 21:36:19,818 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753462_12638 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753462 2025-07-17 21:37:16,682 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753463_12639 src: /192.168.158.6:35394 dest: /192.168.158.4:9866 2025-07-17 21:37:16,699 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:35394, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-407943639_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753463_12639, duration(ns): 14690246 2025-07-17 21:37:16,700 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753463_12639, type=LAST_IN_PIPELINE terminating 2025-07-17 21:37:19,821 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753463_12639 replica FinalizedReplica, blk_1073753463_12639, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753463 for deletion 2025-07-17 21:37:19,822 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753463_12639 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753463 2025-07-17 21:38:16,683 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753464_12640 src: /192.168.158.9:45802 dest: /192.168.158.4:9866 2025-07-17 21:38:16,709 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:45802, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-997741495_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753464_12640, duration(ns): 19707354 2025-07-17 21:38:16,709 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753464_12640, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-17 21:38:19,822 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753464_12640 replica FinalizedReplica, blk_1073753464_12640, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753464 for deletion 2025-07-17 21:38:19,823 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753464_12640 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753464 2025-07-17 21:41:21,727 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753467_12643 src: /192.168.158.6:34002 dest: /192.168.158.4:9866 2025-07-17 21:41:21,748 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:34002, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1147734242_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753467_12643, duration(ns): 18063869 2025-07-17 21:41:21,748 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753467_12643, type=LAST_IN_PIPELINE terminating 2025-07-17 21:41:22,829 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753467_12643 replica FinalizedReplica, blk_1073753467_12643, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753467 for deletion 2025-07-17 21:41:22,830 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753467_12643 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753467 2025-07-17 21:44:31,701 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753470_12646 src: /192.168.158.6:50466 dest: /192.168.158.4:9866 2025-07-17 21:44:31,723 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:50466, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_989565160_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753470_12646, duration(ns): 20352527 2025-07-17 21:44:31,724 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753470_12646, type=LAST_IN_PIPELINE terminating 2025-07-17 21:44:34,834 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753470_12646 replica FinalizedReplica, blk_1073753470_12646, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753470 for deletion 2025-07-17 21:44:34,835 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753470_12646 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753470 2025-07-17 21:45:31,689 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753471_12647 src: /192.168.158.1:60574 dest: /192.168.158.4:9866 2025-07-17 21:45:31,725 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60574, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-234959818_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753471_12647, duration(ns): 26077895 2025-07-17 21:45:31,725 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753471_12647, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-17 21:45:34,837 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753471_12647 replica FinalizedReplica, blk_1073753471_12647, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753471 for deletion 2025-07-17 21:45:34,838 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753471_12647 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753471 2025-07-17 21:48:31,702 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753474_12650 src: /192.168.158.1:45210 dest: /192.168.158.4:9866 2025-07-17 21:48:31,736 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45210, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1018111926_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753474_12650, duration(ns): 24432495 2025-07-17 21:48:31,737 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753474_12650, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-17 21:48:34,844 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753474_12650 replica FinalizedReplica, blk_1073753474_12650, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753474 for deletion 2025-07-17 21:48:34,845 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753474_12650 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753474 2025-07-17 21:50:36,707 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753476_12652 src: /192.168.158.5:39684 dest: /192.168.158.4:9866 2025-07-17 21:50:36,727 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:39684, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1148765867_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753476_12652, duration(ns): 15604810 2025-07-17 21:50:36,727 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753476_12652, type=LAST_IN_PIPELINE terminating 2025-07-17 21:50:37,849 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753476_12652 replica FinalizedReplica, blk_1073753476_12652, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753476 for deletion 2025-07-17 21:50:37,850 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753476_12652 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753476 2025-07-17 21:51:36,706 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753477_12653 src: /192.168.158.1:46104 dest: /192.168.158.4:9866 2025-07-17 21:51:36,742 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46104, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_274409900_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753477_12653, duration(ns): 26264875 2025-07-17 21:51:36,742 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753477_12653, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-17 21:51:37,853 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753477_12653 replica FinalizedReplica, blk_1073753477_12653, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753477 for deletion 2025-07-17 21:51:37,854 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753477_12653 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753477 2025-07-17 21:54:41,714 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753480_12656 src: /192.168.158.5:49544 dest: /192.168.158.4:9866 2025-07-17 21:54:41,742 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:49544, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1848350005_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753480_12656, duration(ns): 22286507 2025-07-17 21:54:41,742 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753480_12656, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-17 21:54:43,857 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753480_12656 replica FinalizedReplica, blk_1073753480_12656, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753480 for deletion 2025-07-17 21:54:43,858 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753480_12656 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753480 2025-07-17 21:56:41,722 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753482_12658 src: /192.168.158.1:52020 dest: /192.168.158.4:9866 2025-07-17 21:56:41,757 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52020, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1101301381_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753482_12658, duration(ns): 25891194 2025-07-17 21:56:41,757 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753482_12658, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-17 21:56:43,861 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753482_12658 replica FinalizedReplica, blk_1073753482_12658, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753482 for deletion 2025-07-17 21:56:43,862 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753482_12658 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753482 2025-07-17 21:57:41,719 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753483_12659 src: /192.168.158.9:45998 dest: /192.168.158.4:9866 2025-07-17 21:57:41,746 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:45998, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1367446500_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753483_12659, duration(ns): 21027548 2025-07-17 21:57:41,746 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753483_12659, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-17 21:57:46,862 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753483_12659 replica FinalizedReplica, blk_1073753483_12659, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753483 for deletion 2025-07-17 21:57:46,863 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753483_12659 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753483 2025-07-17 21:58:46,718 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753484_12660 src: /192.168.158.1:46086 dest: /192.168.158.4:9866 2025-07-17 21:58:46,752 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46086, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2329835_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753484_12660, duration(ns): 24318920 2025-07-17 21:58:46,752 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753484_12660, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-17 21:58:49,864 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753484_12660 replica FinalizedReplica, blk_1073753484_12660, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753484 for deletion 2025-07-17 21:58:49,865 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753484_12660 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753484 2025-07-17 21:59:19,871 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Successfully sent block report 0x12cfd9a757d31f47, containing 4 storage report(s), of which we sent 4. The reports had 8 total blocks and used 1 RPC(s). This took 0 msec to generate and 4 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5. 2025-07-17 21:59:19,872 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Got finalize command for block pool BP-1059995147-192.168.158.1-1752101929360 2025-07-17 21:59:46,715 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753485_12661 src: /192.168.158.1:52438 dest: /192.168.158.4:9866 2025-07-17 21:59:46,747 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52438, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1450200117_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753485_12661, duration(ns): 23042893 2025-07-17 21:59:46,747 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753485_12661, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-17 21:59:49,868 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753485_12661 replica FinalizedReplica, blk_1073753485_12661, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753485 for deletion 2025-07-17 21:59:49,869 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753485_12661 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753485 2025-07-17 22:00:46,718 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753486_12662 src: /192.168.158.8:35692 dest: /192.168.158.4:9866 2025-07-17 22:00:46,744 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:35692, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-992367777_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753486_12662, duration(ns): 20103281 2025-07-17 22:00:46,744 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753486_12662, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-17 22:00:49,871 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753486_12662 replica FinalizedReplica, blk_1073753486_12662, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753486 for deletion 2025-07-17 22:00:49,872 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753486_12662 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753486 2025-07-17 22:01:46,725 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753487_12663 src: /192.168.158.1:45322 dest: /192.168.158.4:9866 2025-07-17 22:01:46,763 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45322, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1870442691_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753487_12663, duration(ns): 28888615 2025-07-17 22:01:46,763 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753487_12663, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-17 22:01:52,871 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753487_12663 replica FinalizedReplica, blk_1073753487_12663, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753487 for deletion 2025-07-17 22:01:52,872 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753487_12663 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753487 2025-07-17 22:05:51,779 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753491_12667 src: /192.168.158.5:33482 dest: /192.168.158.4:9866 2025-07-17 22:05:51,806 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:33482, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1375263632_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753491_12667, duration(ns): 21047190 2025-07-17 22:05:51,806 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753491_12667, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-17 22:05:52,882 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753491_12667 replica FinalizedReplica, blk_1073753491_12667, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753491 for deletion 2025-07-17 22:05:52,883 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753491_12667 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753491 2025-07-17 22:08:51,732 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753494_12670 src: /192.168.158.5:42216 dest: /192.168.158.4:9866 2025-07-17 22:08:51,758 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:42216, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1936327381_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753494_12670, duration(ns): 20792254 2025-07-17 22:08:51,758 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753494_12670, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-17 22:08:52,889 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753494_12670 replica FinalizedReplica, blk_1073753494_12670, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753494 for deletion 2025-07-17 22:08:52,890 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753494_12670 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753494 2025-07-17 22:09:51,718 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753495_12671 src: /192.168.158.1:55734 dest: /192.168.158.4:9866 2025-07-17 22:09:51,751 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55734, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-141717451_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753495_12671, duration(ns): 24218255 2025-07-17 22:09:51,751 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753495_12671, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-17 22:09:52,892 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753495_12671 replica FinalizedReplica, blk_1073753495_12671, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753495 for deletion 2025-07-17 22:09:52,893 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753495_12671 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753495 2025-07-17 22:10:51,726 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753496_12672 src: /192.168.158.8:47838 dest: /192.168.158.4:9866 2025-07-17 22:10:51,754 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:47838, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_698831653_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753496_12672, duration(ns): 21901426 2025-07-17 22:10:51,754 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753496_12672, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-17 22:10:52,891 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753496_12672 replica FinalizedReplica, blk_1073753496_12672, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753496 for deletion 2025-07-17 22:10:52,892 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753496_12672 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753496 2025-07-17 22:11:51,726 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753497_12673 src: /192.168.158.1:49960 dest: /192.168.158.4:9866 2025-07-17 22:11:51,757 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49960, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1103264167_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753497_12673, duration(ns): 22765735 2025-07-17 22:11:51,758 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753497_12673, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-17 22:11:52,892 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753497_12673 replica FinalizedReplica, blk_1073753497_12673, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753497 for deletion 2025-07-17 22:11:52,896 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753497_12673 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753497 2025-07-17 22:12:51,735 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753498_12674 src: /192.168.158.8:57934 dest: /192.168.158.4:9866 2025-07-17 22:12:51,753 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:57934, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1237691921_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753498_12674, duration(ns): 16248759 2025-07-17 22:12:51,753 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753498_12674, type=LAST_IN_PIPELINE terminating 2025-07-17 22:12:52,897 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753498_12674 replica FinalizedReplica, blk_1073753498_12674, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753498 for deletion 2025-07-17 22:12:52,899 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753498_12674 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753498 2025-07-17 22:18:56,743 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753504_12680 src: /192.168.158.6:35544 dest: /192.168.158.4:9866 2025-07-17 22:18:56,771 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:35544, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_273501062_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753504_12680, duration(ns): 22283321 2025-07-17 22:18:56,772 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753504_12680, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-17 22:19:01,905 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753504_12680 replica FinalizedReplica, blk_1073753504_12680, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753504 for deletion 2025-07-17 22:19:01,906 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753504_12680 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753504 2025-07-17 22:20:01,759 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753505_12681 src: /192.168.158.1:39408 dest: /192.168.158.4:9866 2025-07-17 22:20:01,794 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39408, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_231256492_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753505_12681, duration(ns): 24471171 2025-07-17 22:20:01,794 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753505_12681, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-17 22:20:07,908 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753505_12681 replica FinalizedReplica, blk_1073753505_12681, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753505 for deletion 2025-07-17 22:20:07,909 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753505_12681 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753505 2025-07-17 22:25:06,759 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753510_12686 src: /192.168.158.1:52538 dest: /192.168.158.4:9866 2025-07-17 22:25:06,796 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52538, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1907528977_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753510_12686, duration(ns): 27849546 2025-07-17 22:25:06,796 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753510_12686, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-17 22:25:07,919 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753510_12686 replica FinalizedReplica, blk_1073753510_12686, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753510 for deletion 2025-07-17 22:25:07,920 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753510_12686 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753510 2025-07-17 22:27:11,763 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753512_12688 src: /192.168.158.9:44998 dest: /192.168.158.4:9866 2025-07-17 22:27:11,788 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:44998, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_64290097_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753512_12688, duration(ns): 19220655 2025-07-17 22:27:11,788 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753512_12688, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-17 22:27:16,921 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753512_12688 replica FinalizedReplica, blk_1073753512_12688, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753512 for deletion 2025-07-17 22:27:16,922 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753512_12688 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753512 2025-07-17 22:29:16,770 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753514_12690 src: /192.168.158.1:60960 dest: /192.168.158.4:9866 2025-07-17 22:29:16,802 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60960, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1721242946_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753514_12690, duration(ns): 22914938 2025-07-17 22:29:16,802 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753514_12690, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-17 22:29:16,926 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753514_12690 replica FinalizedReplica, blk_1073753514_12690, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753514 for deletion 2025-07-17 22:29:16,928 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753514_12690 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753514 2025-07-17 22:30:16,769 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753515_12691 src: /192.168.158.6:43478 dest: /192.168.158.4:9866 2025-07-17 22:30:16,787 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:43478, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1191182574_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753515_12691, duration(ns): 15429840 2025-07-17 22:30:16,787 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753515_12691, type=LAST_IN_PIPELINE terminating 2025-07-17 22:30:16,931 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753515_12691 replica FinalizedReplica, blk_1073753515_12691, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753515 for deletion 2025-07-17 22:30:16,932 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753515_12691 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753515 2025-07-17 22:31:16,781 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753516_12692 src: /192.168.158.9:60550 dest: /192.168.158.4:9866 2025-07-17 22:31:16,803 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:60550, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1145372742_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753516_12692, duration(ns): 18412631 2025-07-17 22:31:16,803 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753516_12692, type=LAST_IN_PIPELINE terminating 2025-07-17 22:31:16,934 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753516_12692 replica FinalizedReplica, blk_1073753516_12692, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753516 for deletion 2025-07-17 22:31:16,935 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753516_12692 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753516 2025-07-17 22:33:16,778 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753518_12694 src: /192.168.158.9:34060 dest: /192.168.158.4:9866 2025-07-17 22:33:16,798 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:34060, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-480065274_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753518_12694, duration(ns): 17906168 2025-07-17 22:33:16,799 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753518_12694, type=LAST_IN_PIPELINE terminating 2025-07-17 22:33:16,935 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753518_12694 replica FinalizedReplica, blk_1073753518_12694, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753518 for deletion 2025-07-17 22:33:16,936 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753518_12694 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753518 2025-07-17 22:35:16,770 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753520_12696 src: /192.168.158.1:43734 dest: /192.168.158.4:9866 2025-07-17 22:35:16,804 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43734, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2118097653_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753520_12696, duration(ns): 24560862 2025-07-17 22:35:16,804 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753520_12696, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-17 22:35:19,936 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753520_12696 replica FinalizedReplica, blk_1073753520_12696, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753520 for deletion 2025-07-17 22:35:19,937 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753520_12696 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753520 2025-07-17 22:39:26,786 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753524_12700 src: /192.168.158.7:50710 dest: /192.168.158.4:9866 2025-07-17 22:39:26,807 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:50710, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1983165642_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753524_12700, duration(ns): 16911598 2025-07-17 22:39:26,807 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753524_12700, type=LAST_IN_PIPELINE terminating 2025-07-17 22:39:28,948 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753524_12700 replica FinalizedReplica, blk_1073753524_12700, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753524 for deletion 2025-07-17 22:39:28,949 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753524_12700 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753524 2025-07-17 22:41:26,790 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753526_12702 src: /192.168.158.1:57252 dest: /192.168.158.4:9866 2025-07-17 22:41:26,823 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57252, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1281176139_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753526_12702, duration(ns): 23371850 2025-07-17 22:41:26,823 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753526_12702, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-17 22:41:28,951 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753526_12702 replica FinalizedReplica, blk_1073753526_12702, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753526 for deletion 2025-07-17 22:41:28,952 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753526_12702 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753526 2025-07-17 22:42:26,810 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753527_12703 src: /192.168.158.7:48362 dest: /192.168.158.4:9866 2025-07-17 22:42:26,834 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:48362, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1096987574_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753527_12703, duration(ns): 18788244 2025-07-17 22:42:26,834 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753527_12703, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-17 22:42:28,955 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753527_12703 replica FinalizedReplica, blk_1073753527_12703, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753527 for deletion 2025-07-17 22:42:28,956 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753527_12703 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753527 2025-07-17 22:43:26,796 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753528_12704 src: /192.168.158.7:54768 dest: /192.168.158.4:9866 2025-07-17 22:43:26,815 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:54768, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1730215571_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753528_12704, duration(ns): 16468923 2025-07-17 22:43:26,815 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753528_12704, type=LAST_IN_PIPELINE terminating 2025-07-17 22:43:28,956 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753528_12704 replica FinalizedReplica, blk_1073753528_12704, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753528 for deletion 2025-07-17 22:43:28,958 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753528_12704 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753528 2025-07-17 22:48:26,804 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753533_12709 src: /192.168.158.7:49266 dest: /192.168.158.4:9866 2025-07-17 22:48:26,829 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:49266, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-422550288_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753533_12709, duration(ns): 19269984 2025-07-17 22:48:26,829 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753533_12709, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-17 22:48:31,967 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753533_12709 replica FinalizedReplica, blk_1073753533_12709, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753533 for deletion 2025-07-17 22:48:31,968 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753533_12709 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753533 2025-07-17 22:50:26,805 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753535_12711 src: /192.168.158.1:46040 dest: /192.168.158.4:9866 2025-07-17 22:50:26,839 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46040, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_900068272_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753535_12711, duration(ns): 25123420 2025-07-17 22:50:26,839 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753535_12711, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-17 22:50:31,972 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753535_12711 replica FinalizedReplica, blk_1073753535_12711, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753535 for deletion 2025-07-17 22:50:31,973 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753535_12711 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753535 2025-07-17 22:51:26,802 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753536_12712 src: /192.168.158.1:40554 dest: /192.168.158.4:9866 2025-07-17 22:51:26,835 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40554, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1756229810_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753536_12712, duration(ns): 23663700 2025-07-17 22:51:26,836 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753536_12712, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-17 22:51:31,973 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753536_12712 replica FinalizedReplica, blk_1073753536_12712, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753536 for deletion 2025-07-17 22:51:31,975 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753536_12712 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753536 2025-07-17 22:52:31,818 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753537_12713 src: /192.168.158.9:44514 dest: /192.168.158.4:9866 2025-07-17 22:52:31,844 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:44514, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1415966053_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753537_12713, duration(ns): 20626762 2025-07-17 22:52:31,845 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753537_12713, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-17 22:52:37,977 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753537_12713 replica FinalizedReplica, blk_1073753537_12713, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753537 for deletion 2025-07-17 22:52:37,978 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753537_12713 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753537 2025-07-17 22:57:31,836 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753542_12718 src: /192.168.158.9:36018 dest: /192.168.158.4:9866 2025-07-17 22:57:31,857 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:36018, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1335973566_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753542_12718, duration(ns): 18480114 2025-07-17 22:57:31,857 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753542_12718, type=LAST_IN_PIPELINE terminating 2025-07-17 22:57:34,986 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753542_12718 replica FinalizedReplica, blk_1073753542_12718, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753542 for deletion 2025-07-17 22:57:34,987 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753542_12718 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753542 2025-07-17 22:58:31,830 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753543_12719 src: /192.168.158.9:33816 dest: /192.168.158.4:9866 2025-07-17 22:58:31,857 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:33816, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1067987858_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753543_12719, duration(ns): 20800915 2025-07-17 22:58:31,857 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753543_12719, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-17 22:58:37,988 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753543_12719 replica FinalizedReplica, blk_1073753543_12719, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753543 for deletion 2025-07-17 22:58:37,989 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753543_12719 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753543 2025-07-17 22:59:31,827 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753544_12720 src: /192.168.158.1:40770 dest: /192.168.158.4:9866 2025-07-17 22:59:31,861 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40770, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-208545513_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753544_12720, duration(ns): 25189800 2025-07-17 22:59:31,861 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753544_12720, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-17 22:59:34,988 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753544_12720 replica FinalizedReplica, blk_1073753544_12720, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753544 for deletion 2025-07-17 22:59:34,989 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753544_12720 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753544 2025-07-17 23:02:31,854 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753547_12723 src: /192.168.158.1:35500 dest: /192.168.158.4:9866 2025-07-17 23:02:31,887 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35500, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_291430010_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753547_12723, duration(ns): 23724760 2025-07-17 23:02:31,887 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753547_12723, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-17 23:02:37,990 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753547_12723 replica FinalizedReplica, blk_1073753547_12723, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753547 for deletion 2025-07-17 23:02:37,991 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753547_12723 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753547 2025-07-17 23:04:31,843 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753549_12725 src: /192.168.158.7:33426 dest: /192.168.158.4:9866 2025-07-17 23:04:31,862 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:33426, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1467559839_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753549_12725, duration(ns): 17679640 2025-07-17 23:04:31,863 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753549_12725, type=LAST_IN_PIPELINE terminating 2025-07-17 23:04:34,996 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753549_12725 replica FinalizedReplica, blk_1073753549_12725, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753549 for deletion 2025-07-17 23:04:34,997 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753549_12725 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753549 2025-07-17 23:05:31,856 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753550_12726 src: /192.168.158.8:49616 dest: /192.168.158.4:9866 2025-07-17 23:05:31,889 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:49616, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2141808188_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753550_12726, duration(ns): 30783388 2025-07-17 23:05:31,889 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753550_12726, type=LAST_IN_PIPELINE terminating 2025-07-17 23:05:37,998 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753550_12726 replica FinalizedReplica, blk_1073753550_12726, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753550 for deletion 2025-07-17 23:05:37,999 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753550_12726 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753550 2025-07-17 23:09:31,867 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753554_12730 src: /192.168.158.9:40632 dest: /192.168.158.4:9866 2025-07-17 23:09:31,885 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:40632, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1465282607_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753554_12730, duration(ns): 15336128 2025-07-17 23:09:31,885 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753554_12730, type=LAST_IN_PIPELINE terminating 2025-07-17 23:09:35,008 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753554_12730 replica FinalizedReplica, blk_1073753554_12730, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753554 for deletion 2025-07-17 23:09:35,009 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753554_12730 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753554 2025-07-17 23:10:31,870 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753555_12731 src: /192.168.158.9:47412 dest: /192.168.158.4:9866 2025-07-17 23:10:31,897 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:47412, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2012686370_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753555_12731, duration(ns): 21373128 2025-07-17 23:10:31,897 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753555_12731, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-17 23:10:38,011 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753555_12731 replica FinalizedReplica, blk_1073753555_12731, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753555 for deletion 2025-07-17 23:10:38,012 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753555_12731 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753555 2025-07-17 23:13:36,889 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753558_12734 src: /192.168.158.9:35602 dest: /192.168.158.4:9866 2025-07-17 23:13:36,907 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:35602, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1057018870_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753558_12734, duration(ns): 16119889 2025-07-17 23:13:36,908 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753558_12734, type=LAST_IN_PIPELINE terminating 2025-07-17 23:13:41,017 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753558_12734 replica FinalizedReplica, blk_1073753558_12734, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753558 for deletion 2025-07-17 23:13:41,018 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753558_12734 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753558 2025-07-17 23:18:51,913 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753563_12739 src: /192.168.158.8:41562 dest: /192.168.158.4:9866 2025-07-17 23:18:51,944 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:41562, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_704042307_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753563_12739, duration(ns): 24118668 2025-07-17 23:18:51,945 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753563_12739, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-17 23:18:56,028 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753563_12739 replica FinalizedReplica, blk_1073753563_12739, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753563 for deletion 2025-07-17 23:18:56,029 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753563_12739 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753563 2025-07-17 23:21:56,892 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753566_12742 src: /192.168.158.5:54968 dest: /192.168.158.4:9866 2025-07-17 23:21:56,921 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:54968, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_703583822_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753566_12742, duration(ns): 23600111 2025-07-17 23:21:56,922 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753566_12742, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-17 23:22:02,036 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753566_12742 replica FinalizedReplica, blk_1073753566_12742, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753566 for deletion 2025-07-17 23:22:02,037 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753566_12742 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753566 2025-07-17 23:24:01,882 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753568_12744 src: /192.168.158.8:48236 dest: /192.168.158.4:9866 2025-07-17 23:24:01,910 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:48236, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-347845181_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753568_12744, duration(ns): 22549105 2025-07-17 23:24:01,910 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753568_12744, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-17 23:24:05,041 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753568_12744 replica FinalizedReplica, blk_1073753568_12744, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753568 for deletion 2025-07-17 23:24:05,042 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753568_12744 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753568 2025-07-17 23:28:11,891 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753572_12748 src: /192.168.158.1:41642 dest: /192.168.158.4:9866 2025-07-17 23:28:11,924 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41642, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-272304003_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753572_12748, duration(ns): 24315390 2025-07-17 23:28:11,924 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753572_12748, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-17 23:28:17,049 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753572_12748 replica FinalizedReplica, blk_1073753572_12748, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753572 for deletion 2025-07-17 23:28:17,050 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753572_12748 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753572 2025-07-17 23:29:11,883 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753573_12749 src: /192.168.158.1:59494 dest: /192.168.158.4:9866 2025-07-17 23:29:11,916 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59494, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_356025258_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753573_12749, duration(ns): 23603593 2025-07-17 23:29:11,916 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753573_12749, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-17 23:29:17,049 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753573_12749 replica FinalizedReplica, blk_1073753573_12749, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753573 for deletion 2025-07-17 23:29:17,050 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753573_12749 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753573 2025-07-17 23:31:11,899 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753575_12751 src: /192.168.158.1:53954 dest: /192.168.158.4:9866 2025-07-17 23:31:11,932 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53954, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_761893405_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753575_12751, duration(ns): 23524086 2025-07-17 23:31:11,932 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753575_12751, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-17 23:31:17,054 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753575_12751 replica FinalizedReplica, blk_1073753575_12751, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753575 for deletion 2025-07-17 23:31:17,055 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753575_12751 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753575 2025-07-17 23:36:11,910 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753580_12756 src: /192.168.158.7:54230 dest: /192.168.158.4:9866 2025-07-17 23:36:11,937 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:54230, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_664401085_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753580_12756, duration(ns): 21552197 2025-07-17 23:36:11,937 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753580_12756, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-17 23:36:13,269 INFO org.apache.hadoop.hdfs.server.datanode.DirectoryScanner: BlockPool BP-1059995147-192.168.158.1-1752101929360 Total blocks: 9, missing metadata files:0, missing block files:0, missing blocks in memory:0, mismatched blocks:0 2025-07-17 23:36:20,063 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753580_12756 replica FinalizedReplica, blk_1073753580_12756, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753580 for deletion 2025-07-17 23:36:20,064 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753580_12756 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753580 2025-07-17 23:37:11,906 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753581_12757 src: /192.168.158.1:38382 dest: /192.168.158.4:9866 2025-07-17 23:37:11,940 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38382, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-880939269_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753581_12757, duration(ns): 25995748 2025-07-17 23:37:11,941 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753581_12757, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-17 23:37:17,064 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753581_12757 replica FinalizedReplica, blk_1073753581_12757, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753581 for deletion 2025-07-17 23:37:17,065 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753581_12757 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753581 2025-07-17 23:38:16,904 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753582_12758 src: /192.168.158.8:40690 dest: /192.168.158.4:9866 2025-07-17 23:38:16,933 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:40690, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-253340076_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753582_12758, duration(ns): 21016045 2025-07-17 23:38:16,933 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753582_12758, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-17 23:38:23,068 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753582_12758 replica FinalizedReplica, blk_1073753582_12758, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753582 for deletion 2025-07-17 23:38:23,069 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753582_12758 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753582 2025-07-17 23:40:16,921 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753584_12760 src: /192.168.158.9:43760 dest: /192.168.158.4:9866 2025-07-17 23:40:16,939 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:43760, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-18790080_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753584_12760, duration(ns): 15980443 2025-07-17 23:40:16,939 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753584_12760, type=LAST_IN_PIPELINE terminating 2025-07-17 23:40:23,070 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753584_12760 replica FinalizedReplica, blk_1073753584_12760, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753584 for deletion 2025-07-17 23:40:23,071 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753584_12760 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753584 2025-07-17 23:41:16,898 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753585_12761 src: /192.168.158.1:53370 dest: /192.168.158.4:9866 2025-07-17 23:41:16,932 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53370, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1910845669_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753585_12761, duration(ns): 24150575 2025-07-17 23:41:16,933 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753585_12761, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-17 23:41:20,072 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753585_12761 replica FinalizedReplica, blk_1073753585_12761, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753585 for deletion 2025-07-17 23:41:20,073 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753585_12761 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753585 2025-07-17 23:42:16,906 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753586_12762 src: /192.168.158.7:55562 dest: /192.168.158.4:9866 2025-07-17 23:42:16,927 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:55562, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1772480909_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753586_12762, duration(ns): 18853966 2025-07-17 23:42:16,928 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753586_12762, type=LAST_IN_PIPELINE terminating 2025-07-17 23:42:23,074 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753586_12762 replica FinalizedReplica, blk_1073753586_12762, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753586 for deletion 2025-07-17 23:42:23,075 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753586_12762 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753586 2025-07-17 23:43:16,898 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753587_12763 src: /192.168.158.1:35626 dest: /192.168.158.4:9866 2025-07-17 23:43:16,932 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35626, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1087693202_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753587_12763, duration(ns): 24725254 2025-07-17 23:43:16,932 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753587_12763, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-17 23:43:20,077 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753587_12763 replica FinalizedReplica, blk_1073753587_12763, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753587 for deletion 2025-07-17 23:43:20,078 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753587_12763 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753587 2025-07-17 23:44:16,900 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753588_12764 src: /192.168.158.1:59174 dest: /192.168.158.4:9866 2025-07-17 23:44:16,932 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59174, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1100183623_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753588_12764, duration(ns): 23361385 2025-07-17 23:44:16,933 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753588_12764, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-17 23:44:20,078 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753588_12764 replica FinalizedReplica, blk_1073753588_12764, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753588 for deletion 2025-07-17 23:44:20,080 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753588_12764 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753588 2025-07-17 23:48:26,925 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753592_12768 src: /192.168.158.5:59070 dest: /192.168.158.4:9866 2025-07-17 23:48:26,951 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:59070, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_724775694_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753592_12768, duration(ns): 20681027 2025-07-17 23:48:26,952 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753592_12768, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-17 23:48:35,089 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753592_12768 replica FinalizedReplica, blk_1073753592_12768, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753592 for deletion 2025-07-17 23:48:35,090 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753592_12768 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753592 2025-07-17 23:49:26,928 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753593_12769 src: /192.168.158.1:44232 dest: /192.168.158.4:9866 2025-07-17 23:49:26,961 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44232, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_186443441_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753593_12769, duration(ns): 24324644 2025-07-17 23:49:26,961 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753593_12769, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-17 23:49:32,091 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753593_12769 replica FinalizedReplica, blk_1073753593_12769, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753593 for deletion 2025-07-17 23:49:32,092 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753593_12769 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753593 2025-07-17 23:50:26,934 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753594_12770 src: /192.168.158.6:39696 dest: /192.168.158.4:9866 2025-07-17 23:50:26,961 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:39696, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_902272807_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753594_12770, duration(ns): 20539217 2025-07-17 23:50:26,961 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753594_12770, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-17 23:50:32,094 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753594_12770 replica FinalizedReplica, blk_1073753594_12770, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753594 for deletion 2025-07-17 23:50:32,095 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753594_12770 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753594 2025-07-17 23:52:26,956 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753596_12772 src: /192.168.158.8:33996 dest: /192.168.158.4:9866 2025-07-17 23:52:26,981 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:33996, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_288310807_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753596_12772, duration(ns): 20250372 2025-07-17 23:52:26,982 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753596_12772, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-17 23:52:35,101 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753596_12772 replica FinalizedReplica, blk_1073753596_12772, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753596 for deletion 2025-07-17 23:52:35,102 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753596_12772 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753596 2025-07-17 23:53:31,936 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753597_12773 src: /192.168.158.1:36428 dest: /192.168.158.4:9866 2025-07-17 23:53:31,970 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36428, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1273195909_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753597_12773, duration(ns): 24206883 2025-07-17 23:53:31,970 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753597_12773, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-17 23:53:38,101 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753597_12773 replica FinalizedReplica, blk_1073753597_12773, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753597 for deletion 2025-07-17 23:53:38,102 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753597_12773 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753597 2025-07-17 23:55:31,963 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753599_12775 src: /192.168.158.7:56340 dest: /192.168.158.4:9866 2025-07-17 23:55:31,982 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:56340, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1061344267_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753599_12775, duration(ns): 16381991 2025-07-17 23:55:31,982 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753599_12775, type=LAST_IN_PIPELINE terminating 2025-07-17 23:55:38,104 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753599_12775 replica FinalizedReplica, blk_1073753599_12775, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753599 for deletion 2025-07-17 23:55:38,105 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753599_12775 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir13/blk_1073753599 2025-07-18 00:00:41,937 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753604_12780 src: /192.168.158.1:50886 dest: /192.168.158.4:9866 2025-07-18 00:00:41,975 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50886, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_557669065_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753604_12780, duration(ns): 26501838 2025-07-18 00:00:41,975 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753604_12780, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-18 00:00:47,113 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753604_12780 replica FinalizedReplica, blk_1073753604_12780, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753604 for deletion 2025-07-18 00:00:47,114 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753604_12780 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753604 2025-07-18 00:04:46,932 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753608_12784 src: /192.168.158.7:53248 dest: /192.168.158.4:9866 2025-07-18 00:04:46,954 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:53248, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1747020151_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753608_12784, duration(ns): 19206798 2025-07-18 00:04:46,954 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753608_12784, type=LAST_IN_PIPELINE terminating 2025-07-18 00:04:50,120 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753608_12784 replica FinalizedReplica, blk_1073753608_12784, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753608 for deletion 2025-07-18 00:04:50,121 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753608_12784 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753608 2025-07-18 00:05:46,937 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753609_12785 src: /192.168.158.9:50422 dest: /192.168.158.4:9866 2025-07-18 00:05:46,957 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:50422, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1807083485_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753609_12785, duration(ns): 17387006 2025-07-18 00:05:46,957 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753609_12785, type=LAST_IN_PIPELINE terminating 2025-07-18 00:05:50,120 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753609_12785 replica FinalizedReplica, blk_1073753609_12785, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753609 for deletion 2025-07-18 00:05:50,121 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753609_12785 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753609 2025-07-18 00:07:46,946 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753611_12787 src: /192.168.158.8:51156 dest: /192.168.158.4:9866 2025-07-18 00:07:46,973 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:51156, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1264743734_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753611_12787, duration(ns): 21260564 2025-07-18 00:07:46,974 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753611_12787, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-18 00:07:50,127 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753611_12787 replica FinalizedReplica, blk_1073753611_12787, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753611 for deletion 2025-07-18 00:07:50,129 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753611_12787 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753611 2025-07-18 00:08:46,951 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753612_12788 src: /192.168.158.7:40904 dest: /192.168.158.4:9866 2025-07-18 00:08:46,970 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:40904, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-816540038_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753612_12788, duration(ns): 17578922 2025-07-18 00:08:46,971 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753612_12788, type=LAST_IN_PIPELINE terminating 2025-07-18 00:08:50,128 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753612_12788 replica FinalizedReplica, blk_1073753612_12788, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753612 for deletion 2025-07-18 00:08:50,132 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753612_12788 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753612 2025-07-18 00:14:01,949 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753617_12793 src: /192.168.158.9:46438 dest: /192.168.158.4:9866 2025-07-18 00:14:01,974 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:46438, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1991020152_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753617_12793, duration(ns): 19599279 2025-07-18 00:14:01,974 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753617_12793, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-18 00:14:05,139 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753617_12793 replica FinalizedReplica, blk_1073753617_12793, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753617 for deletion 2025-07-18 00:14:05,140 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753617_12793 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753617 2025-07-18 00:17:01,942 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753620_12796 src: /192.168.158.1:52072 dest: /192.168.158.4:9866 2025-07-18 00:17:01,975 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52072, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1278861400_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753620_12796, duration(ns): 23513932 2025-07-18 00:17:01,975 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753620_12796, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-18 00:17:05,146 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753620_12796 replica FinalizedReplica, blk_1073753620_12796, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753620 for deletion 2025-07-18 00:17:05,147 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753620_12796 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753620 2025-07-18 00:18:01,946 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753621_12797 src: /192.168.158.6:40106 dest: /192.168.158.4:9866 2025-07-18 00:18:01,972 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:40106, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-680169401_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753621_12797, duration(ns): 19804041 2025-07-18 00:18:01,972 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753621_12797, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-18 00:18:05,148 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753621_12797 replica FinalizedReplica, blk_1073753621_12797, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753621 for deletion 2025-07-18 00:18:05,150 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753621_12797 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753621 2025-07-18 00:19:01,946 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753622_12798 src: /192.168.158.1:38126 dest: /192.168.158.4:9866 2025-07-18 00:19:01,981 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38126, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_511620590_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753622_12798, duration(ns): 25097224 2025-07-18 00:19:01,981 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753622_12798, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-18 00:19:05,148 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753622_12798 replica FinalizedReplica, blk_1073753622_12798, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753622 for deletion 2025-07-18 00:19:05,149 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753622_12798 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753622 2025-07-18 00:20:01,949 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753623_12799 src: /192.168.158.6:40116 dest: /192.168.158.4:9866 2025-07-18 00:20:01,968 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:40116, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1263361818_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753623_12799, duration(ns): 16384701 2025-07-18 00:20:01,968 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753623_12799, type=LAST_IN_PIPELINE terminating 2025-07-18 00:20:08,153 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753623_12799 replica FinalizedReplica, blk_1073753623_12799, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753623 for deletion 2025-07-18 00:20:08,154 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753623_12799 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753623 2025-07-18 00:22:11,959 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753625_12801 src: /192.168.158.6:50430 dest: /192.168.158.4:9866 2025-07-18 00:22:11,978 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:50430, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_636402000_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753625_12801, duration(ns): 17409199 2025-07-18 00:22:11,979 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753625_12801, type=LAST_IN_PIPELINE terminating 2025-07-18 00:22:17,155 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753625_12801 replica FinalizedReplica, blk_1073753625_12801, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753625 for deletion 2025-07-18 00:22:17,157 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753625_12801 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753625 2025-07-18 00:26:21,972 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753629_12805 src: /192.168.158.6:44618 dest: /192.168.158.4:9866 2025-07-18 00:26:21,994 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:44618, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-315056587_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753629_12805, duration(ns): 19203544 2025-07-18 00:26:21,994 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753629_12805, type=LAST_IN_PIPELINE terminating 2025-07-18 00:26:26,163 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753629_12805 replica FinalizedReplica, blk_1073753629_12805, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753629 for deletion 2025-07-18 00:26:26,164 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753629_12805 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753629 2025-07-18 00:27:21,961 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753630_12806 src: /192.168.158.1:47074 dest: /192.168.158.4:9866 2025-07-18 00:27:21,996 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47074, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1550062237_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753630_12806, duration(ns): 24880662 2025-07-18 00:27:21,996 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753630_12806, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-18 00:27:26,165 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753630_12806 replica FinalizedReplica, blk_1073753630_12806, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753630 for deletion 2025-07-18 00:27:26,166 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753630_12806 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753630 2025-07-18 00:28:21,962 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753631_12807 src: /192.168.158.1:33454 dest: /192.168.158.4:9866 2025-07-18 00:28:21,994 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33454, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-174560051_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753631_12807, duration(ns): 23195292 2025-07-18 00:28:21,994 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753631_12807, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-18 00:28:29,166 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753631_12807 replica FinalizedReplica, blk_1073753631_12807, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753631 for deletion 2025-07-18 00:28:29,168 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753631_12807 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753631 2025-07-18 00:30:21,969 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753633_12809 src: /192.168.158.1:51162 dest: /192.168.158.4:9866 2025-07-18 00:30:22,002 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51162, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-399933431_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753633_12809, duration(ns): 24442900 2025-07-18 00:30:22,003 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753633_12809, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-18 00:30:29,174 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753633_12809 replica FinalizedReplica, blk_1073753633_12809, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753633 for deletion 2025-07-18 00:30:29,175 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753633_12809 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753633 2025-07-18 00:31:26,977 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753634_12810 src: /192.168.158.5:59064 dest: /192.168.158.4:9866 2025-07-18 00:31:27,001 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:59064, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2119780337_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753634_12810, duration(ns): 18968659 2025-07-18 00:31:27,001 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753634_12810, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-18 00:31:32,174 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753634_12810 replica FinalizedReplica, blk_1073753634_12810, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753634 for deletion 2025-07-18 00:31:32,175 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753634_12810 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753634 2025-07-18 00:33:26,980 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753636_12812 src: /192.168.158.8:49754 dest: /192.168.158.4:9866 2025-07-18 00:33:27,005 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:49754, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_500243903_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753636_12812, duration(ns): 20198466 2025-07-18 00:33:27,006 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753636_12812, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-18 00:33:35,177 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753636_12812 replica FinalizedReplica, blk_1073753636_12812, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753636 for deletion 2025-07-18 00:33:35,178 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753636_12812 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753636 2025-07-18 00:35:26,985 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753638_12814 src: /192.168.158.6:48174 dest: /192.168.158.4:9866 2025-07-18 00:35:27,011 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:48174, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-956046809_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753638_12814, duration(ns): 19524311 2025-07-18 00:35:27,011 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753638_12814, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-18 00:35:32,182 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753638_12814 replica FinalizedReplica, blk_1073753638_12814, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753638 for deletion 2025-07-18 00:35:32,183 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753638_12814 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753638 2025-07-18 00:36:26,987 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753639_12815 src: /192.168.158.8:35750 dest: /192.168.158.4:9866 2025-07-18 00:36:27,015 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:35750, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-584541361_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753639_12815, duration(ns): 26089932 2025-07-18 00:36:27,016 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753639_12815, type=LAST_IN_PIPELINE terminating 2025-07-18 00:36:32,186 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753639_12815 replica FinalizedReplica, blk_1073753639_12815, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753639 for deletion 2025-07-18 00:36:32,187 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753639_12815 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753639 2025-07-18 00:38:26,992 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753641_12817 src: /192.168.158.7:49452 dest: /192.168.158.4:9866 2025-07-18 00:38:27,019 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:49452, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1940285259_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753641_12817, duration(ns): 20916831 2025-07-18 00:38:27,020 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753641_12817, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-18 00:38:32,191 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753641_12817 replica FinalizedReplica, blk_1073753641_12817, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753641 for deletion 2025-07-18 00:38:32,192 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753641_12817 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753641 2025-07-18 00:39:31,988 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753642_12818 src: /192.168.158.8:35946 dest: /192.168.158.4:9866 2025-07-18 00:39:32,016 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:35946, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_598288817_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753642_12818, duration(ns): 21896106 2025-07-18 00:39:32,017 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753642_12818, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-18 00:39:38,192 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753642_12818 replica FinalizedReplica, blk_1073753642_12818, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753642 for deletion 2025-07-18 00:39:38,193 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753642_12818 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753642 2025-07-18 00:41:32,000 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753644_12820 src: /192.168.158.6:45382 dest: /192.168.158.4:9866 2025-07-18 00:41:32,020 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:45382, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1295347547_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753644_12820, duration(ns): 18605074 2025-07-18 00:41:32,021 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753644_12820, type=LAST_IN_PIPELINE terminating 2025-07-18 00:41:35,194 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753644_12820 replica FinalizedReplica, blk_1073753644_12820, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753644 for deletion 2025-07-18 00:41:35,195 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753644_12820 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753644 2025-07-18 00:43:37,004 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753646_12822 src: /192.168.158.6:39576 dest: /192.168.158.4:9866 2025-07-18 00:43:37,024 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:39576, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1436516510_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753646_12822, duration(ns): 17323816 2025-07-18 00:43:37,024 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753646_12822, type=LAST_IN_PIPELINE terminating 2025-07-18 00:43:44,198 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753646_12822 replica FinalizedReplica, blk_1073753646_12822, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753646 for deletion 2025-07-18 00:43:44,199 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753646_12822 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753646 2025-07-18 00:46:42,042 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753649_12825 src: /192.168.158.9:36856 dest: /192.168.158.4:9866 2025-07-18 00:46:42,070 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:36856, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1315811211_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753649_12825, duration(ns): 22282872 2025-07-18 00:46:42,071 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753649_12825, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-18 00:46:47,204 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753649_12825 replica FinalizedReplica, blk_1073753649_12825, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753649 for deletion 2025-07-18 00:46:47,205 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753649_12825 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753649 2025-07-18 00:47:42,033 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753650_12826 src: /192.168.158.6:40116 dest: /192.168.158.4:9866 2025-07-18 00:47:42,054 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:40116, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1737530030_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753650_12826, duration(ns): 18962221 2025-07-18 00:47:42,055 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753650_12826, type=LAST_IN_PIPELINE terminating 2025-07-18 00:47:50,205 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753650_12826 replica FinalizedReplica, blk_1073753650_12826, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753650 for deletion 2025-07-18 00:47:50,206 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753650_12826 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753650 2025-07-18 00:48:42,025 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753651_12827 src: /192.168.158.1:53518 dest: /192.168.158.4:9866 2025-07-18 00:48:42,057 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53518, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_89467951_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753651_12827, duration(ns): 23454022 2025-07-18 00:48:42,057 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753651_12827, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-18 00:48:47,209 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753651_12827 replica FinalizedReplica, blk_1073753651_12827, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753651 for deletion 2025-07-18 00:48:47,210 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753651_12827 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753651 2025-07-18 00:49:42,029 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753652_12828 src: /192.168.158.8:54782 dest: /192.168.158.4:9866 2025-07-18 00:49:42,056 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:54782, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1921180748_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753652_12828, duration(ns): 21100842 2025-07-18 00:49:42,056 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753652_12828, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-18 00:49:50,212 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753652_12828 replica FinalizedReplica, blk_1073753652_12828, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753652 for deletion 2025-07-18 00:49:50,213 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753652_12828 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753652 2025-07-18 00:52:42,037 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753655_12831 src: /192.168.158.9:34638 dest: /192.168.158.4:9866 2025-07-18 00:52:42,058 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:34638, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1954641515_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753655_12831, duration(ns): 18157498 2025-07-18 00:52:42,058 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753655_12831, type=LAST_IN_PIPELINE terminating 2025-07-18 00:52:47,217 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753655_12831 replica FinalizedReplica, blk_1073753655_12831, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753655 for deletion 2025-07-18 00:52:47,218 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753655_12831 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753655 2025-07-18 00:53:47,055 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753656_12832 src: /192.168.158.8:54886 dest: /192.168.158.4:9866 2025-07-18 00:53:47,080 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:54886, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_324424938_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753656_12832, duration(ns): 19608583 2025-07-18 00:53:47,081 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753656_12832, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-18 00:53:50,221 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753656_12832 replica FinalizedReplica, blk_1073753656_12832, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753656 for deletion 2025-07-18 00:53:50,222 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753656_12832 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753656 2025-07-18 00:56:52,038 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753659_12835 src: /192.168.158.9:57938 dest: /192.168.158.4:9866 2025-07-18 00:56:52,067 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:57938, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-332556607_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753659_12835, duration(ns): 22927947 2025-07-18 00:56:52,067 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753659_12835, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-18 00:56:56,232 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753659_12835 replica FinalizedReplica, blk_1073753659_12835, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753659 for deletion 2025-07-18 00:56:56,233 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753659_12835 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753659 2025-07-18 00:59:57,057 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753662_12838 src: /192.168.158.1:53478 dest: /192.168.158.4:9866 2025-07-18 00:59:57,089 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53478, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_285078137_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753662_12838, duration(ns): 22915069 2025-07-18 00:59:57,090 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753662_12838, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-18 01:00:02,239 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753662_12838 replica FinalizedReplica, blk_1073753662_12838, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753662 for deletion 2025-07-18 01:00:02,240 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753662_12838 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753662 2025-07-18 01:00:57,049 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753663_12839 src: /192.168.158.9:50010 dest: /192.168.158.4:9866 2025-07-18 01:00:57,077 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:50010, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-142293932_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753663_12839, duration(ns): 22193710 2025-07-18 01:00:57,077 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753663_12839, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-18 01:01:02,243 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753663_12839 replica FinalizedReplica, blk_1073753663_12839, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753663 for deletion 2025-07-18 01:01:02,244 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753663_12839 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753663 2025-07-18 01:01:57,055 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753664_12840 src: /192.168.158.7:34306 dest: /192.168.158.4:9866 2025-07-18 01:01:57,074 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:34306, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1192137040_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753664_12840, duration(ns): 16801437 2025-07-18 01:01:57,074 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753664_12840, type=LAST_IN_PIPELINE terminating 2025-07-18 01:02:05,244 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753664_12840 replica FinalizedReplica, blk_1073753664_12840, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753664 for deletion 2025-07-18 01:02:05,245 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753664_12840 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753664 2025-07-18 01:03:02,048 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753665_12841 src: /192.168.158.5:33442 dest: /192.168.158.4:9866 2025-07-18 01:03:02,073 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:33442, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1107547621_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753665_12841, duration(ns): 19523368 2025-07-18 01:03:02,074 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753665_12841, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-18 01:03:08,246 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753665_12841 replica FinalizedReplica, blk_1073753665_12841, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753665 for deletion 2025-07-18 01:03:08,247 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753665_12841 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753665 2025-07-18 01:08:12,031 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753670_12846 src: /192.168.158.7:54612 dest: /192.168.158.4:9866 2025-07-18 01:08:12,051 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:54612, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_888656265_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753670_12846, duration(ns): 17848009 2025-07-18 01:08:12,051 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753670_12846, type=LAST_IN_PIPELINE terminating 2025-07-18 01:08:17,250 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753670_12846 replica FinalizedReplica, blk_1073753670_12846, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753670 for deletion 2025-07-18 01:08:17,251 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753670_12846 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753670 2025-07-18 01:10:12,050 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753672_12848 src: /192.168.158.6:46354 dest: /192.168.158.4:9866 2025-07-18 01:10:12,071 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:46354, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-316700439_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753672_12848, duration(ns): 18462438 2025-07-18 01:10:12,071 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753672_12848, type=LAST_IN_PIPELINE terminating 2025-07-18 01:10:20,252 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753672_12848 replica FinalizedReplica, blk_1073753672_12848, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753672 for deletion 2025-07-18 01:10:20,253 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753672_12848 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753672 2025-07-18 01:12:12,058 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753674_12850 src: /192.168.158.5:42470 dest: /192.168.158.4:9866 2025-07-18 01:12:12,085 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:42470, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1885066174_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753674_12850, duration(ns): 21512106 2025-07-18 01:12:12,086 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753674_12850, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-18 01:12:17,255 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753674_12850 replica FinalizedReplica, blk_1073753674_12850, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753674 for deletion 2025-07-18 01:12:17,256 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753674_12850 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753674 2025-07-18 01:13:12,048 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753675_12851 src: /192.168.158.9:51874 dest: /192.168.158.4:9866 2025-07-18 01:13:12,075 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:51874, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-95027250_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753675_12851, duration(ns): 20974293 2025-07-18 01:13:12,075 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753675_12851, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-18 01:13:17,258 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753675_12851 replica FinalizedReplica, blk_1073753675_12851, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753675 for deletion 2025-07-18 01:13:17,259 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753675_12851 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753675 2025-07-18 01:16:12,054 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753678_12854 src: /192.168.158.9:46334 dest: /192.168.158.4:9866 2025-07-18 01:16:12,079 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:46334, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1085059563_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753678_12854, duration(ns): 19215811 2025-07-18 01:16:12,079 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753678_12854, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-18 01:16:17,263 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753678_12854 replica FinalizedReplica, blk_1073753678_12854, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753678 for deletion 2025-07-18 01:16:17,264 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753678_12854 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753678 2025-07-18 01:21:12,046 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753683_12859 src: /192.168.158.8:45084 dest: /192.168.158.4:9866 2025-07-18 01:21:12,071 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:45084, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1438203946_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753683_12859, duration(ns): 19092136 2025-07-18 01:21:12,071 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753683_12859, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-18 01:21:17,273 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753683_12859 replica FinalizedReplica, blk_1073753683_12859, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753683 for deletion 2025-07-18 01:21:17,274 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753683_12859 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753683 2025-07-18 01:23:12,050 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753685_12861 src: /192.168.158.7:33846 dest: /192.168.158.4:9866 2025-07-18 01:23:12,071 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:33846, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1422834279_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753685_12861, duration(ns): 17073129 2025-07-18 01:23:12,071 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753685_12861, type=LAST_IN_PIPELINE terminating 2025-07-18 01:23:17,277 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753685_12861 replica FinalizedReplica, blk_1073753685_12861, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753685 for deletion 2025-07-18 01:23:17,278 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753685_12861 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753685 2025-07-18 01:24:12,055 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753686_12862 src: /192.168.158.5:47934 dest: /192.168.158.4:9866 2025-07-18 01:24:12,077 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:47934, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1692682630_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753686_12862, duration(ns): 19078463 2025-07-18 01:24:12,077 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753686_12862, type=LAST_IN_PIPELINE terminating 2025-07-18 01:24:20,280 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753686_12862 replica FinalizedReplica, blk_1073753686_12862, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753686 for deletion 2025-07-18 01:24:20,281 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753686_12862 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753686 2025-07-18 01:27:17,047 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753689_12865 src: /192.168.158.1:35964 dest: /192.168.158.4:9866 2025-07-18 01:27:17,081 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35964, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_958737413_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753689_12865, duration(ns): 23555391 2025-07-18 01:27:17,082 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753689_12865, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-18 01:27:20,286 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753689_12865 replica FinalizedReplica, blk_1073753689_12865, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753689 for deletion 2025-07-18 01:27:20,288 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753689_12865 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753689 2025-07-18 01:28:22,050 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753690_12866 src: /192.168.158.5:34308 dest: /192.168.158.4:9866 2025-07-18 01:28:22,077 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:34308, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1683399537_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753690_12866, duration(ns): 21394600 2025-07-18 01:28:22,077 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753690_12866, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-18 01:28:26,288 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753690_12866 replica FinalizedReplica, blk_1073753690_12866, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753690 for deletion 2025-07-18 01:28:26,289 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753690_12866 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753690 2025-07-18 01:29:22,063 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753691_12867 src: /192.168.158.9:44896 dest: /192.168.158.4:9866 2025-07-18 01:29:22,090 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:44896, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2019618109_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753691_12867, duration(ns): 21329205 2025-07-18 01:29:22,090 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753691_12867, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-18 01:29:26,289 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753691_12867 replica FinalizedReplica, blk_1073753691_12867, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753691 for deletion 2025-07-18 01:29:26,291 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753691_12867 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753691 2025-07-18 01:31:22,059 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753693_12869 src: /192.168.158.7:56934 dest: /192.168.158.4:9866 2025-07-18 01:31:22,078 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:56934, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1982285328_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753693_12869, duration(ns): 16848380 2025-07-18 01:31:22,078 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753693_12869, type=LAST_IN_PIPELINE terminating 2025-07-18 01:31:26,294 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753693_12869 replica FinalizedReplica, blk_1073753693_12869, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753693 for deletion 2025-07-18 01:31:26,295 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753693_12869 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753693 2025-07-18 01:34:22,066 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753696_12872 src: /192.168.158.9:49190 dest: /192.168.158.4:9866 2025-07-18 01:34:22,086 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:49190, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_630429252_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753696_12872, duration(ns): 17388856 2025-07-18 01:34:22,086 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753696_12872, type=LAST_IN_PIPELINE terminating 2025-07-18 01:34:26,300 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753696_12872 replica FinalizedReplica, blk_1073753696_12872, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753696 for deletion 2025-07-18 01:34:26,301 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753696_12872 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753696 2025-07-18 01:36:22,071 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753698_12874 src: /192.168.158.7:33112 dest: /192.168.158.4:9866 2025-07-18 01:36:22,091 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:33112, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_816298413_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753698_12874, duration(ns): 17334133 2025-07-18 01:36:22,091 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753698_12874, type=LAST_IN_PIPELINE terminating 2025-07-18 01:36:29,303 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753698_12874 replica FinalizedReplica, blk_1073753698_12874, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753698 for deletion 2025-07-18 01:36:29,304 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753698_12874 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753698 2025-07-18 01:37:22,070 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753699_12875 src: /192.168.158.6:34738 dest: /192.168.158.4:9866 2025-07-18 01:37:22,098 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:34738, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1762045195_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753699_12875, duration(ns): 21707465 2025-07-18 01:37:22,098 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753699_12875, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-18 01:37:26,305 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753699_12875 replica FinalizedReplica, blk_1073753699_12875, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753699 for deletion 2025-07-18 01:37:26,306 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753699_12875 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753699 2025-07-18 01:38:22,088 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753700_12876 src: /192.168.158.8:35822 dest: /192.168.158.4:9866 2025-07-18 01:38:22,113 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:35822, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_668505594_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753700_12876, duration(ns): 20690559 2025-07-18 01:38:22,114 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753700_12876, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-18 01:38:29,308 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753700_12876 replica FinalizedReplica, blk_1073753700_12876, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753700 for deletion 2025-07-18 01:38:29,309 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753700_12876 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753700 2025-07-18 01:39:22,086 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753701_12877 src: /192.168.158.7:40460 dest: /192.168.158.4:9866 2025-07-18 01:39:22,112 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:40460, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-100934193_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753701_12877, duration(ns): 21302316 2025-07-18 01:39:22,113 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753701_12877, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-18 01:39:29,312 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753701_12877 replica FinalizedReplica, blk_1073753701_12877, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753701 for deletion 2025-07-18 01:39:29,313 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753701_12877 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753701 2025-07-18 01:40:22,091 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753702_12878 src: /192.168.158.9:60154 dest: /192.168.158.4:9866 2025-07-18 01:40:22,119 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:60154, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2108987105_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753702_12878, duration(ns): 21988312 2025-07-18 01:40:22,120 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753702_12878, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-18 01:40:29,315 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753702_12878 replica FinalizedReplica, blk_1073753702_12878, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753702 for deletion 2025-07-18 01:40:29,316 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753702_12878 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753702 2025-07-18 01:46:32,094 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753708_12884 src: /192.168.158.8:56456 dest: /192.168.158.4:9866 2025-07-18 01:46:32,119 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:56456, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1950338915_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753708_12884, duration(ns): 19073772 2025-07-18 01:46:32,120 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753708_12884, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-18 01:46:38,322 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753708_12884 replica FinalizedReplica, blk_1073753708_12884, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753708 for deletion 2025-07-18 01:46:38,323 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753708_12884 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753708 2025-07-18 01:48:32,104 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753710_12886 src: /192.168.158.6:51386 dest: /192.168.158.4:9866 2025-07-18 01:48:32,130 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:51386, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1843788785_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753710_12886, duration(ns): 20396643 2025-07-18 01:48:32,131 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753710_12886, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-18 01:48:35,327 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753710_12886 replica FinalizedReplica, blk_1073753710_12886, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753710 for deletion 2025-07-18 01:48:35,328 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753710_12886 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753710 2025-07-18 01:57:37,136 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753719_12895 src: /192.168.158.1:53224 dest: /192.168.158.4:9866 2025-07-18 01:57:37,172 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53224, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1978906129_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753719_12895, duration(ns): 25981763 2025-07-18 01:57:37,173 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753719_12895, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-18 01:57:44,349 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753719_12895 replica FinalizedReplica, blk_1073753719_12895, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753719 for deletion 2025-07-18 01:57:44,350 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753719_12895 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753719 2025-07-18 01:58:42,142 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753720_12896 src: /192.168.158.5:43968 dest: /192.168.158.4:9866 2025-07-18 01:58:42,161 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:43968, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1399232162_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753720_12896, duration(ns): 16991324 2025-07-18 01:58:42,162 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753720_12896, type=LAST_IN_PIPELINE terminating 2025-07-18 01:58:47,352 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753720_12896 replica FinalizedReplica, blk_1073753720_12896, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753720 for deletion 2025-07-18 01:58:47,354 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753720_12896 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753720 2025-07-18 01:59:42,132 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753721_12897 src: /192.168.158.7:36486 dest: /192.168.158.4:9866 2025-07-18 01:59:42,160 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:36486, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1076349518_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753721_12897, duration(ns): 22470506 2025-07-18 01:59:42,161 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753721_12897, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-18 01:59:50,355 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753721_12897 replica FinalizedReplica, blk_1073753721_12897, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753721 for deletion 2025-07-18 01:59:50,356 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753721_12897 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753721 2025-07-18 02:02:42,132 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753724_12900 src: /192.168.158.9:46866 dest: /192.168.158.4:9866 2025-07-18 02:02:42,151 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:46866, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1133458140_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753724_12900, duration(ns): 16267819 2025-07-18 02:02:42,151 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753724_12900, type=LAST_IN_PIPELINE terminating 2025-07-18 02:02:47,358 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753724_12900 replica FinalizedReplica, blk_1073753724_12900, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753724 for deletion 2025-07-18 02:02:47,359 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753724_12900 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753724 2025-07-18 02:03:42,127 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753725_12901 src: /192.168.158.1:54654 dest: /192.168.158.4:9866 2025-07-18 02:03:42,159 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54654, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_407078300_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753725_12901, duration(ns): 22697478 2025-07-18 02:03:42,159 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753725_12901, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-18 02:03:50,360 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753725_12901 replica FinalizedReplica, blk_1073753725_12901, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753725 for deletion 2025-07-18 02:03:50,361 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753725_12901 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753725 2025-07-18 02:04:42,135 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753726_12902 src: /192.168.158.8:54706 dest: /192.168.158.4:9866 2025-07-18 02:04:42,156 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:54706, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-577611494_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753726_12902, duration(ns): 18843050 2025-07-18 02:04:42,156 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753726_12902, type=LAST_IN_PIPELINE terminating 2025-07-18 02:04:50,364 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753726_12902 replica FinalizedReplica, blk_1073753726_12902, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753726 for deletion 2025-07-18 02:04:50,365 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753726_12902 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753726 2025-07-18 02:07:42,148 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753729_12905 src: /192.168.158.9:33982 dest: /192.168.158.4:9866 2025-07-18 02:07:42,178 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:33982, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-903738805_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753729_12905, duration(ns): 23226231 2025-07-18 02:07:42,178 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753729_12905, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-18 02:07:50,368 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753729_12905 replica FinalizedReplica, blk_1073753729_12905, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753729 for deletion 2025-07-18 02:07:50,369 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753729_12905 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753729 2025-07-18 02:08:42,148 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753730_12906 src: /192.168.158.5:56902 dest: /192.168.158.4:9866 2025-07-18 02:08:42,172 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:56902, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1590916922_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753730_12906, duration(ns): 22733795 2025-07-18 02:08:42,173 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753730_12906, type=LAST_IN_PIPELINE terminating 2025-07-18 02:08:50,370 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753730_12906 replica FinalizedReplica, blk_1073753730_12906, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753730 for deletion 2025-07-18 02:08:50,372 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753730_12906 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753730 2025-07-18 02:09:42,152 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753731_12907 src: /192.168.158.5:44268 dest: /192.168.158.4:9866 2025-07-18 02:09:42,178 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:44268, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_29116550_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753731_12907, duration(ns): 20618263 2025-07-18 02:09:42,178 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753731_12907, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-18 02:09:47,371 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753731_12907 replica FinalizedReplica, blk_1073753731_12907, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753731 for deletion 2025-07-18 02:09:47,372 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753731_12907 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753731 2025-07-18 02:10:42,152 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753732_12908 src: /192.168.158.7:36838 dest: /192.168.158.4:9866 2025-07-18 02:10:42,171 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:36838, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-697453439_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753732_12908, duration(ns): 16940825 2025-07-18 02:10:42,172 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753732_12908, type=LAST_IN_PIPELINE terminating 2025-07-18 02:10:47,373 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753732_12908 replica FinalizedReplica, blk_1073753732_12908, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753732 for deletion 2025-07-18 02:10:47,374 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753732_12908 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753732 2025-07-18 02:12:42,177 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753734_12910 src: /192.168.158.1:58788 dest: /192.168.158.4:9866 2025-07-18 02:12:42,211 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58788, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-496039947_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753734_12910, duration(ns): 23640617 2025-07-18 02:12:42,211 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753734_12910, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-18 02:12:47,376 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753734_12910 replica FinalizedReplica, blk_1073753734_12910, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753734 for deletion 2025-07-18 02:12:47,377 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753734_12910 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753734 2025-07-18 02:16:42,170 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753738_12914 src: /192.168.158.6:35642 dest: /192.168.158.4:9866 2025-07-18 02:16:42,195 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:35642, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1530975193_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753738_12914, duration(ns): 20124447 2025-07-18 02:16:42,196 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753738_12914, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-18 02:16:50,388 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753738_12914 replica FinalizedReplica, blk_1073753738_12914, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753738 for deletion 2025-07-18 02:16:50,390 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753738_12914 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753738 2025-07-18 02:18:42,166 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753740_12916 src: /192.168.158.1:35340 dest: /192.168.158.4:9866 2025-07-18 02:18:42,199 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35340, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-469639593_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753740_12916, duration(ns): 24113140 2025-07-18 02:18:42,199 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753740_12916, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-18 02:18:44,392 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753740_12916 replica FinalizedReplica, blk_1073753740_12916, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753740 for deletion 2025-07-18 02:18:44,393 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753740_12916 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753740 2025-07-18 02:19:47,167 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753741_12917 src: /192.168.158.1:48366 dest: /192.168.158.4:9866 2025-07-18 02:19:47,201 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48366, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1687600897_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753741_12917, duration(ns): 24703489 2025-07-18 02:19:47,201 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753741_12917, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-18 02:19:53,396 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753741_12917 replica FinalizedReplica, blk_1073753741_12917, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753741 for deletion 2025-07-18 02:19:53,397 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753741_12917 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753741 2025-07-18 02:22:47,177 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753744_12920 src: /192.168.158.5:49658 dest: /192.168.158.4:9866 2025-07-18 02:22:47,204 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:49658, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-367255544_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753744_12920, duration(ns): 21246872 2025-07-18 02:22:47,204 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753744_12920, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-18 02:22:53,402 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753744_12920 replica FinalizedReplica, blk_1073753744_12920, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753744 for deletion 2025-07-18 02:22:53,403 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753744_12920 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753744 2025-07-18 02:23:47,182 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753745_12921 src: /192.168.158.9:51396 dest: /192.168.158.4:9866 2025-07-18 02:23:47,200 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:51396, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_36379267_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753745_12921, duration(ns): 16421908 2025-07-18 02:23:47,201 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753745_12921, type=LAST_IN_PIPELINE terminating 2025-07-18 02:23:50,403 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753745_12921 replica FinalizedReplica, blk_1073753745_12921, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753745 for deletion 2025-07-18 02:23:50,404 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753745_12921 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753745 2025-07-18 02:24:47,178 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753746_12922 src: /192.168.158.1:41762 dest: /192.168.158.4:9866 2025-07-18 02:24:47,211 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41762, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1317319603_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753746_12922, duration(ns): 25026122 2025-07-18 02:24:47,212 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753746_12922, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-18 02:24:53,403 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753746_12922 replica FinalizedReplica, blk_1073753746_12922, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753746 for deletion 2025-07-18 02:24:53,404 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753746_12922 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753746 2025-07-18 02:29:52,195 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753751_12927 src: /192.168.158.5:43830 dest: /192.168.158.4:9866 2025-07-18 02:29:52,214 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:43830, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-892103796_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753751_12927, duration(ns): 17869311 2025-07-18 02:29:52,215 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753751_12927, type=LAST_IN_PIPELINE terminating 2025-07-18 02:29:56,410 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753751_12927 replica FinalizedReplica, blk_1073753751_12927, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753751 for deletion 2025-07-18 02:29:56,412 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753751_12927 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753751 2025-07-18 02:30:57,193 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753752_12928 src: /192.168.158.1:58634 dest: /192.168.158.4:9866 2025-07-18 02:30:57,227 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58634, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_327882066_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753752_12928, duration(ns): 25292037 2025-07-18 02:30:57,227 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753752_12928, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-18 02:31:02,413 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753752_12928 replica FinalizedReplica, blk_1073753752_12928, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753752 for deletion 2025-07-18 02:31:02,414 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753752_12928 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753752 2025-07-18 02:32:57,199 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753754_12930 src: /192.168.158.5:36052 dest: /192.168.158.4:9866 2025-07-18 02:32:57,219 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:36052, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1998376742_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753754_12930, duration(ns): 17563909 2025-07-18 02:32:57,219 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753754_12930, type=LAST_IN_PIPELINE terminating 2025-07-18 02:32:59,418 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753754_12930 replica FinalizedReplica, blk_1073753754_12930, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753754 for deletion 2025-07-18 02:32:59,419 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753754_12930 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753754 2025-07-18 02:37:02,201 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753758_12934 src: /192.168.158.1:50458 dest: /192.168.158.4:9866 2025-07-18 02:37:02,234 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50458, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-448880641_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753758_12934, duration(ns): 23550695 2025-07-18 02:37:02,234 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753758_12934, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-18 02:37:08,425 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753758_12934 replica FinalizedReplica, blk_1073753758_12934, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753758 for deletion 2025-07-18 02:37:08,426 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753758_12934 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753758 2025-07-18 02:38:02,210 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753759_12935 src: /192.168.158.9:44238 dest: /192.168.158.4:9866 2025-07-18 02:38:02,231 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:44238, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1011711643_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753759_12935, duration(ns): 18192683 2025-07-18 02:38:02,231 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753759_12935, type=LAST_IN_PIPELINE terminating 2025-07-18 02:38:05,428 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753759_12935 replica FinalizedReplica, blk_1073753759_12935, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753759 for deletion 2025-07-18 02:38:05,429 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753759_12935 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753759 2025-07-18 02:39:02,210 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753760_12936 src: /192.168.158.8:57394 dest: /192.168.158.4:9866 2025-07-18 02:39:02,231 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:57394, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-823335882_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753760_12936, duration(ns): 18665808 2025-07-18 02:39:02,231 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753760_12936, type=LAST_IN_PIPELINE terminating 2025-07-18 02:39:05,432 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753760_12936 replica FinalizedReplica, blk_1073753760_12936, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753760 for deletion 2025-07-18 02:39:05,433 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753760_12936 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753760 2025-07-18 02:41:07,213 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753762_12938 src: /192.168.158.5:52820 dest: /192.168.158.4:9866 2025-07-18 02:41:07,239 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:52820, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-657020169_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753762_12938, duration(ns): 20081184 2025-07-18 02:41:07,241 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753762_12938, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-18 02:41:11,437 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753762_12938 replica FinalizedReplica, blk_1073753762_12938, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753762 for deletion 2025-07-18 02:41:11,438 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753762_12938 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753762 2025-07-18 02:44:07,221 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753765_12941 src: /192.168.158.9:43742 dest: /192.168.158.4:9866 2025-07-18 02:44:07,247 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:43742, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1950766056_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753765_12941, duration(ns): 20234391 2025-07-18 02:44:07,247 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753765_12941, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-18 02:44:11,442 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753765_12941 replica FinalizedReplica, blk_1073753765_12941, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753765 for deletion 2025-07-18 02:44:11,443 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753765_12941 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753765 2025-07-18 02:46:07,222 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753767_12943 src: /192.168.158.7:38008 dest: /192.168.158.4:9866 2025-07-18 02:46:07,248 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:38008, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1190099186_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753767_12943, duration(ns): 20618367 2025-07-18 02:46:07,248 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753767_12943, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-18 02:46:14,447 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753767_12943 replica FinalizedReplica, blk_1073753767_12943, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753767 for deletion 2025-07-18 02:46:14,448 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753767_12943 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753767 2025-07-18 02:48:07,226 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753769_12945 src: /192.168.158.1:33966 dest: /192.168.158.4:9866 2025-07-18 02:48:07,267 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33966, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-80791492_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753769_12945, duration(ns): 30118530 2025-07-18 02:48:07,267 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753769_12945, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-18 02:48:11,452 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753769_12945 replica FinalizedReplica, blk_1073753769_12945, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753769 for deletion 2025-07-18 02:48:11,454 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753769_12945 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753769 2025-07-18 02:50:07,252 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753771_12947 src: /192.168.158.5:39068 dest: /192.168.158.4:9866 2025-07-18 02:50:07,280 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:39068, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_394578151_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753771_12947, duration(ns): 22148903 2025-07-18 02:50:07,280 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753771_12947, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-18 02:50:11,458 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753771_12947 replica FinalizedReplica, blk_1073753771_12947, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753771 for deletion 2025-07-18 02:50:11,459 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753771_12947 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753771 2025-07-18 02:51:07,238 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753772_12948 src: /192.168.158.5:50604 dest: /192.168.158.4:9866 2025-07-18 02:51:07,257 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:50604, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_364577884_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753772_12948, duration(ns): 16439844 2025-07-18 02:51:07,257 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753772_12948, type=LAST_IN_PIPELINE terminating 2025-07-18 02:51:14,460 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753772_12948 replica FinalizedReplica, blk_1073753772_12948, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753772 for deletion 2025-07-18 02:51:14,462 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753772_12948 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753772 2025-07-18 02:52:07,237 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753773_12949 src: /192.168.158.9:47130 dest: /192.168.158.4:9866 2025-07-18 02:52:07,255 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:47130, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-543714245_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753773_12949, duration(ns): 15815938 2025-07-18 02:52:07,255 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753773_12949, type=LAST_IN_PIPELINE terminating 2025-07-18 02:52:11,463 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753773_12949 replica FinalizedReplica, blk_1073753773_12949, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753773 for deletion 2025-07-18 02:52:11,464 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753773_12949 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753773 2025-07-18 02:54:12,242 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753775_12951 src: /192.168.158.8:45968 dest: /192.168.158.4:9866 2025-07-18 02:54:12,263 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:45968, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-689743705_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753775_12951, duration(ns): 18545727 2025-07-18 02:54:12,263 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753775_12951, type=LAST_IN_PIPELINE terminating 2025-07-18 02:54:17,465 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753775_12951 replica FinalizedReplica, blk_1073753775_12951, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753775 for deletion 2025-07-18 02:54:17,466 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753775_12951 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753775 2025-07-18 02:56:12,234 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753777_12953 src: /192.168.158.1:56458 dest: /192.168.158.4:9866 2025-07-18 02:56:12,266 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56458, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1804278099_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753777_12953, duration(ns): 22927524 2025-07-18 02:56:12,266 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753777_12953, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-18 02:56:14,468 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753777_12953 replica FinalizedReplica, blk_1073753777_12953, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753777 for deletion 2025-07-18 02:56:14,470 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753777_12953 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753777 2025-07-18 02:57:12,248 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753778_12954 src: /192.168.158.7:54820 dest: /192.168.158.4:9866 2025-07-18 02:57:12,275 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:54820, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1797820867_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753778_12954, duration(ns): 20764159 2025-07-18 02:57:12,275 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753778_12954, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-18 02:57:14,473 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753778_12954 replica FinalizedReplica, blk_1073753778_12954, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753778 for deletion 2025-07-18 02:57:14,474 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753778_12954 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753778 2025-07-18 02:59:12,251 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753780_12956 src: /192.168.158.8:47762 dest: /192.168.158.4:9866 2025-07-18 02:59:12,269 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:47762, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1895531175_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753780_12956, duration(ns): 16795827 2025-07-18 02:59:12,270 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753780_12956, type=LAST_IN_PIPELINE terminating 2025-07-18 02:59:17,479 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753780_12956 replica FinalizedReplica, blk_1073753780_12956, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753780 for deletion 2025-07-18 02:59:17,481 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753780_12956 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753780 2025-07-18 03:03:12,250 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753784_12960 src: /192.168.158.5:41986 dest: /192.168.158.4:9866 2025-07-18 03:03:12,277 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:41986, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-916824217_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753784_12960, duration(ns): 21668992 2025-07-18 03:03:12,278 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753784_12960, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-18 03:03:14,489 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753784_12960 replica FinalizedReplica, blk_1073753784_12960, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753784 for deletion 2025-07-18 03:03:14,490 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753784_12960 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753784 2025-07-18 03:04:12,248 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753785_12961 src: /192.168.158.1:37432 dest: /192.168.158.4:9866 2025-07-18 03:04:12,282 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37432, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-782842513_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753785_12961, duration(ns): 24155554 2025-07-18 03:04:12,282 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753785_12961, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-18 03:04:14,492 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753785_12961 replica FinalizedReplica, blk_1073753785_12961, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753785 for deletion 2025-07-18 03:04:14,493 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753785_12961 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753785 2025-07-18 03:05:12,267 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753786_12962 src: /192.168.158.6:36766 dest: /192.168.158.4:9866 2025-07-18 03:05:12,287 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:36766, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2007888090_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753786_12962, duration(ns): 17697177 2025-07-18 03:05:12,287 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753786_12962, type=LAST_IN_PIPELINE terminating 2025-07-18 03:05:14,493 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753786_12962 replica FinalizedReplica, blk_1073753786_12962, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753786 for deletion 2025-07-18 03:05:14,495 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753786_12962 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753786 2025-07-18 03:06:12,281 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753787_12963 src: /192.168.158.1:46144 dest: /192.168.158.4:9866 2025-07-18 03:06:12,314 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46144, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1303030791_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753787_12963, duration(ns): 23847681 2025-07-18 03:06:12,315 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753787_12963, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-18 03:06:14,494 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753787_12963 replica FinalizedReplica, blk_1073753787_12963, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753787 for deletion 2025-07-18 03:06:14,495 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753787_12963 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753787 2025-07-18 03:07:12,257 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753788_12964 src: /192.168.158.5:34340 dest: /192.168.158.4:9866 2025-07-18 03:07:12,277 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:34340, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_46170232_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753788_12964, duration(ns): 17613451 2025-07-18 03:07:12,277 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753788_12964, type=LAST_IN_PIPELINE terminating 2025-07-18 03:07:17,495 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753788_12964 replica FinalizedReplica, blk_1073753788_12964, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753788 for deletion 2025-07-18 03:07:17,496 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753788_12964 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753788 2025-07-18 03:09:12,265 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753790_12966 src: /192.168.158.5:35274 dest: /192.168.158.4:9866 2025-07-18 03:09:12,292 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:35274, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_129397381_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753790_12966, duration(ns): 21469913 2025-07-18 03:09:12,292 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753790_12966, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-18 03:09:17,498 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753790_12966 replica FinalizedReplica, blk_1073753790_12966, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753790 for deletion 2025-07-18 03:09:17,499 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753790_12966 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753790 2025-07-18 03:10:12,261 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753791_12967 src: /192.168.158.1:34804 dest: /192.168.158.4:9866 2025-07-18 03:10:12,292 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34804, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_12420562_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753791_12967, duration(ns): 22753683 2025-07-18 03:10:12,293 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753791_12967, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-18 03:10:17,498 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753791_12967 replica FinalizedReplica, blk_1073753791_12967, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753791 for deletion 2025-07-18 03:10:17,500 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753791_12967 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753791 2025-07-18 03:11:12,263 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753792_12968 src: /192.168.158.6:36800 dest: /192.168.158.4:9866 2025-07-18 03:11:12,291 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:36800, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-307470521_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753792_12968, duration(ns): 21843592 2025-07-18 03:11:12,291 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753792_12968, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-18 03:11:14,499 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753792_12968 replica FinalizedReplica, blk_1073753792_12968, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753792 for deletion 2025-07-18 03:11:14,500 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753792_12968 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753792 2025-07-18 03:13:12,268 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753794_12970 src: /192.168.158.5:32920 dest: /192.168.158.4:9866 2025-07-18 03:13:12,288 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:32920, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-200726774_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753794_12970, duration(ns): 18194006 2025-07-18 03:13:12,289 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753794_12970, type=LAST_IN_PIPELINE terminating 2025-07-18 03:13:14,504 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753794_12970 replica FinalizedReplica, blk_1073753794_12970, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753794 for deletion 2025-07-18 03:13:14,505 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753794_12970 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753794 2025-07-18 03:15:12,265 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753796_12972 src: /192.168.158.8:44448 dest: /192.168.158.4:9866 2025-07-18 03:15:12,290 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:44448, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_452378447_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753796_12972, duration(ns): 19102135 2025-07-18 03:15:12,290 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753796_12972, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-18 03:15:14,507 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753796_12972 replica FinalizedReplica, blk_1073753796_12972, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753796 for deletion 2025-07-18 03:15:14,508 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753796_12972 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753796 2025-07-18 03:17:12,268 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753798_12974 src: /192.168.158.1:41880 dest: /192.168.158.4:9866 2025-07-18 03:17:12,304 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41880, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_48025165_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753798_12974, duration(ns): 26830234 2025-07-18 03:17:12,305 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753798_12974, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-18 03:17:14,510 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753798_12974 replica FinalizedReplica, blk_1073753798_12974, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753798 for deletion 2025-07-18 03:17:14,512 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753798_12974 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753798 2025-07-18 03:19:17,275 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753800_12976 src: /192.168.158.9:38564 dest: /192.168.158.4:9866 2025-07-18 03:19:17,301 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:38564, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2142240351_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753800_12976, duration(ns): 20180370 2025-07-18 03:19:17,302 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753800_12976, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-18 03:19:23,516 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753800_12976 replica FinalizedReplica, blk_1073753800_12976, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753800 for deletion 2025-07-18 03:19:23,517 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753800_12976 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753800 2025-07-18 03:20:17,272 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753801_12977 src: /192.168.158.1:51144 dest: /192.168.158.4:9866 2025-07-18 03:20:17,310 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51144, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-448692584_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753801_12977, duration(ns): 27716997 2025-07-18 03:20:17,311 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753801_12977, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-18 03:20:20,518 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753801_12977 replica FinalizedReplica, blk_1073753801_12977, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753801 for deletion 2025-07-18 03:20:20,520 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753801_12977 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753801 2025-07-18 03:23:17,283 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753804_12980 src: /192.168.158.9:46472 dest: /192.168.158.4:9866 2025-07-18 03:23:17,312 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:46472, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-587578214_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753804_12980, duration(ns): 23084508 2025-07-18 03:23:17,312 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753804_12980, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-18 03:23:20,525 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753804_12980 replica FinalizedReplica, blk_1073753804_12980, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753804 for deletion 2025-07-18 03:23:20,526 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753804_12980 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753804 2025-07-18 03:24:17,279 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753805_12981 src: /192.168.158.1:57524 dest: /192.168.158.4:9866 2025-07-18 03:24:17,312 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57524, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1499210352_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753805_12981, duration(ns): 23524872 2025-07-18 03:24:17,312 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753805_12981, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-18 03:24:20,529 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753805_12981 replica FinalizedReplica, blk_1073753805_12981, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753805 for deletion 2025-07-18 03:24:20,530 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753805_12981 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753805 2025-07-18 03:28:17,291 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753809_12985 src: /192.168.158.5:49600 dest: /192.168.158.4:9866 2025-07-18 03:28:17,310 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:49600, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-66753219_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753809_12985, duration(ns): 16258163 2025-07-18 03:28:17,310 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753809_12985, type=LAST_IN_PIPELINE terminating 2025-07-18 03:28:23,530 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753809_12985 replica FinalizedReplica, blk_1073753809_12985, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753809 for deletion 2025-07-18 03:28:23,531 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753809_12985 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753809 2025-07-18 03:31:17,297 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753812_12988 src: /192.168.158.6:60030 dest: /192.168.158.4:9866 2025-07-18 03:31:17,325 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:60030, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-348690722_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753812_12988, duration(ns): 22730434 2025-07-18 03:31:17,328 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753812_12988, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-18 03:31:23,533 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753812_12988 replica FinalizedReplica, blk_1073753812_12988, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753812 for deletion 2025-07-18 03:31:23,535 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753812_12988 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753812 2025-07-18 03:32:17,303 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753813_12989 src: /192.168.158.7:35144 dest: /192.168.158.4:9866 2025-07-18 03:32:17,328 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:35144, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1265767895_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753813_12989, duration(ns): 19409435 2025-07-18 03:32:17,329 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753813_12989, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-18 03:32:20,534 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753813_12989 replica FinalizedReplica, blk_1073753813_12989, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753813 for deletion 2025-07-18 03:32:20,535 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753813_12989 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753813 2025-07-18 03:34:17,297 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753815_12991 src: /192.168.158.1:38434 dest: /192.168.158.4:9866 2025-07-18 03:34:17,329 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38434, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_343000342_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753815_12991, duration(ns): 22883834 2025-07-18 03:34:17,334 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753815_12991, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-18 03:34:20,535 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753815_12991 replica FinalizedReplica, blk_1073753815_12991, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753815 for deletion 2025-07-18 03:34:20,537 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753815_12991 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753815 2025-07-18 03:36:17,305 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753817_12993 src: /192.168.158.8:48030 dest: /192.168.158.4:9866 2025-07-18 03:36:17,324 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:48030, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-467092712_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753817_12993, duration(ns): 16925434 2025-07-18 03:36:17,325 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753817_12993, type=LAST_IN_PIPELINE terminating 2025-07-18 03:36:20,543 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753817_12993 replica FinalizedReplica, blk_1073753817_12993, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753817 for deletion 2025-07-18 03:36:20,544 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753817_12993 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753817 2025-07-18 03:37:22,300 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753818_12994 src: /192.168.158.1:53268 dest: /192.168.158.4:9866 2025-07-18 03:37:22,336 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53268, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2105735170_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753818_12994, duration(ns): 25501804 2025-07-18 03:37:22,337 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753818_12994, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-18 03:37:26,544 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753818_12994 replica FinalizedReplica, blk_1073753818_12994, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753818 for deletion 2025-07-18 03:37:26,546 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753818_12994 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753818 2025-07-18 03:39:27,310 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753820_12996 src: /192.168.158.1:53426 dest: /192.168.158.4:9866 2025-07-18 03:39:27,345 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53426, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-650704546_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753820_12996, duration(ns): 25045908 2025-07-18 03:39:27,345 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753820_12996, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-18 03:39:29,547 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753820_12996 replica FinalizedReplica, blk_1073753820_12996, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753820 for deletion 2025-07-18 03:39:29,548 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753820_12996 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753820 2025-07-18 03:40:27,315 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753821_12997 src: /192.168.158.6:53088 dest: /192.168.158.4:9866 2025-07-18 03:40:27,342 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:53088, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-595044942_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753821_12997, duration(ns): 21668225 2025-07-18 03:40:27,343 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753821_12997, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-18 03:40:32,546 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753821_12997 replica FinalizedReplica, blk_1073753821_12997, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753821 for deletion 2025-07-18 03:40:32,547 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753821_12997 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753821 2025-07-18 03:41:32,318 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753822_12998 src: /192.168.158.6:51950 dest: /192.168.158.4:9866 2025-07-18 03:41:32,339 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:51950, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-390123103_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753822_12998, duration(ns): 18454991 2025-07-18 03:41:32,339 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753822_12998, type=LAST_IN_PIPELINE terminating 2025-07-18 03:41:35,549 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753822_12998 replica FinalizedReplica, blk_1073753822_12998, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753822 for deletion 2025-07-18 03:41:35,550 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753822_12998 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753822 2025-07-18 03:42:32,326 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753823_12999 src: /192.168.158.1:47086 dest: /192.168.158.4:9866 2025-07-18 03:42:32,358 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47086, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1164318443_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753823_12999, duration(ns): 23202477 2025-07-18 03:42:32,359 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753823_12999, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-18 03:42:38,551 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753823_12999 replica FinalizedReplica, blk_1073753823_12999, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753823 for deletion 2025-07-18 03:42:38,552 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753823_12999 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753823 2025-07-18 03:43:32,319 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753824_13000 src: /192.168.158.1:48218 dest: /192.168.158.4:9866 2025-07-18 03:43:32,352 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48218, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-816303582_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753824_13000, duration(ns): 23590067 2025-07-18 03:43:32,352 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753824_13000, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-18 03:43:38,555 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753824_13000 replica FinalizedReplica, blk_1073753824_13000, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753824 for deletion 2025-07-18 03:43:38,557 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753824_13000 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753824 2025-07-18 03:44:32,328 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753825_13001 src: /192.168.158.9:48236 dest: /192.168.158.4:9866 2025-07-18 03:44:32,346 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:48236, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_208505486_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753825_13001, duration(ns): 15520166 2025-07-18 03:44:32,346 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753825_13001, type=LAST_IN_PIPELINE terminating 2025-07-18 03:44:35,559 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753825_13001 replica FinalizedReplica, blk_1073753825_13001, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753825 for deletion 2025-07-18 03:44:35,560 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753825_13001 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753825 2025-07-18 03:46:37,335 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753827_13003 src: /192.168.158.8:33948 dest: /192.168.158.4:9866 2025-07-18 03:46:37,355 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:33948, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1489134896_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753827_13003, duration(ns): 17621739 2025-07-18 03:46:37,355 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753827_13003, type=LAST_IN_PIPELINE terminating 2025-07-18 03:46:44,560 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753827_13003 replica FinalizedReplica, blk_1073753827_13003, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753827 for deletion 2025-07-18 03:46:44,561 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753827_13003 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753827 2025-07-18 03:50:37,336 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753831_13007 src: /192.168.158.6:57574 dest: /192.168.158.4:9866 2025-07-18 03:50:37,356 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:57574, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1942383836_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753831_13007, duration(ns): 17943836 2025-07-18 03:50:37,356 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753831_13007, type=LAST_IN_PIPELINE terminating 2025-07-18 03:50:44,567 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753831_13007 replica FinalizedReplica, blk_1073753831_13007, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753831 for deletion 2025-07-18 03:50:44,568 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753831_13007 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753831 2025-07-18 03:51:37,347 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753832_13008 src: /192.168.158.9:53746 dest: /192.168.158.4:9866 2025-07-18 03:51:37,367 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:53746, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1624473640_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753832_13008, duration(ns): 18043016 2025-07-18 03:51:37,367 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753832_13008, type=LAST_IN_PIPELINE terminating 2025-07-18 03:51:41,568 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753832_13008 replica FinalizedReplica, blk_1073753832_13008, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753832 for deletion 2025-07-18 03:51:41,569 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753832_13008 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753832 2025-07-18 03:54:37,338 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753835_13011 src: /192.168.158.9:41736 dest: /192.168.158.4:9866 2025-07-18 03:54:37,363 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:41736, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1767523545_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753835_13011, duration(ns): 20083754 2025-07-18 03:54:37,363 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753835_13011, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-18 03:54:41,571 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753835_13011 replica FinalizedReplica, blk_1073753835_13011, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753835 for deletion 2025-07-18 03:54:41,572 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753835_13011 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753835 2025-07-18 03:55:37,365 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753836_13012 src: /192.168.158.8:47118 dest: /192.168.158.4:9866 2025-07-18 03:55:37,384 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:47118, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1898218310_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753836_13012, duration(ns): 16701406 2025-07-18 03:55:37,384 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753836_13012, type=LAST_IN_PIPELINE terminating 2025-07-18 03:55:44,572 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753836_13012 replica FinalizedReplica, blk_1073753836_13012, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753836 for deletion 2025-07-18 03:55:44,573 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753836_13012 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753836 2025-07-18 03:56:37,342 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753837_13013 src: /192.168.158.9:45632 dest: /192.168.158.4:9866 2025-07-18 03:56:37,373 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:45632, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1820565183_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753837_13013, duration(ns): 24162778 2025-07-18 03:56:37,373 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753837_13013, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-18 03:56:41,575 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753837_13013 replica FinalizedReplica, blk_1073753837_13013, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753837 for deletion 2025-07-18 03:56:41,576 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753837_13013 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753837 2025-07-18 03:57:37,360 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753838_13014 src: /192.168.158.9:51230 dest: /192.168.158.4:9866 2025-07-18 03:57:37,380 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:51230, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1172169004_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753838_13014, duration(ns): 17814689 2025-07-18 03:57:37,380 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753838_13014, type=LAST_IN_PIPELINE terminating 2025-07-18 03:57:41,579 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753838_13014 replica FinalizedReplica, blk_1073753838_13014, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753838 for deletion 2025-07-18 03:57:41,580 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753838_13014 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753838 2025-07-18 03:59:17,587 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Successfully sent block report 0x12cfd9a757d31f48, containing 4 storage report(s), of which we sent 4. The reports had 8 total blocks and used 1 RPC(s). This took 0 msec to generate and 5 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5. 2025-07-18 03:59:17,587 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Got finalize command for block pool BP-1059995147-192.168.158.1-1752101929360 2025-07-18 04:02:42,354 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753843_13019 src: /192.168.158.6:38844 dest: /192.168.158.4:9866 2025-07-18 04:02:42,374 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:38844, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1556411280_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753843_13019, duration(ns): 17499603 2025-07-18 04:02:42,374 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753843_13019, type=LAST_IN_PIPELINE terminating 2025-07-18 04:02:47,590 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753843_13019 replica FinalizedReplica, blk_1073753843_13019, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753843 for deletion 2025-07-18 04:02:47,591 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753843_13019 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753843 2025-07-18 04:03:47,346 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753844_13020 src: /192.168.158.1:38772 dest: /192.168.158.4:9866 2025-07-18 04:03:47,377 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38772, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2113769590_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753844_13020, duration(ns): 22048004 2025-07-18 04:03:47,377 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753844_13020, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-18 04:03:50,591 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753844_13020 replica FinalizedReplica, blk_1073753844_13020, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753844 for deletion 2025-07-18 04:03:50,592 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753844_13020 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753844 2025-07-18 04:04:47,349 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753845_13021 src: /192.168.158.1:59064 dest: /192.168.158.4:9866 2025-07-18 04:04:47,383 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59064, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-108013678_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753845_13021, duration(ns): 26100132 2025-07-18 04:04:47,384 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753845_13021, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-18 04:04:53,593 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753845_13021 replica FinalizedReplica, blk_1073753845_13021, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753845 for deletion 2025-07-18 04:04:53,595 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753845_13021 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753845 2025-07-18 04:07:47,341 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753848_13024 src: /192.168.158.1:51510 dest: /192.168.158.4:9866 2025-07-18 04:07:47,379 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51510, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1407848374_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753848_13024, duration(ns): 28727847 2025-07-18 04:07:47,379 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753848_13024, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-18 04:07:53,602 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753848_13024 replica FinalizedReplica, blk_1073753848_13024, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753848 for deletion 2025-07-18 04:07:53,603 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753848_13024 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753848 2025-07-18 04:08:47,349 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753849_13025 src: /192.168.158.9:38220 dest: /192.168.158.4:9866 2025-07-18 04:08:47,376 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:38220, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1762263906_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753849_13025, duration(ns): 19650261 2025-07-18 04:08:47,376 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753849_13025, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-18 04:08:53,604 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753849_13025 replica FinalizedReplica, blk_1073753849_13025, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753849 for deletion 2025-07-18 04:08:53,606 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753849_13025 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753849 2025-07-18 04:11:57,360 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753852_13028 src: /192.168.158.1:41986 dest: /192.168.158.4:9866 2025-07-18 04:11:57,393 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41986, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2050844219_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753852_13028, duration(ns): 24358857 2025-07-18 04:11:57,393 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753852_13028, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-18 04:11:59,609 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753852_13028 replica FinalizedReplica, blk_1073753852_13028, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753852 for deletion 2025-07-18 04:11:59,610 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753852_13028 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753852 2025-07-18 04:12:57,394 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753853_13029 src: /192.168.158.6:43236 dest: /192.168.158.4:9866 2025-07-18 04:12:57,421 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:43236, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_512117626_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753853_13029, duration(ns): 20678511 2025-07-18 04:12:57,421 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753853_13029, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-18 04:12:59,612 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753853_13029 replica FinalizedReplica, blk_1073753853_13029, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753853 for deletion 2025-07-18 04:12:59,613 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753853_13029 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753853 2025-07-18 04:14:57,356 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753855_13031 src: /192.168.158.8:40514 dest: /192.168.158.4:9866 2025-07-18 04:14:57,382 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:40514, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-343478332_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753855_13031, duration(ns): 19973851 2025-07-18 04:14:57,382 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753855_13031, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-18 04:14:59,619 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753855_13031 replica FinalizedReplica, blk_1073753855_13031, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753855 for deletion 2025-07-18 04:14:59,621 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753855_13031 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir14/blk_1073753855 2025-07-18 04:15:57,363 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753856_13032 src: /192.168.158.5:49162 dest: /192.168.158.4:9866 2025-07-18 04:15:57,381 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:49162, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-825637718_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753856_13032, duration(ns): 15722517 2025-07-18 04:15:57,381 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753856_13032, type=LAST_IN_PIPELINE terminating 2025-07-18 04:15:59,623 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753856_13032 replica FinalizedReplica, blk_1073753856_13032, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753856 for deletion 2025-07-18 04:15:59,624 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753856_13032 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753856 2025-07-18 04:16:57,355 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753857_13033 src: /192.168.158.1:41196 dest: /192.168.158.4:9866 2025-07-18 04:16:57,389 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41196, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_140806196_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753857_13033, duration(ns): 24483570 2025-07-18 04:16:57,389 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753857_13033, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-18 04:17:02,624 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753857_13033 replica FinalizedReplica, blk_1073753857_13033, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753857 for deletion 2025-07-18 04:17:02,625 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753857_13033 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753857 2025-07-18 04:17:57,361 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753858_13034 src: /192.168.158.7:44442 dest: /192.168.158.4:9866 2025-07-18 04:17:57,388 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:44442, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1985875164_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753858_13034, duration(ns): 21134173 2025-07-18 04:17:57,388 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753858_13034, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-18 04:18:02,625 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753858_13034 replica FinalizedReplica, blk_1073753858_13034, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753858 for deletion 2025-07-18 04:18:02,627 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753858_13034 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753858 2025-07-18 04:18:57,361 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753859_13035 src: /192.168.158.1:42554 dest: /192.168.158.4:9866 2025-07-18 04:18:57,396 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42554, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_22644515_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753859_13035, duration(ns): 25807276 2025-07-18 04:18:57,397 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753859_13035, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-18 04:18:59,626 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753859_13035 replica FinalizedReplica, blk_1073753859_13035, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753859 for deletion 2025-07-18 04:18:59,627 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753859_13035 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753859 2025-07-18 04:19:57,370 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753860_13036 src: /192.168.158.7:51474 dest: /192.168.158.4:9866 2025-07-18 04:19:57,396 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:51474, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_334857424_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753860_13036, duration(ns): 20041983 2025-07-18 04:19:57,396 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753860_13036, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-18 04:20:02,625 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753860_13036 replica FinalizedReplica, blk_1073753860_13036, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753860 for deletion 2025-07-18 04:20:02,627 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753860_13036 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753860 2025-07-18 04:21:57,378 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753862_13038 src: /192.168.158.7:44720 dest: /192.168.158.4:9866 2025-07-18 04:21:57,403 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:44720, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1652334431_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753862_13038, duration(ns): 20128117 2025-07-18 04:21:57,404 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753862_13038, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-18 04:21:59,627 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753862_13038 replica FinalizedReplica, blk_1073753862_13038, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753862 for deletion 2025-07-18 04:21:59,628 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753862_13038 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753862 2025-07-18 04:22:57,372 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753863_13039 src: /192.168.158.1:49400 dest: /192.168.158.4:9866 2025-07-18 04:22:57,408 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49400, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1059602042_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753863_13039, duration(ns): 26195299 2025-07-18 04:22:57,408 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753863_13039, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-18 04:22:59,630 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753863_13039 replica FinalizedReplica, blk_1073753863_13039, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753863 for deletion 2025-07-18 04:22:59,631 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753863_13039 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753863 2025-07-18 04:23:57,416 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753864_13040 src: /192.168.158.5:37204 dest: /192.168.158.4:9866 2025-07-18 04:23:57,435 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:37204, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_764869170_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753864_13040, duration(ns): 16853418 2025-07-18 04:23:57,436 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753864_13040, type=LAST_IN_PIPELINE terminating 2025-07-18 04:23:59,630 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753864_13040 replica FinalizedReplica, blk_1073753864_13040, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753864 for deletion 2025-07-18 04:23:59,632 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753864_13040 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753864 2025-07-18 04:24:57,381 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753865_13041 src: /192.168.158.8:54724 dest: /192.168.158.4:9866 2025-07-18 04:24:57,400 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:54724, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-722029979_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753865_13041, duration(ns): 17231891 2025-07-18 04:24:57,401 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753865_13041, type=LAST_IN_PIPELINE terminating 2025-07-18 04:24:59,632 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753865_13041 replica FinalizedReplica, blk_1073753865_13041, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753865 for deletion 2025-07-18 04:24:59,634 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753865_13041 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753865 2025-07-18 04:27:02,378 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753867_13043 src: /192.168.158.1:36468 dest: /192.168.158.4:9866 2025-07-18 04:27:02,414 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36468, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1501184873_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753867_13043, duration(ns): 26100174 2025-07-18 04:27:02,414 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753867_13043, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-18 04:27:05,637 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753867_13043 replica FinalizedReplica, blk_1073753867_13043, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753867 for deletion 2025-07-18 04:27:05,638 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753867_13043 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753867 2025-07-18 04:28:02,381 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753868_13044 src: /192.168.158.1:48624 dest: /192.168.158.4:9866 2025-07-18 04:28:02,418 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48624, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-788428816_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753868_13044, duration(ns): 27324921 2025-07-18 04:28:02,418 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753868_13044, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-18 04:28:05,638 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753868_13044 replica FinalizedReplica, blk_1073753868_13044, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753868 for deletion 2025-07-18 04:28:05,640 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753868_13044 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753868 2025-07-18 04:29:02,386 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753869_13045 src: /192.168.158.5:40198 dest: /192.168.158.4:9866 2025-07-18 04:29:02,414 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:40198, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1711244758_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753869_13045, duration(ns): 22155441 2025-07-18 04:29:02,414 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753869_13045, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-18 04:29:08,640 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753869_13045 replica FinalizedReplica, blk_1073753869_13045, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753869 for deletion 2025-07-18 04:29:08,641 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753869_13045 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753869 2025-07-18 04:32:07,395 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753872_13048 src: /192.168.158.9:33612 dest: /192.168.158.4:9866 2025-07-18 04:32:07,415 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:33612, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1086040460_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753872_13048, duration(ns): 17681658 2025-07-18 04:32:07,415 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753872_13048, type=LAST_IN_PIPELINE terminating 2025-07-18 04:32:11,649 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753872_13048 replica FinalizedReplica, blk_1073753872_13048, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753872 for deletion 2025-07-18 04:32:11,650 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753872_13048 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753872 2025-07-18 04:34:07,392 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753874_13050 src: /192.168.158.5:54584 dest: /192.168.158.4:9866 2025-07-18 04:34:07,411 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:54584, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_988092035_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753874_13050, duration(ns): 16856273 2025-07-18 04:34:07,412 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753874_13050, type=LAST_IN_PIPELINE terminating 2025-07-18 04:34:11,653 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753874_13050 replica FinalizedReplica, blk_1073753874_13050, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753874 for deletion 2025-07-18 04:34:11,655 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753874_13050 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753874 2025-07-18 04:35:07,392 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753875_13051 src: /192.168.158.9:54854 dest: /192.168.158.4:9866 2025-07-18 04:35:07,411 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:54854, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1524414612_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753875_13051, duration(ns): 16566179 2025-07-18 04:35:07,411 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753875_13051, type=LAST_IN_PIPELINE terminating 2025-07-18 04:35:14,657 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753875_13051 replica FinalizedReplica, blk_1073753875_13051, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753875 for deletion 2025-07-18 04:35:14,658 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753875_13051 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753875 2025-07-18 04:38:12,397 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753878_13054 src: /192.168.158.5:48010 dest: /192.168.158.4:9866 2025-07-18 04:38:12,418 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:48010, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2124257645_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753878_13054, duration(ns): 18762703 2025-07-18 04:38:12,418 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753878_13054, type=LAST_IN_PIPELINE terminating 2025-07-18 04:38:14,663 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753878_13054 replica FinalizedReplica, blk_1073753878_13054, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753878 for deletion 2025-07-18 04:38:14,664 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753878_13054 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753878 2025-07-18 04:40:12,401 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753880_13056 src: /192.168.158.9:59472 dest: /192.168.158.4:9866 2025-07-18 04:40:12,431 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:59472, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1981934828_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753880_13056, duration(ns): 24332687 2025-07-18 04:40:12,431 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753880_13056, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-18 04:40:14,665 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753880_13056 replica FinalizedReplica, blk_1073753880_13056, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753880 for deletion 2025-07-18 04:40:14,667 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753880_13056 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753880 2025-07-18 04:43:17,398 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753883_13059 src: /192.168.158.1:36948 dest: /192.168.158.4:9866 2025-07-18 04:43:17,431 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36948, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2048091571_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753883_13059, duration(ns): 23956953 2025-07-18 04:43:17,431 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753883_13059, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-18 04:43:23,670 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753883_13059 replica FinalizedReplica, blk_1073753883_13059, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753883 for deletion 2025-07-18 04:43:23,671 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753883_13059 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753883 2025-07-18 04:44:17,400 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753884_13060 src: /192.168.158.1:43616 dest: /192.168.158.4:9866 2025-07-18 04:44:17,433 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43616, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-823303317_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753884_13060, duration(ns): 23448828 2025-07-18 04:44:17,433 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753884_13060, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-18 04:44:20,672 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753884_13060 replica FinalizedReplica, blk_1073753884_13060, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753884 for deletion 2025-07-18 04:44:20,673 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753884_13060 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753884 2025-07-18 04:47:27,414 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753887_13063 src: /192.168.158.6:36852 dest: /192.168.158.4:9866 2025-07-18 04:47:27,435 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:36852, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1995742704_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753887_13063, duration(ns): 18470490 2025-07-18 04:47:27,435 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753887_13063, type=LAST_IN_PIPELINE terminating 2025-07-18 04:47:32,681 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753887_13063 replica FinalizedReplica, blk_1073753887_13063, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753887 for deletion 2025-07-18 04:47:32,682 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753887_13063 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753887 2025-07-18 04:50:37,409 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753890_13066 src: /192.168.158.1:50474 dest: /192.168.158.4:9866 2025-07-18 04:50:37,448 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50474, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_175756896_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753890_13066, duration(ns): 29900393 2025-07-18 04:50:37,448 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753890_13066, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-18 04:50:44,688 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753890_13066 replica FinalizedReplica, blk_1073753890_13066, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753890 for deletion 2025-07-18 04:50:44,689 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753890_13066 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753890 2025-07-18 04:51:37,410 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753891_13067 src: /192.168.158.1:41782 dest: /192.168.158.4:9866 2025-07-18 04:51:37,444 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41782, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1293338361_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753891_13067, duration(ns): 24311851 2025-07-18 04:51:37,444 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753891_13067, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-18 04:51:44,689 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753891_13067 replica FinalizedReplica, blk_1073753891_13067, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753891 for deletion 2025-07-18 04:51:44,691 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753891_13067 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753891 2025-07-18 04:52:37,426 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753892_13068 src: /192.168.158.6:51680 dest: /192.168.158.4:9866 2025-07-18 04:52:37,453 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:51680, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1471576487_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753892_13068, duration(ns): 21208063 2025-07-18 04:52:37,453 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753892_13068, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-18 04:52:41,689 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753892_13068 replica FinalizedReplica, blk_1073753892_13068, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753892 for deletion 2025-07-18 04:52:41,690 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753892_13068 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753892 2025-07-18 04:56:52,430 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753896_13072 src: /192.168.158.1:44634 dest: /192.168.158.4:9866 2025-07-18 04:56:52,468 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44634, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2066439959_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753896_13072, duration(ns): 25250474 2025-07-18 04:56:52,468 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753896_13072, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-18 04:56:56,696 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753896_13072 replica FinalizedReplica, blk_1073753896_13072, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753896 for deletion 2025-07-18 04:56:56,697 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753896_13072 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753896 2025-07-18 04:58:52,416 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753898_13074 src: /192.168.158.1:53098 dest: /192.168.158.4:9866 2025-07-18 04:58:52,453 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53098, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1653392778_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753898_13074, duration(ns): 26317294 2025-07-18 04:58:52,454 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753898_13074, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-18 04:58:56,701 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753898_13074 replica FinalizedReplica, blk_1073753898_13074, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753898 for deletion 2025-07-18 04:58:56,703 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753898_13074 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753898 2025-07-18 04:59:52,424 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753899_13075 src: /192.168.158.5:37348 dest: /192.168.158.4:9866 2025-07-18 04:59:52,443 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:37348, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1730582758_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753899_13075, duration(ns): 17481061 2025-07-18 04:59:52,444 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753899_13075, type=LAST_IN_PIPELINE terminating 2025-07-18 04:59:56,702 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753899_13075 replica FinalizedReplica, blk_1073753899_13075, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753899 for deletion 2025-07-18 04:59:56,705 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753899_13075 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753899 2025-07-18 05:01:52,441 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753901_13077 src: /192.168.158.9:46140 dest: /192.168.158.4:9866 2025-07-18 05:01:52,469 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:46140, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_93327649_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753901_13077, duration(ns): 22395938 2025-07-18 05:01:52,470 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753901_13077, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-18 05:01:59,708 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753901_13077 replica FinalizedReplica, blk_1073753901_13077, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753901 for deletion 2025-07-18 05:01:59,709 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753901_13077 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753901 2025-07-18 05:02:52,427 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753902_13078 src: /192.168.158.5:39446 dest: /192.168.158.4:9866 2025-07-18 05:02:52,454 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:39446, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1353590567_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753902_13078, duration(ns): 21540224 2025-07-18 05:02:52,454 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753902_13078, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-18 05:02:56,710 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753902_13078 replica FinalizedReplica, blk_1073753902_13078, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753902 for deletion 2025-07-18 05:02:56,711 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753902_13078 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753902 2025-07-18 05:03:52,426 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753903_13079 src: /192.168.158.1:43222 dest: /192.168.158.4:9866 2025-07-18 05:03:52,459 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43222, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-696760092_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753903_13079, duration(ns): 24074767 2025-07-18 05:03:52,459 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753903_13079, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-18 05:03:56,711 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753903_13079 replica FinalizedReplica, blk_1073753903_13079, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753903 for deletion 2025-07-18 05:03:56,713 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753903_13079 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753903 2025-07-18 05:07:02,447 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753906_13082 src: /192.168.158.1:60658 dest: /192.168.158.4:9866 2025-07-18 05:07:02,482 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60658, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-530496418_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753906_13082, duration(ns): 26388571 2025-07-18 05:07:02,483 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753906_13082, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-18 05:07:05,716 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753906_13082 replica FinalizedReplica, blk_1073753906_13082, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753906 for deletion 2025-07-18 05:07:05,718 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753906_13082 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753906 2025-07-18 05:09:07,470 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753908_13084 src: /192.168.158.8:58910 dest: /192.168.158.4:9866 2025-07-18 05:09:07,501 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:58910, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1066929655_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753908_13084, duration(ns): 23787968 2025-07-18 05:09:07,501 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753908_13084, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-18 05:09:11,718 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753908_13084 replica FinalizedReplica, blk_1073753908_13084, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753908 for deletion 2025-07-18 05:09:11,719 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753908_13084 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753908 2025-07-18 05:11:07,446 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753910_13086 src: /192.168.158.1:53874 dest: /192.168.158.4:9866 2025-07-18 05:11:07,481 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53874, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-254511209_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753910_13086, duration(ns): 26127756 2025-07-18 05:11:07,481 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753910_13086, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-18 05:11:11,728 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753910_13086 replica FinalizedReplica, blk_1073753910_13086, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753910 for deletion 2025-07-18 05:11:11,729 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753910_13086 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753910 2025-07-18 05:15:12,452 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753914_13090 src: /192.168.158.8:40702 dest: /192.168.158.4:9866 2025-07-18 05:15:12,479 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:40702, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-613320678_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753914_13090, duration(ns): 21648822 2025-07-18 05:15:12,479 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753914_13090, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-18 05:15:14,742 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753914_13090 replica FinalizedReplica, blk_1073753914_13090, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753914 for deletion 2025-07-18 05:15:14,743 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753914_13090 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753914 2025-07-18 05:17:17,460 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753916_13092 src: /192.168.158.1:48882 dest: /192.168.158.4:9866 2025-07-18 05:17:17,495 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48882, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-753730242_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753916_13092, duration(ns): 24628809 2025-07-18 05:17:17,496 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753916_13092, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-18 05:17:23,747 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753916_13092 replica FinalizedReplica, blk_1073753916_13092, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753916 for deletion 2025-07-18 05:17:23,748 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753916_13092 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753916 2025-07-18 05:20:17,460 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753919_13095 src: /192.168.158.1:54438 dest: /192.168.158.4:9866 2025-07-18 05:20:17,494 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54438, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-257424165_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753919_13095, duration(ns): 24070829 2025-07-18 05:20:17,494 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753919_13095, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-18 05:20:20,755 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753919_13095 replica FinalizedReplica, blk_1073753919_13095, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753919 for deletion 2025-07-18 05:20:20,756 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753919_13095 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753919 2025-07-18 05:22:17,467 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753921_13097 src: /192.168.158.8:60558 dest: /192.168.158.4:9866 2025-07-18 05:22:17,496 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:60558, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1612058684_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753921_13097, duration(ns): 21413314 2025-07-18 05:22:17,496 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753921_13097, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-18 05:22:23,757 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753921_13097 replica FinalizedReplica, blk_1073753921_13097, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753921 for deletion 2025-07-18 05:22:23,758 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753921_13097 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753921 2025-07-18 05:24:22,462 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753923_13099 src: /192.168.158.6:33296 dest: /192.168.158.4:9866 2025-07-18 05:24:22,483 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:33296, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-342723140_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753923_13099, duration(ns): 18029782 2025-07-18 05:24:22,483 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753923_13099, type=LAST_IN_PIPELINE terminating 2025-07-18 05:24:26,764 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753923_13099 replica FinalizedReplica, blk_1073753923_13099, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753923 for deletion 2025-07-18 05:24:26,765 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753923_13099 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753923 2025-07-18 05:26:22,480 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753925_13101 src: /192.168.158.9:32894 dest: /192.168.158.4:9866 2025-07-18 05:26:22,500 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:32894, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-834121347_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753925_13101, duration(ns): 17191389 2025-07-18 05:26:22,500 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753925_13101, type=LAST_IN_PIPELINE terminating 2025-07-18 05:26:26,767 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753925_13101 replica FinalizedReplica, blk_1073753925_13101, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753925 for deletion 2025-07-18 05:26:26,768 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753925_13101 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753925 2025-07-18 05:28:27,485 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753927_13103 src: /192.168.158.8:57192 dest: /192.168.158.4:9866 2025-07-18 05:28:27,503 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:57192, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1724272510_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753927_13103, duration(ns): 16585690 2025-07-18 05:28:27,504 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753927_13103, type=LAST_IN_PIPELINE terminating 2025-07-18 05:28:29,771 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753927_13103 replica FinalizedReplica, blk_1073753927_13103, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753927 for deletion 2025-07-18 05:28:29,772 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753927_13103 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753927 2025-07-18 05:29:32,482 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753928_13104 src: /192.168.158.5:39974 dest: /192.168.158.4:9866 2025-07-18 05:29:32,501 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:39974, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1462546488_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753928_13104, duration(ns): 16042465 2025-07-18 05:29:32,501 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753928_13104, type=LAST_IN_PIPELINE terminating 2025-07-18 05:29:38,772 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753928_13104 replica FinalizedReplica, blk_1073753928_13104, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753928 for deletion 2025-07-18 05:29:38,773 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753928_13104 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753928 2025-07-18 05:31:37,479 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753930_13106 src: /192.168.158.1:38958 dest: /192.168.158.4:9866 2025-07-18 05:31:37,515 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38958, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1164873571_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753930_13106, duration(ns): 26207613 2025-07-18 05:31:37,515 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753930_13106, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-18 05:31:38,780 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753930_13106 replica FinalizedReplica, blk_1073753930_13106, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753930 for deletion 2025-07-18 05:31:38,781 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753930_13106 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753930 2025-07-18 05:32:37,500 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753931_13107 src: /192.168.158.9:44754 dest: /192.168.158.4:9866 2025-07-18 05:32:37,520 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:44754, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2138292252_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753931_13107, duration(ns): 17740139 2025-07-18 05:32:37,521 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753931_13107, type=LAST_IN_PIPELINE terminating 2025-07-18 05:32:38,783 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753931_13107 replica FinalizedReplica, blk_1073753931_13107, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753931 for deletion 2025-07-18 05:32:38,783 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753931_13107 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753931 2025-07-18 05:33:37,491 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753932_13108 src: /192.168.158.9:41044 dest: /192.168.158.4:9866 2025-07-18 05:33:37,519 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:41044, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679163954_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753932_13108, duration(ns): 20682960 2025-07-18 05:33:37,519 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753932_13108, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-18 05:33:38,787 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753932_13108 replica FinalizedReplica, blk_1073753932_13108, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753932 for deletion 2025-07-18 05:33:38,788 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753932_13108 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753932 2025-07-18 05:35:42,485 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753934_13110 src: /192.168.158.1:42368 dest: /192.168.158.4:9866 2025-07-18 05:35:42,518 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42368, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1239286974_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753934_13110, duration(ns): 23500582 2025-07-18 05:35:42,518 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753934_13110, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-18 05:35:44,792 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753934_13110 replica FinalizedReplica, blk_1073753934_13110, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753934 for deletion 2025-07-18 05:35:44,794 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753934_13110 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753934 2025-07-18 05:36:13,269 INFO org.apache.hadoop.hdfs.server.datanode.DirectoryScanner: BlockPool BP-1059995147-192.168.158.1-1752101929360 Total blocks: 8, missing metadata files:0, missing block files:0, missing blocks in memory:0, mismatched blocks:0 2025-07-18 05:36:42,489 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753935_13111 src: /192.168.158.1:50028 dest: /192.168.158.4:9866 2025-07-18 05:36:42,523 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50028, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_909809542_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753935_13111, duration(ns): 24904796 2025-07-18 05:36:42,523 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753935_13111, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-18 05:36:44,792 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753935_13111 replica FinalizedReplica, blk_1073753935_13111, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753935 for deletion 2025-07-18 05:36:44,794 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753935_13111 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753935 2025-07-18 05:37:47,472 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753936_13112 src: /192.168.158.1:41152 dest: /192.168.158.4:9866 2025-07-18 05:37:47,505 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41152, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_219473094_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753936_13112, duration(ns): 23922364 2025-07-18 05:37:47,505 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753936_13112, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-18 05:37:50,795 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753936_13112 replica FinalizedReplica, blk_1073753936_13112, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753936 for deletion 2025-07-18 05:37:50,796 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753936_13112 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753936 2025-07-18 05:38:47,521 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753937_13113 src: /192.168.158.8:40908 dest: /192.168.158.4:9866 2025-07-18 05:38:47,541 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:40908, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-905912665_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753937_13113, duration(ns): 18273765 2025-07-18 05:38:47,541 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753937_13113, type=LAST_IN_PIPELINE terminating 2025-07-18 05:38:50,798 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753937_13113 replica FinalizedReplica, blk_1073753937_13113, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753937 for deletion 2025-07-18 05:38:50,799 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753937_13113 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753937 2025-07-18 05:42:47,497 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753941_13117 src: /192.168.158.7:46386 dest: /192.168.158.4:9866 2025-07-18 05:42:47,525 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:46386, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1689424607_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753941_13117, duration(ns): 21936709 2025-07-18 05:42:47,528 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753941_13117, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-18 05:42:50,806 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753941_13117 replica FinalizedReplica, blk_1073753941_13117, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753941 for deletion 2025-07-18 05:42:50,807 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753941_13117 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753941 2025-07-18 05:43:52,502 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753942_13118 src: /192.168.158.6:46250 dest: /192.168.158.4:9866 2025-07-18 05:43:52,521 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:46250, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-143908011_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753942_13118, duration(ns): 16718740 2025-07-18 05:43:52,521 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753942_13118, type=LAST_IN_PIPELINE terminating 2025-07-18 05:43:53,809 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753942_13118 replica FinalizedReplica, blk_1073753942_13118, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753942 for deletion 2025-07-18 05:43:53,810 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753942_13118 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753942 2025-07-18 05:45:52,502 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753944_13120 src: /192.168.158.1:55898 dest: /192.168.158.4:9866 2025-07-18 05:45:52,538 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55898, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-281577113_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753944_13120, duration(ns): 25747602 2025-07-18 05:45:52,538 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753944_13120, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-18 05:45:53,812 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753944_13120 replica FinalizedReplica, blk_1073753944_13120, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753944 for deletion 2025-07-18 05:45:53,813 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753944_13120 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753944 2025-07-18 05:48:52,485 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753947_13123 src: /192.168.158.1:47488 dest: /192.168.158.4:9866 2025-07-18 05:48:52,523 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47488, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2062930840_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753947_13123, duration(ns): 27737983 2025-07-18 05:48:52,523 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753947_13123, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-18 05:48:53,822 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753947_13123 replica FinalizedReplica, blk_1073753947_13123, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753947 for deletion 2025-07-18 05:48:53,824 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753947_13123 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753947 2025-07-18 05:49:57,496 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753948_13124 src: /192.168.158.5:58608 dest: /192.168.158.4:9866 2025-07-18 05:49:57,516 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:58608, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-683986421_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753948_13124, duration(ns): 17926777 2025-07-18 05:49:57,516 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753948_13124, type=LAST_IN_PIPELINE terminating 2025-07-18 05:50:02,826 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753948_13124 replica FinalizedReplica, blk_1073753948_13124, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753948 for deletion 2025-07-18 05:50:02,827 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753948_13124 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753948 2025-07-18 05:50:57,499 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753949_13125 src: /192.168.158.8:42032 dest: /192.168.158.4:9866 2025-07-18 05:50:57,517 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:42032, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-747276701_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753949_13125, duration(ns): 16201716 2025-07-18 05:50:57,517 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753949_13125, type=LAST_IN_PIPELINE terminating 2025-07-18 05:50:59,827 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753949_13125 replica FinalizedReplica, blk_1073753949_13125, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753949 for deletion 2025-07-18 05:50:59,828 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753949_13125 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753949 2025-07-18 05:51:57,494 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753950_13126 src: /192.168.158.9:40000 dest: /192.168.158.4:9866 2025-07-18 05:51:57,521 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:40000, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1829760621_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753950_13126, duration(ns): 20698832 2025-07-18 05:51:57,521 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753950_13126, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-18 05:52:02,830 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753950_13126 replica FinalizedReplica, blk_1073753950_13126, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753950 for deletion 2025-07-18 05:52:02,831 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753950_13126 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753950 2025-07-18 05:52:57,497 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753951_13127 src: /192.168.158.1:51062 dest: /192.168.158.4:9866 2025-07-18 05:52:57,531 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51062, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1822723519_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753951_13127, duration(ns): 24947840 2025-07-18 05:52:57,531 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753951_13127, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-18 05:52:59,832 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753951_13127 replica FinalizedReplica, blk_1073753951_13127, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753951 for deletion 2025-07-18 05:52:59,833 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753951_13127 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753951 2025-07-18 05:54:57,500 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753953_13129 src: /192.168.158.8:60592 dest: /192.168.158.4:9866 2025-07-18 05:54:57,525 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:60592, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1321239652_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753953_13129, duration(ns): 19420620 2025-07-18 05:54:57,525 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753953_13129, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-18 05:55:02,835 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753953_13129 replica FinalizedReplica, blk_1073753953_13129, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753953 for deletion 2025-07-18 05:55:02,836 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753953_13129 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753953 2025-07-18 05:56:02,514 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753954_13130 src: /192.168.158.7:40298 dest: /192.168.158.4:9866 2025-07-18 05:56:02,533 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:40298, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1111173061_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753954_13130, duration(ns): 16169606 2025-07-18 05:56:02,533 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753954_13130, type=LAST_IN_PIPELINE terminating 2025-07-18 05:56:05,836 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753954_13130 replica FinalizedReplica, blk_1073753954_13130, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753954 for deletion 2025-07-18 05:56:05,837 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753954_13130 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753954 2025-07-18 05:57:02,502 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753955_13131 src: /192.168.158.5:36418 dest: /192.168.158.4:9866 2025-07-18 05:57:02,528 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:36418, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1775948082_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753955_13131, duration(ns): 20045818 2025-07-18 05:57:02,528 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753955_13131, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-18 05:57:08,838 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753955_13131 replica FinalizedReplica, blk_1073753955_13131, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753955 for deletion 2025-07-18 05:57:08,839 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753955_13131 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753955 2025-07-18 05:58:02,504 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753956_13132 src: /192.168.158.6:43456 dest: /192.168.158.4:9866 2025-07-18 05:58:02,530 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:43456, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1238402294_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753956_13132, duration(ns): 20375682 2025-07-18 05:58:02,531 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753956_13132, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-18 05:58:05,840 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753956_13132 replica FinalizedReplica, blk_1073753956_13132, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753956 for deletion 2025-07-18 05:58:05,842 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753956_13132 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753956 2025-07-18 06:00:02,508 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753958_13134 src: /192.168.158.1:60588 dest: /192.168.158.4:9866 2025-07-18 06:00:02,542 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60588, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_3036921_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753958_13134, duration(ns): 24836774 2025-07-18 06:00:02,543 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753958_13134, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-18 06:00:08,849 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753958_13134 replica FinalizedReplica, blk_1073753958_13134, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753958 for deletion 2025-07-18 06:00:08,851 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753958_13134 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753958 2025-07-18 06:04:02,514 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753962_13138 src: /192.168.158.7:58898 dest: /192.168.158.4:9866 2025-07-18 06:04:02,543 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:58898, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-698338005_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753962_13138, duration(ns): 22921295 2025-07-18 06:04:02,543 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753962_13138, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-18 06:04:08,865 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753962_13138 replica FinalizedReplica, blk_1073753962_13138, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753962 for deletion 2025-07-18 06:04:08,866 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753962_13138 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753962 2025-07-18 06:05:02,513 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753963_13139 src: /192.168.158.1:56072 dest: /192.168.158.4:9866 2025-07-18 06:05:02,547 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56072, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1383124616_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753963_13139, duration(ns): 24450279 2025-07-18 06:05:02,547 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753963_13139, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-18 06:05:08,865 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753963_13139 replica FinalizedReplica, blk_1073753963_13139, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753963 for deletion 2025-07-18 06:05:08,867 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753963_13139 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753963 2025-07-18 06:07:02,517 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753965_13141 src: /192.168.158.1:39568 dest: /192.168.158.4:9866 2025-07-18 06:07:02,552 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39568, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_975594465_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753965_13141, duration(ns): 22787151 2025-07-18 06:07:02,553 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753965_13141, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-18 06:07:05,868 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753965_13141 replica FinalizedReplica, blk_1073753965_13141, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753965 for deletion 2025-07-18 06:07:05,869 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753965_13141 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753965 2025-07-18 06:09:02,522 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753967_13143 src: /192.168.158.7:36554 dest: /192.168.158.4:9866 2025-07-18 06:09:02,549 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:36554, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-774498888_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753967_13143, duration(ns): 20867106 2025-07-18 06:09:02,549 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753967_13143, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-18 06:09:05,873 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753967_13143 replica FinalizedReplica, blk_1073753967_13143, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753967 for deletion 2025-07-18 06:09:05,874 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753967_13143 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753967 2025-07-18 06:10:02,541 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753968_13144 src: /192.168.158.1:42474 dest: /192.168.158.4:9866 2025-07-18 06:10:02,575 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42474, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2067443124_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753968_13144, duration(ns): 24593247 2025-07-18 06:10:02,575 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753968_13144, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-18 06:10:05,876 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753968_13144 replica FinalizedReplica, blk_1073753968_13144, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753968 for deletion 2025-07-18 06:10:05,877 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753968_13144 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753968 2025-07-18 06:15:07,532 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753973_13149 src: /192.168.158.1:38402 dest: /192.168.158.4:9866 2025-07-18 06:15:07,564 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38402, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1249044272_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753973_13149, duration(ns): 22622855 2025-07-18 06:15:07,564 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753973_13149, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-18 06:15:08,892 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753973_13149 replica FinalizedReplica, blk_1073753973_13149, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753973 for deletion 2025-07-18 06:15:08,893 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753973_13149 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753973 2025-07-18 06:16:07,542 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753974_13150 src: /192.168.158.6:51350 dest: /192.168.158.4:9866 2025-07-18 06:16:07,569 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:51350, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1712163966_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753974_13150, duration(ns): 21668057 2025-07-18 06:16:07,569 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753974_13150, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-18 06:16:08,892 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753974_13150 replica FinalizedReplica, blk_1073753974_13150, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753974 for deletion 2025-07-18 06:16:08,894 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753974_13150 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753974 2025-07-18 06:18:07,546 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753976_13152 src: /192.168.158.8:59876 dest: /192.168.158.4:9866 2025-07-18 06:18:07,563 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:59876, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2095268848_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753976_13152, duration(ns): 15656109 2025-07-18 06:18:07,564 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753976_13152, type=LAST_IN_PIPELINE terminating 2025-07-18 06:18:08,899 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753976_13152 replica FinalizedReplica, blk_1073753976_13152, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753976 for deletion 2025-07-18 06:18:08,900 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753976_13152 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753976 2025-07-18 06:22:12,564 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753980_13156 src: /192.168.158.6:57908 dest: /192.168.158.4:9866 2025-07-18 06:22:12,584 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:57908, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1248473171_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753980_13156, duration(ns): 18309922 2025-07-18 06:22:12,585 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753980_13156, type=LAST_IN_PIPELINE terminating 2025-07-18 06:22:14,905 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753980_13156 replica FinalizedReplica, blk_1073753980_13156, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753980 for deletion 2025-07-18 06:22:14,906 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753980_13156 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753980 2025-07-18 06:23:12,556 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753981_13157 src: /192.168.158.1:40296 dest: /192.168.158.4:9866 2025-07-18 06:23:12,590 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40296, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1884310955_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753981_13157, duration(ns): 24268074 2025-07-18 06:23:12,590 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753981_13157, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-18 06:23:17,909 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753981_13157 replica FinalizedReplica, blk_1073753981_13157, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753981 for deletion 2025-07-18 06:23:17,910 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753981_13157 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753981 2025-07-18 06:26:22,570 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753984_13160 src: /192.168.158.6:55926 dest: /192.168.158.4:9866 2025-07-18 06:26:22,592 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:55926, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_369749539_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753984_13160, duration(ns): 18914296 2025-07-18 06:26:22,592 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753984_13160, type=LAST_IN_PIPELINE terminating 2025-07-18 06:26:23,917 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753984_13160 replica FinalizedReplica, blk_1073753984_13160, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753984 for deletion 2025-07-18 06:26:23,918 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753984_13160 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753984 2025-07-18 06:28:27,571 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753986_13162 src: /192.168.158.1:50522 dest: /192.168.158.4:9866 2025-07-18 06:28:27,609 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50522, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1686889763_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753986_13162, duration(ns): 27981255 2025-07-18 06:28:27,609 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753986_13162, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-18 06:28:29,926 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753986_13162 replica FinalizedReplica, blk_1073753986_13162, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753986 for deletion 2025-07-18 06:28:29,927 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753986_13162 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753986 2025-07-18 06:29:27,573 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753987_13163 src: /192.168.158.9:60616 dest: /192.168.158.4:9866 2025-07-18 06:29:27,592 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:60616, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1827726830_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753987_13163, duration(ns): 16008426 2025-07-18 06:29:27,592 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753987_13163, type=LAST_IN_PIPELINE terminating 2025-07-18 06:29:29,929 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753987_13163 replica FinalizedReplica, blk_1073753987_13163, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753987 for deletion 2025-07-18 06:29:29,930 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753987_13163 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753987 2025-07-18 06:31:32,608 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753989_13165 src: /192.168.158.9:34230 dest: /192.168.158.4:9866 2025-07-18 06:31:32,627 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:34230, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1782817162_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753989_13165, duration(ns): 16666921 2025-07-18 06:31:32,627 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753989_13165, type=LAST_IN_PIPELINE terminating 2025-07-18 06:31:35,932 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753989_13165 replica FinalizedReplica, blk_1073753989_13165, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753989 for deletion 2025-07-18 06:31:35,933 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753989_13165 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753989 2025-07-18 06:35:32,592 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753993_13169 src: /192.168.158.1:51576 dest: /192.168.158.4:9866 2025-07-18 06:35:32,625 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51576, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-561551830_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753993_13169, duration(ns): 24503647 2025-07-18 06:35:32,626 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753993_13169, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-18 06:35:35,939 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753993_13169 replica FinalizedReplica, blk_1073753993_13169, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753993 for deletion 2025-07-18 06:35:35,940 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753993_13169 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753993 2025-07-18 06:40:37,572 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073753998_13174 src: /192.168.158.6:40484 dest: /192.168.158.4:9866 2025-07-18 06:40:37,599 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:40484, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1269114333_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073753998_13174, duration(ns): 21245829 2025-07-18 06:40:37,599 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073753998_13174, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-18 06:40:41,951 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073753998_13174 replica FinalizedReplica, blk_1073753998_13174, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753998 for deletion 2025-07-18 06:40:41,952 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073753998_13174 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073753998 2025-07-18 06:42:37,572 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754000_13176 src: /192.168.158.1:41598 dest: /192.168.158.4:9866 2025-07-18 06:42:37,606 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41598, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1364312305_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754000_13176, duration(ns): 25083872 2025-07-18 06:42:37,607 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754000_13176, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-18 06:42:41,954 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754000_13176 replica FinalizedReplica, blk_1073754000_13176, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754000 for deletion 2025-07-18 06:42:41,955 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754000_13176 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754000 2025-07-18 06:43:37,570 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754001_13177 src: /192.168.158.1:51022 dest: /192.168.158.4:9866 2025-07-18 06:43:37,605 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51022, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_216390176_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754001_13177, duration(ns): 24570160 2025-07-18 06:43:37,605 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754001_13177, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-18 06:43:38,957 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754001_13177 replica FinalizedReplica, blk_1073754001_13177, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754001 for deletion 2025-07-18 06:43:38,958 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754001_13177 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754001 2025-07-18 06:46:37,578 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754004_13180 src: /192.168.158.5:38460 dest: /192.168.158.4:9866 2025-07-18 06:46:37,598 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:38460, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_116369614_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754004_13180, duration(ns): 17098153 2025-07-18 06:46:37,598 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754004_13180, type=LAST_IN_PIPELINE terminating 2025-07-18 06:46:41,961 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754004_13180 replica FinalizedReplica, blk_1073754004_13180, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754004 for deletion 2025-07-18 06:46:41,962 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754004_13180 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754004 2025-07-18 06:49:37,581 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754007_13183 src: /192.168.158.8:57108 dest: /192.168.158.4:9866 2025-07-18 06:49:37,602 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:57108, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1642810472_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754007_13183, duration(ns): 18777873 2025-07-18 06:49:37,603 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754007_13183, type=LAST_IN_PIPELINE terminating 2025-07-18 06:49:38,963 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754007_13183 replica FinalizedReplica, blk_1073754007_13183, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754007 for deletion 2025-07-18 06:49:38,964 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754007_13183 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754007 2025-07-18 06:51:37,594 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754009_13185 src: /192.168.158.8:33336 dest: /192.168.158.4:9866 2025-07-18 06:51:37,620 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:33336, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2069570738_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754009_13185, duration(ns): 20248284 2025-07-18 06:51:37,620 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754009_13185, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-18 06:51:38,966 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754009_13185 replica FinalizedReplica, blk_1073754009_13185, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754009 for deletion 2025-07-18 06:51:38,967 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754009_13185 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754009 2025-07-18 06:52:37,598 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754010_13186 src: /192.168.158.1:42616 dest: /192.168.158.4:9866 2025-07-18 06:52:37,635 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42616, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-40124895_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754010_13186, duration(ns): 27278777 2025-07-18 06:52:37,635 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754010_13186, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-18 06:52:38,967 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754010_13186 replica FinalizedReplica, blk_1073754010_13186, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754010 for deletion 2025-07-18 06:52:38,969 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754010_13186 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754010 2025-07-18 06:54:37,604 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754012_13188 src: /192.168.158.5:34564 dest: /192.168.158.4:9866 2025-07-18 06:54:37,630 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:34564, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_174545818_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754012_13188, duration(ns): 20577505 2025-07-18 06:54:37,631 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754012_13188, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-18 06:54:38,969 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754012_13188 replica FinalizedReplica, blk_1073754012_13188, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754012 for deletion 2025-07-18 06:54:38,970 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754012_13188 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754012 2025-07-18 06:56:37,605 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754014_13190 src: /192.168.158.9:57156 dest: /192.168.158.4:9866 2025-07-18 06:56:37,631 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:57156, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_489724684_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754014_13190, duration(ns): 20166168 2025-07-18 06:56:37,631 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754014_13190, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-18 06:56:41,976 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754014_13190 replica FinalizedReplica, blk_1073754014_13190, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754014 for deletion 2025-07-18 06:56:41,977 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754014_13190 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754014 2025-07-18 06:57:42,610 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754015_13191 src: /192.168.158.5:55650 dest: /192.168.158.4:9866 2025-07-18 06:57:42,630 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:55650, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-319988274_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754015_13191, duration(ns): 17954678 2025-07-18 06:57:42,630 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754015_13191, type=LAST_IN_PIPELINE terminating 2025-07-18 06:57:47,978 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754015_13191 replica FinalizedReplica, blk_1073754015_13191, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754015 for deletion 2025-07-18 06:57:47,979 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754015_13191 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754015 2025-07-18 06:58:42,606 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754016_13192 src: /192.168.158.1:40452 dest: /192.168.158.4:9866 2025-07-18 06:58:42,641 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40452, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-373188074_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754016_13192, duration(ns): 24636123 2025-07-18 06:58:42,642 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754016_13192, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-18 06:58:47,979 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754016_13192 replica FinalizedReplica, blk_1073754016_13192, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754016 for deletion 2025-07-18 06:58:47,981 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754016_13192 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754016 2025-07-18 06:59:47,602 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754017_13193 src: /192.168.158.1:49914 dest: /192.168.158.4:9866 2025-07-18 06:59:47,643 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49914, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-388503063_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754017_13193, duration(ns): 31844655 2025-07-18 06:59:47,643 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754017_13193, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-18 06:59:53,981 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754017_13193 replica FinalizedReplica, blk_1073754017_13193, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754017 for deletion 2025-07-18 06:59:53,983 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754017_13193 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754017 2025-07-18 07:01:47,609 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754019_13195 src: /192.168.158.5:50654 dest: /192.168.158.4:9866 2025-07-18 07:01:47,638 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:50654, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1229993596_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754019_13195, duration(ns): 23022245 2025-07-18 07:01:47,638 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754019_13195, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-18 07:01:50,985 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754019_13195 replica FinalizedReplica, blk_1073754019_13195, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754019 for deletion 2025-07-18 07:01:50,986 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754019_13195 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754019 2025-07-18 07:04:47,605 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754022_13198 src: /192.168.158.1:36268 dest: /192.168.158.4:9866 2025-07-18 07:04:47,640 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36268, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-759869831_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754022_13198, duration(ns): 25315840 2025-07-18 07:04:47,640 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754022_13198, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-18 07:04:50,988 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754022_13198 replica FinalizedReplica, blk_1073754022_13198, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754022 for deletion 2025-07-18 07:04:50,989 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754022_13198 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754022 2025-07-18 07:05:52,607 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754023_13199 src: /192.168.158.9:55420 dest: /192.168.158.4:9866 2025-07-18 07:05:52,633 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:55420, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_370140390_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754023_13199, duration(ns): 20777198 2025-07-18 07:05:52,633 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754023_13199, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-18 07:05:56,992 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754023_13199 replica FinalizedReplica, blk_1073754023_13199, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754023 for deletion 2025-07-18 07:05:56,993 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754023_13199 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754023 2025-07-18 07:08:52,614 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754026_13202 src: /192.168.158.7:57054 dest: /192.168.158.4:9866 2025-07-18 07:08:52,643 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:57054, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1935578487_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754026_13202, duration(ns): 22083369 2025-07-18 07:08:52,644 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754026_13202, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-18 07:08:53,998 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754026_13202 replica FinalizedReplica, blk_1073754026_13202, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754026 for deletion 2025-07-18 07:08:53,999 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754026_13202 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754026 2025-07-18 07:09:52,613 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754027_13203 src: /192.168.158.7:51342 dest: /192.168.158.4:9866 2025-07-18 07:09:52,632 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:51342, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1257217689_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754027_13203, duration(ns): 17086258 2025-07-18 07:09:52,632 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754027_13203, type=LAST_IN_PIPELINE terminating 2025-07-18 07:09:57,002 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754027_13203 replica FinalizedReplica, blk_1073754027_13203, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754027 for deletion 2025-07-18 07:09:57,003 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754027_13203 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754027 2025-07-18 07:10:52,614 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754028_13204 src: /192.168.158.6:48550 dest: /192.168.158.4:9866 2025-07-18 07:10:52,641 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:48550, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_106985128_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754028_13204, duration(ns): 21752213 2025-07-18 07:10:52,641 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754028_13204, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-18 07:10:54,004 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754028_13204 replica FinalizedReplica, blk_1073754028_13204, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754028 for deletion 2025-07-18 07:10:54,005 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754028_13204 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754028 2025-07-18 07:12:52,615 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754030_13206 src: /192.168.158.1:34546 dest: /192.168.158.4:9866 2025-07-18 07:12:52,651 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34546, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1839499295_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754030_13206, duration(ns): 26930336 2025-07-18 07:12:52,651 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754030_13206, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-18 07:12:54,007 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754030_13206 replica FinalizedReplica, blk_1073754030_13206, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754030 for deletion 2025-07-18 07:12:54,009 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754030_13206 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754030 2025-07-18 07:13:52,618 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754031_13207 src: /192.168.158.1:40350 dest: /192.168.158.4:9866 2025-07-18 07:13:52,651 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40350, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-599139905_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754031_13207, duration(ns): 23638921 2025-07-18 07:13:52,651 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754031_13207, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-18 07:13:54,012 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754031_13207 replica FinalizedReplica, blk_1073754031_13207, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754031 for deletion 2025-07-18 07:13:54,013 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754031_13207 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754031 2025-07-18 07:14:52,623 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754032_13208 src: /192.168.158.1:48388 dest: /192.168.158.4:9866 2025-07-18 07:14:52,659 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48388, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1126299665_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754032_13208, duration(ns): 26966264 2025-07-18 07:14:52,659 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754032_13208, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-18 07:14:54,015 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754032_13208 replica FinalizedReplica, blk_1073754032_13208, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754032 for deletion 2025-07-18 07:14:54,016 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754032_13208 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754032 2025-07-18 07:15:57,623 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754033_13209 src: /192.168.158.7:57932 dest: /192.168.158.4:9866 2025-07-18 07:15:57,649 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:57932, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1028459799_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754033_13209, duration(ns): 20451873 2025-07-18 07:15:57,649 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754033_13209, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-18 07:16:00,017 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754033_13209 replica FinalizedReplica, blk_1073754033_13209, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754033 for deletion 2025-07-18 07:16:00,018 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754033_13209 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754033 2025-07-18 07:16:57,625 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754034_13210 src: /192.168.158.6:54784 dest: /192.168.158.4:9866 2025-07-18 07:16:57,644 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:54784, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1574476657_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754034_13210, duration(ns): 16492163 2025-07-18 07:16:57,645 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754034_13210, type=LAST_IN_PIPELINE terminating 2025-07-18 07:17:00,020 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754034_13210 replica FinalizedReplica, blk_1073754034_13210, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754034 for deletion 2025-07-18 07:17:00,021 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754034_13210 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754034 2025-07-18 07:23:12,653 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754040_13216 src: /192.168.158.1:41870 dest: /192.168.158.4:9866 2025-07-18 07:23:12,688 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41870, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-317604082_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754040_13216, duration(ns): 26591603 2025-07-18 07:23:12,689 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754040_13216, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-18 07:23:15,031 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754040_13216 replica FinalizedReplica, blk_1073754040_13216, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754040 for deletion 2025-07-18 07:23:15,032 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754040_13216 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754040 2025-07-18 07:25:17,660 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754042_13218 src: /192.168.158.6:48242 dest: /192.168.158.4:9866 2025-07-18 07:25:17,680 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:48242, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1185276631_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754042_13218, duration(ns): 18569880 2025-07-18 07:25:17,680 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754042_13218, type=LAST_IN_PIPELINE terminating 2025-07-18 07:25:21,042 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754042_13218 replica FinalizedReplica, blk_1073754042_13218, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754042 for deletion 2025-07-18 07:25:21,043 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754042_13218 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754042 2025-07-18 07:28:17,665 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754045_13221 src: /192.168.158.9:47480 dest: /192.168.158.4:9866 2025-07-18 07:28:17,684 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:47480, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1520232633_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754045_13221, duration(ns): 16519966 2025-07-18 07:28:17,684 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754045_13221, type=LAST_IN_PIPELINE terminating 2025-07-18 07:28:21,048 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754045_13221 replica FinalizedReplica, blk_1073754045_13221, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754045 for deletion 2025-07-18 07:28:21,049 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754045_13221 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754045 2025-07-18 07:29:22,660 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754046_13222 src: /192.168.158.7:43824 dest: /192.168.158.4:9866 2025-07-18 07:29:22,689 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:43824, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_436736447_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754046_13222, duration(ns): 23594566 2025-07-18 07:29:22,690 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754046_13222, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-18 07:29:27,049 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754046_13222 replica FinalizedReplica, blk_1073754046_13222, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754046 for deletion 2025-07-18 07:29:27,050 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754046_13222 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754046 2025-07-18 07:30:22,668 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754047_13223 src: /192.168.158.9:58104 dest: /192.168.158.4:9866 2025-07-18 07:30:22,687 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:58104, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_432285976_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754047_13223, duration(ns): 17271396 2025-07-18 07:30:22,687 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754047_13223, type=LAST_IN_PIPELINE terminating 2025-07-18 07:30:24,053 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754047_13223 replica FinalizedReplica, blk_1073754047_13223, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754047 for deletion 2025-07-18 07:30:24,054 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754047_13223 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754047 2025-07-18 07:33:27,676 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754050_13226 src: /192.168.158.7:39058 dest: /192.168.158.4:9866 2025-07-18 07:33:27,707 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:39058, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1367500067_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754050_13226, duration(ns): 25751635 2025-07-18 07:33:27,707 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754050_13226, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-18 07:33:33,060 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754050_13226 replica FinalizedReplica, blk_1073754050_13226, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754050 for deletion 2025-07-18 07:33:33,061 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754050_13226 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754050 2025-07-18 07:34:27,705 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754051_13227 src: /192.168.158.8:45218 dest: /192.168.158.4:9866 2025-07-18 07:34:27,731 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:45218, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_556353215_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754051_13227, duration(ns): 20542266 2025-07-18 07:34:27,731 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754051_13227, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-18 07:34:30,064 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754051_13227 replica FinalizedReplica, blk_1073754051_13227, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754051 for deletion 2025-07-18 07:34:30,065 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754051_13227 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754051 2025-07-18 07:35:27,674 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754052_13228 src: /192.168.158.7:55592 dest: /192.168.158.4:9866 2025-07-18 07:35:27,702 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:55592, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_669080764_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754052_13228, duration(ns): 21498538 2025-07-18 07:35:27,702 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754052_13228, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-18 07:35:30,065 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754052_13228 replica FinalizedReplica, blk_1073754052_13228, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754052 for deletion 2025-07-18 07:35:30,067 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754052_13228 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754052 2025-07-18 07:36:27,675 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754053_13229 src: /192.168.158.6:49636 dest: /192.168.158.4:9866 2025-07-18 07:36:27,701 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:49636, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1176665808_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754053_13229, duration(ns): 20628673 2025-07-18 07:36:27,701 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754053_13229, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-18 07:36:30,068 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754053_13229 replica FinalizedReplica, blk_1073754053_13229, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754053 for deletion 2025-07-18 07:36:30,069 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754053_13229 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754053 2025-07-18 07:37:27,687 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754054_13230 src: /192.168.158.6:39074 dest: /192.168.158.4:9866 2025-07-18 07:37:27,706 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:39074, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-140456240_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754054_13230, duration(ns): 17402637 2025-07-18 07:37:27,707 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754054_13230, type=LAST_IN_PIPELINE terminating 2025-07-18 07:37:33,070 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754054_13230 replica FinalizedReplica, blk_1073754054_13230, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754054 for deletion 2025-07-18 07:37:33,071 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754054_13230 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754054 2025-07-18 07:38:27,681 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754055_13231 src: /192.168.158.8:47966 dest: /192.168.158.4:9866 2025-07-18 07:38:27,701 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:47966, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_288046137_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754055_13231, duration(ns): 18530710 2025-07-18 07:38:27,702 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754055_13231, type=LAST_IN_PIPELINE terminating 2025-07-18 07:38:30,073 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754055_13231 replica FinalizedReplica, blk_1073754055_13231, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754055 for deletion 2025-07-18 07:38:30,074 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754055_13231 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754055 2025-07-18 07:41:27,682 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754058_13234 src: /192.168.158.1:32874 dest: /192.168.158.4:9866 2025-07-18 07:41:27,714 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:32874, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2058907954_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754058_13234, duration(ns): 23227089 2025-07-18 07:41:27,715 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754058_13234, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-18 07:41:30,077 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754058_13234 replica FinalizedReplica, blk_1073754058_13234, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754058 for deletion 2025-07-18 07:41:30,078 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754058_13234 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754058 2025-07-18 07:43:32,688 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754060_13236 src: /192.168.158.1:44558 dest: /192.168.158.4:9866 2025-07-18 07:43:32,754 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44558, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1357867702_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754060_13236, duration(ns): 55736562 2025-07-18 07:43:32,755 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754060_13236, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-18 07:43:39,080 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754060_13236 replica FinalizedReplica, blk_1073754060_13236, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754060 for deletion 2025-07-18 07:43:39,081 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754060_13236 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754060 2025-07-18 07:44:32,688 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754061_13237 src: /192.168.158.1:52782 dest: /192.168.158.4:9866 2025-07-18 07:44:32,722 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52782, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1850074274_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754061_13237, duration(ns): 25079841 2025-07-18 07:44:32,723 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754061_13237, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-18 07:44:36,080 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754061_13237 replica FinalizedReplica, blk_1073754061_13237, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754061 for deletion 2025-07-18 07:44:36,081 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754061_13237 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754061 2025-07-18 07:47:37,696 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754064_13240 src: /192.168.158.5:55414 dest: /192.168.158.4:9866 2025-07-18 07:47:37,715 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:55414, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2038199565_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754064_13240, duration(ns): 16893682 2025-07-18 07:47:37,716 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754064_13240, type=LAST_IN_PIPELINE terminating 2025-07-18 07:47:39,088 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754064_13240 replica FinalizedReplica, blk_1073754064_13240, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754064 for deletion 2025-07-18 07:47:39,089 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754064_13240 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754064 2025-07-18 07:51:37,696 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754068_13244 src: /192.168.158.8:43732 dest: /192.168.158.4:9866 2025-07-18 07:51:37,715 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:43732, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1343210765_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754068_13244, duration(ns): 16800837 2025-07-18 07:51:37,715 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754068_13244, type=LAST_IN_PIPELINE terminating 2025-07-18 07:51:42,098 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754068_13244 replica FinalizedReplica, blk_1073754068_13244, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754068 for deletion 2025-07-18 07:51:42,099 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754068_13244 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754068 2025-07-18 07:53:47,687 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754070_13246 src: /192.168.158.1:34334 dest: /192.168.158.4:9866 2025-07-18 07:53:47,726 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34334, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_788026965_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754070_13246, duration(ns): 28153483 2025-07-18 07:53:47,726 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754070_13246, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-18 07:53:51,101 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754070_13246 replica FinalizedReplica, blk_1073754070_13246, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754070 for deletion 2025-07-18 07:53:51,102 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754070_13246 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754070 2025-07-18 07:54:47,694 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754071_13247 src: /192.168.158.1:49872 dest: /192.168.158.4:9866 2025-07-18 07:54:47,730 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49872, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_472411802_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754071_13247, duration(ns): 27349289 2025-07-18 07:54:47,730 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754071_13247, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-18 07:54:54,103 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754071_13247 replica FinalizedReplica, blk_1073754071_13247, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754071 for deletion 2025-07-18 07:54:54,105 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754071_13247 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754071 2025-07-18 07:56:47,700 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754073_13249 src: /192.168.158.5:41534 dest: /192.168.158.4:9866 2025-07-18 07:56:47,727 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:41534, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_887382734_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754073_13249, duration(ns): 21490356 2025-07-18 07:56:47,727 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754073_13249, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-18 07:56:51,109 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754073_13249 replica FinalizedReplica, blk_1073754073_13249, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754073 for deletion 2025-07-18 07:56:51,110 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754073_13249 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754073 2025-07-18 07:57:47,704 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754074_13250 src: /192.168.158.1:39994 dest: /192.168.158.4:9866 2025-07-18 07:57:47,739 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39994, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2061511800_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754074_13250, duration(ns): 25784098 2025-07-18 07:57:47,739 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754074_13250, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-18 07:57:51,110 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754074_13250 replica FinalizedReplica, blk_1073754074_13250, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754074 for deletion 2025-07-18 07:57:51,112 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754074_13250 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754074 2025-07-18 07:59:47,710 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754076_13252 src: /192.168.158.1:34224 dest: /192.168.158.4:9866 2025-07-18 07:59:47,744 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34224, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-489229738_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754076_13252, duration(ns): 25422720 2025-07-18 07:59:47,744 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754076_13252, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-18 07:59:51,114 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754076_13252 replica FinalizedReplica, blk_1073754076_13252, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754076 for deletion 2025-07-18 07:59:51,115 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754076_13252 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754076 2025-07-18 08:01:47,707 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754078_13254 src: /192.168.158.1:53016 dest: /192.168.158.4:9866 2025-07-18 08:01:47,740 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53016, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_941918894_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754078_13254, duration(ns): 24173936 2025-07-18 08:01:47,742 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754078_13254, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-18 08:01:51,120 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754078_13254 replica FinalizedReplica, blk_1073754078_13254, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754078 for deletion 2025-07-18 08:01:51,121 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754078_13254 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754078 2025-07-18 08:06:52,734 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754083_13259 src: /192.168.158.1:54074 dest: /192.168.158.4:9866 2025-07-18 08:06:52,770 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54074, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_517915892_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754083_13259, duration(ns): 25829666 2025-07-18 08:06:52,771 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754083_13259, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-18 08:06:54,125 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754083_13259 replica FinalizedReplica, blk_1073754083_13259, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754083 for deletion 2025-07-18 08:06:54,126 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754083_13259 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754083 2025-07-18 08:09:52,750 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754086_13262 src: /192.168.158.7:33280 dest: /192.168.158.4:9866 2025-07-18 08:09:52,778 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:33280, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1850466716_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754086_13262, duration(ns): 22571158 2025-07-18 08:09:52,779 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754086_13262, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-18 08:09:54,132 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754086_13262 replica FinalizedReplica, blk_1073754086_13262, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754086 for deletion 2025-07-18 08:09:54,133 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754086_13262 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754086 2025-07-18 08:10:52,760 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754087_13263 src: /192.168.158.7:57232 dest: /192.168.158.4:9866 2025-07-18 08:10:52,779 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:57232, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_602397410_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754087_13263, duration(ns): 16641589 2025-07-18 08:10:52,779 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754087_13263, type=LAST_IN_PIPELINE terminating 2025-07-18 08:10:57,133 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754087_13263 replica FinalizedReplica, blk_1073754087_13263, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754087 for deletion 2025-07-18 08:10:57,134 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754087_13263 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754087 2025-07-18 08:12:52,736 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754089_13265 src: /192.168.158.8:60360 dest: /192.168.158.4:9866 2025-07-18 08:12:52,756 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:60360, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_414812118_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754089_13265, duration(ns): 17032275 2025-07-18 08:12:52,756 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754089_13265, type=LAST_IN_PIPELINE terminating 2025-07-18 08:12:54,137 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754089_13265 replica FinalizedReplica, blk_1073754089_13265, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754089 for deletion 2025-07-18 08:12:54,138 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754089_13265 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754089 2025-07-18 08:14:52,736 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754091_13267 src: /192.168.158.9:50628 dest: /192.168.158.4:9866 2025-07-18 08:14:52,762 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:50628, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1535386403_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754091_13267, duration(ns): 20661889 2025-07-18 08:14:52,762 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754091_13267, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-18 08:14:54,140 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754091_13267 replica FinalizedReplica, blk_1073754091_13267, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754091 for deletion 2025-07-18 08:14:54,142 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754091_13267 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754091 2025-07-18 08:16:52,741 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754093_13269 src: /192.168.158.1:58160 dest: /192.168.158.4:9866 2025-07-18 08:16:52,772 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58160, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2071015045_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754093_13269, duration(ns): 21263564 2025-07-18 08:16:52,772 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754093_13269, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-18 08:16:57,144 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754093_13269 replica FinalizedReplica, blk_1073754093_13269, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754093 for deletion 2025-07-18 08:16:57,145 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754093_13269 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754093 2025-07-18 08:20:52,748 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754097_13273 src: /192.168.158.5:57268 dest: /192.168.158.4:9866 2025-07-18 08:20:52,776 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:57268, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1532852777_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754097_13273, duration(ns): 22089153 2025-07-18 08:20:52,776 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754097_13273, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-18 08:20:57,155 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754097_13273 replica FinalizedReplica, blk_1073754097_13273, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754097 for deletion 2025-07-18 08:20:57,156 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754097_13273 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754097 2025-07-18 08:24:57,753 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754101_13277 src: /192.168.158.9:43366 dest: /192.168.158.4:9866 2025-07-18 08:24:57,773 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:43366, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-906292718_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754101_13277, duration(ns): 17567593 2025-07-18 08:24:57,773 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754101_13277, type=LAST_IN_PIPELINE terminating 2025-07-18 08:25:03,163 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754101_13277 replica FinalizedReplica, blk_1073754101_13277, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754101 for deletion 2025-07-18 08:25:03,164 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754101_13277 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754101 2025-07-18 08:28:57,761 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754105_13281 src: /192.168.158.6:51694 dest: /192.168.158.4:9866 2025-07-18 08:28:57,779 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:51694, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-215511415_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754105_13281, duration(ns): 16580394 2025-07-18 08:28:57,780 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754105_13281, type=LAST_IN_PIPELINE terminating 2025-07-18 08:29:00,168 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754105_13281 replica FinalizedReplica, blk_1073754105_13281, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754105 for deletion 2025-07-18 08:29:00,170 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754105_13281 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754105 2025-07-18 08:29:57,762 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754106_13282 src: /192.168.158.1:35064 dest: /192.168.158.4:9866 2025-07-18 08:29:57,798 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35064, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-982239990_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754106_13282, duration(ns): 26178937 2025-07-18 08:29:57,798 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754106_13282, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-18 08:30:00,171 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754106_13282 replica FinalizedReplica, blk_1073754106_13282, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754106 for deletion 2025-07-18 08:30:00,172 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754106_13282 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754106 2025-07-18 08:30:57,763 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754107_13283 src: /192.168.158.1:35834 dest: /192.168.158.4:9866 2025-07-18 08:30:57,800 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35834, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_326279288_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754107_13283, duration(ns): 25391643 2025-07-18 08:30:57,800 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754107_13283, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-18 08:31:03,172 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754107_13283 replica FinalizedReplica, blk_1073754107_13283, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754107 for deletion 2025-07-18 08:31:03,173 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754107_13283 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754107 2025-07-18 08:31:57,767 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754108_13284 src: /192.168.158.5:53820 dest: /192.168.158.4:9866 2025-07-18 08:31:57,794 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:53820, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1753239730_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754108_13284, duration(ns): 21536635 2025-07-18 08:31:57,795 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754108_13284, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-18 08:32:00,175 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754108_13284 replica FinalizedReplica, blk_1073754108_13284, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754108 for deletion 2025-07-18 08:32:00,176 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754108_13284 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754108 2025-07-18 08:32:57,770 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754109_13285 src: /192.168.158.8:43588 dest: /192.168.158.4:9866 2025-07-18 08:32:57,787 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:43588, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1996007625_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754109_13285, duration(ns): 14840780 2025-07-18 08:32:57,788 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754109_13285, type=LAST_IN_PIPELINE terminating 2025-07-18 08:33:00,177 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754109_13285 replica FinalizedReplica, blk_1073754109_13285, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754109 for deletion 2025-07-18 08:33:00,178 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754109_13285 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir15/blk_1073754109 2025-07-18 08:35:57,759 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754112_13288 src: /192.168.158.8:36084 dest: /192.168.158.4:9866 2025-07-18 08:35:57,785 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:36084, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2024611681_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754112_13288, duration(ns): 20458176 2025-07-18 08:35:57,785 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754112_13288, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-18 08:36:00,184 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754112_13288 replica FinalizedReplica, blk_1073754112_13288, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754112 for deletion 2025-07-18 08:36:00,185 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754112_13288 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754112 2025-07-18 08:36:57,777 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754113_13289 src: /192.168.158.8:50984 dest: /192.168.158.4:9866 2025-07-18 08:36:57,804 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:50984, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1884409084_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754113_13289, duration(ns): 21310244 2025-07-18 08:36:57,805 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754113_13289, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-18 08:37:03,188 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754113_13289 replica FinalizedReplica, blk_1073754113_13289, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754113 for deletion 2025-07-18 08:37:03,189 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754113_13289 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754113 2025-07-18 08:37:57,758 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754114_13290 src: /192.168.158.1:59504 dest: /192.168.158.4:9866 2025-07-18 08:37:57,792 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59504, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1688600374_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754114_13290, duration(ns): 24397358 2025-07-18 08:37:57,792 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754114_13290, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-18 08:38:03,191 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754114_13290 replica FinalizedReplica, blk_1073754114_13290, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754114 for deletion 2025-07-18 08:38:03,192 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754114_13290 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754114 2025-07-18 08:38:57,761 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754115_13291 src: /192.168.158.1:58138 dest: /192.168.158.4:9866 2025-07-18 08:38:57,795 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58138, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1985453955_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754115_13291, duration(ns): 24847189 2025-07-18 08:38:57,795 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754115_13291, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-18 08:39:00,194 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754115_13291 replica FinalizedReplica, blk_1073754115_13291, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754115 for deletion 2025-07-18 08:39:00,195 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754115_13291 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754115 2025-07-18 08:39:57,767 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754116_13292 src: /192.168.158.7:56240 dest: /192.168.158.4:9866 2025-07-18 08:39:57,796 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:56240, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1043926852_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754116_13292, duration(ns): 22512461 2025-07-18 08:39:57,796 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754116_13292, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-18 08:40:03,194 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754116_13292 replica FinalizedReplica, blk_1073754116_13292, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754116 for deletion 2025-07-18 08:40:03,195 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754116_13292 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754116 2025-07-18 08:42:57,761 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754119_13295 src: /192.168.158.1:58310 dest: /192.168.158.4:9866 2025-07-18 08:42:57,794 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58310, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1391294778_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754119_13295, duration(ns): 23806490 2025-07-18 08:42:57,794 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754119_13295, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-18 08:43:03,202 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754119_13295 replica FinalizedReplica, blk_1073754119_13295, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754119 for deletion 2025-07-18 08:43:03,203 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754119_13295 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754119 2025-07-18 08:44:57,768 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754121_13297 src: /192.168.158.1:46030 dest: /192.168.158.4:9866 2025-07-18 08:44:57,802 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46030, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1439917030_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754121_13297, duration(ns): 24161447 2025-07-18 08:44:57,802 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754121_13297, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-18 08:45:00,206 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754121_13297 replica FinalizedReplica, blk_1073754121_13297, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754121 for deletion 2025-07-18 08:45:00,207 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754121_13297 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754121 2025-07-18 08:47:02,780 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754123_13299 src: /192.168.158.9:48892 dest: /192.168.158.4:9866 2025-07-18 08:47:02,798 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:48892, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_665337199_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754123_13299, duration(ns): 15943116 2025-07-18 08:47:02,798 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754123_13299, type=LAST_IN_PIPELINE terminating 2025-07-18 08:47:03,209 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754123_13299 replica FinalizedReplica, blk_1073754123_13299, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754123 for deletion 2025-07-18 08:47:03,210 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754123_13299 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754123 2025-07-18 08:50:02,777 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754126_13302 src: /192.168.158.6:54110 dest: /192.168.158.4:9866 2025-07-18 08:50:02,803 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:54110, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1075936998_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754126_13302, duration(ns): 21102498 2025-07-18 08:50:02,804 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754126_13302, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-18 08:50:03,214 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754126_13302 replica FinalizedReplica, blk_1073754126_13302, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754126 for deletion 2025-07-18 08:50:03,215 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754126_13302 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754126 2025-07-18 08:54:07,786 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754130_13306 src: /192.168.158.1:53602 dest: /192.168.158.4:9866 2025-07-18 08:54:07,821 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53602, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_456140731_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754130_13306, duration(ns): 26918010 2025-07-18 08:54:07,824 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754130_13306, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-18 08:54:09,224 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754130_13306 replica FinalizedReplica, blk_1073754130_13306, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754130 for deletion 2025-07-18 08:54:09,225 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754130_13306 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754130 2025-07-18 08:57:07,792 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754133_13309 src: /192.168.158.1:57436 dest: /192.168.158.4:9866 2025-07-18 08:57:07,828 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57436, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1189908255_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754133_13309, duration(ns): 26229237 2025-07-18 08:57:07,828 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754133_13309, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-18 08:57:09,231 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754133_13309 replica FinalizedReplica, blk_1073754133_13309, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754133 for deletion 2025-07-18 08:57:09,232 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754133_13309 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754133 2025-07-18 08:59:12,788 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754135_13311 src: /192.168.158.1:45852 dest: /192.168.158.4:9866 2025-07-18 08:59:12,819 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45852, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2809028_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754135_13311, duration(ns): 22278435 2025-07-18 08:59:12,820 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754135_13311, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-18 08:59:18,237 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754135_13311 replica FinalizedReplica, blk_1073754135_13311, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754135 for deletion 2025-07-18 08:59:18,238 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754135_13311 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754135 2025-07-18 09:03:17,810 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754139_13315 src: /192.168.158.5:48594 dest: /192.168.158.4:9866 2025-07-18 09:03:17,830 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:48594, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1338961065_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754139_13315, duration(ns): 18036765 2025-07-18 09:03:17,831 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754139_13315, type=LAST_IN_PIPELINE terminating 2025-07-18 09:03:21,248 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754139_13315 replica FinalizedReplica, blk_1073754139_13315, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754139 for deletion 2025-07-18 09:03:21,249 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754139_13315 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754139 2025-07-18 09:05:22,806 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754141_13317 src: /192.168.158.1:43246 dest: /192.168.158.4:9866 2025-07-18 09:05:22,841 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43246, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-968555075_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754141_13317, duration(ns): 25976478 2025-07-18 09:05:22,841 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754141_13317, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-18 09:05:30,248 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754141_13317 replica FinalizedReplica, blk_1073754141_13317, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754141 for deletion 2025-07-18 09:05:30,249 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754141_13317 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754141 2025-07-18 09:07:27,810 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754143_13319 src: /192.168.158.5:58206 dest: /192.168.158.4:9866 2025-07-18 09:07:27,831 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:58206, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1214636983_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754143_13319, duration(ns): 18761783 2025-07-18 09:07:27,831 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754143_13319, type=LAST_IN_PIPELINE terminating 2025-07-18 09:07:33,254 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754143_13319 replica FinalizedReplica, blk_1073754143_13319, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754143 for deletion 2025-07-18 09:07:33,255 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754143_13319 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754143 2025-07-18 09:11:27,836 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754147_13323 src: /192.168.158.8:49024 dest: /192.168.158.4:9866 2025-07-18 09:11:27,864 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:49024, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1090929121_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754147_13323, duration(ns): 22145397 2025-07-18 09:11:27,864 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754147_13323, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-18 09:11:33,266 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754147_13323 replica FinalizedReplica, blk_1073754147_13323, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754147 for deletion 2025-07-18 09:11:33,267 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754147_13323 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754147 2025-07-18 09:12:32,843 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754148_13324 src: /192.168.158.1:48364 dest: /192.168.158.4:9866 2025-07-18 09:12:32,877 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48364, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-501044655_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754148_13324, duration(ns): 24654585 2025-07-18 09:12:32,877 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754148_13324, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-18 09:12:36,267 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754148_13324 replica FinalizedReplica, blk_1073754148_13324, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754148 for deletion 2025-07-18 09:12:36,268 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754148_13324 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754148 2025-07-18 09:15:32,812 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754151_13327 src: /192.168.158.6:45276 dest: /192.168.158.4:9866 2025-07-18 09:15:32,843 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:45276, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1877650477_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754151_13327, duration(ns): 24432488 2025-07-18 09:15:32,843 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754151_13327, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-18 09:15:36,275 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754151_13327 replica FinalizedReplica, blk_1073754151_13327, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754151 for deletion 2025-07-18 09:15:36,276 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754151_13327 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754151 2025-07-18 09:16:32,819 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754152_13328 src: /192.168.158.6:47178 dest: /192.168.158.4:9866 2025-07-18 09:16:32,839 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:47178, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1578342417_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754152_13328, duration(ns): 17942353 2025-07-18 09:16:32,839 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754152_13328, type=LAST_IN_PIPELINE terminating 2025-07-18 09:16:36,278 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754152_13328 replica FinalizedReplica, blk_1073754152_13328, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754152 for deletion 2025-07-18 09:16:36,279 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754152_13328 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754152 2025-07-18 09:17:32,819 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754153_13329 src: /192.168.158.5:44464 dest: /192.168.158.4:9866 2025-07-18 09:17:32,837 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:44464, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_544494189_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754153_13329, duration(ns): 15673594 2025-07-18 09:17:32,838 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754153_13329, type=LAST_IN_PIPELINE terminating 2025-07-18 09:17:36,282 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754153_13329 replica FinalizedReplica, blk_1073754153_13329, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754153 for deletion 2025-07-18 09:17:36,283 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754153_13329 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754153 2025-07-18 09:19:37,819 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754155_13331 src: /192.168.158.9:58672 dest: /192.168.158.4:9866 2025-07-18 09:19:37,838 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:58672, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1456012728_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754155_13331, duration(ns): 16676511 2025-07-18 09:19:37,838 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754155_13331, type=LAST_IN_PIPELINE terminating 2025-07-18 09:19:42,285 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754155_13331 replica FinalizedReplica, blk_1073754155_13331, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754155 for deletion 2025-07-18 09:19:42,287 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754155_13331 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754155 2025-07-18 09:21:42,817 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754157_13333 src: /192.168.158.6:59074 dest: /192.168.158.4:9866 2025-07-18 09:21:42,844 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:59074, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-990993229_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754157_13333, duration(ns): 21514531 2025-07-18 09:21:42,844 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754157_13333, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-18 09:21:51,290 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754157_13333 replica FinalizedReplica, blk_1073754157_13333, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754157 for deletion 2025-07-18 09:21:51,291 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754157_13333 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754157 2025-07-18 09:24:42,816 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754160_13336 src: /192.168.158.1:42804 dest: /192.168.158.4:9866 2025-07-18 09:24:42,850 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42804, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_268360381_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754160_13336, duration(ns): 24153462 2025-07-18 09:24:42,850 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754160_13336, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-18 09:24:51,291 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754160_13336 replica FinalizedReplica, blk_1073754160_13336, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754160 for deletion 2025-07-18 09:24:51,292 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754160_13336 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754160 2025-07-18 09:25:47,818 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754161_13337 src: /192.168.158.1:38632 dest: /192.168.158.4:9866 2025-07-18 09:25:47,851 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38632, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1663933768_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754161_13337, duration(ns): 24215991 2025-07-18 09:25:47,851 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754161_13337, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-18 09:25:51,295 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754161_13337 replica FinalizedReplica, blk_1073754161_13337, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754161 for deletion 2025-07-18 09:25:51,296 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754161_13337 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754161 2025-07-18 09:26:52,818 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754162_13338 src: /192.168.158.1:51254 dest: /192.168.158.4:9866 2025-07-18 09:26:52,853 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51254, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1052106688_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754162_13338, duration(ns): 25711598 2025-07-18 09:26:52,853 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754162_13338, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-18 09:26:57,300 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754162_13338 replica FinalizedReplica, blk_1073754162_13338, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754162 for deletion 2025-07-18 09:26:57,301 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754162_13338 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754162 2025-07-18 09:30:52,827 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754166_13342 src: /192.168.158.1:55188 dest: /192.168.158.4:9866 2025-07-18 09:30:52,860 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55188, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_870400255_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754166_13342, duration(ns): 23391332 2025-07-18 09:30:52,860 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754166_13342, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-18 09:30:57,311 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754166_13342 replica FinalizedReplica, blk_1073754166_13342, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754166 for deletion 2025-07-18 09:30:57,312 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754166_13342 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754166 2025-07-18 09:34:02,835 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754169_13345 src: /192.168.158.1:48742 dest: /192.168.158.4:9866 2025-07-18 09:34:02,869 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48742, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1395821391_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754169_13345, duration(ns): 23959345 2025-07-18 09:34:02,869 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754169_13345, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-18 09:34:06,317 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754169_13345 replica FinalizedReplica, blk_1073754169_13345, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754169 for deletion 2025-07-18 09:34:06,318 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754169_13345 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754169 2025-07-18 09:35:02,835 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754170_13346 src: /192.168.158.8:57844 dest: /192.168.158.4:9866 2025-07-18 09:35:02,862 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:57844, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1809628412_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754170_13346, duration(ns): 21272298 2025-07-18 09:35:02,862 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754170_13346, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-18 09:35:09,319 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754170_13346 replica FinalizedReplica, blk_1073754170_13346, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754170 for deletion 2025-07-18 09:35:09,321 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754170_13346 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754170 2025-07-18 09:36:02,839 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754171_13347 src: /192.168.158.5:52544 dest: /192.168.158.4:9866 2025-07-18 09:36:02,866 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:52544, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_755880235_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754171_13347, duration(ns): 22345392 2025-07-18 09:36:02,867 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754171_13347, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-18 09:36:06,325 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754171_13347 replica FinalizedReplica, blk_1073754171_13347, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754171 for deletion 2025-07-18 09:36:06,326 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754171_13347 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754171 2025-07-18 09:38:07,839 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754173_13349 src: /192.168.158.1:55748 dest: /192.168.158.4:9866 2025-07-18 09:38:07,877 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55748, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_560523182_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754173_13349, duration(ns): 28710803 2025-07-18 09:38:07,877 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754173_13349, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-18 09:38:12,331 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754173_13349 replica FinalizedReplica, blk_1073754173_13349, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754173 for deletion 2025-07-18 09:38:12,332 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754173_13349 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754173 2025-07-18 09:39:07,848 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754174_13350 src: /192.168.158.7:52950 dest: /192.168.158.4:9866 2025-07-18 09:39:07,867 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:52950, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1144981185_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754174_13350, duration(ns): 16776695 2025-07-18 09:39:07,867 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754174_13350, type=LAST_IN_PIPELINE terminating 2025-07-18 09:39:12,335 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754174_13350 replica FinalizedReplica, blk_1073754174_13350, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754174 for deletion 2025-07-18 09:39:12,337 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754174_13350 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754174 2025-07-18 09:40:07,851 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754175_13351 src: /192.168.158.9:34592 dest: /192.168.158.4:9866 2025-07-18 09:40:07,869 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:34592, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1870513132_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754175_13351, duration(ns): 16194025 2025-07-18 09:40:07,870 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754175_13351, type=LAST_IN_PIPELINE terminating 2025-07-18 09:40:15,339 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754175_13351 replica FinalizedReplica, blk_1073754175_13351, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754175 for deletion 2025-07-18 09:40:15,340 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754175_13351 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754175 2025-07-18 09:42:07,857 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754177_13353 src: /192.168.158.9:54280 dest: /192.168.158.4:9866 2025-07-18 09:42:07,876 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:54280, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1361957670_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754177_13353, duration(ns): 16870100 2025-07-18 09:42:07,877 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754177_13353, type=LAST_IN_PIPELINE terminating 2025-07-18 09:42:15,343 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754177_13353 replica FinalizedReplica, blk_1073754177_13353, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754177 for deletion 2025-07-18 09:42:15,344 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754177_13353 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754177 2025-07-18 09:45:07,854 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754180_13356 src: /192.168.158.1:51390 dest: /192.168.158.4:9866 2025-07-18 09:45:07,889 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51390, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1996674973_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754180_13356, duration(ns): 25518933 2025-07-18 09:45:07,889 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754180_13356, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-18 09:45:12,348 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754180_13356 replica FinalizedReplica, blk_1073754180_13356, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754180 for deletion 2025-07-18 09:45:12,349 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754180_13356 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754180 2025-07-18 09:49:07,877 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754184_13360 src: /192.168.158.7:35838 dest: /192.168.158.4:9866 2025-07-18 09:49:07,896 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:35838, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1544032557_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754184_13360, duration(ns): 16725752 2025-07-18 09:49:07,896 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754184_13360, type=LAST_IN_PIPELINE terminating 2025-07-18 09:49:15,363 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754184_13360 replica FinalizedReplica, blk_1073754184_13360, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754184 for deletion 2025-07-18 09:49:15,364 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754184_13360 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754184 2025-07-18 09:52:17,881 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754187_13363 src: /192.168.158.9:48382 dest: /192.168.158.4:9866 2025-07-18 09:52:17,899 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:48382, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1707220043_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754187_13363, duration(ns): 16060232 2025-07-18 09:52:17,899 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754187_13363, type=LAST_IN_PIPELINE terminating 2025-07-18 09:52:21,369 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754187_13363 replica FinalizedReplica, blk_1073754187_13363, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754187 for deletion 2025-07-18 09:52:21,370 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754187_13363 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754187 2025-07-18 09:53:17,871 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754188_13364 src: /192.168.158.1:38584 dest: /192.168.158.4:9866 2025-07-18 09:53:17,902 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38584, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_349315527_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754188_13364, duration(ns): 21491459 2025-07-18 09:53:17,902 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754188_13364, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-18 09:53:24,371 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754188_13364 replica FinalizedReplica, blk_1073754188_13364, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754188 for deletion 2025-07-18 09:53:24,372 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754188_13364 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754188 2025-07-18 09:56:22,881 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754191_13367 src: /192.168.158.5:55242 dest: /192.168.158.4:9866 2025-07-18 09:56:22,900 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:55242, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_216909268_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754191_13367, duration(ns): 16179071 2025-07-18 09:56:22,900 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754191_13367, type=LAST_IN_PIPELINE terminating 2025-07-18 09:56:27,376 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754191_13367 replica FinalizedReplica, blk_1073754191_13367, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754191 for deletion 2025-07-18 09:56:27,377 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754191_13367 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754191 2025-07-18 09:59:18,386 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Successfully sent block report 0x12cfd9a757d31f49, containing 4 storage report(s), of which we sent 4. The reports had 8 total blocks and used 1 RPC(s). This took 0 msec to generate and 4 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5. 2025-07-18 09:59:18,386 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Got finalize command for block pool BP-1059995147-192.168.158.1-1752101929360 2025-07-18 10:00:27,884 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754195_13371 src: /192.168.158.6:54882 dest: /192.168.158.4:9866 2025-07-18 10:00:27,911 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:54882, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_890300718_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754195_13371, duration(ns): 20440971 2025-07-18 10:00:27,912 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754195_13371, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-18 10:00:33,383 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754195_13371 replica FinalizedReplica, blk_1073754195_13371, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754195 for deletion 2025-07-18 10:00:33,384 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754195_13371 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754195 2025-07-18 10:01:27,889 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754196_13372 src: /192.168.158.7:53544 dest: /192.168.158.4:9866 2025-07-18 10:01:27,908 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:53544, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-964828701_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754196_13372, duration(ns): 17371401 2025-07-18 10:01:27,908 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754196_13372, type=LAST_IN_PIPELINE terminating 2025-07-18 10:01:33,385 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754196_13372 replica FinalizedReplica, blk_1073754196_13372, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754196 for deletion 2025-07-18 10:01:33,386 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754196_13372 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754196 2025-07-18 10:04:32,926 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754199_13375 src: /192.168.158.6:34770 dest: /192.168.158.4:9866 2025-07-18 10:04:32,944 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:34770, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1995751262_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754199_13375, duration(ns): 15992213 2025-07-18 10:04:32,944 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754199_13375, type=LAST_IN_PIPELINE terminating 2025-07-18 10:04:36,390 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754199_13375 replica FinalizedReplica, blk_1073754199_13375, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754199 for deletion 2025-07-18 10:04:36,391 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754199_13375 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754199 2025-07-18 10:08:37,904 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754203_13379 src: /192.168.158.5:59586 dest: /192.168.158.4:9866 2025-07-18 10:08:37,926 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:59586, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_644596365_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754203_13379, duration(ns): 20332123 2025-07-18 10:08:37,927 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754203_13379, type=LAST_IN_PIPELINE terminating 2025-07-18 10:08:42,394 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754203_13379 replica FinalizedReplica, blk_1073754203_13379, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754203 for deletion 2025-07-18 10:08:42,396 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754203_13379 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754203 2025-07-18 10:09:37,895 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754204_13380 src: /192.168.158.1:48036 dest: /192.168.158.4:9866 2025-07-18 10:09:37,929 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48036, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_818405875_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754204_13380, duration(ns): 24629895 2025-07-18 10:09:37,929 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754204_13380, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-18 10:09:42,397 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754204_13380 replica FinalizedReplica, blk_1073754204_13380, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754204 for deletion 2025-07-18 10:09:42,398 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754204_13380 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754204 2025-07-18 10:11:47,904 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754206_13382 src: /192.168.158.5:53418 dest: /192.168.158.4:9866 2025-07-18 10:11:47,922 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:53418, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_26456819_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754206_13382, duration(ns): 15306868 2025-07-18 10:11:47,922 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754206_13382, type=LAST_IN_PIPELINE terminating 2025-07-18 10:11:51,401 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754206_13382 replica FinalizedReplica, blk_1073754206_13382, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754206 for deletion 2025-07-18 10:11:51,402 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754206_13382 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754206 2025-07-18 10:12:47,903 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754207_13383 src: /192.168.158.7:48198 dest: /192.168.158.4:9866 2025-07-18 10:12:47,929 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:48198, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_735911713_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754207_13383, duration(ns): 20018762 2025-07-18 10:12:47,930 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754207_13383, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-18 10:12:51,402 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754207_13383 replica FinalizedReplica, blk_1073754207_13383, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754207 for deletion 2025-07-18 10:12:51,404 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754207_13383 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754207 2025-07-18 10:13:47,900 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754208_13384 src: /192.168.158.1:60186 dest: /192.168.158.4:9866 2025-07-18 10:13:47,933 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60186, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1635194946_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754208_13384, duration(ns): 23189519 2025-07-18 10:13:47,933 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754208_13384, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-18 10:13:51,403 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754208_13384 replica FinalizedReplica, blk_1073754208_13384, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754208 for deletion 2025-07-18 10:13:51,405 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754208_13384 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754208 2025-07-18 10:15:52,919 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754210_13386 src: /192.168.158.8:49874 dest: /192.168.158.4:9866 2025-07-18 10:15:52,938 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:49874, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-920032658_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754210_13386, duration(ns): 17455099 2025-07-18 10:15:52,939 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754210_13386, type=LAST_IN_PIPELINE terminating 2025-07-18 10:15:57,405 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754210_13386 replica FinalizedReplica, blk_1073754210_13386, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754210 for deletion 2025-07-18 10:15:57,406 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754210_13386 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754210 2025-07-18 10:16:57,908 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754211_13387 src: /192.168.158.1:37196 dest: /192.168.158.4:9866 2025-07-18 10:16:57,947 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37196, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_879971115_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754211_13387, duration(ns): 28039035 2025-07-18 10:16:57,947 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754211_13387, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-18 10:17:03,407 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754211_13387 replica FinalizedReplica, blk_1073754211_13387, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754211 for deletion 2025-07-18 10:17:03,409 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754211_13387 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754211 2025-07-18 10:20:02,919 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754214_13390 src: /192.168.158.9:45640 dest: /192.168.158.4:9866 2025-07-18 10:20:02,945 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:45640, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-54885480_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754214_13390, duration(ns): 21023486 2025-07-18 10:20:02,946 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754214_13390, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-18 10:20:06,412 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754214_13390 replica FinalizedReplica, blk_1073754214_13390, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754214 for deletion 2025-07-18 10:20:06,414 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754214_13390 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754214 2025-07-18 10:21:02,923 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754215_13391 src: /192.168.158.5:58114 dest: /192.168.158.4:9866 2025-07-18 10:21:02,949 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:58114, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-485802101_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754215_13391, duration(ns): 20582174 2025-07-18 10:21:02,949 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754215_13391, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-18 10:21:06,412 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754215_13391 replica FinalizedReplica, blk_1073754215_13391, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754215 for deletion 2025-07-18 10:21:06,413 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754215_13391 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754215 2025-07-18 10:23:02,955 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754217_13393 src: /192.168.158.5:55718 dest: /192.168.158.4:9866 2025-07-18 10:23:02,974 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:55718, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-515678530_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754217_13393, duration(ns): 16683468 2025-07-18 10:23:02,974 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754217_13393, type=LAST_IN_PIPELINE terminating 2025-07-18 10:23:06,417 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754217_13393 replica FinalizedReplica, blk_1073754217_13393, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754217 for deletion 2025-07-18 10:23:06,418 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754217_13393 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754217 2025-07-18 10:28:12,932 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754222_13398 src: /192.168.158.9:43780 dest: /192.168.158.4:9866 2025-07-18 10:28:12,958 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:43780, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_875738115_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754222_13398, duration(ns): 20648677 2025-07-18 10:28:12,958 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754222_13398, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-18 10:28:18,423 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754222_13398 replica FinalizedReplica, blk_1073754222_13398, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754222 for deletion 2025-07-18 10:28:18,424 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754222_13398 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754222 2025-07-18 10:29:12,931 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754223_13399 src: /192.168.158.6:40880 dest: /192.168.158.4:9866 2025-07-18 10:29:12,951 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:40880, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951975024_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754223_13399, duration(ns): 18156643 2025-07-18 10:29:12,951 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754223_13399, type=LAST_IN_PIPELINE terminating 2025-07-18 10:29:21,424 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754223_13399 replica FinalizedReplica, blk_1073754223_13399, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754223 for deletion 2025-07-18 10:29:21,425 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754223_13399 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754223 2025-07-18 10:33:12,939 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754227_13403 src: /192.168.158.1:39942 dest: /192.168.158.4:9866 2025-07-18 10:33:12,975 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39942, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1597853514_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754227_13403, duration(ns): 27097188 2025-07-18 10:33:12,975 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754227_13403, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-18 10:33:18,431 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754227_13403 replica FinalizedReplica, blk_1073754227_13403, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754227 for deletion 2025-07-18 10:33:18,432 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754227_13403 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754227 2025-07-18 10:34:12,947 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754228_13404 src: /192.168.158.6:44594 dest: /192.168.158.4:9866 2025-07-18 10:34:12,974 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:44594, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2058115501_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754228_13404, duration(ns): 20224880 2025-07-18 10:34:12,974 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754228_13404, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-18 10:34:21,432 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754228_13404 replica FinalizedReplica, blk_1073754228_13404, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754228 for deletion 2025-07-18 10:34:21,433 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754228_13404 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754228 2025-07-18 10:36:12,945 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754230_13406 src: /192.168.158.8:37866 dest: /192.168.158.4:9866 2025-07-18 10:36:12,967 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:37866, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-63884198_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754230_13406, duration(ns): 19615397 2025-07-18 10:36:12,967 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754230_13406, type=LAST_IN_PIPELINE terminating 2025-07-18 10:36:21,438 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754230_13406 replica FinalizedReplica, blk_1073754230_13406, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754230 for deletion 2025-07-18 10:36:21,439 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754230_13406 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754230 2025-07-18 10:37:17,948 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754231_13407 src: /192.168.158.8:36556 dest: /192.168.158.4:9866 2025-07-18 10:37:17,966 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:36556, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1150831746_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754231_13407, duration(ns): 16297729 2025-07-18 10:37:17,967 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754231_13407, type=LAST_IN_PIPELINE terminating 2025-07-18 10:37:21,441 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754231_13407 replica FinalizedReplica, blk_1073754231_13407, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754231 for deletion 2025-07-18 10:37:21,442 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754231_13407 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754231 2025-07-18 10:39:17,955 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754233_13409 src: /192.168.158.9:53628 dest: /192.168.158.4:9866 2025-07-18 10:39:17,976 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:53628, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1898020603_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754233_13409, duration(ns): 17874261 2025-07-18 10:39:17,976 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754233_13409, type=LAST_IN_PIPELINE terminating 2025-07-18 10:39:21,446 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754233_13409 replica FinalizedReplica, blk_1073754233_13409, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754233 for deletion 2025-07-18 10:39:21,447 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754233_13409 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754233 2025-07-18 10:40:17,954 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754234_13410 src: /192.168.158.5:32776 dest: /192.168.158.4:9866 2025-07-18 10:40:17,980 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:32776, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1940604119_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754234_13410, duration(ns): 20514084 2025-07-18 10:40:17,981 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754234_13410, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-18 10:40:21,447 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754234_13410 replica FinalizedReplica, blk_1073754234_13410, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754234 for deletion 2025-07-18 10:40:21,450 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754234_13410 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754234 2025-07-18 10:42:22,949 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754236_13412 src: /192.168.158.9:40138 dest: /192.168.158.4:9866 2025-07-18 10:42:22,974 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:40138, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-128337358_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754236_13412, duration(ns): 19369677 2025-07-18 10:42:22,974 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754236_13412, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-18 10:42:27,455 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754236_13412 replica FinalizedReplica, blk_1073754236_13412, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754236 for deletion 2025-07-18 10:42:27,456 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754236_13412 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754236 2025-07-18 10:44:22,951 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754238_13414 src: /192.168.158.1:33898 dest: /192.168.158.4:9866 2025-07-18 10:44:22,982 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33898, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1365374209_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754238_13414, duration(ns): 22156173 2025-07-18 10:44:22,983 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754238_13414, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-18 10:44:30,461 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754238_13414 replica FinalizedReplica, blk_1073754238_13414, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754238 for deletion 2025-07-18 10:44:30,462 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754238_13414 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754238 2025-07-18 10:47:27,967 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754241_13417 src: /192.168.158.7:35514 dest: /192.168.158.4:9866 2025-07-18 10:47:27,987 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:35514, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2069372177_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754241_13417, duration(ns): 17185138 2025-07-18 10:47:27,987 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754241_13417, type=LAST_IN_PIPELINE terminating 2025-07-18 10:47:36,463 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754241_13417 replica FinalizedReplica, blk_1073754241_13417, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754241 for deletion 2025-07-18 10:47:36,464 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754241_13417 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754241 2025-07-18 10:48:27,964 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754242_13418 src: /192.168.158.8:46184 dest: /192.168.158.4:9866 2025-07-18 10:48:27,991 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:46184, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1446554546_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754242_13418, duration(ns): 21309215 2025-07-18 10:48:27,991 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754242_13418, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-18 10:48:33,465 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754242_13418 replica FinalizedReplica, blk_1073754242_13418, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754242 for deletion 2025-07-18 10:48:33,466 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754242_13418 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754242 2025-07-18 10:56:32,990 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754250_13426 src: /192.168.158.5:50174 dest: /192.168.158.4:9866 2025-07-18 10:56:33,007 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:50174, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-629731796_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754250_13426, duration(ns): 14862690 2025-07-18 10:56:33,007 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754250_13426, type=LAST_IN_PIPELINE terminating 2025-07-18 10:56:36,490 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754250_13426 replica FinalizedReplica, blk_1073754250_13426, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754250 for deletion 2025-07-18 10:56:36,491 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754250_13426 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754250 2025-07-18 10:57:32,984 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754251_13427 src: /192.168.158.1:48528 dest: /192.168.158.4:9866 2025-07-18 10:57:33,018 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48528, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_727440542_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754251_13427, duration(ns): 24370029 2025-07-18 10:57:33,020 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754251_13427, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-18 10:57:39,492 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754251_13427 replica FinalizedReplica, blk_1073754251_13427, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754251 for deletion 2025-07-18 10:57:39,493 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754251_13427 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754251 2025-07-18 10:58:32,982 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754252_13428 src: /192.168.158.9:35028 dest: /192.168.158.4:9866 2025-07-18 10:58:33,002 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:35028, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1929188976_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754252_13428, duration(ns): 17796954 2025-07-18 10:58:33,002 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754252_13428, type=LAST_IN_PIPELINE terminating 2025-07-18 10:58:39,492 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754252_13428 replica FinalizedReplica, blk_1073754252_13428, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754252 for deletion 2025-07-18 10:58:39,493 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754252_13428 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754252 2025-07-18 11:00:37,984 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754254_13430 src: /192.168.158.7:41860 dest: /192.168.158.4:9866 2025-07-18 11:00:38,002 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:41860, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_787217070_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754254_13430, duration(ns): 15828412 2025-07-18 11:00:38,002 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754254_13430, type=LAST_IN_PIPELINE terminating 2025-07-18 11:00:42,496 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754254_13430 replica FinalizedReplica, blk_1073754254_13430, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754254 for deletion 2025-07-18 11:00:42,497 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754254_13430 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754254 2025-07-18 11:01:37,981 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754255_13431 src: /192.168.158.5:57842 dest: /192.168.158.4:9866 2025-07-18 11:01:38,001 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:57842, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_734560869_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754255_13431, duration(ns): 18201263 2025-07-18 11:01:38,001 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754255_13431, type=LAST_IN_PIPELINE terminating 2025-07-18 11:01:42,501 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754255_13431 replica FinalizedReplica, blk_1073754255_13431, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754255 for deletion 2025-07-18 11:01:42,502 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754255_13431 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754255 2025-07-18 11:02:37,989 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754256_13432 src: /192.168.158.9:51880 dest: /192.168.158.4:9866 2025-07-18 11:02:38,010 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:51880, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1617090812_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754256_13432, duration(ns): 19016461 2025-07-18 11:02:38,011 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754256_13432, type=LAST_IN_PIPELINE terminating 2025-07-18 11:02:45,503 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754256_13432 replica FinalizedReplica, blk_1073754256_13432, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754256 for deletion 2025-07-18 11:02:45,505 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754256_13432 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754256 2025-07-18 11:03:37,985 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754257_13433 src: /192.168.158.6:47682 dest: /192.168.158.4:9866 2025-07-18 11:03:38,011 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:47682, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1345672441_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754257_13433, duration(ns): 20379464 2025-07-18 11:03:38,011 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754257_13433, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-18 11:03:42,506 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754257_13433 replica FinalizedReplica, blk_1073754257_13433, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754257 for deletion 2025-07-18 11:03:42,507 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754257_13433 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754257 2025-07-18 11:04:37,983 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754258_13434 src: /192.168.158.1:60822 dest: /192.168.158.4:9866 2025-07-18 11:04:38,017 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60822, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1251579953_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754258_13434, duration(ns): 25670485 2025-07-18 11:04:38,018 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754258_13434, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-18 11:04:42,509 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754258_13434 replica FinalizedReplica, blk_1073754258_13434, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754258 for deletion 2025-07-18 11:04:42,511 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754258_13434 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754258 2025-07-18 11:09:47,993 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754263_13439 src: /192.168.158.1:40930 dest: /192.168.158.4:9866 2025-07-18 11:09:48,027 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40930, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1478723632_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754263_13439, duration(ns): 24184203 2025-07-18 11:09:48,027 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754263_13439, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-18 11:09:51,519 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754263_13439 replica FinalizedReplica, blk_1073754263_13439, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754263 for deletion 2025-07-18 11:09:51,520 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754263_13439 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754263 2025-07-18 11:10:47,989 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754264_13440 src: /192.168.158.1:37252 dest: /192.168.158.4:9866 2025-07-18 11:10:48,023 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37252, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_708239915_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754264_13440, duration(ns): 25517710 2025-07-18 11:10:48,024 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754264_13440, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-18 11:10:51,521 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754264_13440 replica FinalizedReplica, blk_1073754264_13440, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754264 for deletion 2025-07-18 11:10:51,522 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754264_13440 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754264 2025-07-18 11:11:47,997 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754265_13441 src: /192.168.158.8:60746 dest: /192.168.158.4:9866 2025-07-18 11:11:48,025 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:60746, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-185158771_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754265_13441, duration(ns): 22637601 2025-07-18 11:11:48,026 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754265_13441, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-18 11:11:54,524 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754265_13441 replica FinalizedReplica, blk_1073754265_13441, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754265 for deletion 2025-07-18 11:11:54,525 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754265_13441 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754265 2025-07-18 11:12:47,999 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754266_13442 src: /192.168.158.7:41482 dest: /192.168.158.4:9866 2025-07-18 11:12:48,025 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:41482, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-475717980_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754266_13442, duration(ns): 20362885 2025-07-18 11:12:48,025 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754266_13442, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-18 11:12:51,527 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754266_13442 replica FinalizedReplica, blk_1073754266_13442, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754266 for deletion 2025-07-18 11:12:51,528 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754266_13442 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754266 2025-07-18 11:13:53,001 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754267_13443 src: /192.168.158.5:54296 dest: /192.168.158.4:9866 2025-07-18 11:13:53,026 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:54296, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1269139894_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754267_13443, duration(ns): 19752174 2025-07-18 11:13:53,026 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754267_13443, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-18 11:14:00,529 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754267_13443 replica FinalizedReplica, blk_1073754267_13443, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754267 for deletion 2025-07-18 11:14:00,530 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754267_13443 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754267 2025-07-18 11:14:53,009 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754268_13444 src: /192.168.158.5:60806 dest: /192.168.158.4:9866 2025-07-18 11:14:53,028 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:60806, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1549844265_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754268_13444, duration(ns): 16798439 2025-07-18 11:14:53,028 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754268_13444, type=LAST_IN_PIPELINE terminating 2025-07-18 11:14:57,530 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754268_13444 replica FinalizedReplica, blk_1073754268_13444, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754268 for deletion 2025-07-18 11:14:57,531 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754268_13444 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754268 2025-07-18 11:16:58,007 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754270_13446 src: /192.168.158.8:53664 dest: /192.168.158.4:9866 2025-07-18 11:16:58,041 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:53664, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_593356379_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754270_13446, duration(ns): 30766914 2025-07-18 11:16:58,042 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754270_13446, type=LAST_IN_PIPELINE terminating 2025-07-18 11:17:03,532 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754270_13446 replica FinalizedReplica, blk_1073754270_13446, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754270 for deletion 2025-07-18 11:17:03,534 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754270_13446 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754270 2025-07-18 11:19:58,006 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754273_13449 src: /192.168.158.1:33466 dest: /192.168.158.4:9866 2025-07-18 11:19:58,042 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33466, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2085409547_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754273_13449, duration(ns): 26309429 2025-07-18 11:19:58,042 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754273_13449, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-18 11:20:03,537 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754273_13449 replica FinalizedReplica, blk_1073754273_13449, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754273 for deletion 2025-07-18 11:20:03,538 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754273_13449 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754273 2025-07-18 11:20:58,007 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754274_13450 src: /192.168.158.8:54700 dest: /192.168.158.4:9866 2025-07-18 11:20:58,032 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:54700, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-151162894_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754274_13450, duration(ns): 20149346 2025-07-18 11:20:58,033 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754274_13450, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-18 11:21:03,538 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754274_13450 replica FinalizedReplica, blk_1073754274_13450, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754274 for deletion 2025-07-18 11:21:03,540 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754274_13450 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754274 2025-07-18 11:25:03,023 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754278_13454 src: /192.168.158.7:45896 dest: /192.168.158.4:9866 2025-07-18 11:25:03,049 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:45896, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_328560122_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754278_13454, duration(ns): 20189210 2025-07-18 11:25:03,049 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754278_13454, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-18 11:25:09,546 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754278_13454 replica FinalizedReplica, blk_1073754278_13454, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754278 for deletion 2025-07-18 11:25:09,547 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754278_13454 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754278 2025-07-18 11:26:03,010 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754279_13455 src: /192.168.158.1:46492 dest: /192.168.158.4:9866 2025-07-18 11:26:03,041 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46492, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_939570028_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754279_13455, duration(ns): 22656934 2025-07-18 11:26:03,041 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754279_13455, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-18 11:26:06,546 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754279_13455 replica FinalizedReplica, blk_1073754279_13455, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754279 for deletion 2025-07-18 11:26:06,547 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754279_13455 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754279 2025-07-18 11:27:03,014 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754280_13456 src: /192.168.158.7:56304 dest: /192.168.158.4:9866 2025-07-18 11:27:03,040 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:56304, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1463405436_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754280_13456, duration(ns): 20775818 2025-07-18 11:27:03,040 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754280_13456, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-18 11:27:06,551 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754280_13456 replica FinalizedReplica, blk_1073754280_13456, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754280 for deletion 2025-07-18 11:27:06,552 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754280_13456 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754280 2025-07-18 11:29:03,021 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754282_13458 src: /192.168.158.9:36714 dest: /192.168.158.4:9866 2025-07-18 11:29:03,039 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:36714, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-632375262_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754282_13458, duration(ns): 16002241 2025-07-18 11:29:03,039 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754282_13458, type=LAST_IN_PIPELINE terminating 2025-07-18 11:29:06,552 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754282_13458 replica FinalizedReplica, blk_1073754282_13458, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754282 for deletion 2025-07-18 11:29:06,553 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754282_13458 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754282 2025-07-18 11:32:03,024 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754285_13461 src: /192.168.158.8:45836 dest: /192.168.158.4:9866 2025-07-18 11:32:03,043 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:45836, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_535487271_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754285_13461, duration(ns): 16398932 2025-07-18 11:32:03,043 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754285_13461, type=LAST_IN_PIPELINE terminating 2025-07-18 11:32:06,556 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754285_13461 replica FinalizedReplica, blk_1073754285_13461, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754285 for deletion 2025-07-18 11:32:06,557 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754285_13461 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754285 2025-07-18 11:33:03,032 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754286_13462 src: /192.168.158.7:35126 dest: /192.168.158.4:9866 2025-07-18 11:33:03,050 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:35126, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_866017627_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754286_13462, duration(ns): 16004922 2025-07-18 11:33:03,051 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754286_13462, type=LAST_IN_PIPELINE terminating 2025-07-18 11:33:06,557 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754286_13462 replica FinalizedReplica, blk_1073754286_13462, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754286 for deletion 2025-07-18 11:33:06,559 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754286_13462 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754286 2025-07-18 11:34:08,020 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754287_13463 src: /192.168.158.1:46210 dest: /192.168.158.4:9866 2025-07-18 11:34:08,053 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46210, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1141475278_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754287_13463, duration(ns): 23071832 2025-07-18 11:34:08,053 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754287_13463, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-18 11:34:15,560 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754287_13463 replica FinalizedReplica, blk_1073754287_13463, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754287 for deletion 2025-07-18 11:34:15,562 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754287_13463 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754287 2025-07-18 11:35:08,025 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754288_13464 src: /192.168.158.5:52700 dest: /192.168.158.4:9866 2025-07-18 11:35:08,052 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:52700, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_885452267_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754288_13464, duration(ns): 21520506 2025-07-18 11:35:08,052 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754288_13464, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-18 11:35:15,562 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754288_13464 replica FinalizedReplica, blk_1073754288_13464, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754288 for deletion 2025-07-18 11:35:15,564 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754288_13464 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754288 2025-07-18 11:36:08,031 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754289_13465 src: /192.168.158.5:39278 dest: /192.168.158.4:9866 2025-07-18 11:36:08,058 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:39278, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1934525100_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754289_13465, duration(ns): 21919158 2025-07-18 11:36:08,059 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754289_13465, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-18 11:36:12,564 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754289_13465 replica FinalizedReplica, blk_1073754289_13465, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754289 for deletion 2025-07-18 11:36:12,566 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754289_13465 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754289 2025-07-18 11:36:13,269 INFO org.apache.hadoop.hdfs.server.datanode.DirectoryScanner: BlockPool BP-1059995147-192.168.158.1-1752101929360 Total blocks: 8, missing metadata files:0, missing block files:0, missing blocks in memory:0, mismatched blocks:0 2025-07-18 11:41:13,037 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754294_13470 src: /192.168.158.1:35746 dest: /192.168.158.4:9866 2025-07-18 11:41:13,070 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35746, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-471140883_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754294_13470, duration(ns): 24378766 2025-07-18 11:41:13,070 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754294_13470, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-18 11:41:15,576 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754294_13470 replica FinalizedReplica, blk_1073754294_13470, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754294 for deletion 2025-07-18 11:41:15,578 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754294_13470 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754294 2025-07-18 11:44:18,046 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754297_13473 src: /192.168.158.8:46350 dest: /192.168.158.4:9866 2025-07-18 11:44:18,065 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:46350, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1451765604_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754297_13473, duration(ns): 16468538 2025-07-18 11:44:18,065 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754297_13473, type=LAST_IN_PIPELINE terminating 2025-07-18 11:44:21,581 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754297_13473 replica FinalizedReplica, blk_1073754297_13473, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754297 for deletion 2025-07-18 11:44:21,582 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754297_13473 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754297 2025-07-18 11:46:18,052 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754299_13475 src: /192.168.158.6:56534 dest: /192.168.158.4:9866 2025-07-18 11:46:18,077 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:56534, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_307928142_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754299_13475, duration(ns): 19945571 2025-07-18 11:46:18,079 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754299_13475, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-18 11:46:21,585 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754299_13475 replica FinalizedReplica, blk_1073754299_13475, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754299 for deletion 2025-07-18 11:46:21,586 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754299_13475 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754299 2025-07-18 11:48:18,055 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754301_13477 src: /192.168.158.9:36368 dest: /192.168.158.4:9866 2025-07-18 11:48:18,075 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:36368, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1749956710_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754301_13477, duration(ns): 18005456 2025-07-18 11:48:18,075 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754301_13477, type=LAST_IN_PIPELINE terminating 2025-07-18 11:48:21,587 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754301_13477 replica FinalizedReplica, blk_1073754301_13477, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754301 for deletion 2025-07-18 11:48:21,588 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754301_13477 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754301 2025-07-18 11:52:33,052 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754305_13481 src: /192.168.158.1:42672 dest: /192.168.158.4:9866 2025-07-18 11:52:33,084 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42672, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1920035259_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754305_13481, duration(ns): 22889399 2025-07-18 11:52:33,084 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754305_13481, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-18 11:52:36,598 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754305_13481 replica FinalizedReplica, blk_1073754305_13481, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754305 for deletion 2025-07-18 11:52:36,599 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754305_13481 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754305 2025-07-18 11:53:33,059 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754306_13482 src: /192.168.158.1:40176 dest: /192.168.158.4:9866 2025-07-18 11:53:33,094 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40176, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-645352131_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754306_13482, duration(ns): 24349442 2025-07-18 11:53:33,094 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754306_13482, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-18 11:53:36,601 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754306_13482 replica FinalizedReplica, blk_1073754306_13482, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754306 for deletion 2025-07-18 11:53:36,602 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754306_13482 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754306 2025-07-18 11:54:33,056 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754307_13483 src: /192.168.158.1:37708 dest: /192.168.158.4:9866 2025-07-18 11:54:33,089 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37708, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_286153927_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754307_13483, duration(ns): 23673425 2025-07-18 11:54:33,089 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754307_13483, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-18 11:54:39,603 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754307_13483 replica FinalizedReplica, blk_1073754307_13483, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754307 for deletion 2025-07-18 11:54:39,604 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754307_13483 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754307 2025-07-18 11:55:33,075 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754308_13484 src: /192.168.158.7:35590 dest: /192.168.158.4:9866 2025-07-18 11:55:33,100 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:35590, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1982500838_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754308_13484, duration(ns): 19783271 2025-07-18 11:55:33,100 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754308_13484, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-18 11:55:39,604 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754308_13484 replica FinalizedReplica, blk_1073754308_13484, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754308 for deletion 2025-07-18 11:55:39,605 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754308_13484 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754308 2025-07-18 11:57:33,061 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754310_13486 src: /192.168.158.1:45152 dest: /192.168.158.4:9866 2025-07-18 11:57:33,099 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45152, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1441123797_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754310_13486, duration(ns): 28008894 2025-07-18 11:57:33,099 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754310_13486, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-18 11:57:36,608 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754310_13486 replica FinalizedReplica, blk_1073754310_13486, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754310 for deletion 2025-07-18 11:57:36,609 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754310_13486 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754310 2025-07-18 11:59:38,067 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754312_13488 src: /192.168.158.1:35608 dest: /192.168.158.4:9866 2025-07-18 11:59:38,101 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35608, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_268092152_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754312_13488, duration(ns): 24797653 2025-07-18 11:59:38,101 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754312_13488, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-18 11:59:45,609 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754312_13488 replica FinalizedReplica, blk_1073754312_13488, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754312 for deletion 2025-07-18 11:59:45,610 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754312_13488 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754312 2025-07-18 12:03:38,074 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754316_13492 src: /192.168.158.1:50940 dest: /192.168.158.4:9866 2025-07-18 12:03:38,106 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50940, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1908597333_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754316_13492, duration(ns): 23619285 2025-07-18 12:03:38,107 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754316_13492, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-18 12:03:42,614 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754316_13492 replica FinalizedReplica, blk_1073754316_13492, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754316 for deletion 2025-07-18 12:03:42,615 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754316_13492 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754316 2025-07-18 12:04:38,077 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754317_13493 src: /192.168.158.9:41102 dest: /192.168.158.4:9866 2025-07-18 12:04:38,103 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:41102, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1433881279_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754317_13493, duration(ns): 20328260 2025-07-18 12:04:38,103 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754317_13493, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-18 12:04:45,616 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754317_13493 replica FinalizedReplica, blk_1073754317_13493, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754317 for deletion 2025-07-18 12:04:45,617 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754317_13493 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754317 2025-07-18 12:10:43,082 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754323_13499 src: /192.168.158.1:35718 dest: /192.168.158.4:9866 2025-07-18 12:10:43,114 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35718, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1626647327_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754323_13499, duration(ns): 23300108 2025-07-18 12:10:43,114 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754323_13499, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-18 12:10:45,630 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754323_13499 replica FinalizedReplica, blk_1073754323_13499, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754323 for deletion 2025-07-18 12:10:45,631 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754323_13499 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754323 2025-07-18 12:11:43,131 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754324_13500 src: /192.168.158.6:38464 dest: /192.168.158.4:9866 2025-07-18 12:11:43,150 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:38464, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1477010141_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754324_13500, duration(ns): 17333095 2025-07-18 12:11:43,151 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754324_13500, type=LAST_IN_PIPELINE terminating 2025-07-18 12:11:48,631 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754324_13500 replica FinalizedReplica, blk_1073754324_13500, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754324 for deletion 2025-07-18 12:11:48,632 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754324_13500 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754324 2025-07-18 12:12:43,089 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754325_13501 src: /192.168.158.1:36410 dest: /192.168.158.4:9866 2025-07-18 12:12:43,124 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36410, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1057581782_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754325_13501, duration(ns): 24904352 2025-07-18 12:12:43,124 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754325_13501, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-18 12:12:48,634 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754325_13501 replica FinalizedReplica, blk_1073754325_13501, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754325 for deletion 2025-07-18 12:12:48,635 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754325_13501 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754325 2025-07-18 12:13:43,093 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754326_13502 src: /192.168.158.9:35432 dest: /192.168.158.4:9866 2025-07-18 12:13:43,120 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:35432, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1939011915_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754326_13502, duration(ns): 20936824 2025-07-18 12:13:43,120 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754326_13502, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-18 12:13:45,635 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754326_13502 replica FinalizedReplica, blk_1073754326_13502, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754326 for deletion 2025-07-18 12:13:45,636 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754326_13502 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754326 2025-07-18 12:14:43,096 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754327_13503 src: /192.168.158.7:56376 dest: /192.168.158.4:9866 2025-07-18 12:14:43,115 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:56376, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-80542589_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754327_13503, duration(ns): 16240247 2025-07-18 12:14:43,115 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754327_13503, type=LAST_IN_PIPELINE terminating 2025-07-18 12:14:45,635 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754327_13503 replica FinalizedReplica, blk_1073754327_13503, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754327 for deletion 2025-07-18 12:14:45,636 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754327_13503 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754327 2025-07-18 12:16:43,103 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754329_13505 src: /192.168.158.7:60490 dest: /192.168.158.4:9866 2025-07-18 12:16:43,126 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:60490, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-773551108_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754329_13505, duration(ns): 18223360 2025-07-18 12:16:43,130 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754329_13505, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-18 12:16:48,637 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754329_13505 replica FinalizedReplica, blk_1073754329_13505, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754329 for deletion 2025-07-18 12:16:48,639 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754329_13505 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754329 2025-07-18 12:17:43,101 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754330_13506 src: /192.168.158.5:36962 dest: /192.168.158.4:9866 2025-07-18 12:17:43,128 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:36962, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-498368485_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754330_13506, duration(ns): 21530902 2025-07-18 12:17:43,128 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754330_13506, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-18 12:17:48,638 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754330_13506 replica FinalizedReplica, blk_1073754330_13506, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754330 for deletion 2025-07-18 12:17:48,639 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754330_13506 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754330 2025-07-18 12:19:43,091 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754332_13508 src: /192.168.158.7:39792 dest: /192.168.158.4:9866 2025-07-18 12:19:43,113 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:39792, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1401476539_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754332_13508, duration(ns): 19215888 2025-07-18 12:19:43,113 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754332_13508, type=LAST_IN_PIPELINE terminating 2025-07-18 12:19:45,642 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754332_13508 replica FinalizedReplica, blk_1073754332_13508, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754332 for deletion 2025-07-18 12:19:45,643 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754332_13508 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754332 2025-07-18 12:22:43,102 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754335_13511 src: /192.168.158.1:53802 dest: /192.168.158.4:9866 2025-07-18 12:22:43,136 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53802, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_193920269_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754335_13511, duration(ns): 25255518 2025-07-18 12:22:43,136 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754335_13511, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-18 12:22:45,645 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754335_13511 replica FinalizedReplica, blk_1073754335_13511, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754335 for deletion 2025-07-18 12:22:45,646 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754335_13511 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754335 2025-07-18 12:23:43,135 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754336_13512 src: /192.168.158.6:60354 dest: /192.168.158.4:9866 2025-07-18 12:23:43,160 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:60354, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_842283269_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754336_13512, duration(ns): 19454816 2025-07-18 12:23:43,160 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754336_13512, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-18 12:23:45,648 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754336_13512 replica FinalizedReplica, blk_1073754336_13512, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754336 for deletion 2025-07-18 12:23:45,649 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754336_13512 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754336 2025-07-18 12:24:43,112 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754337_13513 src: /192.168.158.8:35844 dest: /192.168.158.4:9866 2025-07-18 12:24:43,137 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:35844, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_456391061_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754337_13513, duration(ns): 19262109 2025-07-18 12:24:43,138 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754337_13513, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-18 12:24:45,649 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754337_13513 replica FinalizedReplica, blk_1073754337_13513, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754337 for deletion 2025-07-18 12:24:45,650 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754337_13513 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754337 2025-07-18 12:25:43,113 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754338_13514 src: /192.168.158.6:35154 dest: /192.168.158.4:9866 2025-07-18 12:25:43,133 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:35154, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_73980929_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754338_13514, duration(ns): 17514818 2025-07-18 12:25:43,133 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754338_13514, type=LAST_IN_PIPELINE terminating 2025-07-18 12:25:48,651 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754338_13514 replica FinalizedReplica, blk_1073754338_13514, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754338 for deletion 2025-07-18 12:25:48,652 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754338_13514 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754338 2025-07-18 12:28:43,116 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754341_13517 src: /192.168.158.6:34102 dest: /192.168.158.4:9866 2025-07-18 12:28:43,143 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:34102, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1630148464_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754341_13517, duration(ns): 21854626 2025-07-18 12:28:43,144 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754341_13517, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-18 12:28:45,659 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754341_13517 replica FinalizedReplica, blk_1073754341_13517, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754341 for deletion 2025-07-18 12:28:45,660 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754341_13517 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754341 2025-07-18 12:29:43,122 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754342_13518 src: /192.168.158.1:37926 dest: /192.168.158.4:9866 2025-07-18 12:29:43,155 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37926, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2101815452_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754342_13518, duration(ns): 23751740 2025-07-18 12:29:43,155 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754342_13518, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-18 12:29:45,662 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754342_13518 replica FinalizedReplica, blk_1073754342_13518, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754342 for deletion 2025-07-18 12:29:45,663 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754342_13518 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754342 2025-07-18 12:31:43,126 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754344_13520 src: /192.168.158.5:53118 dest: /192.168.158.4:9866 2025-07-18 12:31:43,147 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:53118, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_433429036_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754344_13520, duration(ns): 18563070 2025-07-18 12:31:43,147 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754344_13520, type=LAST_IN_PIPELINE terminating 2025-07-18 12:31:48,667 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754344_13520 replica FinalizedReplica, blk_1073754344_13520, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754344 for deletion 2025-07-18 12:31:48,668 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754344_13520 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754344 2025-07-18 12:33:48,144 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754346_13522 src: /192.168.158.9:57146 dest: /192.168.158.4:9866 2025-07-18 12:33:48,161 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:57146, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1270082746_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754346_13522, duration(ns): 15240921 2025-07-18 12:33:48,162 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754346_13522, type=LAST_IN_PIPELINE terminating 2025-07-18 12:33:51,667 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754346_13522 replica FinalizedReplica, blk_1073754346_13522, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754346 for deletion 2025-07-18 12:33:51,669 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754346_13522 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754346 2025-07-18 12:35:48,135 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754348_13524 src: /192.168.158.5:58404 dest: /192.168.158.4:9866 2025-07-18 12:35:48,165 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:58404, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-177837490_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754348_13524, duration(ns): 23008301 2025-07-18 12:35:48,165 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754348_13524, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-18 12:35:51,669 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754348_13524 replica FinalizedReplica, blk_1073754348_13524, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754348 for deletion 2025-07-18 12:35:51,670 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754348_13524 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754348 2025-07-18 12:36:53,137 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754349_13525 src: /192.168.158.8:38988 dest: /192.168.158.4:9866 2025-07-18 12:36:53,157 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:38988, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1925936928_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754349_13525, duration(ns): 18033886 2025-07-18 12:36:53,157 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754349_13525, type=LAST_IN_PIPELINE terminating 2025-07-18 12:36:57,673 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754349_13525 replica FinalizedReplica, blk_1073754349_13525, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754349 for deletion 2025-07-18 12:36:57,674 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754349_13525 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754349 2025-07-18 12:39:53,142 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754352_13528 src: /192.168.158.7:38896 dest: /192.168.158.4:9866 2025-07-18 12:39:53,168 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:38896, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1299677731_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754352_13528, duration(ns): 21013159 2025-07-18 12:39:53,169 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754352_13528, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-18 12:39:57,680 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754352_13528 replica FinalizedReplica, blk_1073754352_13528, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754352 for deletion 2025-07-18 12:39:57,682 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754352_13528 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754352 2025-07-18 12:40:58,138 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754353_13529 src: /192.168.158.1:36896 dest: /192.168.158.4:9866 2025-07-18 12:40:58,173 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36896, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-166816976_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754353_13529, duration(ns): 24687376 2025-07-18 12:40:58,173 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754353_13529, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-18 12:41:00,683 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754353_13529 replica FinalizedReplica, blk_1073754353_13529, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754353 for deletion 2025-07-18 12:41:00,684 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754353_13529 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754353 2025-07-18 12:43:58,142 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754356_13532 src: /192.168.158.1:59724 dest: /192.168.158.4:9866 2025-07-18 12:43:58,176 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59724, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1461150741_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754356_13532, duration(ns): 22899634 2025-07-18 12:43:58,176 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754356_13532, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-18 12:44:00,689 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754356_13532 replica FinalizedReplica, blk_1073754356_13532, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754356 for deletion 2025-07-18 12:44:00,690 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754356_13532 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754356 2025-07-18 12:45:58,165 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754358_13534 src: /192.168.158.7:38408 dest: /192.168.158.4:9866 2025-07-18 12:45:58,185 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:38408, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-947068600_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754358_13534, duration(ns): 17136165 2025-07-18 12:45:58,185 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754358_13534, type=LAST_IN_PIPELINE terminating 2025-07-18 12:46:03,692 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754358_13534 replica FinalizedReplica, blk_1073754358_13534, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754358 for deletion 2025-07-18 12:46:03,693 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754358_13534 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754358 2025-07-18 12:48:03,147 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754360_13536 src: /192.168.158.1:34368 dest: /192.168.158.4:9866 2025-07-18 12:48:03,181 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34368, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-838192135_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754360_13536, duration(ns): 24819613 2025-07-18 12:48:03,182 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754360_13536, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-18 12:48:06,695 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754360_13536 replica FinalizedReplica, blk_1073754360_13536, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754360 for deletion 2025-07-18 12:48:06,696 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754360_13536 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754360 2025-07-18 12:50:03,150 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754362_13538 src: /192.168.158.1:55746 dest: /192.168.158.4:9866 2025-07-18 12:50:03,184 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55746, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1742828498_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754362_13538, duration(ns): 24818117 2025-07-18 12:50:03,185 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754362_13538, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-18 12:50:06,700 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754362_13538 replica FinalizedReplica, blk_1073754362_13538, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754362 for deletion 2025-07-18 12:50:06,701 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754362_13538 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754362 2025-07-18 12:51:03,170 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754363_13539 src: /192.168.158.9:47266 dest: /192.168.158.4:9866 2025-07-18 12:51:03,189 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:47266, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1897444728_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754363_13539, duration(ns): 17274278 2025-07-18 12:51:03,190 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754363_13539, type=LAST_IN_PIPELINE terminating 2025-07-18 12:51:06,704 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754363_13539 replica FinalizedReplica, blk_1073754363_13539, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754363 for deletion 2025-07-18 12:51:06,705 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754363_13539 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754363 2025-07-18 12:55:03,158 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754367_13543 src: /192.168.158.8:50240 dest: /192.168.158.4:9866 2025-07-18 12:55:03,176 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:50240, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1277587781_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754367_13543, duration(ns): 16442719 2025-07-18 12:55:03,177 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754367_13543, type=LAST_IN_PIPELINE terminating 2025-07-18 12:55:06,712 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754367_13543 replica FinalizedReplica, blk_1073754367_13543, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754367 for deletion 2025-07-18 12:55:06,713 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754367_13543 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir16/blk_1073754367 2025-07-18 12:56:03,163 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754368_13544 src: /192.168.158.7:58314 dest: /192.168.158.4:9866 2025-07-18 12:56:03,181 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:58314, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-529144659_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754368_13544, duration(ns): 16379267 2025-07-18 12:56:03,182 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754368_13544, type=LAST_IN_PIPELINE terminating 2025-07-18 12:56:06,714 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754368_13544 replica FinalizedReplica, blk_1073754368_13544, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754368 for deletion 2025-07-18 12:56:06,716 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754368_13544 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754368 2025-07-18 12:57:08,171 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754369_13545 src: /192.168.158.7:54304 dest: /192.168.158.4:9866 2025-07-18 12:57:08,198 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:54304, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-374364860_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754369_13545, duration(ns): 22146612 2025-07-18 12:57:08,199 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754369_13545, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-18 12:57:12,716 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754369_13545 replica FinalizedReplica, blk_1073754369_13545, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754369 for deletion 2025-07-18 12:57:12,717 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754369_13545 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754369 2025-07-18 12:58:08,169 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754370_13546 src: /192.168.158.7:37780 dest: /192.168.158.4:9866 2025-07-18 12:58:08,189 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:37780, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_642531672_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754370_13546, duration(ns): 18445138 2025-07-18 12:58:08,190 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754370_13546, type=LAST_IN_PIPELINE terminating 2025-07-18 12:58:12,720 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754370_13546 replica FinalizedReplica, blk_1073754370_13546, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754370 for deletion 2025-07-18 12:58:12,721 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754370_13546 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754370 2025-07-18 12:59:08,166 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754371_13547 src: /192.168.158.5:45876 dest: /192.168.158.4:9866 2025-07-18 12:59:08,193 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:45876, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2071806123_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754371_13547, duration(ns): 21240410 2025-07-18 12:59:08,193 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754371_13547, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-18 12:59:12,721 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754371_13547 replica FinalizedReplica, blk_1073754371_13547, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754371 for deletion 2025-07-18 12:59:12,723 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754371_13547 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754371 2025-07-18 13:01:08,176 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754373_13549 src: /192.168.158.5:49898 dest: /192.168.158.4:9866 2025-07-18 13:01:08,197 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:49898, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1254039676_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754373_13549, duration(ns): 18099361 2025-07-18 13:01:08,197 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754373_13549, type=LAST_IN_PIPELINE terminating 2025-07-18 13:01:15,724 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754373_13549 replica FinalizedReplica, blk_1073754373_13549, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754373 for deletion 2025-07-18 13:01:15,726 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754373_13549 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754373 2025-07-18 13:02:13,181 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754374_13550 src: /192.168.158.7:50200 dest: /192.168.158.4:9866 2025-07-18 13:02:13,200 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:50200, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1845683655_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754374_13550, duration(ns): 17255987 2025-07-18 13:02:13,200 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754374_13550, type=LAST_IN_PIPELINE terminating 2025-07-18 13:02:15,727 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754374_13550 replica FinalizedReplica, blk_1073754374_13550, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754374 for deletion 2025-07-18 13:02:15,728 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754374_13550 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754374 2025-07-18 13:04:13,174 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754376_13552 src: /192.168.158.1:42016 dest: /192.168.158.4:9866 2025-07-18 13:04:13,210 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42016, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1347397285_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754376_13552, duration(ns): 26678698 2025-07-18 13:04:13,210 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754376_13552, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-18 13:04:15,729 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754376_13552 replica FinalizedReplica, blk_1073754376_13552, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754376 for deletion 2025-07-18 13:04:15,730 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754376_13552 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754376 2025-07-18 13:05:13,179 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754377_13553 src: /192.168.158.8:33570 dest: /192.168.158.4:9866 2025-07-18 13:05:13,204 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:33570, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1235255016_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754377_13553, duration(ns): 19903668 2025-07-18 13:05:13,205 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754377_13553, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-18 13:05:15,731 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754377_13553 replica FinalizedReplica, blk_1073754377_13553, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754377 for deletion 2025-07-18 13:05:15,732 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754377_13553 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754377 2025-07-18 13:08:13,179 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754380_13556 src: /192.168.158.1:35798 dest: /192.168.158.4:9866 2025-07-18 13:08:13,211 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35798, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1909529282_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754380_13556, duration(ns): 23369300 2025-07-18 13:08:13,211 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754380_13556, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-18 13:08:15,738 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754380_13556 replica FinalizedReplica, blk_1073754380_13556, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754380 for deletion 2025-07-18 13:08:15,739 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754380_13556 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754380 2025-07-18 13:09:13,180 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754381_13557 src: /192.168.158.1:54992 dest: /192.168.158.4:9866 2025-07-18 13:09:13,216 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54992, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1595013200_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754381_13557, duration(ns): 26202411 2025-07-18 13:09:13,216 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754381_13557, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-18 13:09:15,740 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754381_13557 replica FinalizedReplica, blk_1073754381_13557, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754381 for deletion 2025-07-18 13:09:15,742 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754381_13557 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754381 2025-07-18 13:15:13,193 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754387_13563 src: /192.168.158.1:38042 dest: /192.168.158.4:9866 2025-07-18 13:15:13,231 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38042, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1699241201_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754387_13563, duration(ns): 27997264 2025-07-18 13:15:13,231 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754387_13563, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-18 13:15:15,753 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754387_13563 replica FinalizedReplica, blk_1073754387_13563, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754387 for deletion 2025-07-18 13:15:15,754 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754387_13563 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754387 2025-07-18 13:16:13,202 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754388_13564 src: /192.168.158.6:52078 dest: /192.168.158.4:9866 2025-07-18 13:16:13,222 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:52078, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2060258752_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754388_13564, duration(ns): 18771894 2025-07-18 13:16:13,223 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754388_13564, type=LAST_IN_PIPELINE terminating 2025-07-18 13:16:15,754 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754388_13564 replica FinalizedReplica, blk_1073754388_13564, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754388 for deletion 2025-07-18 13:16:15,756 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754388_13564 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754388 2025-07-18 13:21:23,225 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754393_13569 src: /192.168.158.1:60118 dest: /192.168.158.4:9866 2025-07-18 13:21:23,261 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60118, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_471897861_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754393_13569, duration(ns): 26694089 2025-07-18 13:21:23,262 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754393_13569, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-18 13:21:27,761 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754393_13569 replica FinalizedReplica, blk_1073754393_13569, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754393 for deletion 2025-07-18 13:21:27,762 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754393_13569 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754393 2025-07-18 13:24:23,204 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754396_13572 src: /192.168.158.1:44294 dest: /192.168.158.4:9866 2025-07-18 13:24:23,241 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44294, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2076098738_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754396_13572, duration(ns): 27484723 2025-07-18 13:24:23,242 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754396_13572, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-18 13:24:27,767 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754396_13572 replica FinalizedReplica, blk_1073754396_13572, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754396 for deletion 2025-07-18 13:24:27,768 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754396_13572 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754396 2025-07-18 13:25:23,208 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754397_13573 src: /192.168.158.8:57606 dest: /192.168.158.4:9866 2025-07-18 13:25:23,234 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:57606, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-212954765_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754397_13573, duration(ns): 20131614 2025-07-18 13:25:23,235 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754397_13573, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-18 13:25:27,768 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754397_13573 replica FinalizedReplica, blk_1073754397_13573, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754397 for deletion 2025-07-18 13:25:27,772 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754397_13573 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754397 2025-07-18 13:26:28,210 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754398_13574 src: /192.168.158.6:36816 dest: /192.168.158.4:9866 2025-07-18 13:26:28,229 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:36816, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1215698005_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754398_13574, duration(ns): 16711150 2025-07-18 13:26:28,229 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754398_13574, type=LAST_IN_PIPELINE terminating 2025-07-18 13:26:30,771 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754398_13574 replica FinalizedReplica, blk_1073754398_13574, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754398 for deletion 2025-07-18 13:26:30,772 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754398_13574 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754398 2025-07-18 13:27:28,215 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754399_13575 src: /192.168.158.5:59406 dest: /192.168.158.4:9866 2025-07-18 13:27:28,232 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:59406, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1337974645_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754399_13575, duration(ns): 15578240 2025-07-18 13:27:28,233 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754399_13575, type=LAST_IN_PIPELINE terminating 2025-07-18 13:27:30,775 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754399_13575 replica FinalizedReplica, blk_1073754399_13575, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754399 for deletion 2025-07-18 13:27:30,776 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754399_13575 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754399 2025-07-18 13:28:33,219 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754400_13576 src: /192.168.158.5:46714 dest: /192.168.158.4:9866 2025-07-18 13:28:33,245 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:46714, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1985972258_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754400_13576, duration(ns): 20182334 2025-07-18 13:28:33,245 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754400_13576, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-18 13:28:36,778 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754400_13576 replica FinalizedReplica, blk_1073754400_13576, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754400 for deletion 2025-07-18 13:28:36,779 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754400_13576 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754400 2025-07-18 13:30:38,229 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754402_13578 src: /192.168.158.1:53588 dest: /192.168.158.4:9866 2025-07-18 13:30:38,261 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53588, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_302277769_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754402_13578, duration(ns): 22606638 2025-07-18 13:30:38,261 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754402_13578, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-18 13:30:42,783 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754402_13578 replica FinalizedReplica, blk_1073754402_13578, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754402 for deletion 2025-07-18 13:30:42,784 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754402_13578 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754402 2025-07-18 13:31:38,237 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754403_13579 src: /192.168.158.9:40996 dest: /192.168.158.4:9866 2025-07-18 13:31:38,261 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:40996, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_744495463_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754403_13579, duration(ns): 19453896 2025-07-18 13:31:38,262 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754403_13579, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-18 13:31:45,787 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754403_13579 replica FinalizedReplica, blk_1073754403_13579, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754403 for deletion 2025-07-18 13:31:45,788 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754403_13579 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754403 2025-07-18 13:32:38,245 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754404_13580 src: /192.168.158.7:43860 dest: /192.168.158.4:9866 2025-07-18 13:32:38,268 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:43860, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1426040387_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754404_13580, duration(ns): 18399298 2025-07-18 13:32:38,269 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754404_13580, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-18 13:32:45,790 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754404_13580 replica FinalizedReplica, blk_1073754404_13580, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754404 for deletion 2025-07-18 13:32:45,791 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754404_13580 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754404 2025-07-18 13:36:48,226 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754408_13584 src: /192.168.158.7:47958 dest: /192.168.158.4:9866 2025-07-18 13:36:48,245 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:47958, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-622247607_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754408_13584, duration(ns): 17426349 2025-07-18 13:36:48,246 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754408_13584, type=LAST_IN_PIPELINE terminating 2025-07-18 13:36:54,816 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754408_13584 replica FinalizedReplica, blk_1073754408_13584, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754408 for deletion 2025-07-18 13:36:54,817 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754408_13584 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754408 2025-07-18 13:39:48,229 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754411_13587 src: /192.168.158.9:51458 dest: /192.168.158.4:9866 2025-07-18 13:39:48,254 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:51458, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1168955933_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754411_13587, duration(ns): 19312782 2025-07-18 13:39:48,256 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754411_13587, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-18 13:39:51,803 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754411_13587 replica FinalizedReplica, blk_1073754411_13587, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754411 for deletion 2025-07-18 13:39:51,804 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754411_13587 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754411 2025-07-18 13:42:53,231 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754414_13590 src: /192.168.158.9:37756 dest: /192.168.158.4:9866 2025-07-18 13:42:53,250 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:37756, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_520196808_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754414_13590, duration(ns): 16513767 2025-07-18 13:42:53,250 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754414_13590, type=LAST_IN_PIPELINE terminating 2025-07-18 13:43:00,809 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754414_13590 replica FinalizedReplica, blk_1073754414_13590, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754414 for deletion 2025-07-18 13:43:00,810 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754414_13590 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754414 2025-07-18 13:44:53,232 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754416_13592 src: /192.168.158.5:49466 dest: /192.168.158.4:9866 2025-07-18 13:44:53,257 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:49466, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-961172017_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754416_13592, duration(ns): 19867585 2025-07-18 13:44:53,257 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754416_13592, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-18 13:44:57,812 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754416_13592 replica FinalizedReplica, blk_1073754416_13592, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754416 for deletion 2025-07-18 13:44:57,814 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754416_13592 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754416 2025-07-18 13:46:53,234 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754418_13594 src: /192.168.158.8:38396 dest: /192.168.158.4:9866 2025-07-18 13:46:53,258 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:38396, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_459785379_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754418_13594, duration(ns): 19003219 2025-07-18 13:46:53,259 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754418_13594, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-18 13:46:57,818 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754418_13594 replica FinalizedReplica, blk_1073754418_13594, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754418 for deletion 2025-07-18 13:46:57,819 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754418_13594 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754418 2025-07-18 13:49:58,237 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754421_13597 src: /192.168.158.1:43116 dest: /192.168.158.4:9866 2025-07-18 13:49:58,270 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43116, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_240463732_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754421_13597, duration(ns): 24541267 2025-07-18 13:49:58,271 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754421_13597, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-18 13:50:03,824 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754421_13597 replica FinalizedReplica, blk_1073754421_13597, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754421 for deletion 2025-07-18 13:50:03,825 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754421_13597 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754421 2025-07-18 13:50:58,239 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754422_13598 src: /192.168.158.6:38454 dest: /192.168.158.4:9866 2025-07-18 13:50:58,266 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:38454, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1287697515_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754422_13598, duration(ns): 21508694 2025-07-18 13:50:58,267 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754422_13598, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-18 13:51:00,824 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754422_13598 replica FinalizedReplica, blk_1073754422_13598, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754422 for deletion 2025-07-18 13:51:00,826 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754422_13598 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754422 2025-07-18 13:54:03,254 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754425_13601 src: /192.168.158.7:47808 dest: /192.168.158.4:9866 2025-07-18 13:54:03,281 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:47808, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_924187862_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754425_13601, duration(ns): 22096972 2025-07-18 13:54:03,282 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754425_13601, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-18 13:54:06,833 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754425_13601 replica FinalizedReplica, blk_1073754425_13601, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754425 for deletion 2025-07-18 13:54:06,834 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754425_13601 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754425 2025-07-18 13:55:03,245 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754426_13602 src: /192.168.158.1:54570 dest: /192.168.158.4:9866 2025-07-18 13:55:03,281 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54570, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1517383351_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754426_13602, duration(ns): 26145193 2025-07-18 13:55:03,281 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754426_13602, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-18 13:55:06,834 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754426_13602 replica FinalizedReplica, blk_1073754426_13602, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754426 for deletion 2025-07-18 13:55:06,835 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754426_13602 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754426 2025-07-18 13:56:03,246 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754427_13603 src: /192.168.158.1:43074 dest: /192.168.158.4:9866 2025-07-18 13:56:03,283 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43074, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1913491019_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754427_13603, duration(ns): 26554488 2025-07-18 13:56:03,283 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754427_13603, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-18 13:56:09,837 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754427_13603 replica FinalizedReplica, blk_1073754427_13603, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754427 for deletion 2025-07-18 13:56:09,838 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754427_13603 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754427 2025-07-18 13:59:08,248 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754430_13606 src: /192.168.158.8:53398 dest: /192.168.158.4:9866 2025-07-18 13:59:08,275 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:53398, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1945580288_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754430_13606, duration(ns): 21588501 2025-07-18 13:59:08,275 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754430_13606, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-18 13:59:15,845 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754430_13606 replica FinalizedReplica, blk_1073754430_13606, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754430 for deletion 2025-07-18 13:59:15,846 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754430_13606 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754430 2025-07-18 14:09:28,252 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754440_13616 src: /192.168.158.1:53080 dest: /192.168.158.4:9866 2025-07-18 14:09:28,285 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53080, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_663532598_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754440_13616, duration(ns): 24010255 2025-07-18 14:09:28,285 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754440_13616, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-18 14:09:30,867 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754440_13616 replica FinalizedReplica, blk_1073754440_13616, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754440 for deletion 2025-07-18 14:09:30,868 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754440_13616 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754440 2025-07-18 14:10:28,265 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754441_13617 src: /192.168.158.5:42446 dest: /192.168.158.4:9866 2025-07-18 14:10:28,282 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:42446, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1838683826_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754441_13617, duration(ns): 15047499 2025-07-18 14:10:28,282 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754441_13617, type=LAST_IN_PIPELINE terminating 2025-07-18 14:10:30,868 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754441_13617 replica FinalizedReplica, blk_1073754441_13617, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754441 for deletion 2025-07-18 14:10:30,870 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754441_13617 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754441 2025-07-18 14:11:28,260 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754442_13618 src: /192.168.158.1:57024 dest: /192.168.158.4:9866 2025-07-18 14:11:28,294 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57024, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_428420718_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754442_13618, duration(ns): 24962275 2025-07-18 14:11:28,294 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754442_13618, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-18 14:11:30,870 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754442_13618 replica FinalizedReplica, blk_1073754442_13618, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754442 for deletion 2025-07-18 14:11:30,872 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754442_13618 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754442 2025-07-18 14:15:28,278 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754446_13622 src: /192.168.158.1:55180 dest: /192.168.158.4:9866 2025-07-18 14:15:28,313 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55180, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_28242629_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754446_13622, duration(ns): 24064885 2025-07-18 14:15:28,313 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754446_13622, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-18 14:15:33,879 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754446_13622 replica FinalizedReplica, blk_1073754446_13622, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754446 for deletion 2025-07-18 14:15:33,880 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754446_13622 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754446 2025-07-18 14:17:28,279 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754448_13624 src: /192.168.158.6:40636 dest: /192.168.158.4:9866 2025-07-18 14:17:28,305 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:40636, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-388767719_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754448_13624, duration(ns): 20985949 2025-07-18 14:17:28,306 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754448_13624, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-18 14:17:30,883 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754448_13624 replica FinalizedReplica, blk_1073754448_13624, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754448 for deletion 2025-07-18 14:17:30,884 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754448_13624 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754448 2025-07-18 14:18:28,296 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754449_13625 src: /192.168.158.6:33376 dest: /192.168.158.4:9866 2025-07-18 14:18:28,316 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:33376, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1910535440_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754449_13625, duration(ns): 17745622 2025-07-18 14:18:28,316 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754449_13625, type=LAST_IN_PIPELINE terminating 2025-07-18 14:18:30,887 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754449_13625 replica FinalizedReplica, blk_1073754449_13625, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754449 for deletion 2025-07-18 14:18:30,888 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754449_13625 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754449 2025-07-18 14:21:28,283 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754452_13628 src: /192.168.158.5:54556 dest: /192.168.158.4:9866 2025-07-18 14:21:28,308 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:54556, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1972983153_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754452_13628, duration(ns): 20217558 2025-07-18 14:21:28,309 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754452_13628, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-18 14:21:33,890 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754452_13628 replica FinalizedReplica, blk_1073754452_13628, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754452 for deletion 2025-07-18 14:21:33,892 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754452_13628 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754452 2025-07-18 14:24:28,278 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754455_13631 src: /192.168.158.9:41128 dest: /192.168.158.4:9866 2025-07-18 14:24:28,305 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:41128, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1237429068_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754455_13631, duration(ns): 21495657 2025-07-18 14:24:28,306 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754455_13631, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-18 14:24:30,895 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754455_13631 replica FinalizedReplica, blk_1073754455_13631, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754455 for deletion 2025-07-18 14:24:30,897 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754455_13631 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754455 2025-07-18 14:25:28,291 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754456_13632 src: /192.168.158.5:45510 dest: /192.168.158.4:9866 2025-07-18 14:25:28,309 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:45510, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_812716941_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754456_13632, duration(ns): 16400209 2025-07-18 14:25:28,309 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754456_13632, type=LAST_IN_PIPELINE terminating 2025-07-18 14:25:33,898 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754456_13632 replica FinalizedReplica, blk_1073754456_13632, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754456 for deletion 2025-07-18 14:25:33,899 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754456_13632 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754456 2025-07-18 14:26:33,288 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754457_13633 src: /192.168.158.7:46156 dest: /192.168.158.4:9866 2025-07-18 14:26:33,310 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:46156, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-658019282_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754457_13633, duration(ns): 20278880 2025-07-18 14:26:33,311 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754457_13633, type=LAST_IN_PIPELINE terminating 2025-07-18 14:26:36,898 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754457_13633 replica FinalizedReplica, blk_1073754457_13633, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754457 for deletion 2025-07-18 14:26:36,899 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754457_13633 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754457 2025-07-18 14:27:38,304 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754458_13634 src: /192.168.158.7:56108 dest: /192.168.158.4:9866 2025-07-18 14:27:38,323 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:56108, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-255548501_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754458_13634, duration(ns): 16775963 2025-07-18 14:27:38,323 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754458_13634, type=LAST_IN_PIPELINE terminating 2025-07-18 14:27:45,900 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754458_13634 replica FinalizedReplica, blk_1073754458_13634, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754458 for deletion 2025-07-18 14:27:45,901 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754458_13634 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754458 2025-07-18 14:28:38,300 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754459_13635 src: /192.168.158.6:51702 dest: /192.168.158.4:9866 2025-07-18 14:28:38,318 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:51702, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1734653194_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754459_13635, duration(ns): 15716682 2025-07-18 14:28:38,318 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754459_13635, type=LAST_IN_PIPELINE terminating 2025-07-18 14:28:42,903 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754459_13635 replica FinalizedReplica, blk_1073754459_13635, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754459 for deletion 2025-07-18 14:28:42,905 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754459_13635 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754459 2025-07-18 14:29:38,296 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754460_13636 src: /192.168.158.7:37278 dest: /192.168.158.4:9866 2025-07-18 14:29:38,321 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:37278, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1913212063_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754460_13636, duration(ns): 20258806 2025-07-18 14:29:38,322 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754460_13636, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-18 14:29:45,907 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754460_13636 replica FinalizedReplica, blk_1073754460_13636, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754460 for deletion 2025-07-18 14:29:45,908 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754460_13636 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754460 2025-07-18 14:31:38,302 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754462_13638 src: /192.168.158.1:57142 dest: /192.168.158.4:9866 2025-07-18 14:31:38,333 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57142, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_471388515_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754462_13638, duration(ns): 22944086 2025-07-18 14:31:38,333 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754462_13638, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-18 14:31:42,910 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754462_13638 replica FinalizedReplica, blk_1073754462_13638, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754462 for deletion 2025-07-18 14:31:42,911 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754462_13638 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754462 2025-07-18 14:34:43,306 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754465_13641 src: /192.168.158.1:47910 dest: /192.168.158.4:9866 2025-07-18 14:34:43,340 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47910, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1670171100_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754465_13641, duration(ns): 25346092 2025-07-18 14:34:43,340 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754465_13641, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-18 14:34:48,916 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754465_13641 replica FinalizedReplica, blk_1073754465_13641, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754465 for deletion 2025-07-18 14:34:48,917 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754465_13641 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754465 2025-07-18 14:35:43,323 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754466_13642 src: /192.168.158.8:34976 dest: /192.168.158.4:9866 2025-07-18 14:35:43,344 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:34976, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1701120935_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754466_13642, duration(ns): 18807173 2025-07-18 14:35:43,344 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754466_13642, type=LAST_IN_PIPELINE terminating 2025-07-18 14:35:45,916 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754466_13642 replica FinalizedReplica, blk_1073754466_13642, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754466 for deletion 2025-07-18 14:35:45,917 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754466_13642 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754466 2025-07-18 14:36:43,309 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754467_13643 src: /192.168.158.1:53616 dest: /192.168.158.4:9866 2025-07-18 14:36:43,342 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53616, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-730449505_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754467_13643, duration(ns): 24275785 2025-07-18 14:36:43,342 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754467_13643, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-18 14:36:45,916 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754467_13643 replica FinalizedReplica, blk_1073754467_13643, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754467 for deletion 2025-07-18 14:36:45,917 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754467_13643 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754467 2025-07-18 14:37:43,341 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754468_13644 src: /192.168.158.9:55164 dest: /192.168.158.4:9866 2025-07-18 14:37:43,360 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:55164, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_483712379_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754468_13644, duration(ns): 16749948 2025-07-18 14:37:43,360 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754468_13644, type=LAST_IN_PIPELINE terminating 2025-07-18 14:37:45,919 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754468_13644 replica FinalizedReplica, blk_1073754468_13644, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754468 for deletion 2025-07-18 14:37:45,920 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754468_13644 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754468 2025-07-18 14:40:43,324 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754471_13647 src: /192.168.158.5:37646 dest: /192.168.158.4:9866 2025-07-18 14:40:43,345 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:37646, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-723312781_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754471_13647, duration(ns): 16997171 2025-07-18 14:40:43,345 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754471_13647, type=LAST_IN_PIPELINE terminating 2025-07-18 14:40:45,923 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754471_13647 replica FinalizedReplica, blk_1073754471_13647, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754471 for deletion 2025-07-18 14:40:45,924 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754471_13647 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754471 2025-07-18 14:41:43,331 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754472_13648 src: /192.168.158.5:45300 dest: /192.168.158.4:9866 2025-07-18 14:41:43,360 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:45300, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1523033428_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754472_13648, duration(ns): 23274389 2025-07-18 14:41:43,360 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754472_13648, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-18 14:41:45,924 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754472_13648 replica FinalizedReplica, blk_1073754472_13648, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754472 for deletion 2025-07-18 14:41:45,927 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754472_13648 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754472 2025-07-18 14:44:43,318 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754475_13651 src: /192.168.158.1:41986 dest: /192.168.158.4:9866 2025-07-18 14:44:43,351 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41986, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2102855538_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754475_13651, duration(ns): 24267846 2025-07-18 14:44:43,351 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754475_13651, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-18 14:44:45,930 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754475_13651 replica FinalizedReplica, blk_1073754475_13651, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754475 for deletion 2025-07-18 14:44:45,931 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754475_13651 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754475 2025-07-18 14:45:43,325 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754476_13652 src: /192.168.158.9:55136 dest: /192.168.158.4:9866 2025-07-18 14:45:43,353 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:55136, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_775776647_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754476_13652, duration(ns): 20452915 2025-07-18 14:45:43,353 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754476_13652, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-18 14:45:45,930 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754476_13652 replica FinalizedReplica, blk_1073754476_13652, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754476 for deletion 2025-07-18 14:45:45,931 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754476_13652 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754476 2025-07-18 14:46:43,332 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754477_13653 src: /192.168.158.7:47584 dest: /192.168.158.4:9866 2025-07-18 14:46:43,358 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:47584, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_38732340_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754477_13653, duration(ns): 20419461 2025-07-18 14:46:43,358 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754477_13653, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-18 14:46:45,934 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754477_13653 replica FinalizedReplica, blk_1073754477_13653, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754477 for deletion 2025-07-18 14:46:45,935 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754477_13653 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754477 2025-07-18 14:48:53,351 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754479_13655 src: /192.168.158.9:59442 dest: /192.168.158.4:9866 2025-07-18 14:48:53,370 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:59442, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_41755755_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754479_13655, duration(ns): 15816161 2025-07-18 14:48:53,370 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754479_13655, type=LAST_IN_PIPELINE terminating 2025-07-18 14:48:57,941 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754479_13655 replica FinalizedReplica, blk_1073754479_13655, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754479 for deletion 2025-07-18 14:48:57,942 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754479_13655 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754479 2025-07-18 14:50:53,329 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754481_13657 src: /192.168.158.1:58106 dest: /192.168.158.4:9866 2025-07-18 14:50:53,370 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58106, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1926825624_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754481_13657, duration(ns): 15810268 2025-07-18 14:50:53,371 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754481_13657, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-18 14:50:57,943 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754481_13657 replica FinalizedReplica, blk_1073754481_13657, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754481 for deletion 2025-07-18 14:50:57,944 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754481_13657 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754481 2025-07-18 14:52:58,336 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754483_13659 src: /192.168.158.9:35990 dest: /192.168.158.4:9866 2025-07-18 14:52:58,361 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:35990, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2014046237_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754483_13659, duration(ns): 19233356 2025-07-18 14:52:58,361 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754483_13659, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-18 14:53:00,949 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754483_13659 replica FinalizedReplica, blk_1073754483_13659, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754483 for deletion 2025-07-18 14:53:00,950 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754483_13659 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754483 2025-07-18 14:56:58,339 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754487_13663 src: /192.168.158.9:60380 dest: /192.168.158.4:9866 2025-07-18 14:56:58,358 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:60380, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1138136632_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754487_13663, duration(ns): 17435751 2025-07-18 14:56:58,359 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754487_13663, type=LAST_IN_PIPELINE terminating 2025-07-18 14:57:00,958 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754487_13663 replica FinalizedReplica, blk_1073754487_13663, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754487 for deletion 2025-07-18 14:57:00,959 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754487_13663 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754487 2025-07-18 14:57:58,342 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754488_13664 src: /192.168.158.9:52948 dest: /192.168.158.4:9866 2025-07-18 14:57:58,368 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:52948, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-676569151_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754488_13664, duration(ns): 20664271 2025-07-18 14:57:58,368 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754488_13664, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-18 14:58:03,961 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754488_13664 replica FinalizedReplica, blk_1073754488_13664, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754488 for deletion 2025-07-18 14:58:03,962 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754488_13664 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754488 2025-07-18 14:58:58,346 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754489_13665 src: /192.168.158.7:34178 dest: /192.168.158.4:9866 2025-07-18 14:58:58,363 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:34178, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1692879260_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754489_13665, duration(ns): 15780297 2025-07-18 14:58:58,364 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754489_13665, type=LAST_IN_PIPELINE terminating 2025-07-18 14:59:03,964 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754489_13665 replica FinalizedReplica, blk_1073754489_13665, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754489 for deletion 2025-07-18 14:59:03,965 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754489_13665 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754489 2025-07-18 14:59:58,355 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754490_13666 src: /192.168.158.8:52608 dest: /192.168.158.4:9866 2025-07-18 14:59:58,378 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:52608, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1615246051_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754490_13666, duration(ns): 20931619 2025-07-18 14:59:58,378 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754490_13666, type=LAST_IN_PIPELINE terminating 2025-07-18 15:00:00,966 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754490_13666 replica FinalizedReplica, blk_1073754490_13666, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754490 for deletion 2025-07-18 15:00:00,967 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754490_13666 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754490 2025-07-18 15:03:03,352 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754493_13669 src: /192.168.158.1:52360 dest: /192.168.158.4:9866 2025-07-18 15:03:03,389 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52360, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1570281038_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754493_13669, duration(ns): 28534274 2025-07-18 15:03:03,390 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754493_13669, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-18 15:03:06,969 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754493_13669 replica FinalizedReplica, blk_1073754493_13669, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754493 for deletion 2025-07-18 15:03:06,971 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754493_13669 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754493 2025-07-18 15:04:08,355 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754494_13670 src: /192.168.158.5:52358 dest: /192.168.158.4:9866 2025-07-18 15:04:08,382 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:52358, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1523972623_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754494_13670, duration(ns): 21390168 2025-07-18 15:04:08,382 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754494_13670, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-18 15:04:12,972 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754494_13670 replica FinalizedReplica, blk_1073754494_13670, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754494 for deletion 2025-07-18 15:04:12,973 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754494_13670 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754494 2025-07-18 15:05:08,398 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754495_13671 src: /192.168.158.8:55172 dest: /192.168.158.4:9866 2025-07-18 15:05:08,416 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:55172, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-168292865_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754495_13671, duration(ns): 15715206 2025-07-18 15:05:08,416 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754495_13671, type=LAST_IN_PIPELINE terminating 2025-07-18 15:05:15,973 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754495_13671 replica FinalizedReplica, blk_1073754495_13671, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754495 for deletion 2025-07-18 15:05:15,974 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754495_13671 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754495 2025-07-18 15:08:08,369 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754498_13674 src: /192.168.158.8:40604 dest: /192.168.158.4:9866 2025-07-18 15:08:08,389 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:40604, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-348308796_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754498_13674, duration(ns): 17581435 2025-07-18 15:08:08,389 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754498_13674, type=LAST_IN_PIPELINE terminating 2025-07-18 15:08:15,978 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754498_13674 replica FinalizedReplica, blk_1073754498_13674, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754498 for deletion 2025-07-18 15:08:15,979 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754498_13674 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754498 2025-07-18 15:09:08,402 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754499_13675 src: /192.168.158.7:58576 dest: /192.168.158.4:9866 2025-07-18 15:09:08,422 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:58576, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1436642356_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754499_13675, duration(ns): 18006319 2025-07-18 15:09:08,423 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754499_13675, type=LAST_IN_PIPELINE terminating 2025-07-18 15:09:15,979 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754499_13675 replica FinalizedReplica, blk_1073754499_13675, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754499 for deletion 2025-07-18 15:09:15,981 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754499_13675 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754499 2025-07-18 15:10:08,350 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754500_13676 src: /192.168.158.5:37816 dest: /192.168.158.4:9866 2025-07-18 15:10:08,368 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:37816, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_466214630_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754500_13676, duration(ns): 16551829 2025-07-18 15:10:08,369 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754500_13676, type=LAST_IN_PIPELINE terminating 2025-07-18 15:10:12,980 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754500_13676 replica FinalizedReplica, blk_1073754500_13676, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754500 for deletion 2025-07-18 15:10:12,981 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754500_13676 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754500 2025-07-18 15:11:08,351 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754501_13677 src: /192.168.158.1:36224 dest: /192.168.158.4:9866 2025-07-18 15:11:08,388 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36224, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-542327648_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754501_13677, duration(ns): 27943507 2025-07-18 15:11:08,388 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754501_13677, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-18 15:11:12,981 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754501_13677 replica FinalizedReplica, blk_1073754501_13677, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754501 for deletion 2025-07-18 15:11:12,983 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754501_13677 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754501 2025-07-18 15:14:13,359 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754504_13680 src: /192.168.158.1:41276 dest: /192.168.158.4:9866 2025-07-18 15:14:13,392 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41276, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_771335186_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754504_13680, duration(ns): 22522175 2025-07-18 15:14:13,392 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754504_13680, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-18 15:14:18,986 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754504_13680 replica FinalizedReplica, blk_1073754504_13680, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754504 for deletion 2025-07-18 15:14:18,987 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754504_13680 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754504 2025-07-18 15:15:13,361 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754505_13681 src: /192.168.158.7:34642 dest: /192.168.158.4:9866 2025-07-18 15:15:13,391 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:34642, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1464900676_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754505_13681, duration(ns): 23545554 2025-07-18 15:15:13,391 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754505_13681, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-18 15:15:15,989 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754505_13681 replica FinalizedReplica, blk_1073754505_13681, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754505 for deletion 2025-07-18 15:15:15,990 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754505_13681 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754505 2025-07-18 15:17:13,366 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754507_13683 src: /192.168.158.6:58368 dest: /192.168.158.4:9866 2025-07-18 15:17:13,394 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:58368, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-748737106_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754507_13683, duration(ns): 22144521 2025-07-18 15:17:13,394 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754507_13683, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-18 15:17:18,992 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754507_13683 replica FinalizedReplica, blk_1073754507_13683, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754507 for deletion 2025-07-18 15:17:18,994 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754507_13683 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754507 2025-07-18 15:22:13,372 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754512_13688 src: /192.168.158.1:57438 dest: /192.168.158.4:9866 2025-07-18 15:22:13,406 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57438, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_43805701_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754512_13688, duration(ns): 25473380 2025-07-18 15:22:13,407 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754512_13688, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-18 15:22:16,001 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754512_13688 replica FinalizedReplica, blk_1073754512_13688, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754512 for deletion 2025-07-18 15:22:16,002 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754512_13688 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754512 2025-07-18 15:28:13,373 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754518_13694 src: /192.168.158.1:60034 dest: /192.168.158.4:9866 2025-07-18 15:28:13,405 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60034, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_783573349_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754518_13694, duration(ns): 22971110 2025-07-18 15:28:13,405 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754518_13694, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-18 15:28:19,011 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754518_13694 replica FinalizedReplica, blk_1073754518_13694, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754518 for deletion 2025-07-18 15:28:19,012 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754518_13694 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754518 2025-07-18 15:31:18,379 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754521_13697 src: /192.168.158.7:46040 dest: /192.168.158.4:9866 2025-07-18 15:31:18,405 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:46040, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-979935643_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754521_13697, duration(ns): 19727910 2025-07-18 15:31:18,405 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754521_13697, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-18 15:31:22,018 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754521_13697 replica FinalizedReplica, blk_1073754521_13697, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754521 for deletion 2025-07-18 15:31:22,019 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754521_13697 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754521 2025-07-18 15:33:18,379 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754523_13699 src: /192.168.158.7:54164 dest: /192.168.158.4:9866 2025-07-18 15:33:18,404 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:54164, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-619451038_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754523_13699, duration(ns): 19688708 2025-07-18 15:33:18,404 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754523_13699, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-18 15:33:22,020 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754523_13699 replica FinalizedReplica, blk_1073754523_13699, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754523 for deletion 2025-07-18 15:33:22,021 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754523_13699 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754523 2025-07-18 15:36:23,398 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754526_13702 src: /192.168.158.8:56344 dest: /192.168.158.4:9866 2025-07-18 15:36:23,437 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:56344, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1153335548_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754526_13702, duration(ns): 37050345 2025-07-18 15:36:23,438 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754526_13702, type=LAST_IN_PIPELINE terminating 2025-07-18 15:36:25,028 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754526_13702 replica FinalizedReplica, blk_1073754526_13702, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754526 for deletion 2025-07-18 15:36:25,029 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754526_13702 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754526 2025-07-18 15:38:23,392 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754528_13704 src: /192.168.158.1:40626 dest: /192.168.158.4:9866 2025-07-18 15:38:23,423 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40626, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_350131274_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754528_13704, duration(ns): 22706867 2025-07-18 15:38:23,424 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754528_13704, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-18 15:38:28,031 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754528_13704 replica FinalizedReplica, blk_1073754528_13704, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754528 for deletion 2025-07-18 15:38:28,033 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754528_13704 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754528 2025-07-18 15:40:23,400 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754530_13706 src: /192.168.158.5:43504 dest: /192.168.158.4:9866 2025-07-18 15:40:23,435 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:43504, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-925874940_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754530_13706, duration(ns): 32317712 2025-07-18 15:40:23,435 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754530_13706, type=LAST_IN_PIPELINE terminating 2025-07-18 15:40:25,036 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754530_13706 replica FinalizedReplica, blk_1073754530_13706, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754530 for deletion 2025-07-18 15:40:25,037 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754530_13706 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754530 2025-07-18 15:43:28,416 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754533_13709 src: /192.168.158.7:42732 dest: /192.168.158.4:9866 2025-07-18 15:43:28,435 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:42732, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1504848083_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754533_13709, duration(ns): 16404105 2025-07-18 15:43:28,435 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754533_13709, type=LAST_IN_PIPELINE terminating 2025-07-18 15:43:31,043 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754533_13709 replica FinalizedReplica, blk_1073754533_13709, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754533 for deletion 2025-07-18 15:43:31,044 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754533_13709 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754533 2025-07-18 15:44:28,413 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754534_13710 src: /192.168.158.1:60102 dest: /192.168.158.4:9866 2025-07-18 15:44:28,449 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60102, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1550103713_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754534_13710, duration(ns): 24818706 2025-07-18 15:44:28,449 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754534_13710, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-18 15:44:31,044 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754534_13710 replica FinalizedReplica, blk_1073754534_13710, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754534 for deletion 2025-07-18 15:44:31,046 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754534_13710 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754534 2025-07-18 15:48:28,423 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754538_13714 src: /192.168.158.8:36490 dest: /192.168.158.4:9866 2025-07-18 15:48:28,451 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:36490, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1802855189_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754538_13714, duration(ns): 21420544 2025-07-18 15:48:28,451 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754538_13714, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-18 15:48:31,054 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754538_13714 replica FinalizedReplica, blk_1073754538_13714, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754538 for deletion 2025-07-18 15:48:31,055 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754538_13714 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754538 2025-07-18 15:50:28,436 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754540_13716 src: /192.168.158.9:41684 dest: /192.168.158.4:9866 2025-07-18 15:50:28,454 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:41684, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1878704616_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754540_13716, duration(ns): 16289084 2025-07-18 15:50:28,455 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754540_13716, type=LAST_IN_PIPELINE terminating 2025-07-18 15:50:31,057 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754540_13716 replica FinalizedReplica, blk_1073754540_13716, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754540 for deletion 2025-07-18 15:50:31,058 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754540_13716 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754540 2025-07-18 15:53:28,434 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754543_13719 src: /192.168.158.9:32970 dest: /192.168.158.4:9866 2025-07-18 15:53:28,453 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:32970, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1832665806_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754543_13719, duration(ns): 17577068 2025-07-18 15:53:28,454 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754543_13719, type=LAST_IN_PIPELINE terminating 2025-07-18 15:53:31,062 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754543_13719 replica FinalizedReplica, blk_1073754543_13719, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754543 for deletion 2025-07-18 15:53:31,063 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754543_13719 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754543 2025-07-18 15:54:28,444 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754544_13720 src: /192.168.158.8:37652 dest: /192.168.158.4:9866 2025-07-18 15:54:28,470 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:37652, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1076623650_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754544_13720, duration(ns): 20472693 2025-07-18 15:54:28,470 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754544_13720, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-18 15:54:31,065 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754544_13720 replica FinalizedReplica, blk_1073754544_13720, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754544 for deletion 2025-07-18 15:54:31,066 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754544_13720 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754544 2025-07-18 15:55:28,442 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754545_13721 src: /192.168.158.5:38080 dest: /192.168.158.4:9866 2025-07-18 15:55:28,460 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:38080, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-256627766_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754545_13721, duration(ns): 16394301 2025-07-18 15:55:28,461 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754545_13721, type=LAST_IN_PIPELINE terminating 2025-07-18 15:55:31,069 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754545_13721 replica FinalizedReplica, blk_1073754545_13721, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754545 for deletion 2025-07-18 15:55:31,070 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754545_13721 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754545 2025-07-18 15:58:28,443 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754548_13724 src: /192.168.158.7:57510 dest: /192.168.158.4:9866 2025-07-18 15:58:28,464 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:57510, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_173088089_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754548_13724, duration(ns): 18800177 2025-07-18 15:58:28,464 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754548_13724, type=LAST_IN_PIPELINE terminating 2025-07-18 15:58:31,075 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754548_13724 replica FinalizedReplica, blk_1073754548_13724, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754548 for deletion 2025-07-18 15:58:31,076 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754548_13724 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754548 2025-07-18 15:59:19,080 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Successfully sent block report 0x12cfd9a757d31f4a, containing 4 storage report(s), of which we sent 4. The reports had 8 total blocks and used 1 RPC(s). This took 0 msec to generate and 4 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5. 2025-07-18 15:59:19,080 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Got finalize command for block pool BP-1059995147-192.168.158.1-1752101929360 2025-07-18 16:00:28,442 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754550_13726 src: /192.168.158.1:53780 dest: /192.168.158.4:9866 2025-07-18 16:00:28,476 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53780, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-554164600_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754550_13726, duration(ns): 24851644 2025-07-18 16:00:28,477 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754550_13726, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-18 16:00:31,078 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754550_13726 replica FinalizedReplica, blk_1073754550_13726, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754550 for deletion 2025-07-18 16:00:31,079 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754550_13726 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754550 2025-07-18 16:01:28,455 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754551_13727 src: /192.168.158.9:49384 dest: /192.168.158.4:9866 2025-07-18 16:01:28,474 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:49384, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1896081136_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754551_13727, duration(ns): 16556986 2025-07-18 16:01:28,474 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754551_13727, type=LAST_IN_PIPELINE terminating 2025-07-18 16:01:31,080 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754551_13727 replica FinalizedReplica, blk_1073754551_13727, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754551 for deletion 2025-07-18 16:01:31,081 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754551_13727 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754551 2025-07-18 16:05:28,455 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754555_13731 src: /192.168.158.7:59852 dest: /192.168.158.4:9866 2025-07-18 16:05:28,481 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:59852, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_523232027_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754555_13731, duration(ns): 20774371 2025-07-18 16:05:28,482 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754555_13731, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-18 16:05:34,086 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754555_13731 replica FinalizedReplica, blk_1073754555_13731, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754555 for deletion 2025-07-18 16:05:34,088 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754555_13731 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754555 2025-07-18 16:07:38,458 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754557_13733 src: /192.168.158.7:34276 dest: /192.168.158.4:9866 2025-07-18 16:07:38,488 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:34276, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1883840312_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754557_13733, duration(ns): 23781595 2025-07-18 16:07:38,488 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754557_13733, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-18 16:07:40,092 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754557_13733 replica FinalizedReplica, blk_1073754557_13733, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754557 for deletion 2025-07-18 16:07:40,093 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754557_13733 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754557 2025-07-18 16:08:38,458 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754558_13734 src: /192.168.158.7:46948 dest: /192.168.158.4:9866 2025-07-18 16:08:38,487 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:46948, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_995182317_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754558_13734, duration(ns): 23234889 2025-07-18 16:08:38,487 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754558_13734, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-18 16:08:40,095 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754558_13734 replica FinalizedReplica, blk_1073754558_13734, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754558 for deletion 2025-07-18 16:08:40,096 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754558_13734 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754558 2025-07-18 16:09:38,459 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754559_13735 src: /192.168.158.1:55554 dest: /192.168.158.4:9866 2025-07-18 16:09:38,491 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55554, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1428357592_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754559_13735, duration(ns): 22655081 2025-07-18 16:09:38,491 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754559_13735, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-18 16:09:40,095 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754559_13735 replica FinalizedReplica, blk_1073754559_13735, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754559 for deletion 2025-07-18 16:09:40,096 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754559_13735 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754559 2025-07-18 16:10:38,453 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754560_13736 src: /192.168.158.5:47596 dest: /192.168.158.4:9866 2025-07-18 16:10:38,471 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:47596, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-531763238_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754560_13736, duration(ns): 16519002 2025-07-18 16:10:38,472 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754560_13736, type=LAST_IN_PIPELINE terminating 2025-07-18 16:10:40,095 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754560_13736 replica FinalizedReplica, blk_1073754560_13736, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754560 for deletion 2025-07-18 16:10:40,096 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754560_13736 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754560 2025-07-18 16:11:38,456 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754561_13737 src: /192.168.158.1:54244 dest: /192.168.158.4:9866 2025-07-18 16:11:38,490 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54244, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-506594109_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754561_13737, duration(ns): 24797479 2025-07-18 16:11:38,490 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754561_13737, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-18 16:11:43,096 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754561_13737 replica FinalizedReplica, blk_1073754561_13737, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754561 for deletion 2025-07-18 16:11:43,097 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754561_13737 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754561 2025-07-18 16:12:43,453 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754562_13738 src: /192.168.158.1:49502 dest: /192.168.158.4:9866 2025-07-18 16:12:43,486 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49502, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1128924720_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754562_13738, duration(ns): 24013882 2025-07-18 16:12:43,486 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754562_13738, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-18 16:12:49,097 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754562_13738 replica FinalizedReplica, blk_1073754562_13738, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754562 for deletion 2025-07-18 16:12:49,098 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754562_13738 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754562 2025-07-18 16:13:43,464 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754563_13739 src: /192.168.158.8:42106 dest: /192.168.158.4:9866 2025-07-18 16:13:43,483 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:42106, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-401678931_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754563_13739, duration(ns): 16795988 2025-07-18 16:13:43,483 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754563_13739, type=LAST_IN_PIPELINE terminating 2025-07-18 16:13:49,098 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754563_13739 replica FinalizedReplica, blk_1073754563_13739, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754563 for deletion 2025-07-18 16:13:49,100 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754563_13739 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754563 2025-07-18 16:14:43,453 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754564_13740 src: /192.168.158.1:40634 dest: /192.168.158.4:9866 2025-07-18 16:14:43,486 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40634, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-23392165_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754564_13740, duration(ns): 23597831 2025-07-18 16:14:43,486 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754564_13740, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-18 16:14:46,101 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754564_13740 replica FinalizedReplica, blk_1073754564_13740, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754564 for deletion 2025-07-18 16:14:46,102 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754564_13740 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754564 2025-07-18 16:15:43,464 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754565_13741 src: /192.168.158.8:56036 dest: /192.168.158.4:9866 2025-07-18 16:15:43,485 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:56036, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1588620177_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754565_13741, duration(ns): 19128047 2025-07-18 16:15:43,486 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754565_13741, type=LAST_IN_PIPELINE terminating 2025-07-18 16:15:46,102 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754565_13741 replica FinalizedReplica, blk_1073754565_13741, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754565 for deletion 2025-07-18 16:15:46,103 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754565_13741 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754565 2025-07-18 16:23:08,494 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754572_13748 src: /192.168.158.7:54368 dest: /192.168.158.4:9866 2025-07-18 16:23:08,516 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:54368, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_64746075_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754572_13748, duration(ns): 19316817 2025-07-18 16:23:08,516 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754572_13748, type=LAST_IN_PIPELINE terminating 2025-07-18 16:23:13,116 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754572_13748 replica FinalizedReplica, blk_1073754572_13748, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754572 for deletion 2025-07-18 16:23:13,117 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754572_13748 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754572 2025-07-18 16:25:08,477 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754574_13750 src: /192.168.158.9:45244 dest: /192.168.158.4:9866 2025-07-18 16:25:08,503 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:45244, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-81779178_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754574_13750, duration(ns): 20653869 2025-07-18 16:25:08,504 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754574_13750, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-18 16:25:10,120 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754574_13750 replica FinalizedReplica, blk_1073754574_13750, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754574 for deletion 2025-07-18 16:25:10,121 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754574_13750 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754574 2025-07-18 16:26:08,479 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754575_13751 src: /192.168.158.7:41116 dest: /192.168.158.4:9866 2025-07-18 16:26:08,506 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:41116, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_586319200_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754575_13751, duration(ns): 21190174 2025-07-18 16:26:08,506 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754575_13751, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-18 16:26:10,121 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754575_13751 replica FinalizedReplica, blk_1073754575_13751, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754575 for deletion 2025-07-18 16:26:10,123 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754575_13751 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754575 2025-07-18 16:31:13,486 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754580_13756 src: /192.168.158.1:42222 dest: /192.168.158.4:9866 2025-07-18 16:31:13,519 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42222, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_32773503_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754580_13756, duration(ns): 23920552 2025-07-18 16:31:13,519 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754580_13756, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-18 16:31:16,137 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754580_13756 replica FinalizedReplica, blk_1073754580_13756, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754580 for deletion 2025-07-18 16:31:16,138 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754580_13756 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754580 2025-07-18 16:32:13,478 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754581_13757 src: /192.168.158.1:41566 dest: /192.168.158.4:9866 2025-07-18 16:32:13,509 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41566, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2048680717_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754581_13757, duration(ns): 22682076 2025-07-18 16:32:13,510 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754581_13757, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-18 16:32:16,141 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754581_13757 replica FinalizedReplica, blk_1073754581_13757, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754581 for deletion 2025-07-18 16:32:16,142 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754581_13757 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754581 2025-07-18 16:35:13,481 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754584_13760 src: /192.168.158.7:57372 dest: /192.168.158.4:9866 2025-07-18 16:35:13,510 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:57372, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1547033668_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754584_13760, duration(ns): 22407551 2025-07-18 16:35:13,511 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754584_13760, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-18 16:35:16,148 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754584_13760 replica FinalizedReplica, blk_1073754584_13760, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754584 for deletion 2025-07-18 16:35:16,149 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754584_13760 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754584 2025-07-18 16:36:18,481 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754585_13761 src: /192.168.158.8:37096 dest: /192.168.158.4:9866 2025-07-18 16:36:18,507 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:37096, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-964754916_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754585_13761, duration(ns): 19827488 2025-07-18 16:36:18,507 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754585_13761, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-18 16:36:22,150 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754585_13761 replica FinalizedReplica, blk_1073754585_13761, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754585 for deletion 2025-07-18 16:36:22,151 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754585_13761 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754585 2025-07-18 16:40:28,488 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754589_13765 src: /192.168.158.9:43760 dest: /192.168.158.4:9866 2025-07-18 16:40:28,510 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:43760, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-309695857_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754589_13765, duration(ns): 19037916 2025-07-18 16:40:28,510 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754589_13765, type=LAST_IN_PIPELINE terminating 2025-07-18 16:40:31,159 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754589_13765 replica FinalizedReplica, blk_1073754589_13765, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754589 for deletion 2025-07-18 16:40:31,160 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754589_13765 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754589 2025-07-18 16:41:28,483 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754590_13766 src: /192.168.158.1:59380 dest: /192.168.158.4:9866 2025-07-18 16:41:28,519 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59380, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_901506777_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754590_13766, duration(ns): 26507316 2025-07-18 16:41:28,519 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754590_13766, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-18 16:41:31,162 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754590_13766 replica FinalizedReplica, blk_1073754590_13766, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754590 for deletion 2025-07-18 16:41:31,163 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754590_13766 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754590 2025-07-18 16:43:38,496 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754592_13768 src: /192.168.158.1:41976 dest: /192.168.158.4:9866 2025-07-18 16:43:38,534 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41976, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1472391016_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754592_13768, duration(ns): 27913888 2025-07-18 16:43:38,534 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754592_13768, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-18 16:43:43,168 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754592_13768 replica FinalizedReplica, blk_1073754592_13768, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754592 for deletion 2025-07-18 16:43:43,169 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754592_13768 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754592 2025-07-18 16:44:38,499 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754593_13769 src: /192.168.158.1:47720 dest: /192.168.158.4:9866 2025-07-18 16:44:38,534 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47720, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-112138778_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754593_13769, duration(ns): 24969490 2025-07-18 16:44:38,534 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754593_13769, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-18 16:44:40,170 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754593_13769 replica FinalizedReplica, blk_1073754593_13769, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754593 for deletion 2025-07-18 16:44:40,171 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754593_13769 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754593 2025-07-18 16:46:38,512 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754595_13771 src: /192.168.158.9:46764 dest: /192.168.158.4:9866 2025-07-18 16:46:38,532 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:46764, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_790074975_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754595_13771, duration(ns): 16813153 2025-07-18 16:46:38,533 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754595_13771, type=LAST_IN_PIPELINE terminating 2025-07-18 16:46:43,174 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754595_13771 replica FinalizedReplica, blk_1073754595_13771, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754595 for deletion 2025-07-18 16:46:43,175 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754595_13771 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754595 2025-07-18 16:47:38,519 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754596_13772 src: /192.168.158.9:58140 dest: /192.168.158.4:9866 2025-07-18 16:47:38,541 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:58140, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1846702884_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754596_13772, duration(ns): 19867673 2025-07-18 16:47:38,541 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754596_13772, type=LAST_IN_PIPELINE terminating 2025-07-18 16:47:43,177 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754596_13772 replica FinalizedReplica, blk_1073754596_13772, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754596 for deletion 2025-07-18 16:47:43,178 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754596_13772 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754596 2025-07-18 16:49:43,508 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754598_13774 src: /192.168.158.1:52824 dest: /192.168.158.4:9866 2025-07-18 16:49:43,540 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52824, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2030804711_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754598_13774, duration(ns): 23206797 2025-07-18 16:49:43,541 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754598_13774, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-18 16:49:46,178 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754598_13774 replica FinalizedReplica, blk_1073754598_13774, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754598 for deletion 2025-07-18 16:49:46,179 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754598_13774 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754598 2025-07-18 16:50:43,510 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754599_13775 src: /192.168.158.1:36406 dest: /192.168.158.4:9866 2025-07-18 16:50:43,542 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36406, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_797850578_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754599_13775, duration(ns): 23687811 2025-07-18 16:50:43,543 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754599_13775, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-18 16:50:46,179 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754599_13775 replica FinalizedReplica, blk_1073754599_13775, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754599 for deletion 2025-07-18 16:50:46,181 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754599_13775 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754599 2025-07-18 16:52:43,519 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754601_13777 src: /192.168.158.8:51736 dest: /192.168.158.4:9866 2025-07-18 16:52:43,540 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:51736, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_91802365_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754601_13777, duration(ns): 18895197 2025-07-18 16:52:43,541 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754601_13777, type=LAST_IN_PIPELINE terminating 2025-07-18 16:52:49,183 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754601_13777 replica FinalizedReplica, blk_1073754601_13777, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754601 for deletion 2025-07-18 16:52:49,184 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754601_13777 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754601 2025-07-18 16:54:48,518 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754603_13779 src: /192.168.158.7:36918 dest: /192.168.158.4:9866 2025-07-18 16:54:48,545 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:36918, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1406026140_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754603_13779, duration(ns): 20508343 2025-07-18 16:54:48,545 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754603_13779, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-18 16:54:52,186 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754603_13779 replica FinalizedReplica, blk_1073754603_13779, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754603 for deletion 2025-07-18 16:54:52,187 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754603_13779 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754603 2025-07-18 16:56:48,523 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754605_13781 src: /192.168.158.7:49998 dest: /192.168.158.4:9866 2025-07-18 16:56:48,550 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:49998, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_895868148_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754605_13781, duration(ns): 20592006 2025-07-18 16:56:48,551 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754605_13781, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-18 16:56:55,188 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754605_13781 replica FinalizedReplica, blk_1073754605_13781, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754605 for deletion 2025-07-18 16:56:55,189 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754605_13781 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754605 2025-07-18 16:57:48,525 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754606_13782 src: /192.168.158.1:49958 dest: /192.168.158.4:9866 2025-07-18 16:57:48,560 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49958, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1223008090_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754606_13782, duration(ns): 26402205 2025-07-18 16:57:48,561 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754606_13782, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-18 16:57:52,189 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754606_13782 replica FinalizedReplica, blk_1073754606_13782, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754606 for deletion 2025-07-18 16:57:52,191 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754606_13782 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754606 2025-07-18 16:58:48,526 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754607_13783 src: /192.168.158.8:58334 dest: /192.168.158.4:9866 2025-07-18 16:58:48,544 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:58334, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1073323090_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754607_13783, duration(ns): 16172094 2025-07-18 16:58:48,544 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754607_13783, type=LAST_IN_PIPELINE terminating 2025-07-18 16:58:52,192 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754607_13783 replica FinalizedReplica, blk_1073754607_13783, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754607 for deletion 2025-07-18 16:58:52,193 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754607_13783 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754607 2025-07-18 17:00:53,537 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754609_13785 src: /192.168.158.6:42202 dest: /192.168.158.4:9866 2025-07-18 17:00:53,558 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:42202, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1953231477_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754609_13785, duration(ns): 18566199 2025-07-18 17:00:53,558 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754609_13785, type=LAST_IN_PIPELINE terminating 2025-07-18 17:00:55,196 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754609_13785 replica FinalizedReplica, blk_1073754609_13785, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754609 for deletion 2025-07-18 17:00:55,197 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754609_13785 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754609 2025-07-18 17:01:53,558 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754610_13786 src: /192.168.158.7:43918 dest: /192.168.158.4:9866 2025-07-18 17:01:53,580 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:43918, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_856460540_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754610_13786, duration(ns): 20712530 2025-07-18 17:01:53,581 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754610_13786, type=LAST_IN_PIPELINE terminating 2025-07-18 17:01:55,198 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754610_13786 replica FinalizedReplica, blk_1073754610_13786, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754610 for deletion 2025-07-18 17:01:55,201 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754610_13786 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754610 2025-07-18 17:04:53,529 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754613_13789 src: /192.168.158.7:46506 dest: /192.168.158.4:9866 2025-07-18 17:04:53,554 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:46506, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1147046967_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754613_13789, duration(ns): 19676953 2025-07-18 17:04:53,554 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754613_13789, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-18 17:04:58,204 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754613_13789 replica FinalizedReplica, blk_1073754613_13789, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754613 for deletion 2025-07-18 17:04:58,206 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754613_13789 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754613 2025-07-18 17:06:53,535 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754615_13791 src: /192.168.158.8:41536 dest: /192.168.158.4:9866 2025-07-18 17:06:53,560 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:41536, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1363770260_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754615_13791, duration(ns): 20276098 2025-07-18 17:06:53,561 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754615_13791, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-18 17:06:55,204 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754615_13791 replica FinalizedReplica, blk_1073754615_13791, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754615 for deletion 2025-07-18 17:06:55,205 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754615_13791 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754615 2025-07-18 17:07:53,541 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754616_13792 src: /192.168.158.5:53656 dest: /192.168.158.4:9866 2025-07-18 17:07:53,561 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:53656, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-719109532_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754616_13792, duration(ns): 17426774 2025-07-18 17:07:53,561 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754616_13792, type=LAST_IN_PIPELINE terminating 2025-07-18 17:07:55,208 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754616_13792 replica FinalizedReplica, blk_1073754616_13792, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754616 for deletion 2025-07-18 17:07:55,209 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754616_13792 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754616 2025-07-18 17:08:58,535 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754617_13793 src: /192.168.158.7:35398 dest: /192.168.158.4:9866 2025-07-18 17:08:58,555 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:35398, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-121077791_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754617_13793, duration(ns): 18621297 2025-07-18 17:08:58,556 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754617_13793, type=LAST_IN_PIPELINE terminating 2025-07-18 17:09:04,210 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754617_13793 replica FinalizedReplica, blk_1073754617_13793, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754617 for deletion 2025-07-18 17:09:04,211 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754617_13793 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754617 2025-07-18 17:09:58,542 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754618_13794 src: /192.168.158.9:49958 dest: /192.168.158.4:9866 2025-07-18 17:09:58,568 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:49958, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_866852315_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754618_13794, duration(ns): 20269113 2025-07-18 17:09:58,569 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754618_13794, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-18 17:10:04,211 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754618_13794 replica FinalizedReplica, blk_1073754618_13794, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754618 for deletion 2025-07-18 17:10:04,213 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754618_13794 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754618 2025-07-18 17:14:03,542 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754622_13798 src: /192.168.158.9:38146 dest: /192.168.158.4:9866 2025-07-18 17:14:03,568 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:38146, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1391072318_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754622_13798, duration(ns): 20506413 2025-07-18 17:14:03,569 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754622_13798, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-18 17:14:07,228 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754622_13798 replica FinalizedReplica, blk_1073754622_13798, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754622 for deletion 2025-07-18 17:14:07,229 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754622_13798 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir17/blk_1073754622 2025-07-18 17:16:03,551 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754624_13800 src: /192.168.158.7:40984 dest: /192.168.158.4:9866 2025-07-18 17:16:03,570 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:40984, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-374110634_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754624_13800, duration(ns): 16230428 2025-07-18 17:16:03,570 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754624_13800, type=LAST_IN_PIPELINE terminating 2025-07-18 17:16:07,233 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754624_13800 replica FinalizedReplica, blk_1073754624_13800, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754624 for deletion 2025-07-18 17:16:07,234 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754624_13800 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754624 2025-07-18 17:17:03,549 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754625_13801 src: /192.168.158.8:57962 dest: /192.168.158.4:9866 2025-07-18 17:17:03,568 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:57962, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_814496503_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754625_13801, duration(ns): 16675828 2025-07-18 17:17:03,568 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754625_13801, type=LAST_IN_PIPELINE terminating 2025-07-18 17:17:07,237 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754625_13801 replica FinalizedReplica, blk_1073754625_13801, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754625 for deletion 2025-07-18 17:17:07,238 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754625_13801 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754625 2025-07-18 17:20:03,555 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754628_13804 src: /192.168.158.1:56126 dest: /192.168.158.4:9866 2025-07-18 17:20:03,596 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56126, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-693637299_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754628_13804, duration(ns): 31348935 2025-07-18 17:20:03,602 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754628_13804, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-18 17:20:07,242 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754628_13804 replica FinalizedReplica, blk_1073754628_13804, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754628 for deletion 2025-07-18 17:20:07,243 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754628_13804 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754628 2025-07-18 17:22:13,560 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754630_13806 src: /192.168.158.7:35498 dest: /192.168.158.4:9866 2025-07-18 17:22:13,583 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:35498, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1655180950_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754630_13806, duration(ns): 19106976 2025-07-18 17:22:13,583 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754630_13806, type=LAST_IN_PIPELINE terminating 2025-07-18 17:22:16,248 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754630_13806 replica FinalizedReplica, blk_1073754630_13806, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754630 for deletion 2025-07-18 17:22:16,249 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754630_13806 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754630 2025-07-18 17:23:13,564 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754631_13807 src: /192.168.158.1:60120 dest: /192.168.158.4:9866 2025-07-18 17:23:13,601 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60120, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1097322055_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754631_13807, duration(ns): 26770980 2025-07-18 17:23:13,601 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754631_13807, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-18 17:23:19,249 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754631_13807 replica FinalizedReplica, blk_1073754631_13807, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754631 for deletion 2025-07-18 17:23:19,250 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754631_13807 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754631 2025-07-18 17:30:18,568 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754638_13814 src: /192.168.158.9:42342 dest: /192.168.158.4:9866 2025-07-18 17:30:18,596 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:42342, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_95718603_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754638_13814, duration(ns): 21107207 2025-07-18 17:30:18,596 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754638_13814, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-18 17:30:22,268 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754638_13814 replica FinalizedReplica, blk_1073754638_13814, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754638 for deletion 2025-07-18 17:30:22,269 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754638_13814 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754638 2025-07-18 17:31:18,570 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754639_13815 src: /192.168.158.8:43448 dest: /192.168.158.4:9866 2025-07-18 17:31:18,596 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:43448, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_762054606_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754639_13815, duration(ns): 19748420 2025-07-18 17:31:18,596 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754639_13815, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-18 17:31:25,269 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754639_13815 replica FinalizedReplica, blk_1073754639_13815, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754639 for deletion 2025-07-18 17:31:25,270 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754639_13815 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754639 2025-07-18 17:32:18,574 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754640_13816 src: /192.168.158.1:53866 dest: /192.168.158.4:9866 2025-07-18 17:32:18,609 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53866, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1652232203_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754640_13816, duration(ns): 26261539 2025-07-18 17:32:18,609 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754640_13816, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-18 17:32:25,271 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754640_13816 replica FinalizedReplica, blk_1073754640_13816, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754640 for deletion 2025-07-18 17:32:25,272 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754640_13816 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754640 2025-07-18 17:34:18,569 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754642_13818 src: /192.168.158.1:40508 dest: /192.168.158.4:9866 2025-07-18 17:34:18,601 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40508, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1742043030_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754642_13818, duration(ns): 22262101 2025-07-18 17:34:18,601 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754642_13818, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-18 17:34:22,276 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754642_13818 replica FinalizedReplica, blk_1073754642_13818, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754642 for deletion 2025-07-18 17:34:22,277 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754642_13818 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754642 2025-07-18 17:36:13,269 INFO org.apache.hadoop.hdfs.server.datanode.DirectoryScanner: BlockPool BP-1059995147-192.168.158.1-1752101929360 Total blocks: 8, missing metadata files:0, missing block files:0, missing blocks in memory:0, mismatched blocks:0 2025-07-18 17:36:18,585 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754644_13820 src: /192.168.158.9:36640 dest: /192.168.158.4:9866 2025-07-18 17:36:18,602 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:36640, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-714565684_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754644_13820, duration(ns): 15747541 2025-07-18 17:36:18,603 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754644_13820, type=LAST_IN_PIPELINE terminating 2025-07-18 17:36:22,279 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754644_13820 replica FinalizedReplica, blk_1073754644_13820, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754644 for deletion 2025-07-18 17:36:22,280 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754644_13820 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754644 2025-07-18 17:37:18,582 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754645_13821 src: /192.168.158.1:42918 dest: /192.168.158.4:9866 2025-07-18 17:37:18,614 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42918, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1686028063_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754645_13821, duration(ns): 23312621 2025-07-18 17:37:18,614 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754645_13821, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-18 17:37:22,283 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754645_13821 replica FinalizedReplica, blk_1073754645_13821, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754645 for deletion 2025-07-18 17:37:22,284 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754645_13821 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754645 2025-07-18 17:40:18,586 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754648_13824 src: /192.168.158.1:42596 dest: /192.168.158.4:9866 2025-07-18 17:40:18,622 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42596, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_439276124_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754648_13824, duration(ns): 27045592 2025-07-18 17:40:18,623 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754648_13824, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-18 17:40:22,288 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754648_13824 replica FinalizedReplica, blk_1073754648_13824, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754648 for deletion 2025-07-18 17:40:22,289 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754648_13824 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754648 2025-07-18 17:44:18,596 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754652_13828 src: /192.168.158.7:35988 dest: /192.168.158.4:9866 2025-07-18 17:44:18,623 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:35988, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1450198792_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754652_13828, duration(ns): 21561941 2025-07-18 17:44:18,624 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754652_13828, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-18 17:44:25,294 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754652_13828 replica FinalizedReplica, blk_1073754652_13828, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754652 for deletion 2025-07-18 17:44:25,295 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754652_13828 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754652 2025-07-18 17:47:28,603 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754655_13831 src: /192.168.158.7:45494 dest: /192.168.158.4:9866 2025-07-18 17:47:28,621 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:45494, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1621137856_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754655_13831, duration(ns): 15889350 2025-07-18 17:47:28,622 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754655_13831, type=LAST_IN_PIPELINE terminating 2025-07-18 17:47:34,299 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754655_13831 replica FinalizedReplica, blk_1073754655_13831, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754655 for deletion 2025-07-18 17:47:34,301 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754655_13831 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754655 2025-07-18 17:48:33,660 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754656_13832 src: /192.168.158.7:57894 dest: /192.168.158.4:9866 2025-07-18 17:48:33,686 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:57894, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1296027652_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754656_13832, duration(ns): 20761799 2025-07-18 17:48:33,687 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754656_13832, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-18 17:48:37,301 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754656_13832 replica FinalizedReplica, blk_1073754656_13832, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754656 for deletion 2025-07-18 17:48:37,302 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754656_13832 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754656 2025-07-18 17:49:33,618 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754657_13833 src: /192.168.158.8:53170 dest: /192.168.158.4:9866 2025-07-18 17:49:33,637 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:53170, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-324250196_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754657_13833, duration(ns): 16461236 2025-07-18 17:49:33,637 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754657_13833, type=LAST_IN_PIPELINE terminating 2025-07-18 17:49:40,306 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754657_13833 replica FinalizedReplica, blk_1073754657_13833, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754657 for deletion 2025-07-18 17:49:40,307 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754657_13833 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754657 2025-07-18 17:53:43,605 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754661_13837 src: /192.168.158.7:37896 dest: /192.168.158.4:9866 2025-07-18 17:53:43,624 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:37896, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1794092297_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754661_13837, duration(ns): 16130041 2025-07-18 17:53:43,624 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754661_13837, type=LAST_IN_PIPELINE terminating 2025-07-18 17:53:46,318 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754661_13837 replica FinalizedReplica, blk_1073754661_13837, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754661 for deletion 2025-07-18 17:53:46,319 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754661_13837 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754661 2025-07-18 17:54:43,603 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754662_13838 src: /192.168.158.1:42992 dest: /192.168.158.4:9866 2025-07-18 17:54:43,634 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42992, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1609679224_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754662_13838, duration(ns): 22928507 2025-07-18 17:54:43,635 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754662_13838, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-18 17:54:46,320 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754662_13838 replica FinalizedReplica, blk_1073754662_13838, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754662 for deletion 2025-07-18 17:54:46,322 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754662_13838 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754662 2025-07-18 17:55:48,597 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754663_13839 src: /192.168.158.1:39990 dest: /192.168.158.4:9866 2025-07-18 17:55:48,628 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39990, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1954801886_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754663_13839, duration(ns): 22186489 2025-07-18 17:55:48,629 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754663_13839, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-18 17:55:52,323 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754663_13839 replica FinalizedReplica, blk_1073754663_13839, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754663 for deletion 2025-07-18 17:55:52,324 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754663_13839 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754663 2025-07-18 17:56:48,601 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754664_13840 src: /192.168.158.7:53276 dest: /192.168.158.4:9866 2025-07-18 17:56:48,629 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:53276, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2036667697_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754664_13840, duration(ns): 21981590 2025-07-18 17:56:48,629 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754664_13840, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-18 17:56:55,325 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754664_13840 replica FinalizedReplica, blk_1073754664_13840, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754664 for deletion 2025-07-18 17:56:55,326 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754664_13840 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754664 2025-07-18 17:58:53,598 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754666_13842 src: /192.168.158.1:44646 dest: /192.168.158.4:9866 2025-07-18 17:58:53,629 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44646, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_861643902_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754666_13842, duration(ns): 22419718 2025-07-18 17:58:53,630 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754666_13842, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-18 17:58:55,330 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754666_13842 replica FinalizedReplica, blk_1073754666_13842, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754666 for deletion 2025-07-18 17:58:55,331 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754666_13842 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754666 2025-07-18 18:02:58,608 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754670_13846 src: /192.168.158.9:57066 dest: /192.168.158.4:9866 2025-07-18 18:02:58,635 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:57066, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1833940035_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754670_13846, duration(ns): 22317493 2025-07-18 18:02:58,635 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754670_13846, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-18 18:03:01,341 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754670_13846 replica FinalizedReplica, blk_1073754670_13846, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754670 for deletion 2025-07-18 18:03:01,342 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754670_13846 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754670 2025-07-18 18:03:58,607 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754671_13847 src: /192.168.158.7:51670 dest: /192.168.158.4:9866 2025-07-18 18:03:58,633 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:51670, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2075369169_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754671_13847, duration(ns): 20164476 2025-07-18 18:03:58,633 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754671_13847, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-18 18:04:01,342 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754671_13847 replica FinalizedReplica, blk_1073754671_13847, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754671 for deletion 2025-07-18 18:04:01,344 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754671_13847 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754671 2025-07-18 18:04:58,616 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754672_13848 src: /192.168.158.6:60288 dest: /192.168.158.4:9866 2025-07-18 18:04:58,637 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:60288, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_680989737_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754672_13848, duration(ns): 17634056 2025-07-18 18:04:58,637 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754672_13848, type=LAST_IN_PIPELINE terminating 2025-07-18 18:05:01,344 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754672_13848 replica FinalizedReplica, blk_1073754672_13848, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754672 for deletion 2025-07-18 18:05:01,346 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754672_13848 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754672 2025-07-18 18:07:08,614 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754674_13850 src: /192.168.158.8:34060 dest: /192.168.158.4:9866 2025-07-18 18:07:08,641 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:34060, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1233177635_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754674_13850, duration(ns): 21498825 2025-07-18 18:07:08,641 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754674_13850, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-18 18:07:10,347 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754674_13850 replica FinalizedReplica, blk_1073754674_13850, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754674 for deletion 2025-07-18 18:07:10,348 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754674_13850 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754674 2025-07-18 18:08:08,616 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754675_13851 src: /192.168.158.5:45066 dest: /192.168.158.4:9866 2025-07-18 18:08:08,635 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:45066, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-668767523_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754675_13851, duration(ns): 15967352 2025-07-18 18:08:08,635 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754675_13851, type=LAST_IN_PIPELINE terminating 2025-07-18 18:08:10,350 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754675_13851 replica FinalizedReplica, blk_1073754675_13851, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754675 for deletion 2025-07-18 18:08:10,351 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754675_13851 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754675 2025-07-18 18:10:13,618 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754677_13853 src: /192.168.158.7:37538 dest: /192.168.158.4:9866 2025-07-18 18:10:13,642 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:37538, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1315167875_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754677_13853, duration(ns): 18603635 2025-07-18 18:10:13,643 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754677_13853, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-18 18:10:19,352 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754677_13853 replica FinalizedReplica, blk_1073754677_13853, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754677 for deletion 2025-07-18 18:10:19,353 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754677_13853 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754677 2025-07-18 18:13:23,621 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754680_13856 src: /192.168.158.8:41886 dest: /192.168.158.4:9866 2025-07-18 18:13:23,639 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:41886, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2107322503_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754680_13856, duration(ns): 15998265 2025-07-18 18:13:23,639 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754680_13856, type=LAST_IN_PIPELINE terminating 2025-07-18 18:13:28,361 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754680_13856 replica FinalizedReplica, blk_1073754680_13856, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754680 for deletion 2025-07-18 18:13:28,362 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754680_13856 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754680 2025-07-18 18:15:28,625 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754682_13858 src: /192.168.158.5:48196 dest: /192.168.158.4:9866 2025-07-18 18:15:28,644 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:48196, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-551321085_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754682_13858, duration(ns): 16758226 2025-07-18 18:15:28,645 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754682_13858, type=LAST_IN_PIPELINE terminating 2025-07-18 18:15:31,366 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754682_13858 replica FinalizedReplica, blk_1073754682_13858, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754682 for deletion 2025-07-18 18:15:31,367 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754682_13858 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754682 2025-07-18 18:16:28,628 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754683_13859 src: /192.168.158.1:38522 dest: /192.168.158.4:9866 2025-07-18 18:16:28,668 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38522, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-164185165_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754683_13859, duration(ns): 28634778 2025-07-18 18:16:28,668 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754683_13859, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-18 18:16:31,371 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754683_13859 replica FinalizedReplica, blk_1073754683_13859, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754683 for deletion 2025-07-18 18:16:31,372 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754683_13859 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754683 2025-07-18 18:17:28,641 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754684_13860 src: /192.168.158.8:54434 dest: /192.168.158.4:9866 2025-07-18 18:17:28,662 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:54434, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_659846749_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754684_13860, duration(ns): 19119822 2025-07-18 18:17:28,662 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754684_13860, type=LAST_IN_PIPELINE terminating 2025-07-18 18:17:34,372 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754684_13860 replica FinalizedReplica, blk_1073754684_13860, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754684 for deletion 2025-07-18 18:17:34,373 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754684_13860 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754684 2025-07-18 18:18:28,631 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754685_13861 src: /192.168.158.5:34406 dest: /192.168.158.4:9866 2025-07-18 18:18:28,657 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:34406, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_867789423_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754685_13861, duration(ns): 20580759 2025-07-18 18:18:28,657 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754685_13861, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-18 18:18:31,373 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754685_13861 replica FinalizedReplica, blk_1073754685_13861, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754685 for deletion 2025-07-18 18:18:31,374 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754685_13861 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754685 2025-07-18 18:19:28,625 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754686_13862 src: /192.168.158.1:51620 dest: /192.168.158.4:9866 2025-07-18 18:19:28,658 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51620, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1034158864_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754686_13862, duration(ns): 25139258 2025-07-18 18:19:28,659 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754686_13862, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-18 18:19:34,375 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754686_13862 replica FinalizedReplica, blk_1073754686_13862, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754686 for deletion 2025-07-18 18:19:34,376 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754686_13862 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754686 2025-07-18 18:20:33,641 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754687_13863 src: /192.168.158.9:38802 dest: /192.168.158.4:9866 2025-07-18 18:20:33,659 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:38802, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1102365750_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754687_13863, duration(ns): 15754456 2025-07-18 18:20:33,659 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754687_13863, type=LAST_IN_PIPELINE terminating 2025-07-18 18:20:40,378 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754687_13863 replica FinalizedReplica, blk_1073754687_13863, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754687 for deletion 2025-07-18 18:20:40,380 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754687_13863 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754687 2025-07-18 18:23:33,634 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754690_13866 src: /192.168.158.1:49162 dest: /192.168.158.4:9866 2025-07-18 18:23:33,672 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49162, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_596339635_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754690_13866, duration(ns): 28854809 2025-07-18 18:23:33,672 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754690_13866, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-18 18:23:37,387 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754690_13866 replica FinalizedReplica, blk_1073754690_13866, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754690 for deletion 2025-07-18 18:23:37,388 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754690_13866 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754690 2025-07-18 18:25:38,648 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754692_13868 src: /192.168.158.6:45864 dest: /192.168.158.4:9866 2025-07-18 18:25:38,674 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:45864, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1482618122_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754692_13868, duration(ns): 20128122 2025-07-18 18:25:38,674 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754692_13868, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-18 18:25:40,387 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754692_13868 replica FinalizedReplica, blk_1073754692_13868, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754692 for deletion 2025-07-18 18:25:40,389 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754692_13868 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754692 2025-07-18 18:29:43,641 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754696_13872 src: /192.168.158.1:44522 dest: /192.168.158.4:9866 2025-07-18 18:29:43,674 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44522, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_660884236_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754696_13872, duration(ns): 24068584 2025-07-18 18:29:43,674 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754696_13872, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-18 18:29:46,393 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754696_13872 replica FinalizedReplica, blk_1073754696_13872, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754696 for deletion 2025-07-18 18:29:46,394 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754696_13872 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754696 2025-07-18 18:31:43,639 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754698_13874 src: /192.168.158.9:41804 dest: /192.168.158.4:9866 2025-07-18 18:31:43,664 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:41804, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1929092982_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754698_13874, duration(ns): 19529953 2025-07-18 18:31:43,664 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754698_13874, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-18 18:31:49,399 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754698_13874 replica FinalizedReplica, blk_1073754698_13874, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754698 for deletion 2025-07-18 18:31:49,400 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754698_13874 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754698 2025-07-18 18:35:48,652 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754702_13878 src: /192.168.158.1:38264 dest: /192.168.158.4:9866 2025-07-18 18:35:48,687 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38264, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-680745216_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754702_13878, duration(ns): 24762678 2025-07-18 18:35:48,687 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754702_13878, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-18 18:35:52,409 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754702_13878 replica FinalizedReplica, blk_1073754702_13878, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754702 for deletion 2025-07-18 18:35:52,410 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754702_13878 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754702 2025-07-18 18:36:48,649 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754703_13879 src: /192.168.158.1:49548 dest: /192.168.158.4:9866 2025-07-18 18:36:48,682 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49548, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1560928780_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754703_13879, duration(ns): 24584634 2025-07-18 18:36:48,682 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754703_13879, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-18 18:36:52,412 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754703_13879 replica FinalizedReplica, blk_1073754703_13879, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754703 for deletion 2025-07-18 18:36:52,413 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754703_13879 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754703 2025-07-18 18:37:48,659 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754704_13880 src: /192.168.158.7:33076 dest: /192.168.158.4:9866 2025-07-18 18:37:48,677 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:33076, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2046804879_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754704_13880, duration(ns): 15772181 2025-07-18 18:37:48,678 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754704_13880, type=LAST_IN_PIPELINE terminating 2025-07-18 18:37:55,417 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754704_13880 replica FinalizedReplica, blk_1073754704_13880, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754704 for deletion 2025-07-18 18:37:55,418 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754704_13880 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754704 2025-07-18 18:41:53,667 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754708_13884 src: /192.168.158.5:38224 dest: /192.168.158.4:9866 2025-07-18 18:41:53,696 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:38224, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-936778323_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754708_13884, duration(ns): 22869280 2025-07-18 18:41:53,696 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754708_13884, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-18 18:41:55,429 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754708_13884 replica FinalizedReplica, blk_1073754708_13884, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754708 for deletion 2025-07-18 18:41:55,430 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754708_13884 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754708 2025-07-18 18:42:53,675 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754709_13885 src: /192.168.158.6:40538 dest: /192.168.158.4:9866 2025-07-18 18:42:53,697 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:40538, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1687305900_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754709_13885, duration(ns): 20126987 2025-07-18 18:42:53,697 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754709_13885, type=LAST_IN_PIPELINE terminating 2025-07-18 18:42:55,432 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754709_13885 replica FinalizedReplica, blk_1073754709_13885, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754709 for deletion 2025-07-18 18:42:55,433 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754709_13885 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754709 2025-07-18 18:43:53,680 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754710_13886 src: /192.168.158.6:58692 dest: /192.168.158.4:9866 2025-07-18 18:43:53,702 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:58692, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1534347966_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754710_13886, duration(ns): 18383404 2025-07-18 18:43:53,702 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754710_13886, type=LAST_IN_PIPELINE terminating 2025-07-18 18:43:58,435 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754710_13886 replica FinalizedReplica, blk_1073754710_13886, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754710 for deletion 2025-07-18 18:43:58,436 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754710_13886 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754710 2025-07-18 18:44:53,669 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754711_13887 src: /192.168.158.1:54364 dest: /192.168.158.4:9866 2025-07-18 18:44:53,706 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54364, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2119588876_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754711_13887, duration(ns): 25958989 2025-07-18 18:44:53,707 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754711_13887, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-18 18:44:55,437 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754711_13887 replica FinalizedReplica, blk_1073754711_13887, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754711 for deletion 2025-07-18 18:44:55,438 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754711_13887 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754711 2025-07-18 18:45:53,686 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754712_13888 src: /192.168.158.8:39172 dest: /192.168.158.4:9866 2025-07-18 18:45:53,706 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:39172, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-351564509_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754712_13888, duration(ns): 18168444 2025-07-18 18:45:53,707 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754712_13888, type=LAST_IN_PIPELINE terminating 2025-07-18 18:45:55,439 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754712_13888 replica FinalizedReplica, blk_1073754712_13888, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754712 for deletion 2025-07-18 18:45:55,440 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754712_13888 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754712 2025-07-18 18:48:53,672 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754715_13891 src: /192.168.158.6:52334 dest: /192.168.158.4:9866 2025-07-18 18:48:53,699 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:52334, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1031791610_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754715_13891, duration(ns): 20941449 2025-07-18 18:48:53,699 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754715_13891, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-18 18:48:55,445 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754715_13891 replica FinalizedReplica, blk_1073754715_13891, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754715 for deletion 2025-07-18 18:48:55,446 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754715_13891 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754715 2025-07-18 18:50:53,719 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754717_13893 src: /192.168.158.8:50472 dest: /192.168.158.4:9866 2025-07-18 18:50:53,744 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:50472, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1841152606_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754717_13893, duration(ns): 19481254 2025-07-18 18:50:53,745 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754717_13893, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-18 18:50:55,450 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754717_13893 replica FinalizedReplica, blk_1073754717_13893, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754717 for deletion 2025-07-18 18:50:55,452 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754717_13893 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754717 2025-07-18 18:52:53,686 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754719_13895 src: /192.168.158.1:54872 dest: /192.168.158.4:9866 2025-07-18 18:52:53,720 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54872, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1459434220_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754719_13895, duration(ns): 24517266 2025-07-18 18:52:53,720 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754719_13895, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-18 18:52:58,457 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754719_13895 replica FinalizedReplica, blk_1073754719_13895, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754719 for deletion 2025-07-18 18:52:58,458 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754719_13895 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754719 2025-07-18 18:53:53,699 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754720_13896 src: /192.168.158.5:45824 dest: /192.168.158.4:9866 2025-07-18 18:53:53,724 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:45824, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1411185926_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754720_13896, duration(ns): 20390551 2025-07-18 18:53:53,724 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754720_13896, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-18 18:53:55,457 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754720_13896 replica FinalizedReplica, blk_1073754720_13896, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754720 for deletion 2025-07-18 18:53:55,458 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754720_13896 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754720 2025-07-18 18:54:53,704 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754721_13897 src: /192.168.158.6:44054 dest: /192.168.158.4:9866 2025-07-18 18:54:53,721 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:44054, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1161660423_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754721_13897, duration(ns): 15532500 2025-07-18 18:54:53,722 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754721_13897, type=LAST_IN_PIPELINE terminating 2025-07-18 18:54:55,460 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754721_13897 replica FinalizedReplica, blk_1073754721_13897, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754721 for deletion 2025-07-18 18:54:55,461 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754721_13897 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754721 2025-07-18 18:55:53,704 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754722_13898 src: /192.168.158.7:41992 dest: /192.168.158.4:9866 2025-07-18 18:55:53,723 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:41992, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-771269899_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754722_13898, duration(ns): 16404099 2025-07-18 18:55:53,723 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754722_13898, type=LAST_IN_PIPELINE terminating 2025-07-18 18:55:58,463 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754722_13898 replica FinalizedReplica, blk_1073754722_13898, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754722 for deletion 2025-07-18 18:55:58,464 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754722_13898 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754722 2025-07-18 18:58:53,701 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754725_13901 src: /192.168.158.5:58528 dest: /192.168.158.4:9866 2025-07-18 18:58:53,722 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:58528, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_112028208_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754725_13901, duration(ns): 18506718 2025-07-18 18:58:53,722 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754725_13901, type=LAST_IN_PIPELINE terminating 2025-07-18 18:58:55,469 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754725_13901 replica FinalizedReplica, blk_1073754725_13901, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754725 for deletion 2025-07-18 18:58:55,471 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754725_13901 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754725 2025-07-18 18:59:53,706 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754726_13902 src: /192.168.158.7:33250 dest: /192.168.158.4:9866 2025-07-18 18:59:53,725 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:33250, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_21087670_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754726_13902, duration(ns): 16605092 2025-07-18 18:59:53,725 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754726_13902, type=LAST_IN_PIPELINE terminating 2025-07-18 18:59:55,471 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754726_13902 replica FinalizedReplica, blk_1073754726_13902, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754726 for deletion 2025-07-18 18:59:55,472 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754726_13902 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754726 2025-07-18 19:05:53,712 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754732_13908 src: /192.168.158.1:38252 dest: /192.168.158.4:9866 2025-07-18 19:05:53,745 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38252, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-428520824_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754732_13908, duration(ns): 24536474 2025-07-18 19:05:53,746 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754732_13908, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-18 19:05:55,482 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754732_13908 replica FinalizedReplica, blk_1073754732_13908, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754732 for deletion 2025-07-18 19:05:55,483 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754732_13908 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754732 2025-07-18 19:06:53,717 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754733_13909 src: /192.168.158.8:57980 dest: /192.168.158.4:9866 2025-07-18 19:06:53,736 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:57980, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_501612887_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754733_13909, duration(ns): 16096364 2025-07-18 19:06:53,736 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754733_13909, type=LAST_IN_PIPELINE terminating 2025-07-18 19:06:55,485 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754733_13909 replica FinalizedReplica, blk_1073754733_13909, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754733 for deletion 2025-07-18 19:06:55,486 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754733_13909 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754733 2025-07-18 19:07:53,722 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754734_13910 src: /192.168.158.1:44096 dest: /192.168.158.4:9866 2025-07-18 19:07:53,754 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44096, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2000976946_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754734_13910, duration(ns): 23071125 2025-07-18 19:07:53,755 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754734_13910, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-18 19:07:55,485 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754734_13910 replica FinalizedReplica, blk_1073754734_13910, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754734 for deletion 2025-07-18 19:07:55,487 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754734_13910 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754734 2025-07-18 19:08:53,728 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754735_13911 src: /192.168.158.6:45052 dest: /192.168.158.4:9866 2025-07-18 19:08:53,754 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:45052, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1403218149_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754735_13911, duration(ns): 20555431 2025-07-18 19:08:53,755 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754735_13911, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-18 19:08:55,489 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754735_13911 replica FinalizedReplica, blk_1073754735_13911, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754735 for deletion 2025-07-18 19:08:55,490 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754735_13911 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754735 2025-07-18 19:09:53,726 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754736_13912 src: /192.168.158.1:40626 dest: /192.168.158.4:9866 2025-07-18 19:09:53,759 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40626, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_678133455_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754736_13912, duration(ns): 23740200 2025-07-18 19:09:53,759 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754736_13912, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-18 19:09:55,494 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754736_13912 replica FinalizedReplica, blk_1073754736_13912, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754736 for deletion 2025-07-18 19:09:55,495 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754736_13912 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754736 2025-07-18 19:16:03,745 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754742_13918 src: /192.168.158.5:48108 dest: /192.168.158.4:9866 2025-07-18 19:16:03,773 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:48108, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_546900282_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754742_13918, duration(ns): 22942052 2025-07-18 19:16:03,774 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754742_13918, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-18 19:16:10,502 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754742_13918 replica FinalizedReplica, blk_1073754742_13918, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754742 for deletion 2025-07-18 19:16:10,503 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754742_13918 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754742 2025-07-18 19:19:08,746 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754745_13921 src: /192.168.158.1:45588 dest: /192.168.158.4:9866 2025-07-18 19:19:08,783 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45588, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1848523803_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754745_13921, duration(ns): 27987868 2025-07-18 19:19:08,783 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754745_13921, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-18 19:19:10,506 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754745_13921 replica FinalizedReplica, blk_1073754745_13921, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754745 for deletion 2025-07-18 19:19:10,507 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754745_13921 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754745 2025-07-18 19:22:18,742 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754748_13924 src: /192.168.158.8:39708 dest: /192.168.158.4:9866 2025-07-18 19:22:18,766 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:39708, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2103076442_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754748_13924, duration(ns): 18696103 2025-07-18 19:22:18,767 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754748_13924, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-18 19:22:22,513 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754748_13924 replica FinalizedReplica, blk_1073754748_13924, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754748 for deletion 2025-07-18 19:22:22,515 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754748_13924 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754748 2025-07-18 19:26:18,763 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754752_13928 src: /192.168.158.7:46792 dest: /192.168.158.4:9866 2025-07-18 19:26:18,792 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:46792, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_430658981_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754752_13928, duration(ns): 22946702 2025-07-18 19:26:18,792 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754752_13928, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-18 19:26:22,523 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754752_13928 replica FinalizedReplica, blk_1073754752_13928, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754752 for deletion 2025-07-18 19:26:22,524 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754752_13928 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754752 2025-07-18 19:27:18,764 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754753_13929 src: /192.168.158.1:41408 dest: /192.168.158.4:9866 2025-07-18 19:27:18,800 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41408, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-821563235_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754753_13929, duration(ns): 26458058 2025-07-18 19:27:18,801 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754753_13929, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-18 19:27:22,523 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754753_13929 replica FinalizedReplica, blk_1073754753_13929, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754753 for deletion 2025-07-18 19:27:22,525 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754753_13929 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754753 2025-07-18 19:30:18,776 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754756_13932 src: /192.168.158.9:34868 dest: /192.168.158.4:9866 2025-07-18 19:30:18,795 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:34868, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1667370413_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754756_13932, duration(ns): 16954662 2025-07-18 19:30:18,795 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754756_13932, type=LAST_IN_PIPELINE terminating 2025-07-18 19:30:22,528 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754756_13932 replica FinalizedReplica, blk_1073754756_13932, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754756 for deletion 2025-07-18 19:30:22,529 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754756_13932 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754756 2025-07-18 19:31:18,774 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754757_13933 src: /192.168.158.6:43222 dest: /192.168.158.4:9866 2025-07-18 19:31:18,793 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:43222, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1880220959_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754757_13933, duration(ns): 17100672 2025-07-18 19:31:18,794 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754757_13933, type=LAST_IN_PIPELINE terminating 2025-07-18 19:31:22,531 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754757_13933 replica FinalizedReplica, blk_1073754757_13933, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754757 for deletion 2025-07-18 19:31:22,532 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754757_13933 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754757 2025-07-18 19:32:18,762 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754758_13934 src: /192.168.158.7:40196 dest: /192.168.158.4:9866 2025-07-18 19:32:18,787 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:40196, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1078632344_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754758_13934, duration(ns): 20107969 2025-07-18 19:32:18,787 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754758_13934, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-18 19:32:19,534 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754758_13934 replica FinalizedReplica, blk_1073754758_13934, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754758 for deletion 2025-07-18 19:32:19,536 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754758_13934 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754758 2025-07-18 19:34:23,766 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754760_13936 src: /192.168.158.1:41112 dest: /192.168.158.4:9866 2025-07-18 19:34:23,798 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41112, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-374897696_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754760_13936, duration(ns): 23043964 2025-07-18 19:34:23,800 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754760_13936, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-18 19:34:25,537 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754760_13936 replica FinalizedReplica, blk_1073754760_13936, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754760 for deletion 2025-07-18 19:34:25,538 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754760_13936 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754760 2025-07-18 19:37:23,784 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754763_13939 src: /192.168.158.5:41432 dest: /192.168.158.4:9866 2025-07-18 19:37:23,803 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:41432, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1907799204_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754763_13939, duration(ns): 17116784 2025-07-18 19:37:23,803 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754763_13939, type=LAST_IN_PIPELINE terminating 2025-07-18 19:37:28,541 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754763_13939 replica FinalizedReplica, blk_1073754763_13939, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754763 for deletion 2025-07-18 19:37:28,542 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754763_13939 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754763 2025-07-18 19:39:23,776 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754765_13941 src: /192.168.158.1:36760 dest: /192.168.158.4:9866 2025-07-18 19:39:23,809 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36760, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-79353867_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754765_13941, duration(ns): 23979093 2025-07-18 19:39:23,810 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754765_13941, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-18 19:39:25,544 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754765_13941 replica FinalizedReplica, blk_1073754765_13941, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754765 for deletion 2025-07-18 19:39:25,546 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754765_13941 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754765 2025-07-18 19:43:23,789 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754769_13945 src: /192.168.158.6:57126 dest: /192.168.158.4:9866 2025-07-18 19:43:23,809 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:57126, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-342926314_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754769_13945, duration(ns): 18386796 2025-07-18 19:43:23,810 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754769_13945, type=LAST_IN_PIPELINE terminating 2025-07-18 19:43:25,551 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754769_13945 replica FinalizedReplica, blk_1073754769_13945, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754769 for deletion 2025-07-18 19:43:25,552 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754769_13945 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754769 2025-07-18 19:44:23,794 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754770_13946 src: /192.168.158.5:60232 dest: /192.168.158.4:9866 2025-07-18 19:44:23,818 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:60232, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2037343153_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754770_13946, duration(ns): 22310070 2025-07-18 19:44:23,819 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754770_13946, type=LAST_IN_PIPELINE terminating 2025-07-18 19:44:28,552 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754770_13946 replica FinalizedReplica, blk_1073754770_13946, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754770 for deletion 2025-07-18 19:44:28,554 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754770_13946 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754770 2025-07-18 19:45:23,790 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754771_13947 src: /192.168.158.1:60758 dest: /192.168.158.4:9866 2025-07-18 19:45:23,823 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60758, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_330682744_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754771_13947, duration(ns): 23716562 2025-07-18 19:45:23,824 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754771_13947, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-18 19:45:28,556 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754771_13947 replica FinalizedReplica, blk_1073754771_13947, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754771 for deletion 2025-07-18 19:45:28,556 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754771_13947 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754771 2025-07-18 19:47:23,800 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754773_13949 src: /192.168.158.1:51390 dest: /192.168.158.4:9866 2025-07-18 19:47:23,834 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51390, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1454047963_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754773_13949, duration(ns): 24918043 2025-07-18 19:47:23,834 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754773_13949, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-18 19:47:28,562 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754773_13949 replica FinalizedReplica, blk_1073754773_13949, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754773 for deletion 2025-07-18 19:47:28,563 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754773_13949 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754773 2025-07-18 19:48:23,805 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754774_13950 src: /192.168.158.7:34962 dest: /192.168.158.4:9866 2025-07-18 19:48:23,825 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:34962, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-577795273_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754774_13950, duration(ns): 17893744 2025-07-18 19:48:23,826 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754774_13950, type=LAST_IN_PIPELINE terminating 2025-07-18 19:48:25,563 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754774_13950 replica FinalizedReplica, blk_1073754774_13950, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754774 for deletion 2025-07-18 19:48:25,564 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754774_13950 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754774 2025-07-18 19:49:23,806 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754775_13951 src: /192.168.158.1:48558 dest: /192.168.158.4:9866 2025-07-18 19:49:23,840 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48558, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1513432828_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754775_13951, duration(ns): 23013199 2025-07-18 19:49:23,840 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754775_13951, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-18 19:49:25,567 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754775_13951 replica FinalizedReplica, blk_1073754775_13951, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754775 for deletion 2025-07-18 19:49:25,568 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754775_13951 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754775 2025-07-18 19:56:23,818 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754782_13958 src: /192.168.158.8:53828 dest: /192.168.158.4:9866 2025-07-18 19:56:23,843 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:53828, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-965703375_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754782_13958, duration(ns): 20224930 2025-07-18 19:56:23,844 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754782_13958, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-18 19:56:25,576 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754782_13958 replica FinalizedReplica, blk_1073754782_13958, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754782 for deletion 2025-07-18 19:56:25,577 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754782_13958 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754782 2025-07-18 19:57:23,822 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754783_13959 src: /192.168.158.1:52414 dest: /192.168.158.4:9866 2025-07-18 19:57:23,853 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52414, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-861246778_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754783_13959, duration(ns): 22025075 2025-07-18 19:57:23,853 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754783_13959, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-18 19:57:25,580 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754783_13959 replica FinalizedReplica, blk_1073754783_13959, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754783 for deletion 2025-07-18 19:57:25,581 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754783_13959 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754783 2025-07-18 20:03:28,828 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754789_13965 src: /192.168.158.8:39538 dest: /192.168.158.4:9866 2025-07-18 20:03:28,855 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:39538, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-339543405_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754789_13965, duration(ns): 21105086 2025-07-18 20:03:28,855 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754789_13965, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-18 20:03:31,591 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754789_13965 replica FinalizedReplica, blk_1073754789_13965, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754789 for deletion 2025-07-18 20:03:31,592 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754789_13965 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754789 2025-07-18 20:04:28,831 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754790_13966 src: /192.168.158.5:33408 dest: /192.168.158.4:9866 2025-07-18 20:04:28,859 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:33408, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1564534781_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754790_13966, duration(ns): 22886050 2025-07-18 20:04:28,860 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754790_13966, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-18 20:04:34,593 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754790_13966 replica FinalizedReplica, blk_1073754790_13966, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754790 for deletion 2025-07-18 20:04:34,595 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754790_13966 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754790 2025-07-18 20:08:33,831 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754794_13970 src: /192.168.158.1:46496 dest: /192.168.158.4:9866 2025-07-18 20:08:33,865 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46496, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_289004614_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754794_13970, duration(ns): 25071116 2025-07-18 20:08:33,868 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754794_13970, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-18 20:08:34,604 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754794_13970 replica FinalizedReplica, blk_1073754794_13970, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754794 for deletion 2025-07-18 20:08:34,605 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754794_13970 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754794 2025-07-18 20:11:33,840 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754797_13973 src: /192.168.158.1:35422 dest: /192.168.158.4:9866 2025-07-18 20:11:33,875 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35422, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2101981303_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754797_13973, duration(ns): 26122012 2025-07-18 20:11:33,876 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754797_13973, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-18 20:11:37,607 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754797_13973 replica FinalizedReplica, blk_1073754797_13973, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754797 for deletion 2025-07-18 20:11:37,608 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754797_13973 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754797 2025-07-18 20:12:33,843 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754798_13974 src: /192.168.158.7:45900 dest: /192.168.158.4:9866 2025-07-18 20:12:33,867 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:45900, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1267615625_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754798_13974, duration(ns): 18378260 2025-07-18 20:12:33,868 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754798_13974, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-18 20:12:34,609 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754798_13974 replica FinalizedReplica, blk_1073754798_13974, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754798 for deletion 2025-07-18 20:12:34,610 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754798_13974 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754798 2025-07-18 20:18:33,862 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754804_13980 src: /192.168.158.7:39884 dest: /192.168.158.4:9866 2025-07-18 20:18:33,881 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:39884, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1794298932_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754804_13980, duration(ns): 16210631 2025-07-18 20:18:33,882 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754804_13980, type=LAST_IN_PIPELINE terminating 2025-07-18 20:18:37,617 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754804_13980 replica FinalizedReplica, blk_1073754804_13980, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754804 for deletion 2025-07-18 20:18:37,618 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754804_13980 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754804 2025-07-18 20:23:33,868 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754809_13985 src: /192.168.158.6:59992 dest: /192.168.158.4:9866 2025-07-18 20:23:33,887 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:59992, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1566419136_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754809_13985, duration(ns): 17449970 2025-07-18 20:23:33,888 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754809_13985, type=LAST_IN_PIPELINE terminating 2025-07-18 20:23:34,625 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754809_13985 replica FinalizedReplica, blk_1073754809_13985, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754809 for deletion 2025-07-18 20:23:34,626 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754809_13985 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754809 2025-07-18 20:25:33,866 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754811_13987 src: /192.168.158.5:58970 dest: /192.168.158.4:9866 2025-07-18 20:25:33,891 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:58970, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1511690146_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754811_13987, duration(ns): 19520352 2025-07-18 20:25:33,892 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754811_13987, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-18 20:25:37,629 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754811_13987 replica FinalizedReplica, blk_1073754811_13987, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754811 for deletion 2025-07-18 20:25:37,631 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754811_13987 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754811 2025-07-18 20:26:33,876 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754812_13988 src: /192.168.158.7:33860 dest: /192.168.158.4:9866 2025-07-18 20:26:33,902 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:33860, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_409817599_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754812_13988, duration(ns): 21063378 2025-07-18 20:26:33,903 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754812_13988, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-18 20:26:34,631 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754812_13988 replica FinalizedReplica, blk_1073754812_13988, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754812 for deletion 2025-07-18 20:26:34,632 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754812_13988 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754812 2025-07-18 20:30:33,878 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754816_13992 src: /192.168.158.7:35832 dest: /192.168.158.4:9866 2025-07-18 20:30:33,897 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:35832, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_67631608_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754816_13992, duration(ns): 16616705 2025-07-18 20:30:33,897 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754816_13992, type=LAST_IN_PIPELINE terminating 2025-07-18 20:30:37,640 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754816_13992 replica FinalizedReplica, blk_1073754816_13992, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754816 for deletion 2025-07-18 20:30:37,641 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754816_13992 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754816 2025-07-18 20:31:33,876 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754817_13993 src: /192.168.158.7:47218 dest: /192.168.158.4:9866 2025-07-18 20:31:33,903 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:47218, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_158266362_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754817_13993, duration(ns): 21263327 2025-07-18 20:31:33,903 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754817_13993, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-18 20:31:34,644 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754817_13993 replica FinalizedReplica, blk_1073754817_13993, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754817 for deletion 2025-07-18 20:31:34,645 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754817_13993 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754817 2025-07-18 20:32:33,882 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754818_13994 src: /192.168.158.6:48594 dest: /192.168.158.4:9866 2025-07-18 20:32:33,901 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:48594, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-83809176_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754818_13994, duration(ns): 16962989 2025-07-18 20:32:33,901 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754818_13994, type=LAST_IN_PIPELINE terminating 2025-07-18 20:32:37,645 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754818_13994 replica FinalizedReplica, blk_1073754818_13994, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754818 for deletion 2025-07-18 20:32:37,646 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754818_13994 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754818 2025-07-18 20:33:33,881 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754819_13995 src: /192.168.158.5:42136 dest: /192.168.158.4:9866 2025-07-18 20:33:33,900 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:42136, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1294802457_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754819_13995, duration(ns): 16254881 2025-07-18 20:33:33,900 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754819_13995, type=LAST_IN_PIPELINE terminating 2025-07-18 20:33:34,648 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754819_13995 replica FinalizedReplica, blk_1073754819_13995, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754819 for deletion 2025-07-18 20:33:34,650 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754819_13995 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754819 2025-07-18 20:35:33,875 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754821_13997 src: /192.168.158.1:45918 dest: /192.168.158.4:9866 2025-07-18 20:35:33,908 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45918, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1140031894_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754821_13997, duration(ns): 23829394 2025-07-18 20:35:33,909 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754821_13997, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-18 20:35:34,651 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754821_13997 replica FinalizedReplica, blk_1073754821_13997, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754821 for deletion 2025-07-18 20:35:34,652 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754821_13997 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754821 2025-07-18 20:36:33,881 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754822_13998 src: /192.168.158.1:40100 dest: /192.168.158.4:9866 2025-07-18 20:36:33,914 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40100, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2075738393_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754822_13998, duration(ns): 22947270 2025-07-18 20:36:33,915 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754822_13998, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-18 20:36:37,651 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754822_13998 replica FinalizedReplica, blk_1073754822_13998, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754822 for deletion 2025-07-18 20:36:37,653 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754822_13998 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754822 2025-07-18 20:38:33,883 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754824_14000 src: /192.168.158.1:45888 dest: /192.168.158.4:9866 2025-07-18 20:38:33,918 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45888, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1001764594_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754824_14000, duration(ns): 26116443 2025-07-18 20:38:33,919 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754824_14000, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-18 20:38:34,659 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754824_14000 replica FinalizedReplica, blk_1073754824_14000, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754824 for deletion 2025-07-18 20:38:34,660 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754824_14000 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754824 2025-07-18 20:40:38,891 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754826_14002 src: /192.168.158.7:38332 dest: /192.168.158.4:9866 2025-07-18 20:40:38,917 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:38332, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_958088311_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754826_14002, duration(ns): 20803365 2025-07-18 20:40:38,918 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754826_14002, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-18 20:40:40,664 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754826_14002 replica FinalizedReplica, blk_1073754826_14002, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754826 for deletion 2025-07-18 20:40:40,665 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754826_14002 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754826 2025-07-18 20:43:38,890 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754829_14005 src: /192.168.158.1:38154 dest: /192.168.158.4:9866 2025-07-18 20:43:38,924 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38154, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1401148540_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754829_14005, duration(ns): 25880361 2025-07-18 20:43:38,925 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754829_14005, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-18 20:43:40,677 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754829_14005 replica FinalizedReplica, blk_1073754829_14005, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754829 for deletion 2025-07-18 20:43:40,678 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754829_14005 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754829 2025-07-18 20:44:38,899 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754830_14006 src: /192.168.158.6:36136 dest: /192.168.158.4:9866 2025-07-18 20:44:38,918 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:36136, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_816459411_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754830_14006, duration(ns): 17785398 2025-07-18 20:44:38,919 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754830_14006, type=LAST_IN_PIPELINE terminating 2025-07-18 20:44:40,681 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754830_14006 replica FinalizedReplica, blk_1073754830_14006, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754830 for deletion 2025-07-18 20:44:40,682 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754830_14006 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754830 2025-07-18 20:45:43,895 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754831_14007 src: /192.168.158.1:48020 dest: /192.168.158.4:9866 2025-07-18 20:45:43,927 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48020, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2009783900_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754831_14007, duration(ns): 22536473 2025-07-18 20:45:43,927 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754831_14007, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-18 20:45:46,684 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754831_14007 replica FinalizedReplica, blk_1073754831_14007, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754831 for deletion 2025-07-18 20:45:46,685 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754831_14007 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754831 2025-07-18 20:47:48,902 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754833_14009 src: /192.168.158.6:47198 dest: /192.168.158.4:9866 2025-07-18 20:47:48,921 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:47198, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2101616176_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754833_14009, duration(ns): 16828827 2025-07-18 20:47:48,922 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754833_14009, type=LAST_IN_PIPELINE terminating 2025-07-18 20:47:49,688 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754833_14009 replica FinalizedReplica, blk_1073754833_14009, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754833 for deletion 2025-07-18 20:47:49,689 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754833_14009 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754833 2025-07-18 20:51:53,914 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754837_14013 src: /192.168.158.7:59048 dest: /192.168.158.4:9866 2025-07-18 20:51:53,934 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:59048, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1297592000_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754837_14013, duration(ns): 18160662 2025-07-18 20:51:53,934 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754837_14013, type=LAST_IN_PIPELINE terminating 2025-07-18 20:51:55,700 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754837_14013 replica FinalizedReplica, blk_1073754837_14013, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754837 for deletion 2025-07-18 20:51:55,701 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754837_14013 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754837 2025-07-18 20:52:53,906 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754838_14014 src: /192.168.158.5:36356 dest: /192.168.158.4:9866 2025-07-18 20:52:53,933 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:36356, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-767680310_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754838_14014, duration(ns): 21135340 2025-07-18 20:52:53,934 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754838_14014, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-18 20:52:55,702 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754838_14014 replica FinalizedReplica, blk_1073754838_14014, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754838 for deletion 2025-07-18 20:52:55,704 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754838_14014 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754838 2025-07-18 20:55:53,911 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754841_14017 src: /192.168.158.5:38344 dest: /192.168.158.4:9866 2025-07-18 20:55:53,938 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:38344, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-493879214_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754841_14017, duration(ns): 20887362 2025-07-18 20:55:53,938 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754841_14017, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-18 20:55:55,706 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754841_14017 replica FinalizedReplica, blk_1073754841_14017, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754841 for deletion 2025-07-18 20:55:55,707 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754841_14017 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754841 2025-07-18 21:01:58,923 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754847_14023 src: /192.168.158.7:39960 dest: /192.168.158.4:9866 2025-07-18 21:01:58,947 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:39960, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-678114554_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754847_14023, duration(ns): 18703496 2025-07-18 21:01:58,948 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754847_14023, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-18 21:02:01,724 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754847_14023 replica FinalizedReplica, blk_1073754847_14023, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754847 for deletion 2025-07-18 21:02:01,725 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754847_14023 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754847 2025-07-18 21:03:58,926 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754849_14025 src: /192.168.158.6:58022 dest: /192.168.158.4:9866 2025-07-18 21:03:58,945 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:58022, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_861441667_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754849_14025, duration(ns): 16172041 2025-07-18 21:03:58,946 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754849_14025, type=LAST_IN_PIPELINE terminating 2025-07-18 21:04:04,728 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754849_14025 replica FinalizedReplica, blk_1073754849_14025, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754849 for deletion 2025-07-18 21:04:04,729 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754849_14025 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754849 2025-07-18 21:06:08,931 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754851_14027 src: /192.168.158.6:40256 dest: /192.168.158.4:9866 2025-07-18 21:06:08,950 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:40256, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1547809196_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754851_14027, duration(ns): 16845431 2025-07-18 21:06:08,951 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754851_14027, type=LAST_IN_PIPELINE terminating 2025-07-18 21:06:10,732 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754851_14027 replica FinalizedReplica, blk_1073754851_14027, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754851 for deletion 2025-07-18 21:06:10,733 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754851_14027 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754851 2025-07-18 21:07:08,937 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754852_14028 src: /192.168.158.8:55474 dest: /192.168.158.4:9866 2025-07-18 21:07:08,956 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:55474, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1646434441_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754852_14028, duration(ns): 17033688 2025-07-18 21:07:08,956 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754852_14028, type=LAST_IN_PIPELINE terminating 2025-07-18 21:07:13,734 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754852_14028 replica FinalizedReplica, blk_1073754852_14028, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754852 for deletion 2025-07-18 21:07:13,735 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754852_14028 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754852 2025-07-18 21:09:13,936 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754854_14030 src: /192.168.158.8:55620 dest: /192.168.158.4:9866 2025-07-18 21:09:13,962 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:55620, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_721791597_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754854_14030, duration(ns): 21102523 2025-07-18 21:09:13,962 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754854_14030, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-18 21:09:16,739 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754854_14030 replica FinalizedReplica, blk_1073754854_14030, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754854 for deletion 2025-07-18 21:09:16,741 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754854_14030 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754854 2025-07-18 21:11:13,938 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754856_14032 src: /192.168.158.5:41550 dest: /192.168.158.4:9866 2025-07-18 21:11:13,957 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:41550, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_375186820_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754856_14032, duration(ns): 16298773 2025-07-18 21:11:13,957 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754856_14032, type=LAST_IN_PIPELINE terminating 2025-07-18 21:11:19,743 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754856_14032 replica FinalizedReplica, blk_1073754856_14032, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754856 for deletion 2025-07-18 21:11:19,744 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754856_14032 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754856 2025-07-18 21:13:18,941 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754858_14034 src: /192.168.158.5:35978 dest: /192.168.158.4:9866 2025-07-18 21:13:18,959 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:35978, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_595957320_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754858_14034, duration(ns): 16313705 2025-07-18 21:13:18,960 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754858_14034, type=LAST_IN_PIPELINE terminating 2025-07-18 21:13:22,748 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754858_14034 replica FinalizedReplica, blk_1073754858_14034, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754858 for deletion 2025-07-18 21:13:22,749 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754858_14034 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754858 2025-07-18 21:14:18,941 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754859_14035 src: /192.168.158.9:60804 dest: /192.168.158.4:9866 2025-07-18 21:14:18,961 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:60804, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1517848122_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754859_14035, duration(ns): 17767561 2025-07-18 21:14:18,962 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754859_14035, type=LAST_IN_PIPELINE terminating 2025-07-18 21:14:22,751 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754859_14035 replica FinalizedReplica, blk_1073754859_14035, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754859 for deletion 2025-07-18 21:14:22,751 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754859_14035 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754859 2025-07-18 21:15:18,943 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754860_14036 src: /192.168.158.9:34158 dest: /192.168.158.4:9866 2025-07-18 21:15:18,961 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:34158, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_163597314_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754860_14036, duration(ns): 16720790 2025-07-18 21:15:18,962 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754860_14036, type=LAST_IN_PIPELINE terminating 2025-07-18 21:15:19,751 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754860_14036 replica FinalizedReplica, blk_1073754860_14036, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754860 for deletion 2025-07-18 21:15:19,752 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754860_14036 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754860 2025-07-18 21:16:18,944 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754861_14037 src: /192.168.158.5:60234 dest: /192.168.158.4:9866 2025-07-18 21:16:18,964 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:60234, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1155829396_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754861_14037, duration(ns): 17125064 2025-07-18 21:16:18,964 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754861_14037, type=LAST_IN_PIPELINE terminating 2025-07-18 21:16:19,752 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754861_14037 replica FinalizedReplica, blk_1073754861_14037, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754861 for deletion 2025-07-18 21:16:19,754 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754861_14037 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754861 2025-07-18 21:19:23,944 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754864_14040 src: /192.168.158.1:48362 dest: /192.168.158.4:9866 2025-07-18 21:19:23,976 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48362, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1634274690_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754864_14040, duration(ns): 23239217 2025-07-18 21:19:23,976 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754864_14040, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-18 21:19:25,763 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754864_14040 replica FinalizedReplica, blk_1073754864_14040, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754864 for deletion 2025-07-18 21:19:25,764 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754864_14040 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754864 2025-07-18 21:20:23,948 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754865_14041 src: /192.168.158.5:54716 dest: /192.168.158.4:9866 2025-07-18 21:20:23,973 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:54716, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1849598043_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754865_14041, duration(ns): 19669962 2025-07-18 21:20:23,974 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754865_14041, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-18 21:20:25,764 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754865_14041 replica FinalizedReplica, blk_1073754865_14041, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754865 for deletion 2025-07-18 21:20:25,766 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754865_14041 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754865 2025-07-18 21:21:23,954 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754866_14042 src: /192.168.158.6:60412 dest: /192.168.158.4:9866 2025-07-18 21:21:23,983 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:60412, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1367555398_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754866_14042, duration(ns): 21189113 2025-07-18 21:21:23,983 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754866_14042, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-18 21:21:25,766 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754866_14042 replica FinalizedReplica, blk_1073754866_14042, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754866 for deletion 2025-07-18 21:21:25,768 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754866_14042 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754866 2025-07-18 21:23:23,955 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754868_14044 src: /192.168.158.8:42408 dest: /192.168.158.4:9866 2025-07-18 21:23:23,984 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:42408, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-647512660_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754868_14044, duration(ns): 22790523 2025-07-18 21:23:23,984 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754868_14044, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-18 21:23:25,770 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754868_14044 replica FinalizedReplica, blk_1073754868_14044, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754868 for deletion 2025-07-18 21:23:25,771 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754868_14044 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754868 2025-07-18 21:26:23,966 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754871_14047 src: /192.168.158.9:45720 dest: /192.168.158.4:9866 2025-07-18 21:26:23,984 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:45720, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1323404752_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754871_14047, duration(ns): 15897394 2025-07-18 21:26:23,984 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754871_14047, type=LAST_IN_PIPELINE terminating 2025-07-18 21:26:28,781 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754871_14047 replica FinalizedReplica, blk_1073754871_14047, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754871 for deletion 2025-07-18 21:26:28,783 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754871_14047 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir18/blk_1073754871 2025-07-18 21:35:23,978 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754880_14056 src: /192.168.158.7:57004 dest: /192.168.158.4:9866 2025-07-18 21:35:23,997 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:57004, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1732752418_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754880_14056, duration(ns): 16149365 2025-07-18 21:35:23,997 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754880_14056, type=LAST_IN_PIPELINE terminating 2025-07-18 21:35:25,807 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754880_14056 replica FinalizedReplica, blk_1073754880_14056, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754880 for deletion 2025-07-18 21:35:25,808 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754880_14056 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754880 2025-07-18 21:37:28,980 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754882_14058 src: /192.168.158.1:52784 dest: /192.168.158.4:9866 2025-07-18 21:37:29,012 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52784, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2140028760_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754882_14058, duration(ns): 23301781 2025-07-18 21:37:29,012 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754882_14058, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-18 21:37:34,810 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754882_14058 replica FinalizedReplica, blk_1073754882_14058, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754882 for deletion 2025-07-18 21:37:34,811 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754882_14058 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754882 2025-07-18 21:38:33,974 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754883_14059 src: /192.168.158.1:56918 dest: /192.168.158.4:9866 2025-07-18 21:38:34,009 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56918, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1784481510_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754883_14059, duration(ns): 25809470 2025-07-18 21:38:34,009 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754883_14059, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-18 21:38:34,811 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754883_14059 replica FinalizedReplica, blk_1073754883_14059, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754883 for deletion 2025-07-18 21:38:34,813 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754883_14059 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754883 2025-07-18 21:40:33,983 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754885_14061 src: /192.168.158.7:59026 dest: /192.168.158.4:9866 2025-07-18 21:40:34,002 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:59026, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1748869573_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754885_14061, duration(ns): 17418460 2025-07-18 21:40:34,003 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754885_14061, type=LAST_IN_PIPELINE terminating 2025-07-18 21:40:34,813 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754885_14061 replica FinalizedReplica, blk_1073754885_14061, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754885 for deletion 2025-07-18 21:40:34,815 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754885_14061 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754885 2025-07-18 21:41:33,980 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754886_14062 src: /192.168.158.1:52852 dest: /192.168.158.4:9866 2025-07-18 21:41:34,012 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52852, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1051152813_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754886_14062, duration(ns): 23574109 2025-07-18 21:41:34,013 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754886_14062, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-18 21:41:34,816 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754886_14062 replica FinalizedReplica, blk_1073754886_14062, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754886 for deletion 2025-07-18 21:41:34,817 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754886_14062 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754886 2025-07-18 21:42:38,982 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754887_14063 src: /192.168.158.1:37466 dest: /192.168.158.4:9866 2025-07-18 21:42:39,013 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37466, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-785583297_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754887_14063, duration(ns): 21597916 2025-07-18 21:42:39,013 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754887_14063, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-18 21:42:43,820 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754887_14063 replica FinalizedReplica, blk_1073754887_14063, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754887 for deletion 2025-07-18 21:42:43,821 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754887_14063 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754887 2025-07-18 21:56:58,993 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754901_14077 src: /192.168.158.1:44620 dest: /192.168.158.4:9866 2025-07-18 21:56:59,029 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44620, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-352596986_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754901_14077, duration(ns): 25657539 2025-07-18 21:56:59,029 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754901_14077, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-18 21:57:04,854 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754901_14077 replica FinalizedReplica, blk_1073754901_14077, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754901 for deletion 2025-07-18 21:57:04,855 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754901_14077 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754901 2025-07-18 21:57:58,998 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754902_14078 src: /192.168.158.9:53876 dest: /192.168.158.4:9866 2025-07-18 21:57:59,019 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:53876, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1037410145_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754902_14078, duration(ns): 18283551 2025-07-18 21:57:59,019 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754902_14078, type=LAST_IN_PIPELINE terminating 2025-07-18 21:58:01,857 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754902_14078 replica FinalizedReplica, blk_1073754902_14078, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754902 for deletion 2025-07-18 21:58:01,858 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754902_14078 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754902 2025-07-18 21:59:19,861 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Successfully sent block report 0x12cfd9a757d31f4b, containing 4 storage report(s), of which we sent 4. The reports had 8 total blocks and used 1 RPC(s). This took 0 msec to generate and 4 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5. 2025-07-18 21:59:19,861 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Got finalize command for block pool BP-1059995147-192.168.158.1-1752101929360 2025-07-18 22:00:04,002 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754904_14080 src: /192.168.158.6:34760 dest: /192.168.158.4:9866 2025-07-18 22:00:04,020 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:34760, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1477868123_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754904_14080, duration(ns): 16197148 2025-07-18 22:00:04,020 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754904_14080, type=LAST_IN_PIPELINE terminating 2025-07-18 22:00:04,860 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754904_14080 replica FinalizedReplica, blk_1073754904_14080, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754904 for deletion 2025-07-18 22:00:04,862 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754904_14080 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754904 2025-07-18 22:02:04,007 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754906_14082 src: /192.168.158.5:40818 dest: /192.168.158.4:9866 2025-07-18 22:02:04,033 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:40818, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1182402107_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754906_14082, duration(ns): 20047526 2025-07-18 22:02:04,034 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754906_14082, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-18 22:02:04,866 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754906_14082 replica FinalizedReplica, blk_1073754906_14082, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754906 for deletion 2025-07-18 22:02:04,868 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754906_14082 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754906 2025-07-18 22:03:04,008 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754907_14083 src: /192.168.158.5:41058 dest: /192.168.158.4:9866 2025-07-18 22:03:04,027 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:41058, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_154269288_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754907_14083, duration(ns): 17284233 2025-07-18 22:03:04,028 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754907_14083, type=LAST_IN_PIPELINE terminating 2025-07-18 22:03:04,873 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754907_14083 replica FinalizedReplica, blk_1073754907_14083, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754907 for deletion 2025-07-18 22:03:04,874 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754907_14083 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754907 2025-07-18 22:07:14,011 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754911_14087 src: /192.168.158.7:54826 dest: /192.168.158.4:9866 2025-07-18 22:07:14,030 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:54826, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1187512620_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754911_14087, duration(ns): 16196635 2025-07-18 22:07:14,030 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754911_14087, type=LAST_IN_PIPELINE terminating 2025-07-18 22:07:19,884 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754911_14087 replica FinalizedReplica, blk_1073754911_14087, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754911 for deletion 2025-07-18 22:07:19,885 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754911_14087 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754911 2025-07-18 22:09:14,018 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754913_14089 src: /192.168.158.8:48456 dest: /192.168.158.4:9866 2025-07-18 22:09:14,036 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:48456, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_886212155_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754913_14089, duration(ns): 16061878 2025-07-18 22:09:14,037 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754913_14089, type=LAST_IN_PIPELINE terminating 2025-07-18 22:09:16,887 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754913_14089 replica FinalizedReplica, blk_1073754913_14089, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754913 for deletion 2025-07-18 22:09:16,888 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754913_14089 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754913 2025-07-18 22:10:14,043 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754914_14090 src: /192.168.158.9:33180 dest: /192.168.158.4:9866 2025-07-18 22:10:14,062 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:33180, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_940449217_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754914_14090, duration(ns): 16373721 2025-07-18 22:10:14,062 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754914_14090, type=LAST_IN_PIPELINE terminating 2025-07-18 22:10:16,890 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754914_14090 replica FinalizedReplica, blk_1073754914_14090, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754914 for deletion 2025-07-18 22:10:16,891 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754914_14090 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754914 2025-07-18 22:14:19,019 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754918_14094 src: /192.168.158.1:34802 dest: /192.168.158.4:9866 2025-07-18 22:14:19,054 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34802, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2094565439_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754918_14094, duration(ns): 25785720 2025-07-18 22:14:19,054 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754918_14094, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-18 22:14:22,899 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754918_14094 replica FinalizedReplica, blk_1073754918_14094, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754918 for deletion 2025-07-18 22:14:22,900 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754918_14094 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754918 2025-07-18 22:16:19,028 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754920_14096 src: /192.168.158.1:57104 dest: /192.168.158.4:9866 2025-07-18 22:16:19,060 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57104, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_724939340_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754920_14096, duration(ns): 22791488 2025-07-18 22:16:19,060 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754920_14096, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-18 22:16:19,905 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754920_14096 replica FinalizedReplica, blk_1073754920_14096, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754920 for deletion 2025-07-18 22:16:19,906 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754920_14096 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754920 2025-07-18 22:18:24,030 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754922_14098 src: /192.168.158.1:56374 dest: /192.168.158.4:9866 2025-07-18 22:18:24,066 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56374, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-992575641_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754922_14098, duration(ns): 26423942 2025-07-18 22:18:24,066 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754922_14098, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-18 22:18:25,912 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754922_14098 replica FinalizedReplica, blk_1073754922_14098, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754922 for deletion 2025-07-18 22:18:25,913 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754922_14098 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754922 2025-07-18 22:21:24,033 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754925_14101 src: /192.168.158.1:43426 dest: /192.168.158.4:9866 2025-07-18 22:21:24,067 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43426, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1362897560_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754925_14101, duration(ns): 24616055 2025-07-18 22:21:24,067 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754925_14101, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-18 22:21:25,914 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754925_14101 replica FinalizedReplica, blk_1073754925_14101, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754925 for deletion 2025-07-18 22:21:25,915 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754925_14101 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754925 2025-07-18 22:23:29,036 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754927_14103 src: /192.168.158.1:45164 dest: /192.168.158.4:9866 2025-07-18 22:23:29,071 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45164, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1156154272_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754927_14103, duration(ns): 24332577 2025-07-18 22:23:29,071 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754927_14103, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-18 22:23:37,919 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754927_14103 replica FinalizedReplica, blk_1073754927_14103, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754927 for deletion 2025-07-18 22:23:37,920 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754927_14103 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754927 2025-07-18 22:25:29,036 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754929_14105 src: /192.168.158.1:50920 dest: /192.168.158.4:9866 2025-07-18 22:25:29,069 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50920, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-386474630_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754929_14105, duration(ns): 24398519 2025-07-18 22:25:29,070 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754929_14105, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-18 22:25:37,928 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754929_14105 replica FinalizedReplica, blk_1073754929_14105, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754929 for deletion 2025-07-18 22:25:37,929 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754929_14105 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754929 2025-07-18 22:26:34,045 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754930_14106 src: /192.168.158.5:55846 dest: /192.168.158.4:9866 2025-07-18 22:26:34,064 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:55846, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_705292032_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754930_14106, duration(ns): 17022576 2025-07-18 22:26:34,065 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754930_14106, type=LAST_IN_PIPELINE terminating 2025-07-18 22:26:37,929 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754930_14106 replica FinalizedReplica, blk_1073754930_14106, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754930 for deletion 2025-07-18 22:26:37,930 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754930_14106 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754930 2025-07-18 22:27:34,048 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754931_14107 src: /192.168.158.7:37566 dest: /192.168.158.4:9866 2025-07-18 22:27:34,075 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:37566, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_501768218_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754931_14107, duration(ns): 21425446 2025-07-18 22:27:34,075 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754931_14107, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-18 22:27:40,930 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754931_14107 replica FinalizedReplica, blk_1073754931_14107, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754931 for deletion 2025-07-18 22:27:40,931 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754931_14107 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754931 2025-07-18 22:29:34,045 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754933_14109 src: /192.168.158.1:60978 dest: /192.168.158.4:9866 2025-07-18 22:29:34,080 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60978, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1711205373_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754933_14109, duration(ns): 25415185 2025-07-18 22:29:34,080 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754933_14109, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-18 22:29:37,933 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754933_14109 replica FinalizedReplica, blk_1073754933_14109, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754933 for deletion 2025-07-18 22:29:37,934 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754933_14109 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754933 2025-07-18 22:30:34,046 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754934_14110 src: /192.168.158.1:51950 dest: /192.168.158.4:9866 2025-07-18 22:30:34,077 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51950, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1676730435_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754934_14110, duration(ns): 22686105 2025-07-18 22:30:34,077 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754934_14110, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-18 22:30:40,934 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754934_14110 replica FinalizedReplica, blk_1073754934_14110, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754934 for deletion 2025-07-18 22:30:40,935 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754934_14110 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754934 2025-07-18 22:32:39,061 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754936_14112 src: /192.168.158.1:38424 dest: /192.168.158.4:9866 2025-07-18 22:32:39,095 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38424, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1658805696_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754936_14112, duration(ns): 25063218 2025-07-18 22:32:39,096 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754936_14112, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-18 22:32:46,940 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754936_14112 replica FinalizedReplica, blk_1073754936_14112, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754936 for deletion 2025-07-18 22:32:46,941 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754936_14112 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754936 2025-07-18 22:35:39,060 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754939_14115 src: /192.168.158.8:55906 dest: /192.168.158.4:9866 2025-07-18 22:35:39,080 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:55906, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1684551592_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754939_14115, duration(ns): 17389410 2025-07-18 22:35:39,080 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754939_14115, type=LAST_IN_PIPELINE terminating 2025-07-18 22:35:43,946 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754939_14115 replica FinalizedReplica, blk_1073754939_14115, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754939 for deletion 2025-07-18 22:35:43,947 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754939_14115 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754939 2025-07-18 22:36:44,062 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754940_14116 src: /192.168.158.8:36348 dest: /192.168.158.4:9866 2025-07-18 22:36:44,092 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:36348, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_218057293_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754940_14116, duration(ns): 24186878 2025-07-18 22:36:44,093 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754940_14116, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-18 22:36:49,946 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754940_14116 replica FinalizedReplica, blk_1073754940_14116, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754940 for deletion 2025-07-18 22:36:49,947 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754940_14116 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754940 2025-07-18 22:37:44,061 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754941_14117 src: /192.168.158.5:58816 dest: /192.168.158.4:9866 2025-07-18 22:37:44,090 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:58816, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1662379945_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754941_14117, duration(ns): 22800893 2025-07-18 22:37:44,090 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754941_14117, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-18 22:37:49,951 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754941_14117 replica FinalizedReplica, blk_1073754941_14117, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754941 for deletion 2025-07-18 22:37:49,952 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754941_14117 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754941 2025-07-18 22:38:44,063 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754942_14118 src: /192.168.158.1:58564 dest: /192.168.158.4:9866 2025-07-18 22:38:44,097 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58564, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1326312611_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754942_14118, duration(ns): 25009239 2025-07-18 22:38:44,097 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754942_14118, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-18 22:38:49,955 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754942_14118 replica FinalizedReplica, blk_1073754942_14118, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754942 for deletion 2025-07-18 22:38:49,956 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754942_14118 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754942 2025-07-18 22:39:49,066 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754943_14119 src: /192.168.158.6:56542 dest: /192.168.158.4:9866 2025-07-18 22:39:49,085 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:56542, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-161717553_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754943_14119, duration(ns): 16863890 2025-07-18 22:39:49,085 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754943_14119, type=LAST_IN_PIPELINE terminating 2025-07-18 22:39:52,956 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754943_14119 replica FinalizedReplica, blk_1073754943_14119, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754943 for deletion 2025-07-18 22:39:52,957 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754943_14119 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754943 2025-07-18 22:40:49,070 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754944_14120 src: /192.168.158.7:33262 dest: /192.168.158.4:9866 2025-07-18 22:40:49,089 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:33262, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_755376924_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754944_14120, duration(ns): 16881560 2025-07-18 22:40:49,089 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754944_14120, type=LAST_IN_PIPELINE terminating 2025-07-18 22:40:52,957 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754944_14120 replica FinalizedReplica, blk_1073754944_14120, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754944 for deletion 2025-07-18 22:40:52,959 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754944_14120 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754944 2025-07-18 22:41:49,065 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754945_14121 src: /192.168.158.1:52048 dest: /192.168.158.4:9866 2025-07-18 22:41:49,098 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52048, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_419363688_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754945_14121, duration(ns): 23343946 2025-07-18 22:41:49,098 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754945_14121, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-18 22:41:52,959 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754945_14121 replica FinalizedReplica, blk_1073754945_14121, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754945 for deletion 2025-07-18 22:41:52,961 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754945_14121 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754945 2025-07-18 22:42:54,071 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754946_14122 src: /192.168.158.1:59830 dest: /192.168.158.4:9866 2025-07-18 22:42:54,106 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59830, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1364827801_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754946_14122, duration(ns): 24512777 2025-07-18 22:42:54,107 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754946_14122, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-18 22:42:58,960 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754946_14122 replica FinalizedReplica, blk_1073754946_14122, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754946 for deletion 2025-07-18 22:42:58,961 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754946_14122 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754946 2025-07-18 22:43:54,073 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754947_14123 src: /192.168.158.1:42274 dest: /192.168.158.4:9866 2025-07-18 22:43:54,107 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42274, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1639167142_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754947_14123, duration(ns): 24125756 2025-07-18 22:43:54,107 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754947_14123, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-18 22:44:01,962 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754947_14123 replica FinalizedReplica, blk_1073754947_14123, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754947 for deletion 2025-07-18 22:44:01,963 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754947_14123 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754947 2025-07-18 22:44:54,083 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754948_14124 src: /192.168.158.6:36038 dest: /192.168.158.4:9866 2025-07-18 22:44:54,110 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:36038, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-296182323_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754948_14124, duration(ns): 21392147 2025-07-18 22:44:54,110 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754948_14124, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-18 22:44:58,964 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754948_14124 replica FinalizedReplica, blk_1073754948_14124, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754948 for deletion 2025-07-18 22:44:58,965 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754948_14124 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754948 2025-07-18 22:46:59,083 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754950_14126 src: /192.168.158.8:37946 dest: /192.168.158.4:9866 2025-07-18 22:46:59,110 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:37946, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1407226728_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754950_14126, duration(ns): 21912497 2025-07-18 22:46:59,111 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754950_14126, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-18 22:47:04,969 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754950_14126 replica FinalizedReplica, blk_1073754950_14126, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754950 for deletion 2025-07-18 22:47:04,971 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754950_14126 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754950 2025-07-18 22:48:59,074 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754952_14128 src: /192.168.158.1:52572 dest: /192.168.158.4:9866 2025-07-18 22:48:59,107 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52572, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1007526576_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754952_14128, duration(ns): 23638421 2025-07-18 22:48:59,108 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754952_14128, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-18 22:49:01,974 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754952_14128 replica FinalizedReplica, blk_1073754952_14128, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754952 for deletion 2025-07-18 22:49:01,975 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754952_14128 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754952 2025-07-18 22:49:59,089 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754953_14129 src: /192.168.158.1:58654 dest: /192.168.158.4:9866 2025-07-18 22:49:59,127 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58654, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2040943158_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754953_14129, duration(ns): 28706548 2025-07-18 22:49:59,128 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754953_14129, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-18 22:50:04,977 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754953_14129 replica FinalizedReplica, blk_1073754953_14129, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754953 for deletion 2025-07-18 22:50:04,978 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754953_14129 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754953 2025-07-18 22:50:59,112 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754954_14130 src: /192.168.158.9:57018 dest: /192.168.158.4:9866 2025-07-18 22:50:59,137 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:57018, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1313179763_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754954_14130, duration(ns): 19578117 2025-07-18 22:50:59,137 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754954_14130, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-18 22:51:04,981 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754954_14130 replica FinalizedReplica, blk_1073754954_14130, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754954 for deletion 2025-07-18 22:51:04,982 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754954_14130 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754954 2025-07-18 22:54:04,092 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754957_14133 src: /192.168.158.1:40146 dest: /192.168.158.4:9866 2025-07-18 22:54:04,127 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40146, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1510524169_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754957_14133, duration(ns): 25508538 2025-07-18 22:54:04,127 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754957_14133, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-18 22:54:07,983 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754957_14133 replica FinalizedReplica, blk_1073754957_14133, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754957 for deletion 2025-07-18 22:54:07,984 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754957_14133 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754957 2025-07-18 22:55:04,099 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754958_14134 src: /192.168.158.8:51500 dest: /192.168.158.4:9866 2025-07-18 22:55:04,124 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:51500, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2078513575_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754958_14134, duration(ns): 20079107 2025-07-18 22:55:04,125 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754958_14134, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-18 22:55:10,986 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754958_14134 replica FinalizedReplica, blk_1073754958_14134, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754958 for deletion 2025-07-18 22:55:10,987 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754958_14134 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754958 2025-07-18 22:56:09,094 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754959_14135 src: /192.168.158.1:59510 dest: /192.168.158.4:9866 2025-07-18 22:56:09,145 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59510, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_596519068_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754959_14135, duration(ns): 41276223 2025-07-18 22:56:09,145 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754959_14135, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-18 22:56:16,990 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754959_14135 replica FinalizedReplica, blk_1073754959_14135, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754959 for deletion 2025-07-18 22:56:16,991 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754959_14135 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754959 2025-07-18 22:59:09,107 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754962_14138 src: /192.168.158.1:49212 dest: /192.168.158.4:9866 2025-07-18 22:59:09,143 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49212, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1961855101_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754962_14138, duration(ns): 26614444 2025-07-18 22:59:09,144 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754962_14138, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-18 22:59:13,998 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754962_14138 replica FinalizedReplica, blk_1073754962_14138, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754962 for deletion 2025-07-18 22:59:13,999 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754962_14138 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754962 2025-07-18 23:00:09,119 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754963_14139 src: /192.168.158.7:43644 dest: /192.168.158.4:9866 2025-07-18 23:00:09,144 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:43644, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1804865308_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754963_14139, duration(ns): 19418279 2025-07-18 23:00:09,144 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754963_14139, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-18 23:00:14,003 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754963_14139 replica FinalizedReplica, blk_1073754963_14139, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754963 for deletion 2025-07-18 23:00:14,004 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754963_14139 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754963 2025-07-18 23:01:09,115 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754964_14140 src: /192.168.158.8:58712 dest: /192.168.158.4:9866 2025-07-18 23:01:09,140 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:58712, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1403296158_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754964_14140, duration(ns): 19821795 2025-07-18 23:01:09,140 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754964_14140, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-18 23:01:14,006 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754964_14140 replica FinalizedReplica, blk_1073754964_14140, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754964 for deletion 2025-07-18 23:01:14,007 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754964_14140 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754964 2025-07-18 23:02:09,115 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754965_14141 src: /192.168.158.9:39288 dest: /192.168.158.4:9866 2025-07-18 23:02:09,133 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:39288, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1452145367_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754965_14141, duration(ns): 15895964 2025-07-18 23:02:09,134 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754965_14141, type=LAST_IN_PIPELINE terminating 2025-07-18 23:02:14,010 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754965_14141 replica FinalizedReplica, blk_1073754965_14141, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754965 for deletion 2025-07-18 23:02:14,011 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754965_14141 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754965 2025-07-18 23:03:14,147 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754966_14142 src: /192.168.158.5:49948 dest: /192.168.158.4:9866 2025-07-18 23:03:14,173 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:49948, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1551058648_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754966_14142, duration(ns): 20584675 2025-07-18 23:03:14,174 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754966_14142, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-18 23:03:20,013 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754966_14142 replica FinalizedReplica, blk_1073754966_14142, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754966 for deletion 2025-07-18 23:03:20,014 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754966_14142 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754966 2025-07-18 23:04:14,116 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754967_14143 src: /192.168.158.6:34394 dest: /192.168.158.4:9866 2025-07-18 23:04:14,137 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:34394, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_950468841_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754967_14143, duration(ns): 18967311 2025-07-18 23:04:14,137 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754967_14143, type=LAST_IN_PIPELINE terminating 2025-07-18 23:04:20,014 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754967_14143 replica FinalizedReplica, blk_1073754967_14143, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754967 for deletion 2025-07-18 23:04:20,016 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754967_14143 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754967 2025-07-18 23:07:14,129 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754970_14146 src: /192.168.158.8:48964 dest: /192.168.158.4:9866 2025-07-18 23:07:14,152 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:48964, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1820934170_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754970_14146, duration(ns): 20826403 2025-07-18 23:07:14,152 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754970_14146, type=LAST_IN_PIPELINE terminating 2025-07-18 23:07:20,020 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754970_14146 replica FinalizedReplica, blk_1073754970_14146, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754970 for deletion 2025-07-18 23:07:20,021 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754970_14146 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754970 2025-07-18 23:08:14,127 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754971_14147 src: /192.168.158.5:52786 dest: /192.168.158.4:9866 2025-07-18 23:08:14,146 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:52786, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_391320601_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754971_14147, duration(ns): 16904737 2025-07-18 23:08:14,146 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754971_14147, type=LAST_IN_PIPELINE terminating 2025-07-18 23:08:20,020 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754971_14147 replica FinalizedReplica, blk_1073754971_14147, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754971 for deletion 2025-07-18 23:08:20,021 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754971_14147 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754971 2025-07-18 23:11:14,132 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754974_14150 src: /192.168.158.5:35338 dest: /192.168.158.4:9866 2025-07-18 23:11:14,154 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:35338, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1274492435_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754974_14150, duration(ns): 19180818 2025-07-18 23:11:14,154 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754974_14150, type=LAST_IN_PIPELINE terminating 2025-07-18 23:11:20,025 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754974_14150 replica FinalizedReplica, blk_1073754974_14150, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754974 for deletion 2025-07-18 23:11:20,026 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754974_14150 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754974 2025-07-18 23:12:14,141 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754975_14151 src: /192.168.158.8:43866 dest: /192.168.158.4:9866 2025-07-18 23:12:14,161 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:43866, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-721653006_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754975_14151, duration(ns): 17285213 2025-07-18 23:12:14,161 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754975_14151, type=LAST_IN_PIPELINE terminating 2025-07-18 23:12:17,025 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754975_14151 replica FinalizedReplica, blk_1073754975_14151, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754975 for deletion 2025-07-18 23:12:17,026 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754975_14151 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754975 2025-07-18 23:13:14,137 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754976_14152 src: /192.168.158.1:44486 dest: /192.168.158.4:9866 2025-07-18 23:13:14,194 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44486, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_927142779_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754976_14152, duration(ns): 47285219 2025-07-18 23:13:14,194 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754976_14152, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-18 23:13:17,027 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754976_14152 replica FinalizedReplica, blk_1073754976_14152, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754976 for deletion 2025-07-18 23:13:17,028 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754976_14152 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754976 2025-07-18 23:15:14,138 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754978_14154 src: /192.168.158.6:39094 dest: /192.168.158.4:9866 2025-07-18 23:15:14,163 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:39094, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2040542881_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754978_14154, duration(ns): 19865622 2025-07-18 23:15:14,164 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754978_14154, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-18 23:15:20,030 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754978_14154 replica FinalizedReplica, blk_1073754978_14154, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754978 for deletion 2025-07-18 23:15:20,031 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754978_14154 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754978 2025-07-18 23:16:14,143 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754979_14155 src: /192.168.158.5:51172 dest: /192.168.158.4:9866 2025-07-18 23:16:14,163 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:51172, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-321564138_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754979_14155, duration(ns): 18299827 2025-07-18 23:16:14,164 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754979_14155, type=LAST_IN_PIPELINE terminating 2025-07-18 23:16:20,034 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754979_14155 replica FinalizedReplica, blk_1073754979_14155, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754979 for deletion 2025-07-18 23:16:20,035 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754979_14155 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754979 2025-07-18 23:21:14,139 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754984_14160 src: /192.168.158.6:36748 dest: /192.168.158.4:9866 2025-07-18 23:21:14,164 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:36748, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_315418673_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754984_14160, duration(ns): 19824724 2025-07-18 23:21:14,165 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754984_14160, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-18 23:21:17,045 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754984_14160 replica FinalizedReplica, blk_1073754984_14160, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754984 for deletion 2025-07-18 23:21:17,046 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754984_14160 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754984 2025-07-18 23:23:14,132 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754986_14162 src: /192.168.158.1:35324 dest: /192.168.158.4:9866 2025-07-18 23:23:14,167 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35324, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1091978956_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754986_14162, duration(ns): 23938033 2025-07-18 23:23:14,167 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754986_14162, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-18 23:23:20,053 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754986_14162 replica FinalizedReplica, blk_1073754986_14162, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754986 for deletion 2025-07-18 23:23:20,054 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754986_14162 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754986 2025-07-18 23:25:14,133 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754988_14164 src: /192.168.158.1:54848 dest: /192.168.158.4:9866 2025-07-18 23:25:14,169 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54848, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_998136449_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754988_14164, duration(ns): 26277005 2025-07-18 23:25:14,169 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754988_14164, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-18 23:25:17,058 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754988_14164 replica FinalizedReplica, blk_1073754988_14164, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754988 for deletion 2025-07-18 23:25:17,060 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754988_14164 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754988 2025-07-18 23:27:19,162 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754990_14166 src: /192.168.158.5:44200 dest: /192.168.158.4:9866 2025-07-18 23:27:19,189 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:44200, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1547487178_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754990_14166, duration(ns): 22015817 2025-07-18 23:27:19,190 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754990_14166, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-18 23:27:23,062 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754990_14166 replica FinalizedReplica, blk_1073754990_14166, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754990 for deletion 2025-07-18 23:27:23,064 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754990_14166 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754990 2025-07-18 23:31:24,158 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754994_14170 src: /192.168.158.5:50846 dest: /192.168.158.4:9866 2025-07-18 23:31:24,184 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:50846, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1749397391_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754994_14170, duration(ns): 20316915 2025-07-18 23:31:24,185 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754994_14170, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-18 23:31:29,076 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754994_14170 replica FinalizedReplica, blk_1073754994_14170, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754994 for deletion 2025-07-18 23:31:29,077 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754994_14170 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754994 2025-07-18 23:32:24,155 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754995_14171 src: /192.168.158.1:38082 dest: /192.168.158.4:9866 2025-07-18 23:32:24,187 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38082, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-258005447_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754995_14171, duration(ns): 22886891 2025-07-18 23:32:24,187 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754995_14171, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-18 23:32:29,077 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754995_14171 replica FinalizedReplica, blk_1073754995_14171, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754995 for deletion 2025-07-18 23:32:29,080 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754995_14171 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754995 2025-07-18 23:33:24,165 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754996_14172 src: /192.168.158.9:48426 dest: /192.168.158.4:9866 2025-07-18 23:33:24,191 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:48426, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1905292790_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754996_14172, duration(ns): 20101062 2025-07-18 23:33:24,191 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754996_14172, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-18 23:33:29,083 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754996_14172 replica FinalizedReplica, blk_1073754996_14172, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754996 for deletion 2025-07-18 23:33:29,084 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754996_14172 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754996 2025-07-18 23:34:24,160 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073754997_14173 src: /192.168.158.8:54324 dest: /192.168.158.4:9866 2025-07-18 23:34:24,190 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:54324, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_900808483_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073754997_14173, duration(ns): 23322491 2025-07-18 23:34:24,191 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073754997_14173, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-18 23:34:29,082 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073754997_14173 replica FinalizedReplica, blk_1073754997_14173, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754997 for deletion 2025-07-18 23:34:29,084 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073754997_14173 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073754997 2025-07-18 23:36:13,269 INFO org.apache.hadoop.hdfs.server.datanode.DirectoryScanner: BlockPool BP-1059995147-192.168.158.1-1752101929360 Total blocks: 8, missing metadata files:0, missing block files:0, missing blocks in memory:0, mismatched blocks:0 2025-07-18 23:37:29,189 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755000_14176 src: /192.168.158.8:35186 dest: /192.168.158.4:9866 2025-07-18 23:37:29,210 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:35186, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-462452132_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755000_14176, duration(ns): 18594762 2025-07-18 23:37:29,211 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755000_14176, type=LAST_IN_PIPELINE terminating 2025-07-18 23:37:35,090 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755000_14176 replica FinalizedReplica, blk_1073755000_14176, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755000 for deletion 2025-07-18 23:37:35,091 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755000_14176 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755000 2025-07-18 23:39:29,177 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755002_14178 src: /192.168.158.5:33994 dest: /192.168.158.4:9866 2025-07-18 23:39:29,197 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:33994, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-746573054_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755002_14178, duration(ns): 17948418 2025-07-18 23:39:29,198 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755002_14178, type=LAST_IN_PIPELINE terminating 2025-07-18 23:39:32,092 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755002_14178 replica FinalizedReplica, blk_1073755002_14178, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755002 for deletion 2025-07-18 23:39:32,093 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755002_14178 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755002 2025-07-18 23:40:29,276 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755003_14179 src: /192.168.158.1:50292 dest: /192.168.158.4:9866 2025-07-18 23:40:29,309 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50292, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-132514824_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755003_14179, duration(ns): 23382210 2025-07-18 23:40:29,309 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755003_14179, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-18 23:40:35,095 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755003_14179 replica FinalizedReplica, blk_1073755003_14179, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755003 for deletion 2025-07-18 23:40:35,096 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755003_14179 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755003 2025-07-18 23:41:29,177 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755004_14180 src: /192.168.158.7:60066 dest: /192.168.158.4:9866 2025-07-18 23:41:29,207 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:60066, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1870888394_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755004_14180, duration(ns): 23671707 2025-07-18 23:41:29,208 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755004_14180, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-18 23:41:35,098 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755004_14180 replica FinalizedReplica, blk_1073755004_14180, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755004 for deletion 2025-07-18 23:41:35,099 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755004_14180 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755004 2025-07-18 23:44:34,190 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755007_14183 src: /192.168.158.5:40030 dest: /192.168.158.4:9866 2025-07-18 23:44:34,216 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:40030, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_384756304_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755007_14183, duration(ns): 20944512 2025-07-18 23:44:34,217 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755007_14183, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-18 23:44:38,104 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755007_14183 replica FinalizedReplica, blk_1073755007_14183, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755007 for deletion 2025-07-18 23:44:38,106 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755007_14183 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755007 2025-07-18 23:46:34,176 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755009_14185 src: /192.168.158.1:50896 dest: /192.168.158.4:9866 2025-07-18 23:46:34,209 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50896, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1789513573_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755009_14185, duration(ns): 23245879 2025-07-18 23:46:34,209 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755009_14185, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-18 23:46:38,108 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755009_14185 replica FinalizedReplica, blk_1073755009_14185, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755009 for deletion 2025-07-18 23:46:38,109 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755009_14185 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755009 2025-07-18 23:48:34,196 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755011_14187 src: /192.168.158.9:54954 dest: /192.168.158.4:9866 2025-07-18 23:48:34,226 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:54954, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1493617669_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755011_14187, duration(ns): 24649705 2025-07-18 23:48:34,226 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755011_14187, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-18 23:48:38,115 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755011_14187 replica FinalizedReplica, blk_1073755011_14187, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755011 for deletion 2025-07-18 23:48:38,116 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755011_14187 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755011 2025-07-18 23:51:44,187 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755014_14190 src: /192.168.158.8:60750 dest: /192.168.158.4:9866 2025-07-18 23:51:44,213 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:60750, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1628754711_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755014_14190, duration(ns): 19791813 2025-07-18 23:51:44,213 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755014_14190, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-18 23:51:50,122 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755014_14190 replica FinalizedReplica, blk_1073755014_14190, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755014 for deletion 2025-07-18 23:51:50,123 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755014_14190 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755014 2025-07-18 23:53:49,177 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755016_14192 src: /192.168.158.6:47798 dest: /192.168.158.4:9866 2025-07-18 23:53:49,195 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:47798, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_997776098_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755016_14192, duration(ns): 16081819 2025-07-18 23:53:49,195 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755016_14192, type=LAST_IN_PIPELINE terminating 2025-07-18 23:53:53,126 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755016_14192 replica FinalizedReplica, blk_1073755016_14192, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755016 for deletion 2025-07-18 23:53:53,127 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755016_14192 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755016 2025-07-18 23:54:49,178 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755017_14193 src: /192.168.158.9:59562 dest: /192.168.158.4:9866 2025-07-18 23:54:49,198 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:59562, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1501244440_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755017_14193, duration(ns): 17768895 2025-07-18 23:54:49,198 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755017_14193, type=LAST_IN_PIPELINE terminating 2025-07-18 23:54:56,129 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755017_14193 replica FinalizedReplica, blk_1073755017_14193, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755017 for deletion 2025-07-18 23:54:56,130 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755017_14193 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755017 2025-07-18 23:55:49,187 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755018_14194 src: /192.168.158.6:42920 dest: /192.168.158.4:9866 2025-07-18 23:55:49,205 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:42920, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_948689410_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755018_14194, duration(ns): 16395586 2025-07-18 23:55:49,206 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755018_14194, type=LAST_IN_PIPELINE terminating 2025-07-18 23:55:53,131 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755018_14194 replica FinalizedReplica, blk_1073755018_14194, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755018 for deletion 2025-07-18 23:55:53,132 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755018_14194 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755018 2025-07-18 23:57:54,201 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755020_14196 src: /192.168.158.8:53070 dest: /192.168.158.4:9866 2025-07-18 23:57:54,229 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:53070, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1075111549_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755020_14196, duration(ns): 22320761 2025-07-18 23:57:54,230 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755020_14196, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-18 23:57:59,139 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755020_14196 replica FinalizedReplica, blk_1073755020_14196, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755020 for deletion 2025-07-18 23:57:59,140 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755020_14196 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755020 2025-07-19 00:09:04,252 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755031_14207 src: /192.168.158.5:44976 dest: /192.168.158.4:9866 2025-07-19 00:09:04,279 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:44976, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-318984845_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755031_14207, duration(ns): 20958217 2025-07-19 00:09:04,279 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755031_14207, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-19 00:09:08,173 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755031_14207 replica FinalizedReplica, blk_1073755031_14207, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755031 for deletion 2025-07-19 00:09:08,174 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755031_14207 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755031 2025-07-19 00:15:09,226 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755037_14213 src: /192.168.158.6:53706 dest: /192.168.158.4:9866 2025-07-19 00:15:09,245 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:53706, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_206910605_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755037_14213, duration(ns): 17017694 2025-07-19 00:15:09,246 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755037_14213, type=LAST_IN_PIPELINE terminating 2025-07-19 00:15:14,190 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755037_14213 replica FinalizedReplica, blk_1073755037_14213, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755037 for deletion 2025-07-19 00:15:14,191 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755037_14213 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755037 2025-07-19 00:20:29,232 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755042_14218 src: /192.168.158.9:42816 dest: /192.168.158.4:9866 2025-07-19 00:20:29,255 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:42816, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1377194129_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755042_14218, duration(ns): 20755651 2025-07-19 00:20:29,256 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755042_14218, type=LAST_IN_PIPELINE terminating 2025-07-19 00:20:35,207 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755042_14218 replica FinalizedReplica, blk_1073755042_14218, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755042 for deletion 2025-07-19 00:20:35,208 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755042_14218 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755042 2025-07-19 00:22:29,209 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755044_14220 src: /192.168.158.1:42250 dest: /192.168.158.4:9866 2025-07-19 00:22:29,243 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42250, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2033144836_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755044_14220, duration(ns): 24898630 2025-07-19 00:22:29,243 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755044_14220, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-19 00:22:32,213 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755044_14220 replica FinalizedReplica, blk_1073755044_14220, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755044 for deletion 2025-07-19 00:22:32,215 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755044_14220 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755044 2025-07-19 00:25:29,214 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755047_14223 src: /192.168.158.1:55712 dest: /192.168.158.4:9866 2025-07-19 00:25:29,247 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55712, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-57425086_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755047_14223, duration(ns): 24073731 2025-07-19 00:25:29,248 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755047_14223, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-19 00:25:32,220 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755047_14223 replica FinalizedReplica, blk_1073755047_14223, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755047 for deletion 2025-07-19 00:25:32,221 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755047_14223 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755047 2025-07-19 00:26:29,228 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755048_14224 src: /192.168.158.9:36712 dest: /192.168.158.4:9866 2025-07-19 00:26:29,247 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:36712, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2143848825_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755048_14224, duration(ns): 17192885 2025-07-19 00:26:29,248 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755048_14224, type=LAST_IN_PIPELINE terminating 2025-07-19 00:26:35,224 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755048_14224 replica FinalizedReplica, blk_1073755048_14224, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755048 for deletion 2025-07-19 00:26:35,225 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755048_14224 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755048 2025-07-19 00:28:34,268 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755050_14226 src: /192.168.158.7:57254 dest: /192.168.158.4:9866 2025-07-19 00:28:34,295 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:57254, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-681911276_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755050_14226, duration(ns): 21529366 2025-07-19 00:28:34,296 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755050_14226, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-19 00:28:41,229 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755050_14226 replica FinalizedReplica, blk_1073755050_14226, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755050 for deletion 2025-07-19 00:28:41,230 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755050_14226 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755050 2025-07-19 00:29:34,244 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755051_14227 src: /192.168.158.1:50642 dest: /192.168.158.4:9866 2025-07-19 00:29:34,279 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50642, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1494844790_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755051_14227, duration(ns): 25843860 2025-07-19 00:29:34,279 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755051_14227, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-19 00:29:38,230 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755051_14227 replica FinalizedReplica, blk_1073755051_14227, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755051 for deletion 2025-07-19 00:29:38,231 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755051_14227 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755051 2025-07-19 00:31:44,255 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755053_14229 src: /192.168.158.5:46424 dest: /192.168.158.4:9866 2025-07-19 00:31:44,282 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:46424, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_657662565_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755053_14229, duration(ns): 21098080 2025-07-19 00:31:44,282 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755053_14229, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-19 00:31:47,232 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755053_14229 replica FinalizedReplica, blk_1073755053_14229, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755053 for deletion 2025-07-19 00:31:47,233 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755053_14229 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755053 2025-07-19 00:35:44,233 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755057_14233 src: /192.168.158.1:41714 dest: /192.168.158.4:9866 2025-07-19 00:35:44,266 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41714, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1695894688_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755057_14233, duration(ns): 23289230 2025-07-19 00:35:44,266 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755057_14233, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-19 00:35:47,238 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755057_14233 replica FinalizedReplica, blk_1073755057_14233, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755057 for deletion 2025-07-19 00:35:47,239 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755057_14233 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755057 2025-07-19 00:36:44,238 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755058_14234 src: /192.168.158.1:57070 dest: /192.168.158.4:9866 2025-07-19 00:36:44,270 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57070, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1862706139_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755058_14234, duration(ns): 23464253 2025-07-19 00:36:44,271 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755058_14234, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-19 00:36:50,242 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755058_14234 replica FinalizedReplica, blk_1073755058_14234, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755058 for deletion 2025-07-19 00:36:50,243 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755058_14234 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755058 2025-07-19 00:37:49,253 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755059_14235 src: /192.168.158.8:43384 dest: /192.168.158.4:9866 2025-07-19 00:37:49,273 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:43384, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_477653072_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755059_14235, duration(ns): 17090332 2025-07-19 00:37:49,273 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755059_14235, type=LAST_IN_PIPELINE terminating 2025-07-19 00:37:56,243 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755059_14235 replica FinalizedReplica, blk_1073755059_14235, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755059 for deletion 2025-07-19 00:37:56,244 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755059_14235 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755059 2025-07-19 00:38:49,268 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755060_14236 src: /192.168.158.9:38362 dest: /192.168.158.4:9866 2025-07-19 00:38:49,297 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:38362, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-144896539_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755060_14236, duration(ns): 23261756 2025-07-19 00:38:49,298 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755060_14236, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-19 00:38:53,244 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755060_14236 replica FinalizedReplica, blk_1073755060_14236, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755060 for deletion 2025-07-19 00:38:53,246 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755060_14236 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755060 2025-07-19 00:39:49,267 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755061_14237 src: /192.168.158.9:48874 dest: /192.168.158.4:9866 2025-07-19 00:39:49,286 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:48874, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1301781190_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755061_14237, duration(ns): 17070802 2025-07-19 00:39:49,286 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755061_14237, type=LAST_IN_PIPELINE terminating 2025-07-19 00:39:53,246 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755061_14237 replica FinalizedReplica, blk_1073755061_14237, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755061 for deletion 2025-07-19 00:39:53,248 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755061_14237 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755061 2025-07-19 00:40:49,260 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755062_14238 src: /192.168.158.7:46634 dest: /192.168.158.4:9866 2025-07-19 00:40:49,286 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:46634, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-588533294_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755062_14238, duration(ns): 20301217 2025-07-19 00:40:49,286 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755062_14238, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-19 00:40:53,247 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755062_14238 replica FinalizedReplica, blk_1073755062_14238, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755062 for deletion 2025-07-19 00:40:53,248 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755062_14238 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755062 2025-07-19 00:41:49,256 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755063_14239 src: /192.168.158.8:50202 dest: /192.168.158.4:9866 2025-07-19 00:41:49,282 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:50202, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_906365454_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755063_14239, duration(ns): 20758545 2025-07-19 00:41:49,283 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755063_14239, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-19 00:41:53,251 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755063_14239 replica FinalizedReplica, blk_1073755063_14239, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755063 for deletion 2025-07-19 00:41:53,252 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755063_14239 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755063 2025-07-19 00:42:49,259 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755064_14240 src: /192.168.158.1:58764 dest: /192.168.158.4:9866 2025-07-19 00:42:49,294 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58764, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_522772562_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755064_14240, duration(ns): 25798842 2025-07-19 00:42:49,294 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755064_14240, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-19 00:42:56,251 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755064_14240 replica FinalizedReplica, blk_1073755064_14240, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755064 for deletion 2025-07-19 00:42:56,252 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755064_14240 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755064 2025-07-19 00:44:49,270 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755066_14242 src: /192.168.158.7:56770 dest: /192.168.158.4:9866 2025-07-19 00:44:49,290 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:56770, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1347835787_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755066_14242, duration(ns): 17180484 2025-07-19 00:44:49,290 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755066_14242, type=LAST_IN_PIPELINE terminating 2025-07-19 00:44:56,258 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755066_14242 replica FinalizedReplica, blk_1073755066_14242, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755066 for deletion 2025-07-19 00:44:56,260 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755066_14242 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755066 2025-07-19 00:45:49,268 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755067_14243 src: /192.168.158.1:41776 dest: /192.168.158.4:9866 2025-07-19 00:45:49,303 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41776, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1622680215_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755067_14243, duration(ns): 26368255 2025-07-19 00:45:49,304 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755067_14243, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-19 00:45:53,260 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755067_14243 replica FinalizedReplica, blk_1073755067_14243, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755067 for deletion 2025-07-19 00:45:53,261 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755067_14243 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755067 2025-07-19 00:47:49,251 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755069_14245 src: /192.168.158.7:43600 dest: /192.168.158.4:9866 2025-07-19 00:47:49,271 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:43600, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1531113019_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755069_14245, duration(ns): 17684644 2025-07-19 00:47:49,271 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755069_14245, type=LAST_IN_PIPELINE terminating 2025-07-19 00:47:53,267 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755069_14245 replica FinalizedReplica, blk_1073755069_14245, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755069 for deletion 2025-07-19 00:47:53,268 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755069_14245 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755069 2025-07-19 00:48:49,254 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755070_14246 src: /192.168.158.8:50106 dest: /192.168.158.4:9866 2025-07-19 00:48:49,274 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:50106, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-197012706_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755070_14246, duration(ns): 17676675 2025-07-19 00:48:49,274 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755070_14246, type=LAST_IN_PIPELINE terminating 2025-07-19 00:48:56,269 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755070_14246 replica FinalizedReplica, blk_1073755070_14246, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755070 for deletion 2025-07-19 00:48:56,270 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755070_14246 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755070 2025-07-19 00:49:49,250 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755071_14247 src: /192.168.158.8:48656 dest: /192.168.158.4:9866 2025-07-19 00:49:49,274 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:48656, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_481161935_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755071_14247, duration(ns): 18928264 2025-07-19 00:49:49,274 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755071_14247, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-19 00:49:53,270 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755071_14247 replica FinalizedReplica, blk_1073755071_14247, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755071 for deletion 2025-07-19 00:49:53,271 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755071_14247 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755071 2025-07-19 00:50:54,288 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755072_14248 src: /192.168.158.1:47098 dest: /192.168.158.4:9866 2025-07-19 00:50:54,322 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47098, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1861341876_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755072_14248, duration(ns): 24979935 2025-07-19 00:50:54,323 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755072_14248, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-19 00:51:02,270 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755072_14248 replica FinalizedReplica, blk_1073755072_14248, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755072 for deletion 2025-07-19 00:51:02,271 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755072_14248 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755072 2025-07-19 00:52:54,271 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755074_14250 src: /192.168.158.6:49696 dest: /192.168.158.4:9866 2025-07-19 00:52:54,289 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:49696, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_366135627_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755074_14250, duration(ns): 16108061 2025-07-19 00:52:54,289 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755074_14250, type=LAST_IN_PIPELINE terminating 2025-07-19 00:53:02,271 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755074_14250 replica FinalizedReplica, blk_1073755074_14250, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755074 for deletion 2025-07-19 00:53:02,272 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755074_14250 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755074 2025-07-19 00:53:59,295 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755075_14251 src: /192.168.158.7:42398 dest: /192.168.158.4:9866 2025-07-19 00:53:59,322 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:42398, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-651766419_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755075_14251, duration(ns): 20936943 2025-07-19 00:53:59,322 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755075_14251, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-19 00:54:02,275 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755075_14251 replica FinalizedReplica, blk_1073755075_14251, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755075 for deletion 2025-07-19 00:54:02,276 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755075_14251 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755075 2025-07-19 00:54:59,291 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755076_14252 src: /192.168.158.8:51318 dest: /192.168.158.4:9866 2025-07-19 00:54:59,310 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:51318, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_582376376_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755076_14252, duration(ns): 17398568 2025-07-19 00:54:59,311 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755076_14252, type=LAST_IN_PIPELINE terminating 2025-07-19 00:55:02,276 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755076_14252 replica FinalizedReplica, blk_1073755076_14252, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755076 for deletion 2025-07-19 00:55:02,278 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755076_14252 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755076 2025-07-19 01:00:04,315 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755081_14257 src: /192.168.158.1:36082 dest: /192.168.158.4:9866 2025-07-19 01:00:04,349 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36082, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1727160345_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755081_14257, duration(ns): 25174531 2025-07-19 01:00:04,349 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755081_14257, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-19 01:00:08,290 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755081_14257 replica FinalizedReplica, blk_1073755081_14257, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755081 for deletion 2025-07-19 01:00:08,292 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755081_14257 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755081 2025-07-19 01:01:09,311 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755082_14258 src: /192.168.158.5:58968 dest: /192.168.158.4:9866 2025-07-19 01:01:09,338 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:58968, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1376592247_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755082_14258, duration(ns): 21134878 2025-07-19 01:01:09,338 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755082_14258, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-19 01:01:17,296 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755082_14258 replica FinalizedReplica, blk_1073755082_14258, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755082 for deletion 2025-07-19 01:01:17,297 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755082_14258 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755082 2025-07-19 01:02:09,317 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755083_14259 src: /192.168.158.8:33820 dest: /192.168.158.4:9866 2025-07-19 01:02:09,337 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:33820, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1145301914_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755083_14259, duration(ns): 18150289 2025-07-19 01:02:09,338 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755083_14259, type=LAST_IN_PIPELINE terminating 2025-07-19 01:02:17,301 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755083_14259 replica FinalizedReplica, blk_1073755083_14259, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755083 for deletion 2025-07-19 01:02:17,302 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755083_14259 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755083 2025-07-19 01:04:09,296 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755085_14261 src: /192.168.158.1:43382 dest: /192.168.158.4:9866 2025-07-19 01:04:09,332 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43382, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-961360452_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755085_14261, duration(ns): 26940078 2025-07-19 01:04:09,333 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755085_14261, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-19 01:04:14,306 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755085_14261 replica FinalizedReplica, blk_1073755085_14261, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755085 for deletion 2025-07-19 01:04:14,308 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755085_14261 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755085 2025-07-19 01:07:09,311 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755088_14264 src: /192.168.158.6:42806 dest: /192.168.158.4:9866 2025-07-19 01:07:09,330 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:42806, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1607992163_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755088_14264, duration(ns): 16406191 2025-07-19 01:07:09,330 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755088_14264, type=LAST_IN_PIPELINE terminating 2025-07-19 01:07:14,310 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755088_14264 replica FinalizedReplica, blk_1073755088_14264, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755088 for deletion 2025-07-19 01:07:14,311 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755088_14264 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755088 2025-07-19 01:10:09,312 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755091_14267 src: /192.168.158.1:50510 dest: /192.168.158.4:9866 2025-07-19 01:10:09,348 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50510, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-57051168_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755091_14267, duration(ns): 25578468 2025-07-19 01:10:09,349 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755091_14267, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-19 01:10:14,319 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755091_14267 replica FinalizedReplica, blk_1073755091_14267, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755091 for deletion 2025-07-19 01:10:14,320 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755091_14267 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755091 2025-07-19 01:11:09,321 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755092_14268 src: /192.168.158.8:46116 dest: /192.168.158.4:9866 2025-07-19 01:11:09,342 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:46116, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_236912091_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755092_14268, duration(ns): 19163791 2025-07-19 01:11:09,343 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755092_14268, type=LAST_IN_PIPELINE terminating 2025-07-19 01:11:14,323 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755092_14268 replica FinalizedReplica, blk_1073755092_14268, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755092 for deletion 2025-07-19 01:11:14,324 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755092_14268 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755092 2025-07-19 01:12:09,314 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755093_14269 src: /192.168.158.5:52194 dest: /192.168.158.4:9866 2025-07-19 01:12:09,343 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:52194, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1783543762_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755093_14269, duration(ns): 23048007 2025-07-19 01:12:09,343 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755093_14269, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-19 01:12:14,323 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755093_14269 replica FinalizedReplica, blk_1073755093_14269, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755093 for deletion 2025-07-19 01:12:14,324 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755093_14269 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755093 2025-07-19 01:16:19,304 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755097_14273 src: /192.168.158.5:40512 dest: /192.168.158.4:9866 2025-07-19 01:16:19,329 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:40512, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_583310597_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755097_14273, duration(ns): 20132794 2025-07-19 01:16:19,330 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755097_14273, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-19 01:16:23,330 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755097_14273 replica FinalizedReplica, blk_1073755097_14273, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755097 for deletion 2025-07-19 01:16:23,332 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755097_14273 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755097 2025-07-19 01:17:19,308 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755098_14274 src: /192.168.158.1:33810 dest: /192.168.158.4:9866 2025-07-19 01:17:19,346 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33810, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1246819478_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755098_14274, duration(ns): 27893022 2025-07-19 01:17:19,346 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755098_14274, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-19 01:17:23,334 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755098_14274 replica FinalizedReplica, blk_1073755098_14274, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755098 for deletion 2025-07-19 01:17:23,335 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755098_14274 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755098 2025-07-19 01:19:19,321 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755100_14276 src: /192.168.158.1:52262 dest: /192.168.158.4:9866 2025-07-19 01:19:19,356 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52262, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1648579534_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755100_14276, duration(ns): 25252103 2025-07-19 01:19:19,356 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755100_14276, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-19 01:19:23,340 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755100_14276 replica FinalizedReplica, blk_1073755100_14276, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755100 for deletion 2025-07-19 01:19:23,341 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755100_14276 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755100 2025-07-19 01:21:19,320 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755102_14278 src: /192.168.158.8:46184 dest: /192.168.158.4:9866 2025-07-19 01:21:19,346 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:46184, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_715937368_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755102_14278, duration(ns): 20039677 2025-07-19 01:21:19,346 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755102_14278, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-19 01:21:23,342 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755102_14278 replica FinalizedReplica, blk_1073755102_14278, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755102 for deletion 2025-07-19 01:21:23,343 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755102_14278 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755102 2025-07-19 01:24:19,333 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755105_14281 src: /192.168.158.9:52416 dest: /192.168.158.4:9866 2025-07-19 01:24:19,351 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:52416, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-693337511_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755105_14281, duration(ns): 16047299 2025-07-19 01:24:19,352 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755105_14281, type=LAST_IN_PIPELINE terminating 2025-07-19 01:24:23,349 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755105_14281 replica FinalizedReplica, blk_1073755105_14281, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755105 for deletion 2025-07-19 01:24:23,351 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755105_14281 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755105 2025-07-19 01:25:19,323 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755106_14282 src: /192.168.158.1:56530 dest: /192.168.158.4:9866 2025-07-19 01:25:19,356 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56530, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-950863505_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755106_14282, duration(ns): 24065430 2025-07-19 01:25:19,357 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755106_14282, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-19 01:25:23,351 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755106_14282 replica FinalizedReplica, blk_1073755106_14282, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755106 for deletion 2025-07-19 01:25:23,353 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755106_14282 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755106 2025-07-19 01:27:24,345 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755108_14284 src: /192.168.158.9:48196 dest: /192.168.158.4:9866 2025-07-19 01:27:24,364 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:48196, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_316756617_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755108_14284, duration(ns): 16579667 2025-07-19 01:27:24,364 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755108_14284, type=LAST_IN_PIPELINE terminating 2025-07-19 01:27:29,356 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755108_14284 replica FinalizedReplica, blk_1073755108_14284, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755108 for deletion 2025-07-19 01:27:29,357 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755108_14284 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755108 2025-07-19 01:28:24,330 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755109_14285 src: /192.168.158.6:32924 dest: /192.168.158.4:9866 2025-07-19 01:28:24,355 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:32924, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1773570712_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755109_14285, duration(ns): 19114235 2025-07-19 01:28:24,355 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755109_14285, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-19 01:28:26,357 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755109_14285 replica FinalizedReplica, blk_1073755109_14285, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755109 for deletion 2025-07-19 01:28:26,358 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755109_14285 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755109 2025-07-19 01:32:29,333 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755113_14289 src: /192.168.158.5:54282 dest: /192.168.158.4:9866 2025-07-19 01:32:29,357 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:54282, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1703652111_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755113_14289, duration(ns): 18983447 2025-07-19 01:32:29,357 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755113_14289, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-19 01:32:32,363 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755113_14289 replica FinalizedReplica, blk_1073755113_14289, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755113 for deletion 2025-07-19 01:32:32,365 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755113_14289 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755113 2025-07-19 01:33:29,335 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755114_14290 src: /192.168.158.6:46614 dest: /192.168.158.4:9866 2025-07-19 01:33:29,361 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:46614, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_128668320_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755114_14290, duration(ns): 20524727 2025-07-19 01:33:29,362 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755114_14290, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-19 01:33:32,368 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755114_14290 replica FinalizedReplica, blk_1073755114_14290, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755114 for deletion 2025-07-19 01:33:32,369 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755114_14290 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755114 2025-07-19 01:41:39,345 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755122_14298 src: /192.168.158.6:39974 dest: /192.168.158.4:9866 2025-07-19 01:41:39,371 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:39974, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1271259649_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755122_14298, duration(ns): 19760654 2025-07-19 01:41:39,371 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755122_14298, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-19 01:41:41,384 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755122_14298 replica FinalizedReplica, blk_1073755122_14298, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755122 for deletion 2025-07-19 01:41:41,385 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755122_14298 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755122 2025-07-19 01:43:39,346 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755124_14300 src: /192.168.158.1:38546 dest: /192.168.158.4:9866 2025-07-19 01:43:39,380 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38546, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-715116364_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755124_14300, duration(ns): 25099891 2025-07-19 01:43:39,380 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755124_14300, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-19 01:43:44,388 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755124_14300 replica FinalizedReplica, blk_1073755124_14300, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755124 for deletion 2025-07-19 01:43:44,389 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755124_14300 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755124 2025-07-19 01:44:39,346 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755125_14301 src: /192.168.158.1:43708 dest: /192.168.158.4:9866 2025-07-19 01:44:39,379 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43708, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1507428971_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755125_14301, duration(ns): 24247510 2025-07-19 01:44:39,379 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755125_14301, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-19 01:44:41,390 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755125_14301 replica FinalizedReplica, blk_1073755125_14301, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755125 for deletion 2025-07-19 01:44:41,391 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755125_14301 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755125 2025-07-19 01:47:39,362 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755128_14304 src: /192.168.158.1:56238 dest: /192.168.158.4:9866 2025-07-19 01:47:39,398 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56238, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1910337922_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755128_14304, duration(ns): 26502232 2025-07-19 01:47:39,398 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755128_14304, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-19 01:47:41,396 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755128_14304 replica FinalizedReplica, blk_1073755128_14304, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755128 for deletion 2025-07-19 01:47:41,397 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755128_14304 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755128 2025-07-19 01:48:39,354 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755129_14305 src: /192.168.158.1:51624 dest: /192.168.158.4:9866 2025-07-19 01:48:39,386 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51624, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1734366643_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755129_14305, duration(ns): 23434021 2025-07-19 01:48:39,387 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755129_14305, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-19 01:48:41,396 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755129_14305 replica FinalizedReplica, blk_1073755129_14305, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755129 for deletion 2025-07-19 01:48:41,398 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755129_14305 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755129 2025-07-19 01:49:39,360 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755130_14306 src: /192.168.158.8:46062 dest: /192.168.158.4:9866 2025-07-19 01:49:39,387 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:46062, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1831998099_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755130_14306, duration(ns): 21346129 2025-07-19 01:49:39,388 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755130_14306, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-19 01:49:44,399 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755130_14306 replica FinalizedReplica, blk_1073755130_14306, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755130 for deletion 2025-07-19 01:49:44,400 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755130_14306 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755130 2025-07-19 01:53:39,364 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755134_14310 src: /192.168.158.1:41474 dest: /192.168.158.4:9866 2025-07-19 01:53:39,396 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41474, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1612919678_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755134_14310, duration(ns): 22556455 2025-07-19 01:53:39,396 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755134_14310, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-19 01:53:41,409 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755134_14310 replica FinalizedReplica, blk_1073755134_14310, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755134 for deletion 2025-07-19 01:53:41,410 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755134_14310 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755134 2025-07-19 01:54:39,370 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755135_14311 src: /192.168.158.5:42974 dest: /192.168.158.4:9866 2025-07-19 01:54:39,399 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:42974, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_631190828_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755135_14311, duration(ns): 22261823 2025-07-19 01:54:39,399 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755135_14311, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-19 01:54:44,416 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755135_14311 replica FinalizedReplica, blk_1073755135_14311, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755135 for deletion 2025-07-19 01:54:44,417 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755135_14311 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir19/blk_1073755135 2025-07-19 01:55:39,369 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755136_14312 src: /192.168.158.1:56984 dest: /192.168.158.4:9866 2025-07-19 01:55:39,408 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56984, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1589418825_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755136_14312, duration(ns): 30146263 2025-07-19 01:55:39,409 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755136_14312, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-19 01:55:41,418 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755136_14312 replica FinalizedReplica, blk_1073755136_14312, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755136 for deletion 2025-07-19 01:55:41,419 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755136_14312 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755136 2025-07-19 01:56:44,368 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755137_14313 src: /192.168.158.1:41742 dest: /192.168.158.4:9866 2025-07-19 01:56:44,401 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41742, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1440799593_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755137_14313, duration(ns): 23576536 2025-07-19 01:56:44,401 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755137_14313, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-19 01:56:47,424 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755137_14313 replica FinalizedReplica, blk_1073755137_14313, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755137 for deletion 2025-07-19 01:56:47,425 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755137_14313 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755137 2025-07-19 02:00:49,372 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755141_14317 src: /192.168.158.1:53810 dest: /192.168.158.4:9866 2025-07-19 02:00:49,405 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53810, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1430023927_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755141_14317, duration(ns): 23408328 2025-07-19 02:00:49,405 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755141_14317, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-19 02:00:53,431 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755141_14317 replica FinalizedReplica, blk_1073755141_14317, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755141 for deletion 2025-07-19 02:00:53,432 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755141_14317 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755141 2025-07-19 02:01:49,379 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755142_14318 src: /192.168.158.5:44206 dest: /192.168.158.4:9866 2025-07-19 02:01:49,406 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:44206, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2139962041_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755142_14318, duration(ns): 21628776 2025-07-19 02:01:49,407 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755142_14318, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-19 02:01:53,434 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755142_14318 replica FinalizedReplica, blk_1073755142_14318, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755142 for deletion 2025-07-19 02:01:53,435 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755142_14318 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755142 2025-07-19 02:03:49,383 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755144_14320 src: /192.168.158.9:50624 dest: /192.168.158.4:9866 2025-07-19 02:03:49,402 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:50624, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-219215632_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755144_14320, duration(ns): 16162473 2025-07-19 02:03:49,402 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755144_14320, type=LAST_IN_PIPELINE terminating 2025-07-19 02:03:53,436 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755144_14320 replica FinalizedReplica, blk_1073755144_14320, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755144 for deletion 2025-07-19 02:03:53,437 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755144_14320 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755144 2025-07-19 02:04:49,391 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755145_14321 src: /192.168.158.6:47154 dest: /192.168.158.4:9866 2025-07-19 02:04:49,412 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:47154, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_215142320_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755145_14321, duration(ns): 19196639 2025-07-19 02:04:49,412 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755145_14321, type=LAST_IN_PIPELINE terminating 2025-07-19 02:04:53,440 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755145_14321 replica FinalizedReplica, blk_1073755145_14321, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755145 for deletion 2025-07-19 02:04:53,441 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755145_14321 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755145 2025-07-19 02:06:49,382 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755147_14323 src: /192.168.158.1:35780 dest: /192.168.158.4:9866 2025-07-19 02:06:49,416 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35780, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-506743791_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755147_14323, duration(ns): 25087574 2025-07-19 02:06:49,416 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755147_14323, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-19 02:06:56,442 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755147_14323 replica FinalizedReplica, blk_1073755147_14323, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755147 for deletion 2025-07-19 02:06:56,443 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755147_14323 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755147 2025-07-19 02:09:54,390 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755150_14326 src: /192.168.158.1:58072 dest: /192.168.158.4:9866 2025-07-19 02:09:54,424 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58072, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1812349233_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755150_14326, duration(ns): 24754512 2025-07-19 02:09:54,424 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755150_14326, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-19 02:09:56,445 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755150_14326 replica FinalizedReplica, blk_1073755150_14326, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755150 for deletion 2025-07-19 02:09:56,446 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755150_14326 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755150 2025-07-19 02:10:59,395 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755151_14327 src: /192.168.158.8:40230 dest: /192.168.158.4:9866 2025-07-19 02:10:59,413 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:40230, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1027970687_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755151_14327, duration(ns): 16477729 2025-07-19 02:10:59,414 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755151_14327, type=LAST_IN_PIPELINE terminating 2025-07-19 02:11:05,448 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755151_14327 replica FinalizedReplica, blk_1073755151_14327, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755151 for deletion 2025-07-19 02:11:05,449 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755151_14327 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755151 2025-07-19 02:11:59,392 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755152_14328 src: /192.168.158.1:34714 dest: /192.168.158.4:9866 2025-07-19 02:11:59,425 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34714, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_969924337_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755152_14328, duration(ns): 24344476 2025-07-19 02:11:59,426 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755152_14328, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-19 02:12:02,450 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755152_14328 replica FinalizedReplica, blk_1073755152_14328, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755152 for deletion 2025-07-19 02:12:02,451 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755152_14328 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755152 2025-07-19 02:12:59,399 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755153_14329 src: /192.168.158.6:37330 dest: /192.168.158.4:9866 2025-07-19 02:12:59,419 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:37330, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_690480783_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755153_14329, duration(ns): 17633697 2025-07-19 02:12:59,420 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755153_14329, type=LAST_IN_PIPELINE terminating 2025-07-19 02:13:05,454 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755153_14329 replica FinalizedReplica, blk_1073755153_14329, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755153 for deletion 2025-07-19 02:13:05,455 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755153_14329 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755153 2025-07-19 02:14:59,398 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755155_14331 src: /192.168.158.7:39502 dest: /192.168.158.4:9866 2025-07-19 02:14:59,425 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:39502, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1761999553_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755155_14331, duration(ns): 21621527 2025-07-19 02:14:59,426 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755155_14331, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-19 02:15:02,457 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755155_14331 replica FinalizedReplica, blk_1073755155_14331, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755155 for deletion 2025-07-19 02:15:02,458 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755155_14331 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755155 2025-07-19 02:15:59,400 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755156_14332 src: /192.168.158.7:48478 dest: /192.168.158.4:9866 2025-07-19 02:15:59,427 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:48478, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-770317165_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755156_14332, duration(ns): 20886152 2025-07-19 02:15:59,429 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755156_14332, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-19 02:16:02,456 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755156_14332 replica FinalizedReplica, blk_1073755156_14332, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755156 for deletion 2025-07-19 02:16:02,457 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755156_14332 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755156 2025-07-19 02:16:59,405 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755157_14333 src: /192.168.158.8:41126 dest: /192.168.158.4:9866 2025-07-19 02:16:59,425 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:41126, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_55517012_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755157_14333, duration(ns): 18478979 2025-07-19 02:16:59,426 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755157_14333, type=LAST_IN_PIPELINE terminating 2025-07-19 02:17:02,459 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755157_14333 replica FinalizedReplica, blk_1073755157_14333, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755157 for deletion 2025-07-19 02:17:02,460 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755157_14333 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755157 2025-07-19 02:17:59,399 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755158_14334 src: /192.168.158.1:59676 dest: /192.168.158.4:9866 2025-07-19 02:17:59,434 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59676, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1555000203_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755158_14334, duration(ns): 25451099 2025-07-19 02:17:59,434 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755158_14334, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-19 02:18:05,460 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755158_14334 replica FinalizedReplica, blk_1073755158_14334, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755158 for deletion 2025-07-19 02:18:05,462 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755158_14334 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755158 2025-07-19 02:18:59,408 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755159_14335 src: /192.168.158.8:39886 dest: /192.168.158.4:9866 2025-07-19 02:18:59,436 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:39886, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1812786189_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755159_14335, duration(ns): 22715479 2025-07-19 02:18:59,436 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755159_14335, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-19 02:19:02,462 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755159_14335 replica FinalizedReplica, blk_1073755159_14335, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755159 for deletion 2025-07-19 02:19:02,463 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755159_14335 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755159 2025-07-19 02:21:04,407 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755161_14337 src: /192.168.158.6:54594 dest: /192.168.158.4:9866 2025-07-19 02:21:04,434 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:54594, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_452186000_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755161_14337, duration(ns): 21166605 2025-07-19 02:21:04,434 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755161_14337, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-19 02:21:08,466 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755161_14337 replica FinalizedReplica, blk_1073755161_14337, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755161 for deletion 2025-07-19 02:21:08,467 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755161_14337 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755161 2025-07-19 02:22:04,412 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755162_14338 src: /192.168.158.6:60490 dest: /192.168.158.4:9866 2025-07-19 02:22:04,431 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:60490, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1103623325_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755162_14338, duration(ns): 16038905 2025-07-19 02:22:04,431 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755162_14338, type=LAST_IN_PIPELINE terminating 2025-07-19 02:22:08,468 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755162_14338 replica FinalizedReplica, blk_1073755162_14338, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755162 for deletion 2025-07-19 02:22:08,469 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755162_14338 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755162 2025-07-19 02:23:04,411 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755163_14339 src: /192.168.158.1:47106 dest: /192.168.158.4:9866 2025-07-19 02:23:04,447 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47106, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-456020837_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755163_14339, duration(ns): 25689224 2025-07-19 02:23:04,448 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755163_14339, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-19 02:23:08,469 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755163_14339 replica FinalizedReplica, blk_1073755163_14339, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755163 for deletion 2025-07-19 02:23:08,471 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755163_14339 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755163 2025-07-19 02:24:04,409 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755164_14340 src: /192.168.158.1:32878 dest: /192.168.158.4:9866 2025-07-19 02:24:04,447 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:32878, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_587500108_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755164_14340, duration(ns): 27160864 2025-07-19 02:24:04,447 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755164_14340, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-19 02:24:08,474 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755164_14340 replica FinalizedReplica, blk_1073755164_14340, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755164 for deletion 2025-07-19 02:24:08,475 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755164_14340 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755164 2025-07-19 02:26:04,423 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755166_14342 src: /192.168.158.8:59132 dest: /192.168.158.4:9866 2025-07-19 02:26:04,450 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:59132, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_88861924_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755166_14342, duration(ns): 21588043 2025-07-19 02:26:04,450 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755166_14342, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-19 02:26:11,478 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755166_14342 replica FinalizedReplica, blk_1073755166_14342, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755166 for deletion 2025-07-19 02:26:11,479 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755166_14342 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755166 2025-07-19 02:28:04,425 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755168_14344 src: /192.168.158.7:44288 dest: /192.168.158.4:9866 2025-07-19 02:28:04,445 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:44288, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_820273626_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755168_14344, duration(ns): 17868978 2025-07-19 02:28:04,445 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755168_14344, type=LAST_IN_PIPELINE terminating 2025-07-19 02:28:08,480 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755168_14344 replica FinalizedReplica, blk_1073755168_14344, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755168 for deletion 2025-07-19 02:28:08,482 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755168_14344 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755168 2025-07-19 02:30:04,436 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755170_14346 src: /192.168.158.5:40038 dest: /192.168.158.4:9866 2025-07-19 02:30:04,463 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:40038, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_688438233_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755170_14346, duration(ns): 20903247 2025-07-19 02:30:04,463 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755170_14346, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-19 02:30:08,483 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755170_14346 replica FinalizedReplica, blk_1073755170_14346, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755170 for deletion 2025-07-19 02:30:08,484 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755170_14346 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755170 2025-07-19 02:31:04,426 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755171_14347 src: /192.168.158.1:35710 dest: /192.168.158.4:9866 2025-07-19 02:31:04,460 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35710, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-316517781_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755171_14347, duration(ns): 24032117 2025-07-19 02:31:04,460 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755171_14347, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-19 02:31:08,484 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755171_14347 replica FinalizedReplica, blk_1073755171_14347, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755171 for deletion 2025-07-19 02:31:08,486 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755171_14347 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755171 2025-07-19 02:32:04,409 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755172_14348 src: /192.168.158.7:55080 dest: /192.168.158.4:9866 2025-07-19 02:32:04,435 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:55080, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-664187142_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755172_14348, duration(ns): 20610395 2025-07-19 02:32:04,435 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755172_14348, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-19 02:32:08,484 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755172_14348 replica FinalizedReplica, blk_1073755172_14348, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755172 for deletion 2025-07-19 02:32:08,485 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755172_14348 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755172 2025-07-19 02:33:09,426 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755173_14349 src: /192.168.158.1:38794 dest: /192.168.158.4:9866 2025-07-19 02:33:09,461 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38794, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1126132767_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755173_14349, duration(ns): 25602579 2025-07-19 02:33:09,461 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755173_14349, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-19 02:33:14,486 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755173_14349 replica FinalizedReplica, blk_1073755173_14349, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755173 for deletion 2025-07-19 02:33:14,487 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755173_14349 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755173 2025-07-19 02:34:09,432 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755174_14350 src: /192.168.158.7:37816 dest: /192.168.158.4:9866 2025-07-19 02:34:09,449 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:37816, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2001305050_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755174_14350, duration(ns): 15438590 2025-07-19 02:34:09,450 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755174_14350, type=LAST_IN_PIPELINE terminating 2025-07-19 02:34:11,489 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755174_14350 replica FinalizedReplica, blk_1073755174_14350, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755174 for deletion 2025-07-19 02:34:11,490 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755174_14350 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755174 2025-07-19 02:36:09,436 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755176_14352 src: /192.168.158.9:58494 dest: /192.168.158.4:9866 2025-07-19 02:36:09,465 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:58494, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1247847441_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755176_14352, duration(ns): 23927106 2025-07-19 02:36:09,466 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755176_14352, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-19 02:36:14,493 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755176_14352 replica FinalizedReplica, blk_1073755176_14352, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755176 for deletion 2025-07-19 02:36:14,495 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755176_14352 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755176 2025-07-19 02:37:14,443 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755177_14353 src: /192.168.158.5:54370 dest: /192.168.158.4:9866 2025-07-19 02:37:14,462 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:54370, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-822738312_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755177_14353, duration(ns): 16969761 2025-07-19 02:37:14,462 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755177_14353, type=LAST_IN_PIPELINE terminating 2025-07-19 02:37:17,493 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755177_14353 replica FinalizedReplica, blk_1073755177_14353, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755177 for deletion 2025-07-19 02:37:17,495 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755177_14353 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755177 2025-07-19 02:38:14,444 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755178_14354 src: /192.168.158.8:51026 dest: /192.168.158.4:9866 2025-07-19 02:38:14,464 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:51026, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1109148469_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755178_14354, duration(ns): 17516636 2025-07-19 02:38:14,464 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755178_14354, type=LAST_IN_PIPELINE terminating 2025-07-19 02:38:17,498 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755178_14354 replica FinalizedReplica, blk_1073755178_14354, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755178 for deletion 2025-07-19 02:38:17,499 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755178_14354 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755178 2025-07-19 02:39:19,433 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755179_14355 src: /192.168.158.8:50486 dest: /192.168.158.4:9866 2025-07-19 02:39:19,461 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:50486, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1694541064_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755179_14355, duration(ns): 22322896 2025-07-19 02:39:19,462 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755179_14355, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-19 02:39:23,498 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755179_14355 replica FinalizedReplica, blk_1073755179_14355, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755179 for deletion 2025-07-19 02:39:23,499 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755179_14355 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755179 2025-07-19 02:40:19,432 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755180_14356 src: /192.168.158.1:57136 dest: /192.168.158.4:9866 2025-07-19 02:40:19,467 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57136, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_844888344_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755180_14356, duration(ns): 25506428 2025-07-19 02:40:19,467 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755180_14356, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-19 02:40:23,498 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755180_14356 replica FinalizedReplica, blk_1073755180_14356, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755180 for deletion 2025-07-19 02:40:23,499 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755180_14356 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755180 2025-07-19 02:41:19,442 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755181_14357 src: /192.168.158.7:50416 dest: /192.168.158.4:9866 2025-07-19 02:41:19,469 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:50416, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1408011461_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755181_14357, duration(ns): 20975526 2025-07-19 02:41:19,469 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755181_14357, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-19 02:41:26,502 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755181_14357 replica FinalizedReplica, blk_1073755181_14357, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755181 for deletion 2025-07-19 02:41:26,503 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755181_14357 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755181 2025-07-19 02:42:19,437 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755182_14358 src: /192.168.158.1:59122 dest: /192.168.158.4:9866 2025-07-19 02:42:19,471 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59122, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-551032336_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755182_14358, duration(ns): 24020290 2025-07-19 02:42:19,471 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755182_14358, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-19 02:42:23,504 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755182_14358 replica FinalizedReplica, blk_1073755182_14358, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755182 for deletion 2025-07-19 02:42:23,505 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755182_14358 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755182 2025-07-19 02:43:19,442 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755183_14359 src: /192.168.158.9:49830 dest: /192.168.158.4:9866 2025-07-19 02:43:19,468 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:49830, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_438921165_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755183_14359, duration(ns): 20598177 2025-07-19 02:43:19,468 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755183_14359, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-19 02:43:23,506 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755183_14359 replica FinalizedReplica, blk_1073755183_14359, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755183 for deletion 2025-07-19 02:43:23,508 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755183_14359 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755183 2025-07-19 02:45:19,451 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755185_14361 src: /192.168.158.8:44160 dest: /192.168.158.4:9866 2025-07-19 02:45:19,480 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:44160, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-130440710_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755185_14361, duration(ns): 23148686 2025-07-19 02:45:19,480 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755185_14361, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-19 02:45:23,510 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755185_14361 replica FinalizedReplica, blk_1073755185_14361, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755185 for deletion 2025-07-19 02:45:23,511 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755185_14361 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755185 2025-07-19 02:46:19,448 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755186_14362 src: /192.168.158.6:52238 dest: /192.168.158.4:9866 2025-07-19 02:46:19,474 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:52238, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2033908451_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755186_14362, duration(ns): 20361897 2025-07-19 02:46:19,474 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755186_14362, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-19 02:46:23,513 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755186_14362 replica FinalizedReplica, blk_1073755186_14362, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755186 for deletion 2025-07-19 02:46:23,514 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755186_14362 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755186 2025-07-19 02:48:24,450 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755188_14364 src: /192.168.158.8:48436 dest: /192.168.158.4:9866 2025-07-19 02:48:24,477 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:48436, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-676567800_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755188_14364, duration(ns): 20814640 2025-07-19 02:48:24,477 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755188_14364, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-19 02:48:26,515 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755188_14364 replica FinalizedReplica, blk_1073755188_14364, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755188 for deletion 2025-07-19 02:48:26,516 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755188_14364 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755188 2025-07-19 02:49:24,447 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755189_14365 src: /192.168.158.1:52140 dest: /192.168.158.4:9866 2025-07-19 02:49:24,482 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52140, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1381234691_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755189_14365, duration(ns): 26095323 2025-07-19 02:49:24,482 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755189_14365, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-19 02:49:26,515 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755189_14365 replica FinalizedReplica, blk_1073755189_14365, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755189 for deletion 2025-07-19 02:49:26,516 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755189_14365 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755189 2025-07-19 02:51:29,452 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755191_14367 src: /192.168.158.1:59362 dest: /192.168.158.4:9866 2025-07-19 02:51:29,489 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59362, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2147203956_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755191_14367, duration(ns): 26368884 2025-07-19 02:51:29,489 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755191_14367, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-19 02:51:32,518 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755191_14367 replica FinalizedReplica, blk_1073755191_14367, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755191 for deletion 2025-07-19 02:51:32,519 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755191_14367 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755191 2025-07-19 02:54:29,473 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755194_14370 src: /192.168.158.6:36946 dest: /192.168.158.4:9866 2025-07-19 02:54:29,496 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:36946, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-715531189_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755194_14370, duration(ns): 20896976 2025-07-19 02:54:29,497 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755194_14370, type=LAST_IN_PIPELINE terminating 2025-07-19 02:54:32,524 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755194_14370 replica FinalizedReplica, blk_1073755194_14370, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755194 for deletion 2025-07-19 02:54:32,525 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755194_14370 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755194 2025-07-19 02:57:34,466 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755197_14373 src: /192.168.158.5:46844 dest: /192.168.158.4:9866 2025-07-19 02:57:34,491 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:46844, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_608623449_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755197_14373, duration(ns): 19849761 2025-07-19 02:57:34,492 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755197_14373, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-19 02:57:38,532 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755197_14373 replica FinalizedReplica, blk_1073755197_14373, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755197 for deletion 2025-07-19 02:57:38,533 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755197_14373 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755197 2025-07-19 03:00:39,473 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755200_14376 src: /192.168.158.5:45014 dest: /192.168.158.4:9866 2025-07-19 03:00:39,499 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:45014, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-252377440_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755200_14376, duration(ns): 20497736 2025-07-19 03:00:39,499 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755200_14376, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-19 03:00:44,537 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755200_14376 replica FinalizedReplica, blk_1073755200_14376, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755200 for deletion 2025-07-19 03:00:44,538 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755200_14376 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755200 2025-07-19 03:01:39,470 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755201_14377 src: /192.168.158.1:56980 dest: /192.168.158.4:9866 2025-07-19 03:01:39,502 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56980, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1179433579_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755201_14377, duration(ns): 23147267 2025-07-19 03:01:39,503 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755201_14377, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-19 03:01:44,537 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755201_14377 replica FinalizedReplica, blk_1073755201_14377, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755201 for deletion 2025-07-19 03:01:44,539 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755201_14377 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755201 2025-07-19 03:04:44,481 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755204_14380 src: /192.168.158.5:32830 dest: /192.168.158.4:9866 2025-07-19 03:04:44,500 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:32830, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-189208268_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755204_14380, duration(ns): 16674977 2025-07-19 03:04:44,500 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755204_14380, type=LAST_IN_PIPELINE terminating 2025-07-19 03:04:50,545 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755204_14380 replica FinalizedReplica, blk_1073755204_14380, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755204 for deletion 2025-07-19 03:04:50,546 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755204_14380 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755204 2025-07-19 03:06:44,481 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755206_14382 src: /192.168.158.7:47512 dest: /192.168.158.4:9866 2025-07-19 03:06:44,507 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:47512, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_931198716_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755206_14382, duration(ns): 20481781 2025-07-19 03:06:44,507 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755206_14382, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-19 03:06:47,554 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755206_14382 replica FinalizedReplica, blk_1073755206_14382, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755206 for deletion 2025-07-19 03:06:47,555 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755206_14382 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755206 2025-07-19 03:07:44,479 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755207_14383 src: /192.168.158.8:46750 dest: /192.168.158.4:9866 2025-07-19 03:07:44,508 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:46750, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1883282479_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755207_14383, duration(ns): 23799778 2025-07-19 03:07:44,509 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755207_14383, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-19 03:07:47,556 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755207_14383 replica FinalizedReplica, blk_1073755207_14383, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755207 for deletion 2025-07-19 03:07:47,558 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755207_14383 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755207 2025-07-19 03:08:44,485 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755208_14384 src: /192.168.158.6:33984 dest: /192.168.158.4:9866 2025-07-19 03:08:44,515 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:33984, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-40514413_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755208_14384, duration(ns): 24793495 2025-07-19 03:08:44,515 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755208_14384, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-19 03:08:47,561 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755208_14384 replica FinalizedReplica, blk_1073755208_14384, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755208 for deletion 2025-07-19 03:08:47,562 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755208_14384 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755208 2025-07-19 03:09:44,486 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755209_14385 src: /192.168.158.8:50608 dest: /192.168.158.4:9866 2025-07-19 03:09:44,513 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:50608, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_499374133_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755209_14385, duration(ns): 21560901 2025-07-19 03:09:44,514 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755209_14385, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-19 03:09:47,563 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755209_14385 replica FinalizedReplica, blk_1073755209_14385, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755209 for deletion 2025-07-19 03:09:47,564 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755209_14385 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755209 2025-07-19 03:11:49,494 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755211_14387 src: /192.168.158.6:36118 dest: /192.168.158.4:9866 2025-07-19 03:11:49,512 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:36118, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1480230649_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755211_14387, duration(ns): 16201418 2025-07-19 03:11:49,512 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755211_14387, type=LAST_IN_PIPELINE terminating 2025-07-19 03:11:56,567 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755211_14387 replica FinalizedReplica, blk_1073755211_14387, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755211 for deletion 2025-07-19 03:11:56,568 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755211_14387 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755211 2025-07-19 03:12:49,493 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755212_14388 src: /192.168.158.8:44256 dest: /192.168.158.4:9866 2025-07-19 03:12:49,520 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:44256, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1905219307_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755212_14388, duration(ns): 22244889 2025-07-19 03:12:49,521 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755212_14388, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-19 03:12:53,570 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755212_14388 replica FinalizedReplica, blk_1073755212_14388, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755212 for deletion 2025-07-19 03:12:53,572 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755212_14388 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755212 2025-07-19 03:16:54,502 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755216_14392 src: /192.168.158.6:49136 dest: /192.168.158.4:9866 2025-07-19 03:16:54,522 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:49136, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2869240_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755216_14392, duration(ns): 16531145 2025-07-19 03:16:54,522 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755216_14392, type=LAST_IN_PIPELINE terminating 2025-07-19 03:16:56,582 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755216_14392 replica FinalizedReplica, blk_1073755216_14392, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755216 for deletion 2025-07-19 03:16:56,583 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755216_14392 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755216 2025-07-19 03:19:54,501 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755219_14395 src: /192.168.158.9:57468 dest: /192.168.158.4:9866 2025-07-19 03:19:54,519 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:57468, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1947393519_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755219_14395, duration(ns): 16493620 2025-07-19 03:19:54,520 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755219_14395, type=LAST_IN_PIPELINE terminating 2025-07-19 03:19:56,588 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755219_14395 replica FinalizedReplica, blk_1073755219_14395, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755219 for deletion 2025-07-19 03:19:56,590 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755219_14395 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755219 2025-07-19 03:21:59,500 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755221_14397 src: /192.168.158.1:54896 dest: /192.168.158.4:9866 2025-07-19 03:21:59,534 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54896, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1357212478_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755221_14397, duration(ns): 24750985 2025-07-19 03:21:59,534 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755221_14397, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-19 03:22:02,592 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755221_14397 replica FinalizedReplica, blk_1073755221_14397, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755221 for deletion 2025-07-19 03:22:02,593 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755221_14397 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755221 2025-07-19 03:28:09,512 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755227_14403 src: /192.168.158.9:33146 dest: /192.168.158.4:9866 2025-07-19 03:28:09,533 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:33146, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-356093890_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755227_14403, duration(ns): 19432052 2025-07-19 03:28:09,533 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755227_14403, type=LAST_IN_PIPELINE terminating 2025-07-19 03:28:11,600 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755227_14403 replica FinalizedReplica, blk_1073755227_14403, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755227 for deletion 2025-07-19 03:28:11,601 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755227_14403 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755227 2025-07-19 03:31:14,508 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755230_14406 src: /192.168.158.8:46086 dest: /192.168.158.4:9866 2025-07-19 03:31:14,538 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:46086, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1915495453_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755230_14406, duration(ns): 24417128 2025-07-19 03:31:14,539 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755230_14406, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-19 03:31:20,608 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755230_14406 replica FinalizedReplica, blk_1073755230_14406, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755230 for deletion 2025-07-19 03:31:20,609 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755230_14406 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755230 2025-07-19 03:36:19,519 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755235_14411 src: /192.168.158.5:58100 dest: /192.168.158.4:9866 2025-07-19 03:36:19,537 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:58100, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1545022897_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755235_14411, duration(ns): 16189866 2025-07-19 03:36:19,538 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755235_14411, type=LAST_IN_PIPELINE terminating 2025-07-19 03:36:26,619 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755235_14411 replica FinalizedReplica, blk_1073755235_14411, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755235 for deletion 2025-07-19 03:36:26,620 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755235_14411 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755235 2025-07-19 03:39:24,537 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755238_14414 src: /192.168.158.6:42438 dest: /192.168.158.4:9866 2025-07-19 03:39:24,558 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:42438, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1866901207_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755238_14414, duration(ns): 18872158 2025-07-19 03:39:24,559 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755238_14414, type=LAST_IN_PIPELINE terminating 2025-07-19 03:39:26,627 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755238_14414 replica FinalizedReplica, blk_1073755238_14414, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755238 for deletion 2025-07-19 03:39:26,629 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755238_14414 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755238 2025-07-19 03:40:29,529 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755239_14415 src: /192.168.158.1:45998 dest: /192.168.158.4:9866 2025-07-19 03:40:29,562 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45998, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1662272801_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755239_14415, duration(ns): 24128818 2025-07-19 03:40:29,562 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755239_14415, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-19 03:40:35,629 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755239_14415 replica FinalizedReplica, blk_1073755239_14415, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755239 for deletion 2025-07-19 03:40:35,630 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755239_14415 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755239 2025-07-19 03:42:29,538 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755241_14417 src: /192.168.158.1:37540 dest: /192.168.158.4:9866 2025-07-19 03:42:29,574 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37540, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-244209354_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755241_14417, duration(ns): 25917306 2025-07-19 03:42:29,574 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755241_14417, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-19 03:42:35,636 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755241_14417 replica FinalizedReplica, blk_1073755241_14417, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755241 for deletion 2025-07-19 03:42:35,637 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755241_14417 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755241 2025-07-19 03:43:29,544 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755242_14418 src: /192.168.158.6:43234 dest: /192.168.158.4:9866 2025-07-19 03:43:29,564 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:43234, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_337146995_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755242_14418, duration(ns): 18834844 2025-07-19 03:43:29,565 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755242_14418, type=LAST_IN_PIPELINE terminating 2025-07-19 03:43:32,639 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755242_14418 replica FinalizedReplica, blk_1073755242_14418, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755242 for deletion 2025-07-19 03:43:32,640 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755242_14418 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755242 2025-07-19 03:44:34,521 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755243_14419 src: /192.168.158.1:48654 dest: /192.168.158.4:9866 2025-07-19 03:44:34,554 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48654, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-86514374_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755243_14419, duration(ns): 23871079 2025-07-19 03:44:34,554 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755243_14419, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-19 03:44:41,641 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755243_14419 replica FinalizedReplica, blk_1073755243_14419, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755243 for deletion 2025-07-19 03:44:41,642 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755243_14419 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755243 2025-07-19 03:46:34,533 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755245_14421 src: /192.168.158.1:45302 dest: /192.168.158.4:9866 2025-07-19 03:46:34,566 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45302, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1541997878_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755245_14421, duration(ns): 23521040 2025-07-19 03:46:34,566 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755245_14421, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-19 03:46:41,644 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755245_14421 replica FinalizedReplica, blk_1073755245_14421, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755245 for deletion 2025-07-19 03:46:41,645 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755245_14421 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755245 2025-07-19 03:48:39,541 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755247_14423 src: /192.168.158.6:38076 dest: /192.168.158.4:9866 2025-07-19 03:48:39,561 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:38076, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1586514266_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755247_14423, duration(ns): 17467078 2025-07-19 03:48:39,561 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755247_14423, type=LAST_IN_PIPELINE terminating 2025-07-19 03:48:41,647 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755247_14423 replica FinalizedReplica, blk_1073755247_14423, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755247 for deletion 2025-07-19 03:48:41,648 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755247_14423 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755247 2025-07-19 03:49:39,540 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755248_14424 src: /192.168.158.5:54950 dest: /192.168.158.4:9866 2025-07-19 03:49:39,559 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:54950, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-352706997_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755248_14424, duration(ns): 16920889 2025-07-19 03:49:39,559 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755248_14424, type=LAST_IN_PIPELINE terminating 2025-07-19 03:49:41,650 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755248_14424 replica FinalizedReplica, blk_1073755248_14424, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755248 for deletion 2025-07-19 03:49:41,651 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755248_14424 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755248 2025-07-19 03:50:39,538 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755249_14425 src: /192.168.158.1:48984 dest: /192.168.158.4:9866 2025-07-19 03:50:39,571 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48984, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1802668869_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755249_14425, duration(ns): 22865560 2025-07-19 03:50:39,571 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755249_14425, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-19 03:50:41,654 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755249_14425 replica FinalizedReplica, blk_1073755249_14425, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755249 for deletion 2025-07-19 03:50:41,655 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755249_14425 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755249 2025-07-19 03:51:39,545 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755250_14426 src: /192.168.158.5:40782 dest: /192.168.158.4:9866 2025-07-19 03:51:39,564 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:40782, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1044901608_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755250_14426, duration(ns): 17332836 2025-07-19 03:51:39,565 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755250_14426, type=LAST_IN_PIPELINE terminating 2025-07-19 03:51:41,655 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755250_14426 replica FinalizedReplica, blk_1073755250_14426, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755250 for deletion 2025-07-19 03:51:41,657 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755250_14426 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755250 2025-07-19 03:53:44,548 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755252_14428 src: /192.168.158.7:34848 dest: /192.168.158.4:9866 2025-07-19 03:53:44,568 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:34848, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1722408124_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755252_14428, duration(ns): 17349047 2025-07-19 03:53:44,568 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755252_14428, type=LAST_IN_PIPELINE terminating 2025-07-19 03:53:47,660 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755252_14428 replica FinalizedReplica, blk_1073755252_14428, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755252 for deletion 2025-07-19 03:53:47,661 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755252_14428 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755252 2025-07-19 03:54:44,540 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755253_14429 src: /192.168.158.1:51528 dest: /192.168.158.4:9866 2025-07-19 03:54:44,575 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51528, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-703712875_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755253_14429, duration(ns): 25868195 2025-07-19 03:54:44,575 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755253_14429, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-19 03:54:47,663 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755253_14429 replica FinalizedReplica, blk_1073755253_14429, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755253 for deletion 2025-07-19 03:54:47,664 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755253_14429 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755253 2025-07-19 03:55:49,539 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755254_14430 src: /192.168.158.1:33560 dest: /192.168.158.4:9866 2025-07-19 03:55:49,572 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33560, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1941868575_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755254_14430, duration(ns): 23369409 2025-07-19 03:55:49,572 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755254_14430, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-19 03:55:56,665 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755254_14430 replica FinalizedReplica, blk_1073755254_14430, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755254 for deletion 2025-07-19 03:55:56,666 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755254_14430 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755254 2025-07-19 03:56:49,544 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755255_14431 src: /192.168.158.1:47336 dest: /192.168.158.4:9866 2025-07-19 03:56:49,580 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47336, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_99702093_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755255_14431, duration(ns): 26605547 2025-07-19 03:56:49,580 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755255_14431, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-19 03:56:53,668 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755255_14431 replica FinalizedReplica, blk_1073755255_14431, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755255 for deletion 2025-07-19 03:56:53,674 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755255_14431 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755255 2025-07-19 03:57:49,545 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755256_14432 src: /192.168.158.1:49680 dest: /192.168.158.4:9866 2025-07-19 03:57:49,579 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49680, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_102613526_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755256_14432, duration(ns): 24822532 2025-07-19 03:57:49,579 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755256_14432, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-19 03:57:53,668 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755256_14432 replica FinalizedReplica, blk_1073755256_14432, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755256 for deletion 2025-07-19 03:57:53,670 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755256_14432 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755256 2025-07-19 03:59:17,678 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Successfully sent block report 0x12cfd9a757d31f4c, containing 4 storage report(s), of which we sent 4. The reports had 8 total blocks and used 1 RPC(s). This took 0 msec to generate and 5 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5. 2025-07-19 03:59:17,679 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Got finalize command for block pool BP-1059995147-192.168.158.1-1752101929360 2025-07-19 04:01:54,587 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755260_14436 src: /192.168.158.5:40844 dest: /192.168.158.4:9866 2025-07-19 04:01:54,605 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:40844, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1198743804_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755260_14436, duration(ns): 15835679 2025-07-19 04:01:54,606 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755260_14436, type=LAST_IN_PIPELINE terminating 2025-07-19 04:01:59,683 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755260_14436 replica FinalizedReplica, blk_1073755260_14436, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755260 for deletion 2025-07-19 04:01:59,684 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755260_14436 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755260 2025-07-19 04:02:54,558 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755261_14437 src: /192.168.158.7:45804 dest: /192.168.158.4:9866 2025-07-19 04:02:54,585 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:45804, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1323317612_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755261_14437, duration(ns): 19573448 2025-07-19 04:02:54,585 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755261_14437, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-19 04:02:59,686 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755261_14437 replica FinalizedReplica, blk_1073755261_14437, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755261 for deletion 2025-07-19 04:02:59,687 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755261_14437 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755261 2025-07-19 04:03:54,557 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755262_14438 src: /192.168.158.1:49358 dest: /192.168.158.4:9866 2025-07-19 04:03:54,591 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49358, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1420247458_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755262_14438, duration(ns): 25063883 2025-07-19 04:03:54,592 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755262_14438, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-19 04:03:56,689 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755262_14438 replica FinalizedReplica, blk_1073755262_14438, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755262 for deletion 2025-07-19 04:03:56,690 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755262_14438 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755262 2025-07-19 04:06:54,568 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755265_14441 src: /192.168.158.7:36650 dest: /192.168.158.4:9866 2025-07-19 04:06:54,587 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:36650, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1037268237_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755265_14441, duration(ns): 17358551 2025-07-19 04:06:54,588 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755265_14441, type=LAST_IN_PIPELINE terminating 2025-07-19 04:06:59,696 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755265_14441 replica FinalizedReplica, blk_1073755265_14441, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755265 for deletion 2025-07-19 04:06:59,698 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755265_14441 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755265 2025-07-19 04:07:54,570 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755266_14442 src: /192.168.158.1:56432 dest: /192.168.158.4:9866 2025-07-19 04:07:54,606 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56432, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1812620611_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755266_14442, duration(ns): 26292066 2025-07-19 04:07:54,606 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755266_14442, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-19 04:07:56,698 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755266_14442 replica FinalizedReplica, blk_1073755266_14442, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755266 for deletion 2025-07-19 04:07:56,699 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755266_14442 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755266 2025-07-19 04:08:54,572 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755267_14443 src: /192.168.158.9:48550 dest: /192.168.158.4:9866 2025-07-19 04:08:54,590 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:48550, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1205349804_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755267_14443, duration(ns): 15684692 2025-07-19 04:08:54,590 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755267_14443, type=LAST_IN_PIPELINE terminating 2025-07-19 04:08:56,701 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755267_14443 replica FinalizedReplica, blk_1073755267_14443, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755267 for deletion 2025-07-19 04:08:56,702 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755267_14443 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755267 2025-07-19 04:16:04,585 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755274_14450 src: /192.168.158.7:55486 dest: /192.168.158.4:9866 2025-07-19 04:16:04,603 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:55486, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1716395270_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755274_14450, duration(ns): 16071108 2025-07-19 04:16:04,603 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755274_14450, type=LAST_IN_PIPELINE terminating 2025-07-19 04:16:08,713 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755274_14450 replica FinalizedReplica, blk_1073755274_14450, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755274 for deletion 2025-07-19 04:16:08,715 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755274_14450 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755274 2025-07-19 04:17:04,594 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755275_14451 src: /192.168.158.6:47816 dest: /192.168.158.4:9866 2025-07-19 04:17:04,623 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:47816, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1318836434_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755275_14451, duration(ns): 22783382 2025-07-19 04:17:04,623 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755275_14451, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-19 04:17:11,716 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755275_14451 replica FinalizedReplica, blk_1073755275_14451, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755275 for deletion 2025-07-19 04:17:11,717 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755275_14451 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755275 2025-07-19 04:18:04,587 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755276_14452 src: /192.168.158.6:46988 dest: /192.168.158.4:9866 2025-07-19 04:18:04,606 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:46988, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1025191817_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755276_14452, duration(ns): 16655062 2025-07-19 04:18:04,606 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755276_14452, type=LAST_IN_PIPELINE terminating 2025-07-19 04:18:08,718 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755276_14452 replica FinalizedReplica, blk_1073755276_14452, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755276 for deletion 2025-07-19 04:18:08,719 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755276_14452 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755276 2025-07-19 04:19:04,579 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755277_14453 src: /192.168.158.1:33586 dest: /192.168.158.4:9866 2025-07-19 04:19:04,612 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33586, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1095038039_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755277_14453, duration(ns): 22888683 2025-07-19 04:19:04,612 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755277_14453, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-19 04:19:08,722 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755277_14453 replica FinalizedReplica, blk_1073755277_14453, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755277 for deletion 2025-07-19 04:19:08,723 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755277_14453 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755277 2025-07-19 04:22:04,589 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755280_14456 src: /192.168.158.9:36740 dest: /192.168.158.4:9866 2025-07-19 04:22:04,614 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:36740, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_430274708_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755280_14456, duration(ns): 19391161 2025-07-19 04:22:04,614 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755280_14456, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-19 04:22:11,727 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755280_14456 replica FinalizedReplica, blk_1073755280_14456, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755280 for deletion 2025-07-19 04:22:11,728 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755280_14456 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755280 2025-07-19 04:25:09,593 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755283_14459 src: /192.168.158.9:33480 dest: /192.168.158.4:9866 2025-07-19 04:25:09,611 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:33480, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-173380886_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755283_14459, duration(ns): 16041626 2025-07-19 04:25:09,612 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755283_14459, type=LAST_IN_PIPELINE terminating 2025-07-19 04:25:11,729 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755283_14459 replica FinalizedReplica, blk_1073755283_14459, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755283 for deletion 2025-07-19 04:25:11,731 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755283_14459 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755283 2025-07-19 04:26:09,590 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755284_14460 src: /192.168.158.1:38028 dest: /192.168.158.4:9866 2025-07-19 04:26:09,626 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38028, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1894131481_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755284_14460, duration(ns): 26798429 2025-07-19 04:26:09,626 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755284_14460, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-19 04:26:11,731 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755284_14460 replica FinalizedReplica, blk_1073755284_14460, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755284 for deletion 2025-07-19 04:26:11,733 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755284_14460 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755284 2025-07-19 04:28:09,601 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755286_14462 src: /192.168.158.5:35348 dest: /192.168.158.4:9866 2025-07-19 04:28:09,627 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:35348, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-188111633_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755286_14462, duration(ns): 20197842 2025-07-19 04:28:09,627 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755286_14462, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-19 04:28:11,736 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755286_14462 replica FinalizedReplica, blk_1073755286_14462, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755286 for deletion 2025-07-19 04:28:11,737 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755286_14462 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755286 2025-07-19 04:29:14,653 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755287_14463 src: /192.168.158.7:42564 dest: /192.168.158.4:9866 2025-07-19 04:29:14,671 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:42564, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-234045292_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755287_14463, duration(ns): 15417209 2025-07-19 04:29:14,671 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755287_14463, type=LAST_IN_PIPELINE terminating 2025-07-19 04:29:17,739 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755287_14463 replica FinalizedReplica, blk_1073755287_14463, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755287 for deletion 2025-07-19 04:29:17,740 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755287_14463 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755287 2025-07-19 04:30:14,606 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755288_14464 src: /192.168.158.1:53196 dest: /192.168.158.4:9866 2025-07-19 04:30:14,638 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53196, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_139724740_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755288_14464, duration(ns): 23255507 2025-07-19 04:30:14,639 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755288_14464, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-19 04:30:17,742 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755288_14464 replica FinalizedReplica, blk_1073755288_14464, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755288 for deletion 2025-07-19 04:30:17,743 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755288_14464 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755288 2025-07-19 04:31:19,600 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755289_14465 src: /192.168.158.9:46840 dest: /192.168.158.4:9866 2025-07-19 04:31:19,626 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:46840, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_940338482_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755289_14465, duration(ns): 20539898 2025-07-19 04:31:19,626 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755289_14465, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-19 04:31:23,743 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755289_14465 replica FinalizedReplica, blk_1073755289_14465, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755289 for deletion 2025-07-19 04:31:23,744 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755289_14465 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755289 2025-07-19 04:32:19,608 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755290_14466 src: /192.168.158.1:43122 dest: /192.168.158.4:9866 2025-07-19 04:32:19,643 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43122, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1493004586_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755290_14466, duration(ns): 25441267 2025-07-19 04:32:19,644 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755290_14466, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-19 04:32:23,746 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755290_14466 replica FinalizedReplica, blk_1073755290_14466, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755290 for deletion 2025-07-19 04:32:23,747 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755290_14466 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755290 2025-07-19 04:36:24,616 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755294_14470 src: /192.168.158.5:51674 dest: /192.168.158.4:9866 2025-07-19 04:36:24,636 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:51674, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_146257363_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755294_14470, duration(ns): 17741050 2025-07-19 04:36:24,636 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755294_14470, type=LAST_IN_PIPELINE terminating 2025-07-19 04:36:26,754 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755294_14470 replica FinalizedReplica, blk_1073755294_14470, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755294 for deletion 2025-07-19 04:36:26,755 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755294_14470 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755294 2025-07-19 04:37:24,613 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755295_14471 src: /192.168.158.1:51898 dest: /192.168.158.4:9866 2025-07-19 04:37:24,649 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51898, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2045361730_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755295_14471, duration(ns): 27094186 2025-07-19 04:37:24,650 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755295_14471, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-19 04:37:26,757 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755295_14471 replica FinalizedReplica, blk_1073755295_14471, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755295 for deletion 2025-07-19 04:37:26,758 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755295_14471 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755295 2025-07-19 04:39:24,619 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755297_14473 src: /192.168.158.8:43704 dest: /192.168.158.4:9866 2025-07-19 04:39:24,637 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:43704, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-396795490_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755297_14473, duration(ns): 15722197 2025-07-19 04:39:24,637 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755297_14473, type=LAST_IN_PIPELINE terminating 2025-07-19 04:39:26,759 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755297_14473 replica FinalizedReplica, blk_1073755297_14473, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755297 for deletion 2025-07-19 04:39:26,760 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755297_14473 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755297 2025-07-19 04:40:24,614 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755298_14474 src: /192.168.158.1:34090 dest: /192.168.158.4:9866 2025-07-19 04:40:24,649 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34090, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1658431278_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755298_14474, duration(ns): 25904553 2025-07-19 04:40:24,649 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755298_14474, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-19 04:40:29,762 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755298_14474 replica FinalizedReplica, blk_1073755298_14474, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755298 for deletion 2025-07-19 04:40:29,763 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755298_14474 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755298 2025-07-19 04:42:24,619 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755300_14476 src: /192.168.158.1:43746 dest: /192.168.158.4:9866 2025-07-19 04:42:24,655 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43746, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1264923907_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755300_14476, duration(ns): 27514141 2025-07-19 04:42:24,655 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755300_14476, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-19 04:42:26,762 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755300_14476 replica FinalizedReplica, blk_1073755300_14476, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755300 for deletion 2025-07-19 04:42:26,763 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755300_14476 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755300 2025-07-19 04:44:29,625 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755302_14478 src: /192.168.158.1:52308 dest: /192.168.158.4:9866 2025-07-19 04:44:29,661 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52308, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_842907271_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755302_14478, duration(ns): 26665483 2025-07-19 04:44:29,662 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755302_14478, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-19 04:44:32,765 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755302_14478 replica FinalizedReplica, blk_1073755302_14478, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755302 for deletion 2025-07-19 04:44:32,766 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755302_14478 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755302 2025-07-19 04:45:29,623 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755303_14479 src: /192.168.158.5:53058 dest: /192.168.158.4:9866 2025-07-19 04:45:29,648 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:53058, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-712854925_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755303_14479, duration(ns): 19424655 2025-07-19 04:45:29,648 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755303_14479, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-19 04:45:32,765 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755303_14479 replica FinalizedReplica, blk_1073755303_14479, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755303 for deletion 2025-07-19 04:45:32,766 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755303_14479 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755303 2025-07-19 04:47:34,630 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755305_14481 src: /192.168.158.9:32812 dest: /192.168.158.4:9866 2025-07-19 04:47:34,648 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:32812, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1112051042_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755305_14481, duration(ns): 15701158 2025-07-19 04:47:34,648 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755305_14481, type=LAST_IN_PIPELINE terminating 2025-07-19 04:47:35,770 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755305_14481 replica FinalizedReplica, blk_1073755305_14481, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755305 for deletion 2025-07-19 04:47:35,771 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755305_14481 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755305 2025-07-19 04:49:39,634 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755307_14483 src: /192.168.158.9:36376 dest: /192.168.158.4:9866 2025-07-19 04:49:39,652 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:36376, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1893167810_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755307_14483, duration(ns): 15898383 2025-07-19 04:49:39,652 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755307_14483, type=LAST_IN_PIPELINE terminating 2025-07-19 04:49:44,773 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755307_14483 replica FinalizedReplica, blk_1073755307_14483, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755307 for deletion 2025-07-19 04:49:44,774 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755307_14483 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755307 2025-07-19 04:50:44,634 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755308_14484 src: /192.168.158.6:50800 dest: /192.168.158.4:9866 2025-07-19 04:50:44,662 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:50800, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_692007859_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755308_14484, duration(ns): 21399524 2025-07-19 04:50:44,662 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755308_14484, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-19 04:50:50,775 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755308_14484 replica FinalizedReplica, blk_1073755308_14484, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755308 for deletion 2025-07-19 04:50:50,776 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755308_14484 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755308 2025-07-19 04:53:44,634 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755311_14487 src: /192.168.158.5:40594 dest: /192.168.158.4:9866 2025-07-19 04:53:44,653 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:40594, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_914416301_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755311_14487, duration(ns): 17261159 2025-07-19 04:53:44,654 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755311_14487, type=LAST_IN_PIPELINE terminating 2025-07-19 04:53:50,782 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755311_14487 replica FinalizedReplica, blk_1073755311_14487, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755311 for deletion 2025-07-19 04:53:50,783 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755311_14487 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755311 2025-07-19 04:56:44,645 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755314_14490 src: /192.168.158.6:55416 dest: /192.168.158.4:9866 2025-07-19 04:56:44,662 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:55416, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1558683549_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755314_14490, duration(ns): 15330084 2025-07-19 04:56:44,663 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755314_14490, type=LAST_IN_PIPELINE terminating 2025-07-19 04:56:50,788 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755314_14490 replica FinalizedReplica, blk_1073755314_14490, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755314 for deletion 2025-07-19 04:56:50,790 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755314_14490 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755314 2025-07-19 04:57:44,634 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755315_14491 src: /192.168.158.1:44336 dest: /192.168.158.4:9866 2025-07-19 04:57:44,669 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44336, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-403173715_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755315_14491, duration(ns): 24828980 2025-07-19 04:57:44,669 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755315_14491, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-19 04:57:47,789 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755315_14491 replica FinalizedReplica, blk_1073755315_14491, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755315 for deletion 2025-07-19 04:57:47,790 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755315_14491 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755315 2025-07-19 04:58:44,642 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755316_14492 src: /192.168.158.9:54528 dest: /192.168.158.4:9866 2025-07-19 04:58:44,663 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:54528, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_126827969_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755316_14492, duration(ns): 18695894 2025-07-19 04:58:44,663 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755316_14492, type=LAST_IN_PIPELINE terminating 2025-07-19 04:58:50,790 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755316_14492 replica FinalizedReplica, blk_1073755316_14492, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755316 for deletion 2025-07-19 04:58:50,791 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755316_14492 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755316 2025-07-19 04:59:44,646 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755317_14493 src: /192.168.158.8:43226 dest: /192.168.158.4:9866 2025-07-19 04:59:44,673 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:43226, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1363341088_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755317_14493, duration(ns): 21489840 2025-07-19 04:59:44,674 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755317_14493, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-19 04:59:50,795 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755317_14493 replica FinalizedReplica, blk_1073755317_14493, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755317 for deletion 2025-07-19 04:59:50,796 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755317_14493 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755317 2025-07-19 05:00:44,646 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755318_14494 src: /192.168.158.9:34464 dest: /192.168.158.4:9866 2025-07-19 05:00:44,664 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:34464, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_23943895_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755318_14494, duration(ns): 16589889 2025-07-19 05:00:44,665 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755318_14494, type=LAST_IN_PIPELINE terminating 2025-07-19 05:00:50,799 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755318_14494 replica FinalizedReplica, blk_1073755318_14494, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755318 for deletion 2025-07-19 05:00:50,800 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755318_14494 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755318 2025-07-19 05:01:44,643 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755319_14495 src: /192.168.158.1:46164 dest: /192.168.158.4:9866 2025-07-19 05:01:44,676 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46164, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1151391347_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755319_14495, duration(ns): 24983909 2025-07-19 05:01:44,677 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755319_14495, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-19 05:01:47,801 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755319_14495 replica FinalizedReplica, blk_1073755319_14495, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755319 for deletion 2025-07-19 05:01:47,802 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755319_14495 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755319 2025-07-19 05:02:44,652 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755320_14496 src: /192.168.158.1:43770 dest: /192.168.158.4:9866 2025-07-19 05:02:44,686 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43770, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1057272574_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755320_14496, duration(ns): 24700815 2025-07-19 05:02:44,686 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755320_14496, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-19 05:02:50,801 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755320_14496 replica FinalizedReplica, blk_1073755320_14496, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755320 for deletion 2025-07-19 05:02:50,802 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755320_14496 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755320 2025-07-19 05:05:44,652 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755323_14499 src: /192.168.158.7:39620 dest: /192.168.158.4:9866 2025-07-19 05:05:44,677 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:39620, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1028995760_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755323_14499, duration(ns): 19541417 2025-07-19 05:05:44,678 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755323_14499, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-19 05:05:47,804 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755323_14499 replica FinalizedReplica, blk_1073755323_14499, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755323 for deletion 2025-07-19 05:05:47,806 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755323_14499 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755323 2025-07-19 05:09:49,659 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755327_14503 src: /192.168.158.5:48436 dest: /192.168.158.4:9866 2025-07-19 05:09:49,685 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:48436, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_733830054_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755327_14503, duration(ns): 21221838 2025-07-19 05:09:49,685 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755327_14503, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-19 05:09:50,809 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755327_14503 replica FinalizedReplica, blk_1073755327_14503, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755327 for deletion 2025-07-19 05:09:50,811 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755327_14503 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755327 2025-07-19 05:10:49,659 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755328_14504 src: /192.168.158.8:60794 dest: /192.168.158.4:9866 2025-07-19 05:10:49,679 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:60794, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-91533286_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755328_14504, duration(ns): 17053998 2025-07-19 05:10:49,679 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755328_14504, type=LAST_IN_PIPELINE terminating 2025-07-19 05:10:53,812 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755328_14504 replica FinalizedReplica, blk_1073755328_14504, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755328 for deletion 2025-07-19 05:10:53,813 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755328_14504 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755328 2025-07-19 05:14:59,668 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755332_14508 src: /192.168.158.7:44408 dest: /192.168.158.4:9866 2025-07-19 05:14:59,694 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:44408, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1971518384_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755332_14508, duration(ns): 20423516 2025-07-19 05:14:59,695 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755332_14508, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-19 05:15:02,819 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755332_14508 replica FinalizedReplica, blk_1073755332_14508, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755332 for deletion 2025-07-19 05:15:02,821 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755332_14508 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755332 2025-07-19 05:17:04,665 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755334_14510 src: /192.168.158.7:45032 dest: /192.168.158.4:9866 2025-07-19 05:17:04,683 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:45032, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_176595484_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755334_14510, duration(ns): 15772710 2025-07-19 05:17:04,684 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755334_14510, type=LAST_IN_PIPELINE terminating 2025-07-19 05:17:05,824 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755334_14510 replica FinalizedReplica, blk_1073755334_14510, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755334 for deletion 2025-07-19 05:17:05,825 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755334_14510 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755334 2025-07-19 05:18:09,667 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755335_14511 src: /192.168.158.9:53638 dest: /192.168.158.4:9866 2025-07-19 05:18:09,687 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:53638, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1512891290_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755335_14511, duration(ns): 17860787 2025-07-19 05:18:09,687 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755335_14511, type=LAST_IN_PIPELINE terminating 2025-07-19 05:18:14,827 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755335_14511 replica FinalizedReplica, blk_1073755335_14511, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755335 for deletion 2025-07-19 05:18:14,828 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755335_14511 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755335 2025-07-19 05:21:09,674 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755338_14514 src: /192.168.158.9:51092 dest: /192.168.158.4:9866 2025-07-19 05:21:09,693 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:51092, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1701393263_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755338_14514, duration(ns): 16274281 2025-07-19 05:21:09,693 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755338_14514, type=LAST_IN_PIPELINE terminating 2025-07-19 05:21:11,831 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755338_14514 replica FinalizedReplica, blk_1073755338_14514, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755338 for deletion 2025-07-19 05:21:11,832 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755338_14514 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755338 2025-07-19 05:28:24,679 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755345_14521 src: /192.168.158.7:43400 dest: /192.168.158.4:9866 2025-07-19 05:28:24,701 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:43400, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_432975786_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755345_14521, duration(ns): 18879672 2025-07-19 05:28:24,701 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755345_14521, type=LAST_IN_PIPELINE terminating 2025-07-19 05:28:26,842 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755345_14521 replica FinalizedReplica, blk_1073755345_14521, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755345 for deletion 2025-07-19 05:28:26,843 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755345_14521 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755345 2025-07-19 05:30:24,675 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755347_14523 src: /192.168.158.1:56002 dest: /192.168.158.4:9866 2025-07-19 05:30:24,709 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56002, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1800644609_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755347_14523, duration(ns): 24562313 2025-07-19 05:30:24,709 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755347_14523, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-19 05:30:26,845 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755347_14523 replica FinalizedReplica, blk_1073755347_14523, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755347 for deletion 2025-07-19 05:30:26,846 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755347_14523 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755347 2025-07-19 05:36:13,269 INFO org.apache.hadoop.hdfs.server.datanode.DirectoryScanner: BlockPool BP-1059995147-192.168.158.1-1752101929360 Total blocks: 8, missing metadata files:0, missing block files:0, missing blocks in memory:0, mismatched blocks:0 2025-07-19 05:37:34,696 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755354_14530 src: /192.168.158.5:59810 dest: /192.168.158.4:9866 2025-07-19 05:37:34,715 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:59810, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1403528195_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755354_14530, duration(ns): 17088986 2025-07-19 05:37:34,715 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755354_14530, type=LAST_IN_PIPELINE terminating 2025-07-19 05:37:38,857 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755354_14530 replica FinalizedReplica, blk_1073755354_14530, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755354 for deletion 2025-07-19 05:37:38,858 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755354_14530 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755354 2025-07-19 05:38:39,693 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755355_14531 src: /192.168.158.7:58890 dest: /192.168.158.4:9866 2025-07-19 05:38:39,718 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:58890, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1520895923_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755355_14531, duration(ns): 19326538 2025-07-19 05:38:39,718 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755355_14531, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-19 05:38:41,857 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755355_14531 replica FinalizedReplica, blk_1073755355_14531, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755355 for deletion 2025-07-19 05:38:41,858 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755355_14531 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755355 2025-07-19 05:39:44,693 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755356_14532 src: /192.168.158.9:42090 dest: /192.168.158.4:9866 2025-07-19 05:39:44,713 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:42090, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1090009748_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755356_14532, duration(ns): 17115568 2025-07-19 05:39:44,713 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755356_14532, type=LAST_IN_PIPELINE terminating 2025-07-19 05:39:47,859 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755356_14532 replica FinalizedReplica, blk_1073755356_14532, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755356 for deletion 2025-07-19 05:39:47,860 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755356_14532 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755356 2025-07-19 05:40:44,696 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755357_14533 src: /192.168.158.6:33896 dest: /192.168.158.4:9866 2025-07-19 05:40:44,717 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:33896, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1118952926_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755357_14533, duration(ns): 18213921 2025-07-19 05:40:44,717 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755357_14533, type=LAST_IN_PIPELINE terminating 2025-07-19 05:40:47,860 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755357_14533 replica FinalizedReplica, blk_1073755357_14533, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755357 for deletion 2025-07-19 05:40:47,861 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755357_14533 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755357 2025-07-19 05:41:49,706 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755358_14534 src: /192.168.158.8:36184 dest: /192.168.158.4:9866 2025-07-19 05:41:49,725 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:36184, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-67575541_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755358_14534, duration(ns): 17599066 2025-07-19 05:41:49,726 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755358_14534, type=LAST_IN_PIPELINE terminating 2025-07-19 05:41:53,861 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755358_14534 replica FinalizedReplica, blk_1073755358_14534, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755358 for deletion 2025-07-19 05:41:53,862 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755358_14534 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755358 2025-07-19 05:44:54,698 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755361_14537 src: /192.168.158.9:33118 dest: /192.168.158.4:9866 2025-07-19 05:44:54,724 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:33118, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1640937537_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755361_14537, duration(ns): 20336178 2025-07-19 05:44:54,724 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755361_14537, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-19 05:44:56,867 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755361_14537 replica FinalizedReplica, blk_1073755361_14537, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755361 for deletion 2025-07-19 05:44:56,869 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755361_14537 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755361 2025-07-19 05:46:54,700 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755363_14539 src: /192.168.158.9:60564 dest: /192.168.158.4:9866 2025-07-19 05:46:54,721 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:60564, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1840392547_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755363_14539, duration(ns): 18133374 2025-07-19 05:46:54,721 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755363_14539, type=LAST_IN_PIPELINE terminating 2025-07-19 05:46:56,870 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755363_14539 replica FinalizedReplica, blk_1073755363_14539, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755363 for deletion 2025-07-19 05:46:56,871 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755363_14539 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755363 2025-07-19 05:49:59,705 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755366_14542 src: /192.168.158.9:45304 dest: /192.168.158.4:9866 2025-07-19 05:49:59,730 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:45304, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1075623902_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755366_14542, duration(ns): 19578202 2025-07-19 05:49:59,731 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755366_14542, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-19 05:50:02,875 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755366_14542 replica FinalizedReplica, blk_1073755366_14542, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755366 for deletion 2025-07-19 05:50:02,876 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755366_14542 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755366 2025-07-19 05:51:59,714 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755368_14544 src: /192.168.158.8:36358 dest: /192.168.158.4:9866 2025-07-19 05:51:59,738 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:36358, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1239273489_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755368_14544, duration(ns): 19220377 2025-07-19 05:51:59,739 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755368_14544, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-19 05:52:05,881 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755368_14544 replica FinalizedReplica, blk_1073755368_14544, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755368 for deletion 2025-07-19 05:52:05,882 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755368_14544 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755368 2025-07-19 05:54:59,713 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755371_14547 src: /192.168.158.8:45282 dest: /192.168.158.4:9866 2025-07-19 05:54:59,739 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:45282, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1926923200_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755371_14547, duration(ns): 20117732 2025-07-19 05:54:59,740 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755371_14547, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-19 05:55:02,889 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755371_14547 replica FinalizedReplica, blk_1073755371_14547, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755371 for deletion 2025-07-19 05:55:02,890 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755371_14547 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755371 2025-07-19 05:55:59,720 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755372_14548 src: /192.168.158.8:53382 dest: /192.168.158.4:9866 2025-07-19 05:55:59,746 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:53382, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_222702451_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755372_14548, duration(ns): 20976541 2025-07-19 05:55:59,747 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755372_14548, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-19 05:56:02,889 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755372_14548 replica FinalizedReplica, blk_1073755372_14548, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755372 for deletion 2025-07-19 05:56:02,890 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755372_14548 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755372 2025-07-19 05:56:59,719 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755373_14549 src: /192.168.158.9:48978 dest: /192.168.158.4:9866 2025-07-19 05:56:59,745 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:48978, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_64554719_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755373_14549, duration(ns): 20299809 2025-07-19 05:56:59,745 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755373_14549, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-19 05:57:02,893 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755373_14549 replica FinalizedReplica, blk_1073755373_14549, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755373 for deletion 2025-07-19 05:57:02,894 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755373_14549 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755373 2025-07-19 05:58:59,725 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755375_14551 src: /192.168.158.5:49398 dest: /192.168.158.4:9866 2025-07-19 05:58:59,745 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:49398, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-342114534_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755375_14551, duration(ns): 17703631 2025-07-19 05:58:59,745 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755375_14551, type=LAST_IN_PIPELINE terminating 2025-07-19 05:59:05,894 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755375_14551 replica FinalizedReplica, blk_1073755375_14551, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755375 for deletion 2025-07-19 05:59:05,895 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755375_14551 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755375 2025-07-19 05:59:59,727 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755376_14552 src: /192.168.158.8:39294 dest: /192.168.158.4:9866 2025-07-19 05:59:59,747 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:39294, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1758169186_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755376_14552, duration(ns): 17902430 2025-07-19 05:59:59,748 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755376_14552, type=LAST_IN_PIPELINE terminating 2025-07-19 06:00:02,895 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755376_14552 replica FinalizedReplica, blk_1073755376_14552, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755376 for deletion 2025-07-19 06:00:02,897 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755376_14552 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755376 2025-07-19 06:01:04,726 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755377_14553 src: /192.168.158.6:41316 dest: /192.168.158.4:9866 2025-07-19 06:01:04,745 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:41316, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1284634956_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755377_14553, duration(ns): 16769328 2025-07-19 06:01:04,745 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755377_14553, type=LAST_IN_PIPELINE terminating 2025-07-19 06:01:08,897 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755377_14553 replica FinalizedReplica, blk_1073755377_14553, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755377 for deletion 2025-07-19 06:01:08,898 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755377_14553 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755377 2025-07-19 06:04:04,735 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755380_14556 src: /192.168.158.8:46244 dest: /192.168.158.4:9866 2025-07-19 06:04:04,755 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:46244, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_775909952_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755380_14556, duration(ns): 17538014 2025-07-19 06:04:04,755 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755380_14556, type=LAST_IN_PIPELINE terminating 2025-07-19 06:04:05,900 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755380_14556 replica FinalizedReplica, blk_1073755380_14556, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755380 for deletion 2025-07-19 06:04:05,901 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755380_14556 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755380 2025-07-19 06:05:04,740 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755381_14557 src: /192.168.158.9:56416 dest: /192.168.158.4:9866 2025-07-19 06:05:04,762 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:56416, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-77201923_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755381_14557, duration(ns): 19734312 2025-07-19 06:05:04,763 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755381_14557, type=LAST_IN_PIPELINE terminating 2025-07-19 06:05:08,901 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755381_14557 replica FinalizedReplica, blk_1073755381_14557, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755381 for deletion 2025-07-19 06:05:08,902 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755381_14557 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755381 2025-07-19 06:06:04,738 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755382_14558 src: /192.168.158.5:59296 dest: /192.168.158.4:9866 2025-07-19 06:06:04,760 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:59296, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-244784871_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755382_14558, duration(ns): 19169907 2025-07-19 06:06:04,760 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755382_14558, type=LAST_IN_PIPELINE terminating 2025-07-19 06:06:05,903 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755382_14558 replica FinalizedReplica, blk_1073755382_14558, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755382 for deletion 2025-07-19 06:06:05,904 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755382_14558 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755382 2025-07-19 06:07:04,737 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755383_14559 src: /192.168.158.5:54694 dest: /192.168.158.4:9866 2025-07-19 06:07:04,766 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:54694, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-609335989_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755383_14559, duration(ns): 23492977 2025-07-19 06:07:04,767 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755383_14559, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-19 06:07:08,903 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755383_14559 replica FinalizedReplica, blk_1073755383_14559, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755383 for deletion 2025-07-19 06:07:08,904 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755383_14559 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755383 2025-07-19 06:08:04,740 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755384_14560 src: /192.168.158.5:47570 dest: /192.168.158.4:9866 2025-07-19 06:08:04,766 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:47570, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_138439924_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755384_14560, duration(ns): 20614013 2025-07-19 06:08:04,766 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755384_14560, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-19 06:08:05,903 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755384_14560 replica FinalizedReplica, blk_1073755384_14560, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755384 for deletion 2025-07-19 06:08:05,904 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755384_14560 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755384 2025-07-19 06:09:04,734 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755385_14561 src: /192.168.158.1:58848 dest: /192.168.158.4:9866 2025-07-19 06:09:04,768 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58848, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1239451717_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755385_14561, duration(ns): 23995669 2025-07-19 06:09:04,769 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755385_14561, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-19 06:09:05,905 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755385_14561 replica FinalizedReplica, blk_1073755385_14561, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755385 for deletion 2025-07-19 06:09:05,908 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755385_14561 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755385 2025-07-19 06:11:04,740 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755387_14563 src: /192.168.158.8:35690 dest: /192.168.158.4:9866 2025-07-19 06:11:04,766 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:35690, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_529160414_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755387_14563, duration(ns): 19934003 2025-07-19 06:11:04,766 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755387_14563, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-19 06:11:05,908 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755387_14563 replica FinalizedReplica, blk_1073755387_14563, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755387 for deletion 2025-07-19 06:11:05,909 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755387_14563 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755387 2025-07-19 06:12:04,747 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755388_14564 src: /192.168.158.6:46896 dest: /192.168.158.4:9866 2025-07-19 06:12:04,766 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:46896, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-11509096_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755388_14564, duration(ns): 15726130 2025-07-19 06:12:04,767 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755388_14564, type=LAST_IN_PIPELINE terminating 2025-07-19 06:12:05,911 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755388_14564 replica FinalizedReplica, blk_1073755388_14564, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755388 for deletion 2025-07-19 06:12:05,912 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755388_14564 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir20/blk_1073755388 2025-07-19 06:17:09,755 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755393_14569 src: /192.168.158.7:51938 dest: /192.168.158.4:9866 2025-07-19 06:17:09,783 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:51938, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_421196532_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755393_14569, duration(ns): 21841638 2025-07-19 06:17:09,783 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755393_14569, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-19 06:17:11,922 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755393_14569 replica FinalizedReplica, blk_1073755393_14569, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755393 for deletion 2025-07-19 06:17:11,923 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755393_14569 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755393 2025-07-19 06:20:19,789 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755396_14572 src: /192.168.158.8:52122 dest: /192.168.158.4:9866 2025-07-19 06:20:19,807 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:52122, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1715279752_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755396_14572, duration(ns): 15633044 2025-07-19 06:20:19,807 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755396_14572, type=LAST_IN_PIPELINE terminating 2025-07-19 06:20:20,926 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755396_14572 replica FinalizedReplica, blk_1073755396_14572, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755396 for deletion 2025-07-19 06:20:20,927 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755396_14572 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755396 2025-07-19 06:22:24,768 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755398_14574 src: /192.168.158.6:46836 dest: /192.168.158.4:9866 2025-07-19 06:22:24,786 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:46836, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-67992825_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755398_14574, duration(ns): 15917918 2025-07-19 06:22:24,786 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755398_14574, type=LAST_IN_PIPELINE terminating 2025-07-19 06:22:26,929 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755398_14574 replica FinalizedReplica, blk_1073755398_14574, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755398 for deletion 2025-07-19 06:22:26,930 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755398_14574 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755398 2025-07-19 06:23:29,760 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755399_14575 src: /192.168.158.6:51214 dest: /192.168.158.4:9866 2025-07-19 06:23:29,787 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:51214, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-285445732_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755399_14575, duration(ns): 21524486 2025-07-19 06:23:29,787 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755399_14575, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-19 06:23:32,929 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755399_14575 replica FinalizedReplica, blk_1073755399_14575, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755399 for deletion 2025-07-19 06:23:32,931 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755399_14575 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755399 2025-07-19 06:24:29,760 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755400_14576 src: /192.168.158.6:33450 dest: /192.168.158.4:9866 2025-07-19 06:24:29,786 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:33450, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2099992090_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755400_14576, duration(ns): 20040575 2025-07-19 06:24:29,786 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755400_14576, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-19 06:24:32,932 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755400_14576 replica FinalizedReplica, blk_1073755400_14576, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755400 for deletion 2025-07-19 06:24:32,933 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755400_14576 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755400 2025-07-19 06:26:29,761 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755402_14578 src: /192.168.158.1:33010 dest: /192.168.158.4:9866 2025-07-19 06:26:29,792 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33010, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-765581393_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755402_14578, duration(ns): 22866623 2025-07-19 06:26:29,793 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755402_14578, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-19 06:26:32,935 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755402_14578 replica FinalizedReplica, blk_1073755402_14578, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755402 for deletion 2025-07-19 06:26:32,936 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755402_14578 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755402 2025-07-19 06:28:34,766 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755404_14580 src: /192.168.158.1:59094 dest: /192.168.158.4:9866 2025-07-19 06:28:34,800 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59094, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_837330211_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755404_14580, duration(ns): 24343235 2025-07-19 06:28:34,800 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755404_14580, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-19 06:28:35,938 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755404_14580 replica FinalizedReplica, blk_1073755404_14580, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755404 for deletion 2025-07-19 06:28:35,939 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755404_14580 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755404 2025-07-19 06:31:39,773 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755407_14583 src: /192.168.158.6:46706 dest: /192.168.158.4:9866 2025-07-19 06:31:39,794 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:46706, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1431253866_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755407_14583, duration(ns): 18222571 2025-07-19 06:31:39,794 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755407_14583, type=LAST_IN_PIPELINE terminating 2025-07-19 06:31:44,945 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755407_14583 replica FinalizedReplica, blk_1073755407_14583, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755407 for deletion 2025-07-19 06:31:44,947 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755407_14583 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755407 2025-07-19 06:32:44,794 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755408_14584 src: /192.168.158.8:60694 dest: /192.168.158.4:9866 2025-07-19 06:32:44,820 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:60694, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-527650143_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755408_14584, duration(ns): 20241204 2025-07-19 06:32:44,820 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755408_14584, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-19 06:32:47,948 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755408_14584 replica FinalizedReplica, blk_1073755408_14584, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755408 for deletion 2025-07-19 06:32:47,949 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755408_14584 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755408 2025-07-19 06:33:44,772 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755409_14585 src: /192.168.158.1:47308 dest: /192.168.158.4:9866 2025-07-19 06:33:44,804 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47308, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2036676321_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755409_14585, duration(ns): 23849982 2025-07-19 06:33:44,805 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755409_14585, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-19 06:33:47,951 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755409_14585 replica FinalizedReplica, blk_1073755409_14585, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755409 for deletion 2025-07-19 06:33:47,952 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755409_14585 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755409 2025-07-19 06:40:59,788 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755416_14592 src: /192.168.158.9:58922 dest: /192.168.158.4:9866 2025-07-19 06:40:59,807 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:58922, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-95484518_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755416_14592, duration(ns): 17410940 2025-07-19 06:40:59,808 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755416_14592, type=LAST_IN_PIPELINE terminating 2025-07-19 06:41:02,962 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755416_14592 replica FinalizedReplica, blk_1073755416_14592, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755416 for deletion 2025-07-19 06:41:02,963 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755416_14592 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755416 2025-07-19 06:45:04,789 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755420_14596 src: /192.168.158.9:58622 dest: /192.168.158.4:9866 2025-07-19 06:45:04,815 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:58622, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1063782225_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755420_14596, duration(ns): 20279455 2025-07-19 06:45:04,815 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755420_14596, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-19 06:45:05,973 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755420_14596 replica FinalizedReplica, blk_1073755420_14596, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755420 for deletion 2025-07-19 06:45:05,974 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755420_14596 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755420 2025-07-19 06:47:04,798 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755422_14598 src: /192.168.158.5:40154 dest: /192.168.158.4:9866 2025-07-19 06:47:04,816 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:40154, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1426808446_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755422_14598, duration(ns): 16739641 2025-07-19 06:47:04,817 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755422_14598, type=LAST_IN_PIPELINE terminating 2025-07-19 06:47:05,977 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755422_14598 replica FinalizedReplica, blk_1073755422_14598, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755422 for deletion 2025-07-19 06:47:05,978 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755422_14598 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755422 2025-07-19 06:49:04,794 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755424_14600 src: /192.168.158.9:57328 dest: /192.168.158.4:9866 2025-07-19 06:49:04,821 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:57328, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-318509663_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755424_14600, duration(ns): 21485610 2025-07-19 06:49:04,821 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755424_14600, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-19 06:49:05,982 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755424_14600 replica FinalizedReplica, blk_1073755424_14600, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755424 for deletion 2025-07-19 06:49:05,983 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755424_14600 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755424 2025-07-19 06:50:04,797 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755425_14601 src: /192.168.158.9:52074 dest: /192.168.158.4:9866 2025-07-19 06:50:04,824 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:52074, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_413354648_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755425_14601, duration(ns): 21641013 2025-07-19 06:50:04,824 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755425_14601, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-19 06:50:05,985 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755425_14601 replica FinalizedReplica, blk_1073755425_14601, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755425 for deletion 2025-07-19 06:50:05,986 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755425_14601 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755425 2025-07-19 06:52:14,799 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755427_14603 src: /192.168.158.8:39790 dest: /192.168.158.4:9866 2025-07-19 06:52:14,829 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:39790, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1329803491_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755427_14603, duration(ns): 24706281 2025-07-19 06:52:14,829 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755427_14603, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-19 06:52:17,988 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755427_14603 replica FinalizedReplica, blk_1073755427_14603, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755427 for deletion 2025-07-19 06:52:17,989 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755427_14603 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755427 2025-07-19 06:53:14,795 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755428_14604 src: /192.168.158.6:44558 dest: /192.168.158.4:9866 2025-07-19 06:53:14,822 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:44558, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-807934023_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755428_14604, duration(ns): 20977464 2025-07-19 06:53:14,822 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755428_14604, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-19 06:53:20,991 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755428_14604 replica FinalizedReplica, blk_1073755428_14604, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755428 for deletion 2025-07-19 06:53:20,992 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755428_14604 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755428 2025-07-19 06:57:24,802 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755432_14608 src: /192.168.158.9:57644 dest: /192.168.158.4:9866 2025-07-19 06:57:24,822 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:57644, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1080256283_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755432_14608, duration(ns): 17123805 2025-07-19 06:57:24,822 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755432_14608, type=LAST_IN_PIPELINE terminating 2025-07-19 06:57:27,001 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755432_14608 replica FinalizedReplica, blk_1073755432_14608, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755432 for deletion 2025-07-19 06:57:27,002 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755432_14608 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755432 2025-07-19 06:59:24,799 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755434_14610 src: /192.168.158.1:58126 dest: /192.168.158.4:9866 2025-07-19 06:59:24,832 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58126, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1716277424_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755434_14610, duration(ns): 24661562 2025-07-19 06:59:24,833 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755434_14610, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-19 06:59:27,008 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755434_14610 replica FinalizedReplica, blk_1073755434_14610, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755434 for deletion 2025-07-19 06:59:27,009 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755434_14610 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755434 2025-07-19 07:00:29,797 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755435_14611 src: /192.168.158.1:55686 dest: /192.168.158.4:9866 2025-07-19 07:00:29,833 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55686, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-218586471_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755435_14611, duration(ns): 26228212 2025-07-19 07:00:29,833 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755435_14611, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-19 07:00:36,011 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755435_14611 replica FinalizedReplica, blk_1073755435_14611, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755435 for deletion 2025-07-19 07:00:36,012 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755435_14611 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755435 2025-07-19 07:02:34,809 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755437_14613 src: /192.168.158.7:55894 dest: /192.168.158.4:9866 2025-07-19 07:02:34,829 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:55894, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1166899360_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755437_14613, duration(ns): 18163582 2025-07-19 07:02:34,829 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755437_14613, type=LAST_IN_PIPELINE terminating 2025-07-19 07:02:39,015 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755437_14613 replica FinalizedReplica, blk_1073755437_14613, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755437 for deletion 2025-07-19 07:02:39,016 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755437_14613 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755437 2025-07-19 07:03:34,801 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755438_14614 src: /192.168.158.1:33160 dest: /192.168.158.4:9866 2025-07-19 07:03:34,834 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33160, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1871229571_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755438_14614, duration(ns): 23439309 2025-07-19 07:03:34,834 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755438_14614, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-19 07:03:36,017 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755438_14614 replica FinalizedReplica, blk_1073755438_14614, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755438 for deletion 2025-07-19 07:03:36,018 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755438_14614 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755438 2025-07-19 07:05:34,827 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755440_14616 src: /192.168.158.1:52550 dest: /192.168.158.4:9866 2025-07-19 07:05:34,860 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52550, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_378917573_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755440_14616, duration(ns): 24877167 2025-07-19 07:05:34,860 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755440_14616, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-19 07:05:36,022 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755440_14616 replica FinalizedReplica, blk_1073755440_14616, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755440 for deletion 2025-07-19 07:05:36,023 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755440_14616 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755440 2025-07-19 07:06:39,831 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755441_14617 src: /192.168.158.1:48824 dest: /192.168.158.4:9866 2025-07-19 07:06:39,862 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48824, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1428433144_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755441_14617, duration(ns): 22565161 2025-07-19 07:06:39,862 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755441_14617, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-19 07:06:42,025 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755441_14617 replica FinalizedReplica, blk_1073755441_14617, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755441 for deletion 2025-07-19 07:06:42,027 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755441_14617 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755441 2025-07-19 07:08:44,808 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755443_14619 src: /192.168.158.1:51070 dest: /192.168.158.4:9866 2025-07-19 07:08:44,843 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51070, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-743386374_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755443_14619, duration(ns): 24731410 2025-07-19 07:08:44,843 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755443_14619, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-19 07:08:48,030 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755443_14619 replica FinalizedReplica, blk_1073755443_14619, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755443 for deletion 2025-07-19 07:08:48,031 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755443_14619 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755443 2025-07-19 07:10:44,818 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755445_14621 src: /192.168.158.1:47158 dest: /192.168.158.4:9866 2025-07-19 07:10:44,851 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47158, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-842678533_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755445_14621, duration(ns): 24246751 2025-07-19 07:10:44,851 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755445_14621, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-19 07:10:48,034 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755445_14621 replica FinalizedReplica, blk_1073755445_14621, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755445 for deletion 2025-07-19 07:10:48,035 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755445_14621 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755445 2025-07-19 07:11:49,826 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755446_14622 src: /192.168.158.9:47890 dest: /192.168.158.4:9866 2025-07-19 07:11:49,853 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:47890, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1526323527_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755446_14622, duration(ns): 21475778 2025-07-19 07:11:49,853 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755446_14622, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-19 07:11:51,036 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755446_14622 replica FinalizedReplica, blk_1073755446_14622, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755446 for deletion 2025-07-19 07:11:51,037 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755446_14622 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755446 2025-07-19 07:13:49,819 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755448_14624 src: /192.168.158.1:50202 dest: /192.168.158.4:9866 2025-07-19 07:13:49,853 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50202, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1923489074_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755448_14624, duration(ns): 24960769 2025-07-19 07:13:49,854 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755448_14624, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-19 07:13:54,041 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755448_14624 replica FinalizedReplica, blk_1073755448_14624, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755448 for deletion 2025-07-19 07:13:54,043 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755448_14624 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755448 2025-07-19 07:15:54,826 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755450_14626 src: /192.168.158.7:49728 dest: /192.168.158.4:9866 2025-07-19 07:15:54,846 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:49728, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-306716755_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755450_14626, duration(ns): 18365889 2025-07-19 07:15:54,847 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755450_14626, type=LAST_IN_PIPELINE terminating 2025-07-19 07:16:00,045 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755450_14626 replica FinalizedReplica, blk_1073755450_14626, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755450 for deletion 2025-07-19 07:16:00,046 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755450_14626 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755450 2025-07-19 07:16:54,826 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755451_14627 src: /192.168.158.5:56458 dest: /192.168.158.4:9866 2025-07-19 07:16:54,851 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:56458, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_42704245_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755451_14627, duration(ns): 20040715 2025-07-19 07:16:54,852 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755451_14627, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-19 07:17:00,050 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755451_14627 replica FinalizedReplica, blk_1073755451_14627, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755451 for deletion 2025-07-19 07:17:00,051 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755451_14627 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755451 2025-07-19 07:18:59,827 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755453_14629 src: /192.168.158.6:49552 dest: /192.168.158.4:9866 2025-07-19 07:18:59,847 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:49552, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_782544853_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755453_14629, duration(ns): 17554354 2025-07-19 07:18:59,847 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755453_14629, type=LAST_IN_PIPELINE terminating 2025-07-19 07:19:03,055 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755453_14629 replica FinalizedReplica, blk_1073755453_14629, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755453 for deletion 2025-07-19 07:19:03,056 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755453_14629 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755453 2025-07-19 07:19:59,827 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755454_14630 src: /192.168.158.1:60524 dest: /192.168.158.4:9866 2025-07-19 07:19:59,861 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60524, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1167049417_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755454_14630, duration(ns): 25135224 2025-07-19 07:19:59,861 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755454_14630, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-19 07:20:03,055 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755454_14630 replica FinalizedReplica, blk_1073755454_14630, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755454 for deletion 2025-07-19 07:20:03,057 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755454_14630 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755454 2025-07-19 07:21:59,837 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755456_14632 src: /192.168.158.1:53794 dest: /192.168.158.4:9866 2025-07-19 07:21:59,870 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53794, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_734342546_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755456_14632, duration(ns): 24375647 2025-07-19 07:21:59,870 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755456_14632, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-19 07:22:03,061 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755456_14632 replica FinalizedReplica, blk_1073755456_14632, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755456 for deletion 2025-07-19 07:22:03,063 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755456_14632 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755456 2025-07-19 07:22:59,841 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755457_14633 src: /192.168.158.9:53834 dest: /192.168.158.4:9866 2025-07-19 07:22:59,860 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:53834, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1922468833_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755457_14633, duration(ns): 17031466 2025-07-19 07:22:59,860 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755457_14633, type=LAST_IN_PIPELINE terminating 2025-07-19 07:23:03,064 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755457_14633 replica FinalizedReplica, blk_1073755457_14633, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755457 for deletion 2025-07-19 07:23:03,065 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755457_14633 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755457 2025-07-19 07:30:04,843 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755464_14640 src: /192.168.158.7:36118 dest: /192.168.158.4:9866 2025-07-19 07:30:04,870 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:36118, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1842693386_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755464_14640, duration(ns): 21785933 2025-07-19 07:30:04,871 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755464_14640, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-19 07:30:09,076 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755464_14640 replica FinalizedReplica, blk_1073755464_14640, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755464 for deletion 2025-07-19 07:30:09,077 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755464_14640 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755464 2025-07-19 07:31:04,851 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755465_14641 src: /192.168.158.7:55288 dest: /192.168.158.4:9866 2025-07-19 07:31:04,871 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:55288, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_287759659_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755465_14641, duration(ns): 17685128 2025-07-19 07:31:04,872 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755465_14641, type=LAST_IN_PIPELINE terminating 2025-07-19 07:31:09,080 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755465_14641 replica FinalizedReplica, blk_1073755465_14641, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755465 for deletion 2025-07-19 07:31:09,081 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755465_14641 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755465 2025-07-19 07:32:04,853 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755466_14642 src: /192.168.158.8:56318 dest: /192.168.158.4:9866 2025-07-19 07:32:04,874 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:56318, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1235914773_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755466_14642, duration(ns): 18381465 2025-07-19 07:32:04,874 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755466_14642, type=LAST_IN_PIPELINE terminating 2025-07-19 07:32:06,080 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755466_14642 replica FinalizedReplica, blk_1073755466_14642, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755466 for deletion 2025-07-19 07:32:06,081 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755466_14642 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755466 2025-07-19 07:33:04,852 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755467_14643 src: /192.168.158.7:37712 dest: /192.168.158.4:9866 2025-07-19 07:33:04,872 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:37712, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1399717835_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755467_14643, duration(ns): 17968717 2025-07-19 07:33:04,873 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755467_14643, type=LAST_IN_PIPELINE terminating 2025-07-19 07:33:09,083 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755467_14643 replica FinalizedReplica, blk_1073755467_14643, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755467 for deletion 2025-07-19 07:33:09,084 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755467_14643 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755467 2025-07-19 07:34:04,852 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755468_14644 src: /192.168.158.5:58524 dest: /192.168.158.4:9866 2025-07-19 07:34:04,877 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:58524, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_993518990_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755468_14644, duration(ns): 19317505 2025-07-19 07:34:04,877 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755468_14644, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-19 07:34:06,087 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755468_14644 replica FinalizedReplica, blk_1073755468_14644, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755468 for deletion 2025-07-19 07:34:06,088 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755468_14644 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755468 2025-07-19 07:35:04,852 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755469_14645 src: /192.168.158.6:33488 dest: /192.168.158.4:9866 2025-07-19 07:35:04,883 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:33488, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1483028460_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755469_14645, duration(ns): 21856372 2025-07-19 07:35:04,883 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755469_14645, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-19 07:35:06,091 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755469_14645 replica FinalizedReplica, blk_1073755469_14645, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755469 for deletion 2025-07-19 07:35:06,092 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755469_14645 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755469 2025-07-19 07:36:04,849 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755470_14646 src: /192.168.158.1:38454 dest: /192.168.158.4:9866 2025-07-19 07:36:04,884 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38454, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2081077920_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755470_14646, duration(ns): 25611783 2025-07-19 07:36:04,884 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755470_14646, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-19 07:36:06,090 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755470_14646 replica FinalizedReplica, blk_1073755470_14646, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755470 for deletion 2025-07-19 07:36:06,091 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755470_14646 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755470 2025-07-19 07:37:04,860 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755471_14647 src: /192.168.158.9:57758 dest: /192.168.158.4:9866 2025-07-19 07:37:04,880 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:57758, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_811212495_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755471_14647, duration(ns): 17140292 2025-07-19 07:37:04,880 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755471_14647, type=LAST_IN_PIPELINE terminating 2025-07-19 07:37:09,092 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755471_14647 replica FinalizedReplica, blk_1073755471_14647, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755471 for deletion 2025-07-19 07:37:09,094 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755471_14647 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755471 2025-07-19 07:39:04,857 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755473_14649 src: /192.168.158.1:52286 dest: /192.168.158.4:9866 2025-07-19 07:39:04,892 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52286, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_402752966_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755473_14649, duration(ns): 25634859 2025-07-19 07:39:04,892 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755473_14649, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-19 07:39:06,095 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755473_14649 replica FinalizedReplica, blk_1073755473_14649, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755473 for deletion 2025-07-19 07:39:06,096 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755473_14649 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755473 2025-07-19 07:40:04,866 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755474_14650 src: /192.168.158.6:59286 dest: /192.168.158.4:9866 2025-07-19 07:40:04,884 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:59286, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_512698612_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755474_14650, duration(ns): 16640641 2025-07-19 07:40:04,885 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755474_14650, type=LAST_IN_PIPELINE terminating 2025-07-19 07:40:06,095 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755474_14650 replica FinalizedReplica, blk_1073755474_14650, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755474 for deletion 2025-07-19 07:40:06,096 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755474_14650 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755474 2025-07-19 07:50:04,882 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755484_14660 src: /192.168.158.7:58136 dest: /192.168.158.4:9866 2025-07-19 07:50:04,902 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:58136, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1058593441_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755484_14660, duration(ns): 17379509 2025-07-19 07:50:04,902 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755484_14660, type=LAST_IN_PIPELINE terminating 2025-07-19 07:50:09,111 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755484_14660 replica FinalizedReplica, blk_1073755484_14660, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755484 for deletion 2025-07-19 07:50:09,112 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755484_14660 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755484 2025-07-19 07:51:04,880 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755485_14661 src: /192.168.158.6:46614 dest: /192.168.158.4:9866 2025-07-19 07:51:04,906 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:46614, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1678643915_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755485_14661, duration(ns): 20478719 2025-07-19 07:51:04,907 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755485_14661, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-19 07:51:09,114 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755485_14661 replica FinalizedReplica, blk_1073755485_14661, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755485 for deletion 2025-07-19 07:51:09,115 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755485_14661 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755485 2025-07-19 07:53:04,885 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755487_14663 src: /192.168.158.7:38850 dest: /192.168.158.4:9866 2025-07-19 07:53:04,911 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:38850, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_61882705_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755487_14663, duration(ns): 20512716 2025-07-19 07:53:04,912 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755487_14663, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-19 07:53:06,117 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755487_14663 replica FinalizedReplica, blk_1073755487_14663, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755487 for deletion 2025-07-19 07:53:06,118 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755487_14663 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755487 2025-07-19 07:54:04,887 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755488_14664 src: /192.168.158.9:46958 dest: /192.168.158.4:9866 2025-07-19 07:54:04,907 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:46958, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1005893531_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755488_14664, duration(ns): 17064709 2025-07-19 07:54:04,907 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755488_14664, type=LAST_IN_PIPELINE terminating 2025-07-19 07:54:09,122 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755488_14664 replica FinalizedReplica, blk_1073755488_14664, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755488 for deletion 2025-07-19 07:54:09,123 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755488_14664 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755488 2025-07-19 07:57:04,894 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755491_14667 src: /192.168.158.1:50742 dest: /192.168.158.4:9866 2025-07-19 07:57:04,927 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50742, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1640819287_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755491_14667, duration(ns): 23930768 2025-07-19 07:57:04,927 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755491_14667, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-19 07:57:06,125 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755491_14667 replica FinalizedReplica, blk_1073755491_14667, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755491 for deletion 2025-07-19 07:57:06,126 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755491_14667 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755491 2025-07-19 08:03:09,904 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755497_14673 src: /192.168.158.5:54390 dest: /192.168.158.4:9866 2025-07-19 08:03:09,924 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:54390, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-118369978_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755497_14673, duration(ns): 18020282 2025-07-19 08:03:09,925 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755497_14673, type=LAST_IN_PIPELINE terminating 2025-07-19 08:03:12,136 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755497_14673 replica FinalizedReplica, blk_1073755497_14673, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755497 for deletion 2025-07-19 08:03:12,137 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755497_14673 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755497 2025-07-19 08:04:09,899 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755498_14674 src: /192.168.158.1:45474 dest: /192.168.158.4:9866 2025-07-19 08:04:09,931 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45474, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_237349243_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755498_14674, duration(ns): 23482226 2025-07-19 08:04:09,931 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755498_14674, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-19 08:04:15,139 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755498_14674 replica FinalizedReplica, blk_1073755498_14674, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755498 for deletion 2025-07-19 08:04:15,140 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755498_14674 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755498 2025-07-19 08:06:09,905 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755500_14676 src: /192.168.158.8:34898 dest: /192.168.158.4:9866 2025-07-19 08:06:09,932 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:34898, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1281227555_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755500_14676, duration(ns): 21119203 2025-07-19 08:06:09,932 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755500_14676, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-19 08:06:12,142 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755500_14676 replica FinalizedReplica, blk_1073755500_14676, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755500 for deletion 2025-07-19 08:06:12,143 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755500_14676 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755500 2025-07-19 08:08:09,908 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755502_14678 src: /192.168.158.1:55630 dest: /192.168.158.4:9866 2025-07-19 08:08:09,943 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55630, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_888186017_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755502_14678, duration(ns): 25203937 2025-07-19 08:08:09,943 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755502_14678, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-19 08:08:12,148 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755502_14678 replica FinalizedReplica, blk_1073755502_14678, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755502 for deletion 2025-07-19 08:08:12,149 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755502_14678 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755502 2025-07-19 08:10:09,916 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755504_14680 src: /192.168.158.8:47930 dest: /192.168.158.4:9866 2025-07-19 08:10:09,934 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:47930, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1669583142_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755504_14680, duration(ns): 16344755 2025-07-19 08:10:09,935 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755504_14680, type=LAST_IN_PIPELINE terminating 2025-07-19 08:10:12,156 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755504_14680 replica FinalizedReplica, blk_1073755504_14680, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755504 for deletion 2025-07-19 08:10:12,157 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755504_14680 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755504 2025-07-19 08:13:14,918 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755507_14683 src: /192.168.158.5:46768 dest: /192.168.158.4:9866 2025-07-19 08:13:14,945 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:46768, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1381528293_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755507_14683, duration(ns): 21707202 2025-07-19 08:13:14,946 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755507_14683, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-19 08:13:21,163 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755507_14683 replica FinalizedReplica, blk_1073755507_14683, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755507 for deletion 2025-07-19 08:13:21,164 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755507_14683 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755507 2025-07-19 08:15:19,951 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755509_14685 src: /192.168.158.1:36024 dest: /192.168.158.4:9866 2025-07-19 08:15:19,986 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36024, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-469675719_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755509_14685, duration(ns): 25931881 2025-07-19 08:15:19,986 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755509_14685, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-19 08:15:21,168 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755509_14685 replica FinalizedReplica, blk_1073755509_14685, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755509 for deletion 2025-07-19 08:15:21,169 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755509_14685 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755509 2025-07-19 08:17:19,927 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755511_14687 src: /192.168.158.9:34234 dest: /192.168.158.4:9866 2025-07-19 08:17:19,946 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:34234, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1170627660_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755511_14687, duration(ns): 16682105 2025-07-19 08:17:19,946 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755511_14687, type=LAST_IN_PIPELINE terminating 2025-07-19 08:17:21,172 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755511_14687 replica FinalizedReplica, blk_1073755511_14687, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755511 for deletion 2025-07-19 08:17:21,173 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755511_14687 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755511 2025-07-19 08:18:19,924 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755512_14688 src: /192.168.158.8:53780 dest: /192.168.158.4:9866 2025-07-19 08:18:19,953 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:53780, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_816815866_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755512_14688, duration(ns): 22752679 2025-07-19 08:18:19,953 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755512_14688, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-19 08:18:21,176 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755512_14688 replica FinalizedReplica, blk_1073755512_14688, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755512 for deletion 2025-07-19 08:18:21,177 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755512_14688 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755512 2025-07-19 08:19:19,930 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755513_14689 src: /192.168.158.6:39192 dest: /192.168.158.4:9866 2025-07-19 08:19:19,948 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:39192, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_768104890_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755513_14689, duration(ns): 16068382 2025-07-19 08:19:19,948 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755513_14689, type=LAST_IN_PIPELINE terminating 2025-07-19 08:19:21,177 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755513_14689 replica FinalizedReplica, blk_1073755513_14689, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755513 for deletion 2025-07-19 08:19:21,179 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755513_14689 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755513 2025-07-19 08:21:19,927 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755515_14691 src: /192.168.158.1:50878 dest: /192.168.158.4:9866 2025-07-19 08:21:19,964 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50878, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1840879122_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755515_14691, duration(ns): 27135899 2025-07-19 08:21:19,964 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755515_14691, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-19 08:21:24,178 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755515_14691 replica FinalizedReplica, blk_1073755515_14691, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755515 for deletion 2025-07-19 08:21:24,179 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755515_14691 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755515 2025-07-19 08:22:19,927 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755516_14692 src: /192.168.158.1:45500 dest: /192.168.158.4:9866 2025-07-19 08:22:19,959 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45500, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_26111929_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755516_14692, duration(ns): 23583037 2025-07-19 08:22:19,959 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755516_14692, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-19 08:22:21,181 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755516_14692 replica FinalizedReplica, blk_1073755516_14692, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755516 for deletion 2025-07-19 08:22:21,182 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755516_14692 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755516 2025-07-19 08:23:19,943 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755517_14693 src: /192.168.158.8:44186 dest: /192.168.158.4:9866 2025-07-19 08:23:19,962 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:44186, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_88424779_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755517_14693, duration(ns): 16806523 2025-07-19 08:23:19,962 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755517_14693, type=LAST_IN_PIPELINE terminating 2025-07-19 08:23:21,185 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755517_14693 replica FinalizedReplica, blk_1073755517_14693, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755517 for deletion 2025-07-19 08:23:21,186 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755517_14693 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755517 2025-07-19 08:24:24,930 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755518_14694 src: /192.168.158.1:48014 dest: /192.168.158.4:9866 2025-07-19 08:24:24,965 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48014, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_305709223_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755518_14694, duration(ns): 24985530 2025-07-19 08:24:24,965 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755518_14694, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-19 08:24:27,187 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755518_14694 replica FinalizedReplica, blk_1073755518_14694, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755518 for deletion 2025-07-19 08:24:27,188 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755518_14694 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755518 2025-07-19 08:28:29,937 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755522_14698 src: /192.168.158.1:56216 dest: /192.168.158.4:9866 2025-07-19 08:28:29,971 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56216, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_3778471_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755522_14698, duration(ns): 24536755 2025-07-19 08:28:29,971 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755522_14698, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-19 08:28:36,198 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755522_14698 replica FinalizedReplica, blk_1073755522_14698, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755522 for deletion 2025-07-19 08:28:36,199 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755522_14698 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755522 2025-07-19 08:30:29,943 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755524_14700 src: /192.168.158.8:58712 dest: /192.168.158.4:9866 2025-07-19 08:30:29,961 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:58712, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-605293171_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755524_14700, duration(ns): 16476615 2025-07-19 08:30:29,962 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755524_14700, type=LAST_IN_PIPELINE terminating 2025-07-19 08:30:33,199 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755524_14700 replica FinalizedReplica, blk_1073755524_14700, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755524 for deletion 2025-07-19 08:30:33,201 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755524_14700 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755524 2025-07-19 08:31:29,941 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755525_14701 src: /192.168.158.1:56612 dest: /192.168.158.4:9866 2025-07-19 08:31:29,974 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56612, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1787435255_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755525_14701, duration(ns): 23958187 2025-07-19 08:31:29,974 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755525_14701, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-19 08:31:36,202 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755525_14701 replica FinalizedReplica, blk_1073755525_14701, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755525 for deletion 2025-07-19 08:31:36,203 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755525_14701 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755525 2025-07-19 08:32:29,949 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755526_14702 src: /192.168.158.6:41594 dest: /192.168.158.4:9866 2025-07-19 08:32:29,975 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:41594, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-13159770_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755526_14702, duration(ns): 20347745 2025-07-19 08:32:29,975 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755526_14702, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-19 08:32:33,204 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755526_14702 replica FinalizedReplica, blk_1073755526_14702, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755526 for deletion 2025-07-19 08:32:33,205 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755526_14702 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755526 2025-07-19 08:35:29,952 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755529_14705 src: /192.168.158.8:52202 dest: /192.168.158.4:9866 2025-07-19 08:35:29,972 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:52202, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_492905580_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755529_14705, duration(ns): 17456265 2025-07-19 08:35:29,972 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755529_14705, type=LAST_IN_PIPELINE terminating 2025-07-19 08:35:36,207 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755529_14705 replica FinalizedReplica, blk_1073755529_14705, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755529 for deletion 2025-07-19 08:35:36,209 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755529_14705 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755529 2025-07-19 08:40:34,962 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755534_14710 src: /192.168.158.6:44466 dest: /192.168.158.4:9866 2025-07-19 08:40:34,981 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:44466, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-197054354_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755534_14710, duration(ns): 16778873 2025-07-19 08:40:34,981 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755534_14710, type=LAST_IN_PIPELINE terminating 2025-07-19 08:40:36,213 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755534_14710 replica FinalizedReplica, blk_1073755534_14710, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755534 for deletion 2025-07-19 08:40:36,214 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755534_14710 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755534 2025-07-19 08:41:34,959 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755535_14711 src: /192.168.158.5:42182 dest: /192.168.158.4:9866 2025-07-19 08:41:34,984 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:42182, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1014352389_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755535_14711, duration(ns): 19819901 2025-07-19 08:41:34,985 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755535_14711, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-19 08:41:36,216 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755535_14711 replica FinalizedReplica, blk_1073755535_14711, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755535 for deletion 2025-07-19 08:41:36,217 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755535_14711 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755535 2025-07-19 08:42:34,963 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755536_14712 src: /192.168.158.7:45460 dest: /192.168.158.4:9866 2025-07-19 08:42:34,992 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:45460, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1343888735_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755536_14712, duration(ns): 22877842 2025-07-19 08:42:34,992 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755536_14712, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-19 08:42:36,220 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755536_14712 replica FinalizedReplica, blk_1073755536_14712, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755536 for deletion 2025-07-19 08:42:36,221 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755536_14712 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755536 2025-07-19 08:43:34,961 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755537_14713 src: /192.168.158.1:38134 dest: /192.168.158.4:9866 2025-07-19 08:43:34,993 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38134, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1111463995_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755537_14713, duration(ns): 22912301 2025-07-19 08:43:34,993 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755537_14713, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-19 08:43:36,223 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755537_14713 replica FinalizedReplica, blk_1073755537_14713, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755537 for deletion 2025-07-19 08:43:36,224 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755537_14713 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755537 2025-07-19 08:45:34,971 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755539_14715 src: /192.168.158.1:60868 dest: /192.168.158.4:9866 2025-07-19 08:45:35,008 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60868, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1380341094_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755539_14715, duration(ns): 27326007 2025-07-19 08:45:35,008 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755539_14715, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-19 08:45:36,224 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755539_14715 replica FinalizedReplica, blk_1073755539_14715, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755539 for deletion 2025-07-19 08:45:36,225 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755539_14715 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755539 2025-07-19 08:49:34,992 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755543_14719 src: /192.168.158.5:33582 dest: /192.168.158.4:9866 2025-07-19 08:49:35,011 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:33582, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-260027011_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755543_14719, duration(ns): 17000823 2025-07-19 08:49:35,012 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755543_14719, type=LAST_IN_PIPELINE terminating 2025-07-19 08:49:39,229 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755543_14719 replica FinalizedReplica, blk_1073755543_14719, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755543 for deletion 2025-07-19 08:49:39,231 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755543_14719 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755543 2025-07-19 08:50:34,987 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755544_14720 src: /192.168.158.1:50550 dest: /192.168.158.4:9866 2025-07-19 08:50:35,018 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50550, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1443849104_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755544_14720, duration(ns): 22007197 2025-07-19 08:50:35,018 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755544_14720, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-19 08:50:36,232 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755544_14720 replica FinalizedReplica, blk_1073755544_14720, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755544 for deletion 2025-07-19 08:50:36,233 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755544_14720 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755544 2025-07-19 08:53:39,990 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755547_14723 src: /192.168.158.6:50576 dest: /192.168.158.4:9866 2025-07-19 08:53:40,017 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:50576, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1156978932_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755547_14723, duration(ns): 21421880 2025-07-19 08:53:40,017 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755547_14723, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-19 08:53:42,237 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755547_14723 replica FinalizedReplica, blk_1073755547_14723, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755547 for deletion 2025-07-19 08:53:42,238 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755547_14723 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755547 2025-07-19 08:55:44,988 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755549_14725 src: /192.168.158.1:49140 dest: /192.168.158.4:9866 2025-07-19 08:55:45,023 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49140, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1710199387_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755549_14725, duration(ns): 25517681 2025-07-19 08:55:45,023 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755549_14725, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-19 08:55:51,241 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755549_14725 replica FinalizedReplica, blk_1073755549_14725, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755549 for deletion 2025-07-19 08:55:51,242 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755549_14725 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755549 2025-07-19 08:56:45,004 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755550_14726 src: /192.168.158.8:37640 dest: /192.168.158.4:9866 2025-07-19 08:56:45,025 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:37640, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-576474396_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755550_14726, duration(ns): 18725136 2025-07-19 08:56:45,025 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755550_14726, type=LAST_IN_PIPELINE terminating 2025-07-19 08:56:48,243 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755550_14726 replica FinalizedReplica, blk_1073755550_14726, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755550 for deletion 2025-07-19 08:56:48,244 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755550_14726 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755550 2025-07-19 08:58:44,997 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755552_14728 src: /192.168.158.1:36404 dest: /192.168.158.4:9866 2025-07-19 08:58:45,031 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36404, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1655429848_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755552_14728, duration(ns): 25495079 2025-07-19 08:58:45,031 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755552_14728, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-19 08:58:48,249 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755552_14728 replica FinalizedReplica, blk_1073755552_14728, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755552 for deletion 2025-07-19 08:58:48,250 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755552_14728 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755552 2025-07-19 08:59:44,999 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755553_14729 src: /192.168.158.8:44002 dest: /192.168.158.4:9866 2025-07-19 08:59:45,032 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:44002, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_503961057_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755553_14729, duration(ns): 26547259 2025-07-19 08:59:45,033 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755553_14729, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-19 08:59:48,253 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755553_14729 replica FinalizedReplica, blk_1073755553_14729, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755553 for deletion 2025-07-19 08:59:48,254 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755553_14729 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755553 2025-07-19 09:08:49,992 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755562_14738 src: /192.168.158.1:42558 dest: /192.168.158.4:9866 2025-07-19 09:08:50,027 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42558, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-586650914_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755562_14738, duration(ns): 24875670 2025-07-19 09:08:50,027 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755562_14738, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-19 09:08:54,272 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755562_14738 replica FinalizedReplica, blk_1073755562_14738, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755562 for deletion 2025-07-19 09:08:54,273 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755562_14738 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755562 2025-07-19 09:09:49,991 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755563_14739 src: /192.168.158.1:60768 dest: /192.168.158.4:9866 2025-07-19 09:09:50,023 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60768, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1493044302_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755563_14739, duration(ns): 22315724 2025-07-19 09:09:50,024 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755563_14739, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-19 09:09:51,272 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755563_14739 replica FinalizedReplica, blk_1073755563_14739, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755563 for deletion 2025-07-19 09:09:51,273 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755563_14739 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755563 2025-07-19 09:10:49,999 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755564_14740 src: /192.168.158.8:45688 dest: /192.168.158.4:9866 2025-07-19 09:10:50,026 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:45688, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-718359520_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755564_14740, duration(ns): 21674371 2025-07-19 09:10:50,027 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755564_14740, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-19 09:10:51,274 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755564_14740 replica FinalizedReplica, blk_1073755564_14740, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755564 for deletion 2025-07-19 09:10:51,275 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755564_14740 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755564 2025-07-19 09:11:49,994 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755565_14741 src: /192.168.158.1:50240 dest: /192.168.158.4:9866 2025-07-19 09:11:50,028 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50240, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-401290134_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755565_14741, duration(ns): 24646533 2025-07-19 09:11:50,028 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755565_14741, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-19 09:11:51,274 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755565_14741 replica FinalizedReplica, blk_1073755565_14741, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755565 for deletion 2025-07-19 09:11:51,275 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755565_14741 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755565 2025-07-19 09:12:49,999 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755566_14742 src: /192.168.158.1:57896 dest: /192.168.158.4:9866 2025-07-19 09:12:50,034 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57896, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1846022805_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755566_14742, duration(ns): 25311843 2025-07-19 09:12:50,034 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755566_14742, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-19 09:12:51,277 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755566_14742 replica FinalizedReplica, blk_1073755566_14742, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755566 for deletion 2025-07-19 09:12:51,278 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755566_14742 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755566 2025-07-19 09:13:50,004 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755567_14743 src: /192.168.158.9:56986 dest: /192.168.158.4:9866 2025-07-19 09:13:50,025 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:56986, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1518537684_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755567_14743, duration(ns): 18642772 2025-07-19 09:13:50,025 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755567_14743, type=LAST_IN_PIPELINE terminating 2025-07-19 09:13:54,279 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755567_14743 replica FinalizedReplica, blk_1073755567_14743, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755567 for deletion 2025-07-19 09:13:54,280 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755567_14743 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755567 2025-07-19 09:16:50,015 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755570_14746 src: /192.168.158.7:33000 dest: /192.168.158.4:9866 2025-07-19 09:16:50,034 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:33000, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1813274673_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755570_14746, duration(ns): 17461057 2025-07-19 09:16:50,035 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755570_14746, type=LAST_IN_PIPELINE terminating 2025-07-19 09:16:51,287 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755570_14746 replica FinalizedReplica, blk_1073755570_14746, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755570 for deletion 2025-07-19 09:16:51,288 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755570_14746 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755570 2025-07-19 09:17:55,011 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755571_14747 src: /192.168.158.5:47728 dest: /192.168.158.4:9866 2025-07-19 09:17:55,030 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:47728, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1771098326_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755571_14747, duration(ns): 16545387 2025-07-19 09:17:55,030 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755571_14747, type=LAST_IN_PIPELINE terminating 2025-07-19 09:18:00,293 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755571_14747 replica FinalizedReplica, blk_1073755571_14747, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755571 for deletion 2025-07-19 09:18:00,294 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755571_14747 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755571 2025-07-19 09:19:55,011 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755573_14749 src: /192.168.158.7:39468 dest: /192.168.158.4:9866 2025-07-19 09:19:55,036 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:39468, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-279978293_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755573_14749, duration(ns): 19691428 2025-07-19 09:19:55,037 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755573_14749, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-19 09:20:00,296 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755573_14749 replica FinalizedReplica, blk_1073755573_14749, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755573 for deletion 2025-07-19 09:20:00,298 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755573_14749 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755573 2025-07-19 09:20:55,014 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755574_14750 src: /192.168.158.9:57300 dest: /192.168.158.4:9866 2025-07-19 09:20:55,032 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:57300, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-139956378_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755574_14750, duration(ns): 15740295 2025-07-19 09:20:55,032 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755574_14750, type=LAST_IN_PIPELINE terminating 2025-07-19 09:21:00,297 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755574_14750 replica FinalizedReplica, blk_1073755574_14750, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755574 for deletion 2025-07-19 09:21:00,298 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755574_14750 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755574 2025-07-19 09:23:55,031 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755577_14753 src: /192.168.158.5:50962 dest: /192.168.158.4:9866 2025-07-19 09:23:55,058 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:50962, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-643978109_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755577_14753, duration(ns): 21382964 2025-07-19 09:23:55,058 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755577_14753, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-19 09:23:57,302 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755577_14753 replica FinalizedReplica, blk_1073755577_14753, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755577 for deletion 2025-07-19 09:23:57,303 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755577_14753 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755577 2025-07-19 09:26:55,034 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755580_14756 src: /192.168.158.1:45606 dest: /192.168.158.4:9866 2025-07-19 09:26:55,069 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45606, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2027773991_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755580_14756, duration(ns): 24501029 2025-07-19 09:26:55,069 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755580_14756, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-19 09:26:57,312 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755580_14756 replica FinalizedReplica, blk_1073755580_14756, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755580 for deletion 2025-07-19 09:26:57,313 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755580_14756 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755580 2025-07-19 09:27:55,036 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755581_14757 src: /192.168.158.9:33812 dest: /192.168.158.4:9866 2025-07-19 09:27:55,061 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:33812, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_543222734_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755581_14757, duration(ns): 19830966 2025-07-19 09:27:55,062 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755581_14757, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-19 09:27:57,312 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755581_14757 replica FinalizedReplica, blk_1073755581_14757, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755581 for deletion 2025-07-19 09:27:57,314 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755581_14757 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755581 2025-07-19 09:30:55,046 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755584_14760 src: /192.168.158.5:54430 dest: /192.168.158.4:9866 2025-07-19 09:30:55,065 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:54430, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1462231767_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755584_14760, duration(ns): 16967639 2025-07-19 09:30:55,065 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755584_14760, type=LAST_IN_PIPELINE terminating 2025-07-19 09:30:57,322 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755584_14760 replica FinalizedReplica, blk_1073755584_14760, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755584 for deletion 2025-07-19 09:30:57,323 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755584_14760 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755584 2025-07-19 09:31:55,047 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755585_14761 src: /192.168.158.1:41524 dest: /192.168.158.4:9866 2025-07-19 09:31:55,079 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41524, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1909337926_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755585_14761, duration(ns): 23356667 2025-07-19 09:31:55,080 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755585_14761, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-19 09:32:00,323 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755585_14761 replica FinalizedReplica, blk_1073755585_14761, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755585 for deletion 2025-07-19 09:32:00,324 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755585_14761 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755585 2025-07-19 09:32:55,042 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755586_14762 src: /192.168.158.1:50318 dest: /192.168.158.4:9866 2025-07-19 09:32:55,077 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50318, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-14208433_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755586_14762, duration(ns): 26025630 2025-07-19 09:32:55,077 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755586_14762, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-19 09:33:00,324 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755586_14762 replica FinalizedReplica, blk_1073755586_14762, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755586 for deletion 2025-07-19 09:33:00,326 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755586_14762 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755586 2025-07-19 09:33:55,050 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755587_14763 src: /192.168.158.5:59270 dest: /192.168.158.4:9866 2025-07-19 09:33:55,074 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:59270, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2141055610_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755587_14763, duration(ns): 18997546 2025-07-19 09:33:55,075 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755587_14763, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-19 09:33:57,326 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755587_14763 replica FinalizedReplica, blk_1073755587_14763, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755587 for deletion 2025-07-19 09:33:57,327 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755587_14763 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755587 2025-07-19 09:37:00,059 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755590_14766 src: /192.168.158.8:47686 dest: /192.168.158.4:9866 2025-07-19 09:37:00,081 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:47686, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_277608582_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755590_14766, duration(ns): 20308593 2025-07-19 09:37:00,082 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755590_14766, type=LAST_IN_PIPELINE terminating 2025-07-19 09:37:03,332 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755590_14766 replica FinalizedReplica, blk_1073755590_14766, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755590 for deletion 2025-07-19 09:37:03,334 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755590_14766 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755590 2025-07-19 09:39:10,051 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755592_14768 src: /192.168.158.1:49086 dest: /192.168.158.4:9866 2025-07-19 09:39:10,083 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49086, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-678612519_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755592_14768, duration(ns): 23612331 2025-07-19 09:39:10,084 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755592_14768, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-19 09:39:15,336 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755592_14768 replica FinalizedReplica, blk_1073755592_14768, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755592 for deletion 2025-07-19 09:39:15,338 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755592_14768 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755592 2025-07-19 09:40:10,052 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755593_14769 src: /192.168.158.1:44866 dest: /192.168.158.4:9866 2025-07-19 09:40:10,089 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44866, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-671870041_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755593_14769, duration(ns): 27021431 2025-07-19 09:40:10,089 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755593_14769, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-19 09:40:12,339 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755593_14769 replica FinalizedReplica, blk_1073755593_14769, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755593 for deletion 2025-07-19 09:40:12,340 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755593_14769 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755593 2025-07-19 09:41:10,049 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755594_14770 src: /192.168.158.1:45448 dest: /192.168.158.4:9866 2025-07-19 09:41:10,083 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45448, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1829863765_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755594_14770, duration(ns): 24877197 2025-07-19 09:41:10,083 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755594_14770, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-19 09:41:12,343 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755594_14770 replica FinalizedReplica, blk_1073755594_14770, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755594 for deletion 2025-07-19 09:41:12,344 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755594_14770 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755594 2025-07-19 09:42:15,052 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755595_14771 src: /192.168.158.9:43974 dest: /192.168.158.4:9866 2025-07-19 09:42:15,079 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:43974, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1970372811_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755595_14771, duration(ns): 20611727 2025-07-19 09:42:15,080 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755595_14771, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-19 09:42:21,345 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755595_14771 replica FinalizedReplica, blk_1073755595_14771, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755595 for deletion 2025-07-19 09:42:21,346 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755595_14771 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755595 2025-07-19 09:45:15,066 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755598_14774 src: /192.168.158.9:58788 dest: /192.168.158.4:9866 2025-07-19 09:45:15,092 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:58788, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1665721124_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755598_14774, duration(ns): 21231509 2025-07-19 09:45:15,093 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755598_14774, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-19 09:45:21,355 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755598_14774 replica FinalizedReplica, blk_1073755598_14774, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755598 for deletion 2025-07-19 09:45:21,357 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755598_14774 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755598 2025-07-19 09:46:15,066 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755599_14775 src: /192.168.158.5:54610 dest: /192.168.158.4:9866 2025-07-19 09:46:15,086 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:54610, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1915709727_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755599_14775, duration(ns): 18123328 2025-07-19 09:46:15,087 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755599_14775, type=LAST_IN_PIPELINE terminating 2025-07-19 09:46:18,358 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755599_14775 replica FinalizedReplica, blk_1073755599_14775, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755599 for deletion 2025-07-19 09:46:18,359 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755599_14775 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755599 2025-07-19 09:49:25,075 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755602_14778 src: /192.168.158.7:47844 dest: /192.168.158.4:9866 2025-07-19 09:49:25,093 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:47844, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1674766070_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755602_14778, duration(ns): 16102037 2025-07-19 09:49:25,093 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755602_14778, type=LAST_IN_PIPELINE terminating 2025-07-19 09:49:30,364 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755602_14778 replica FinalizedReplica, blk_1073755602_14778, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755602 for deletion 2025-07-19 09:49:30,365 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755602_14778 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755602 2025-07-19 09:54:35,067 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755607_14783 src: /192.168.158.1:54982 dest: /192.168.158.4:9866 2025-07-19 09:54:35,101 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54982, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_811998445_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755607_14783, duration(ns): 25711184 2025-07-19 09:54:35,102 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755607_14783, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-19 09:54:39,373 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755607_14783 replica FinalizedReplica, blk_1073755607_14783, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755607 for deletion 2025-07-19 09:54:39,374 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755607_14783 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755607 2025-07-19 09:56:40,074 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755609_14785 src: /192.168.158.9:53444 dest: /192.168.158.4:9866 2025-07-19 09:56:40,100 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:53444, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1307223804_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755609_14785, duration(ns): 20063977 2025-07-19 09:56:40,100 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755609_14785, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-19 09:56:42,376 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755609_14785 replica FinalizedReplica, blk_1073755609_14785, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755609 for deletion 2025-07-19 09:56:42,377 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755609_14785 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755609 2025-07-19 09:59:18,383 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Successfully sent block report 0x12cfd9a757d31f4d, containing 4 storage report(s), of which we sent 4. The reports had 8 total blocks and used 1 RPC(s). This took 0 msec to generate and 5 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5. 2025-07-19 09:59:18,384 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Got finalize command for block pool BP-1059995147-192.168.158.1-1752101929360 2025-07-19 09:59:45,077 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755612_14788 src: /192.168.158.8:42950 dest: /192.168.158.4:9866 2025-07-19 09:59:45,103 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:42950, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1337245113_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755612_14788, duration(ns): 20687744 2025-07-19 09:59:45,103 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755612_14788, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-19 09:59:48,381 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755612_14788 replica FinalizedReplica, blk_1073755612_14788, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755612 for deletion 2025-07-19 09:59:48,382 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755612_14788 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755612 2025-07-19 10:03:50,083 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755616_14792 src: /192.168.158.5:55252 dest: /192.168.158.4:9866 2025-07-19 10:03:50,108 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:55252, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1413611235_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755616_14792, duration(ns): 20644173 2025-07-19 10:03:50,109 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755616_14792, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-19 10:03:51,389 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755616_14792 replica FinalizedReplica, blk_1073755616_14792, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755616 for deletion 2025-07-19 10:03:51,390 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755616_14792 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755616 2025-07-19 10:05:55,083 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755618_14794 src: /192.168.158.6:37350 dest: /192.168.158.4:9866 2025-07-19 10:05:55,109 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:37350, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1011980075_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755618_14794, duration(ns): 20474965 2025-07-19 10:05:55,109 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755618_14794, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-19 10:06:00,390 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755618_14794 replica FinalizedReplica, blk_1073755618_14794, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755618 for deletion 2025-07-19 10:06:00,391 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755618_14794 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755618 2025-07-19 10:08:55,094 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755621_14797 src: /192.168.158.8:43256 dest: /192.168.158.4:9866 2025-07-19 10:08:55,116 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:43256, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2100917036_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755621_14797, duration(ns): 19269459 2025-07-19 10:08:55,116 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755621_14797, type=LAST_IN_PIPELINE terminating 2025-07-19 10:09:00,397 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755621_14797 replica FinalizedReplica, blk_1073755621_14797, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755621 for deletion 2025-07-19 10:09:00,398 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755621_14797 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755621 2025-07-19 10:10:55,094 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755623_14799 src: /192.168.158.9:33608 dest: /192.168.158.4:9866 2025-07-19 10:10:55,112 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:33608, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1447129959_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755623_14799, duration(ns): 15485386 2025-07-19 10:10:55,112 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755623_14799, type=LAST_IN_PIPELINE terminating 2025-07-19 10:11:00,401 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755623_14799 replica FinalizedReplica, blk_1073755623_14799, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755623 for deletion 2025-07-19 10:11:00,402 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755623_14799 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755623 2025-07-19 10:12:00,095 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755624_14800 src: /192.168.158.8:56740 dest: /192.168.158.4:9866 2025-07-19 10:12:00,123 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:56740, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_645484720_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755624_14800, duration(ns): 22349378 2025-07-19 10:12:00,124 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755624_14800, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-19 10:12:03,403 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755624_14800 replica FinalizedReplica, blk_1073755624_14800, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755624 for deletion 2025-07-19 10:12:03,404 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755624_14800 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755624 2025-07-19 10:13:00,097 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755625_14801 src: /192.168.158.5:56524 dest: /192.168.158.4:9866 2025-07-19 10:13:00,116 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:56524, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1727575736_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755625_14801, duration(ns): 16993912 2025-07-19 10:13:00,116 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755625_14801, type=LAST_IN_PIPELINE terminating 2025-07-19 10:13:03,407 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755625_14801 replica FinalizedReplica, blk_1073755625_14801, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755625 for deletion 2025-07-19 10:13:03,408 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755625_14801 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755625 2025-07-19 10:15:00,092 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755627_14803 src: /192.168.158.1:58074 dest: /192.168.158.4:9866 2025-07-19 10:15:00,123 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58074, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1799352357_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755627_14803, duration(ns): 22616338 2025-07-19 10:15:00,124 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755627_14803, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-19 10:15:06,409 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755627_14803 replica FinalizedReplica, blk_1073755627_14803, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755627 for deletion 2025-07-19 10:15:06,410 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755627_14803 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755627 2025-07-19 10:17:00,101 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755629_14805 src: /192.168.158.5:34314 dest: /192.168.158.4:9866 2025-07-19 10:17:00,127 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:34314, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1928774398_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755629_14805, duration(ns): 20498728 2025-07-19 10:17:00,127 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755629_14805, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-19 10:17:06,411 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755629_14805 replica FinalizedReplica, blk_1073755629_14805, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755629 for deletion 2025-07-19 10:17:06,412 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755629_14805 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755629 2025-07-19 10:19:00,103 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755631_14807 src: /192.168.158.7:48404 dest: /192.168.158.4:9866 2025-07-19 10:19:00,130 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:48404, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-355737374_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755631_14807, duration(ns): 21197567 2025-07-19 10:19:00,130 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755631_14807, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-19 10:19:03,415 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755631_14807 replica FinalizedReplica, blk_1073755631_14807, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755631 for deletion 2025-07-19 10:19:03,416 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755631_14807 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755631 2025-07-19 10:28:05,117 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755640_14816 src: /192.168.158.1:46744 dest: /192.168.158.4:9866 2025-07-19 10:28:05,149 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46744, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1349473880_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755640_14816, duration(ns): 23568058 2025-07-19 10:28:05,149 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755640_14816, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-19 10:28:09,428 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755640_14816 replica FinalizedReplica, blk_1073755640_14816, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755640 for deletion 2025-07-19 10:28:09,429 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755640_14816 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755640 2025-07-19 10:29:10,140 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755641_14817 src: /192.168.158.1:53256 dest: /192.168.158.4:9866 2025-07-19 10:29:10,173 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53256, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2093465483_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755641_14817, duration(ns): 23918154 2025-07-19 10:29:10,173 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755641_14817, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-19 10:29:12,428 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755641_14817 replica FinalizedReplica, blk_1073755641_14817, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755641 for deletion 2025-07-19 10:29:12,429 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755641_14817 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755641 2025-07-19 10:30:10,124 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755642_14818 src: /192.168.158.5:33214 dest: /192.168.158.4:9866 2025-07-19 10:30:10,145 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:33214, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-127499686_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755642_14818, duration(ns): 17203682 2025-07-19 10:30:10,145 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755642_14818, type=LAST_IN_PIPELINE terminating 2025-07-19 10:30:12,431 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755642_14818 replica FinalizedReplica, blk_1073755642_14818, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755642 for deletion 2025-07-19 10:30:12,432 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755642_14818 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755642 2025-07-19 10:32:15,127 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755644_14820 src: /192.168.158.9:34130 dest: /192.168.158.4:9866 2025-07-19 10:32:15,147 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:34130, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_16411519_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755644_14820, duration(ns): 18036762 2025-07-19 10:32:15,147 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755644_14820, type=LAST_IN_PIPELINE terminating 2025-07-19 10:32:15,433 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755644_14820 replica FinalizedReplica, blk_1073755644_14820, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755644 for deletion 2025-07-19 10:32:15,434 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755644_14820 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755644 2025-07-19 10:34:15,128 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755646_14822 src: /192.168.158.6:36996 dest: /192.168.158.4:9866 2025-07-19 10:34:15,155 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:36996, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1240079459_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755646_14822, duration(ns): 20774621 2025-07-19 10:34:15,155 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755646_14822, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-19 10:34:15,436 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755646_14822 replica FinalizedReplica, blk_1073755646_14822, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755646 for deletion 2025-07-19 10:34:15,437 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755646_14822 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir21/blk_1073755646 2025-07-19 10:36:20,129 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755648_14824 src: /192.168.158.9:50390 dest: /192.168.158.4:9866 2025-07-19 10:36:20,156 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:50390, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_677460422_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755648_14824, duration(ns): 21754788 2025-07-19 10:36:20,156 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755648_14824, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-19 10:36:21,441 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755648_14824 replica FinalizedReplica, blk_1073755648_14824, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755648 for deletion 2025-07-19 10:36:21,442 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755648_14824 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755648 2025-07-19 10:37:20,132 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755649_14825 src: /192.168.158.1:59772 dest: /192.168.158.4:9866 2025-07-19 10:37:20,163 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59772, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-775117710_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755649_14825, duration(ns): 21995620 2025-07-19 10:37:20,164 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755649_14825, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-19 10:37:24,443 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755649_14825 replica FinalizedReplica, blk_1073755649_14825, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755649 for deletion 2025-07-19 10:37:24,444 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755649_14825 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755649 2025-07-19 10:39:20,146 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755651_14827 src: /192.168.158.7:60728 dest: /192.168.158.4:9866 2025-07-19 10:39:20,164 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:60728, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1900616322_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755651_14827, duration(ns): 15801628 2025-07-19 10:39:20,164 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755651_14827, type=LAST_IN_PIPELINE terminating 2025-07-19 10:39:21,447 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755651_14827 replica FinalizedReplica, blk_1073755651_14827, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755651 for deletion 2025-07-19 10:39:21,448 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755651_14827 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755651 2025-07-19 10:40:20,140 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755652_14828 src: /192.168.158.8:37512 dest: /192.168.158.4:9866 2025-07-19 10:40:20,166 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:37512, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-390997342_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755652_14828, duration(ns): 20496123 2025-07-19 10:40:20,166 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755652_14828, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-19 10:40:24,449 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755652_14828 replica FinalizedReplica, blk_1073755652_14828, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755652 for deletion 2025-07-19 10:40:24,450 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755652_14828 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755652 2025-07-19 10:42:25,142 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755654_14830 src: /192.168.158.9:32932 dest: /192.168.158.4:9866 2025-07-19 10:42:25,168 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:32932, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1849910529_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755654_14830, duration(ns): 19876091 2025-07-19 10:42:25,168 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755654_14830, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-19 10:42:30,454 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755654_14830 replica FinalizedReplica, blk_1073755654_14830, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755654 for deletion 2025-07-19 10:42:30,455 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755654_14830 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755654 2025-07-19 10:45:30,142 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755657_14833 src: /192.168.158.9:43404 dest: /192.168.158.4:9866 2025-07-19 10:45:30,161 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:43404, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-120393166_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755657_14833, duration(ns): 16899131 2025-07-19 10:45:30,161 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755657_14833, type=LAST_IN_PIPELINE terminating 2025-07-19 10:45:30,461 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755657_14833 replica FinalizedReplica, blk_1073755657_14833, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755657 for deletion 2025-07-19 10:45:30,462 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755657_14833 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755657 2025-07-19 10:52:30,153 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755664_14840 src: /192.168.158.8:47032 dest: /192.168.158.4:9866 2025-07-19 10:52:30,182 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:47032, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-935420882_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755664_14840, duration(ns): 22404434 2025-07-19 10:52:30,182 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755664_14840, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-19 10:52:30,471 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755664_14840 replica FinalizedReplica, blk_1073755664_14840, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755664 for deletion 2025-07-19 10:52:30,472 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755664_14840 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755664 2025-07-19 10:53:30,151 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755665_14841 src: /192.168.158.1:34516 dest: /192.168.158.4:9866 2025-07-19 10:53:30,183 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34516, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_585861311_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755665_14841, duration(ns): 22763741 2025-07-19 10:53:30,183 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755665_14841, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-19 10:53:30,474 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755665_14841 replica FinalizedReplica, blk_1073755665_14841, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755665 for deletion 2025-07-19 10:53:30,475 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755665_14841 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755665 2025-07-19 10:55:30,161 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755667_14843 src: /192.168.158.5:48934 dest: /192.168.158.4:9866 2025-07-19 10:55:30,181 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:48934, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1436904786_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755667_14843, duration(ns): 17781764 2025-07-19 10:55:30,182 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755667_14843, type=LAST_IN_PIPELINE terminating 2025-07-19 10:55:30,479 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755667_14843 replica FinalizedReplica, blk_1073755667_14843, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755667 for deletion 2025-07-19 10:55:30,480 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755667_14843 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755667 2025-07-19 10:56:30,160 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755668_14844 src: /192.168.158.1:34074 dest: /192.168.158.4:9866 2025-07-19 10:56:30,193 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34074, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1370215363_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755668_14844, duration(ns): 24482445 2025-07-19 10:56:30,193 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755668_14844, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-19 10:56:30,481 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755668_14844 replica FinalizedReplica, blk_1073755668_14844, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755668 for deletion 2025-07-19 10:56:30,482 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755668_14844 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755668 2025-07-19 10:57:30,167 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755669_14845 src: /192.168.158.1:59458 dest: /192.168.158.4:9866 2025-07-19 10:57:30,200 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59458, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_358132613_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755669_14845, duration(ns): 24217359 2025-07-19 10:57:30,201 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755669_14845, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-19 10:57:33,482 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755669_14845 replica FinalizedReplica, blk_1073755669_14845, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755669 for deletion 2025-07-19 10:57:33,483 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755669_14845 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755669 2025-07-19 11:00:35,173 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755672_14848 src: /192.168.158.7:56452 dest: /192.168.158.4:9866 2025-07-19 11:00:35,200 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:56452, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1639870499_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755672_14848, duration(ns): 21859419 2025-07-19 11:00:35,200 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755672_14848, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-19 11:00:36,488 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755672_14848 replica FinalizedReplica, blk_1073755672_14848, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755672 for deletion 2025-07-19 11:00:36,490 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755672_14848 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755672 2025-07-19 11:02:35,174 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755674_14850 src: /192.168.158.7:37006 dest: /192.168.158.4:9866 2025-07-19 11:02:35,192 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:37006, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2416152_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755674_14850, duration(ns): 16609187 2025-07-19 11:02:35,193 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755674_14850, type=LAST_IN_PIPELINE terminating 2025-07-19 11:02:39,493 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755674_14850 replica FinalizedReplica, blk_1073755674_14850, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755674 for deletion 2025-07-19 11:02:39,494 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755674_14850 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755674 2025-07-19 11:03:40,174 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755675_14851 src: /192.168.158.8:46450 dest: /192.168.158.4:9866 2025-07-19 11:03:40,200 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:46450, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1192524895_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755675_14851, duration(ns): 21227347 2025-07-19 11:03:40,201 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755675_14851, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-19 11:03:42,494 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755675_14851 replica FinalizedReplica, blk_1073755675_14851, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755675 for deletion 2025-07-19 11:03:42,495 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755675_14851 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755675 2025-07-19 11:04:40,175 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755676_14852 src: /192.168.158.7:50978 dest: /192.168.158.4:9866 2025-07-19 11:04:40,201 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:50978, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1531617275_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755676_14852, duration(ns): 20061492 2025-07-19 11:04:40,201 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755676_14852, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-19 11:04:42,497 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755676_14852 replica FinalizedReplica, blk_1073755676_14852, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755676 for deletion 2025-07-19 11:04:42,497 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755676_14852 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755676 2025-07-19 11:05:40,178 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755677_14853 src: /192.168.158.8:41930 dest: /192.168.158.4:9866 2025-07-19 11:05:40,203 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:41930, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1347942520_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755677_14853, duration(ns): 19908048 2025-07-19 11:05:40,203 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755677_14853, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-19 11:05:45,499 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755677_14853 replica FinalizedReplica, blk_1073755677_14853, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755677 for deletion 2025-07-19 11:05:45,501 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755677_14853 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755677 2025-07-19 11:06:40,176 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755678_14854 src: /192.168.158.1:40814 dest: /192.168.158.4:9866 2025-07-19 11:06:40,209 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40814, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1008111288_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755678_14854, duration(ns): 24559361 2025-07-19 11:06:40,209 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755678_14854, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-19 11:06:42,502 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755678_14854 replica FinalizedReplica, blk_1073755678_14854, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755678 for deletion 2025-07-19 11:06:42,504 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755678_14854 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755678 2025-07-19 11:07:40,182 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755679_14855 src: /192.168.158.1:37308 dest: /192.168.158.4:9866 2025-07-19 11:07:40,219 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37308, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1408025080_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755679_14855, duration(ns): 27899430 2025-07-19 11:07:40,219 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755679_14855, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-19 11:07:45,508 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755679_14855 replica FinalizedReplica, blk_1073755679_14855, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755679 for deletion 2025-07-19 11:07:45,509 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755679_14855 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755679 2025-07-19 11:08:40,182 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755680_14856 src: /192.168.158.5:54338 dest: /192.168.158.4:9866 2025-07-19 11:08:40,209 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:54338, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1349492021_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755680_14856, duration(ns): 21738139 2025-07-19 11:08:40,209 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755680_14856, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-19 11:08:42,509 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755680_14856 replica FinalizedReplica, blk_1073755680_14856, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755680 for deletion 2025-07-19 11:08:42,510 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755680_14856 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755680 2025-07-19 11:09:40,187 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755681_14857 src: /192.168.158.5:56446 dest: /192.168.158.4:9866 2025-07-19 11:09:40,214 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:56446, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-379870621_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755681_14857, duration(ns): 21815470 2025-07-19 11:09:40,214 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755681_14857, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-19 11:09:42,512 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755681_14857 replica FinalizedReplica, blk_1073755681_14857, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755681 for deletion 2025-07-19 11:09:42,513 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755681_14857 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755681 2025-07-19 11:10:45,188 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755682_14858 src: /192.168.158.7:34860 dest: /192.168.158.4:9866 2025-07-19 11:10:45,209 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:34860, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_673976865_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755682_14858, duration(ns): 19196679 2025-07-19 11:10:45,210 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755682_14858, type=LAST_IN_PIPELINE terminating 2025-07-19 11:10:48,512 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755682_14858 replica FinalizedReplica, blk_1073755682_14858, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755682 for deletion 2025-07-19 11:10:48,513 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755682_14858 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755682 2025-07-19 11:12:50,182 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755684_14860 src: /192.168.158.1:39726 dest: /192.168.158.4:9866 2025-07-19 11:12:50,214 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39726, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1463226690_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755684_14860, duration(ns): 22756515 2025-07-19 11:12:50,214 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755684_14860, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-19 11:12:51,517 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755684_14860 replica FinalizedReplica, blk_1073755684_14860, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755684 for deletion 2025-07-19 11:12:51,518 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755684_14860 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755684 2025-07-19 11:14:55,189 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755686_14862 src: /192.168.158.7:52124 dest: /192.168.158.4:9866 2025-07-19 11:14:55,214 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:52124, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-795242731_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755686_14862, duration(ns): 19397556 2025-07-19 11:14:55,215 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755686_14862, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-19 11:14:57,520 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755686_14862 replica FinalizedReplica, blk_1073755686_14862, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755686 for deletion 2025-07-19 11:14:57,521 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755686_14862 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755686 2025-07-19 11:15:55,191 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755687_14863 src: /192.168.158.6:52782 dest: /192.168.158.4:9866 2025-07-19 11:15:55,216 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:52782, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-785962487_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755687_14863, duration(ns): 19248407 2025-07-19 11:15:55,217 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755687_14863, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-19 11:15:57,522 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755687_14863 replica FinalizedReplica, blk_1073755687_14863, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755687 for deletion 2025-07-19 11:15:57,523 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755687_14863 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755687 2025-07-19 11:16:55,192 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755688_14864 src: /192.168.158.7:50824 dest: /192.168.158.4:9866 2025-07-19 11:16:55,210 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:50824, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-291040600_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755688_14864, duration(ns): 16543622 2025-07-19 11:16:55,211 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755688_14864, type=LAST_IN_PIPELINE terminating 2025-07-19 11:16:57,523 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755688_14864 replica FinalizedReplica, blk_1073755688_14864, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755688 for deletion 2025-07-19 11:16:57,525 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755688_14864 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755688 2025-07-19 11:19:55,191 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755691_14867 src: /192.168.158.7:55346 dest: /192.168.158.4:9866 2025-07-19 11:19:55,216 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:55346, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_126021651_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755691_14867, duration(ns): 19866277 2025-07-19 11:19:55,216 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755691_14867, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-19 11:19:57,527 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755691_14867 replica FinalizedReplica, blk_1073755691_14867, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755691 for deletion 2025-07-19 11:19:57,528 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755691_14867 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755691 2025-07-19 11:25:10,206 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755696_14872 src: /192.168.158.1:44448 dest: /192.168.158.4:9866 2025-07-19 11:25:10,239 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44448, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1084411563_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755696_14872, duration(ns): 23465727 2025-07-19 11:25:10,240 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755696_14872, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-19 11:25:12,535 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755696_14872 replica FinalizedReplica, blk_1073755696_14872, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755696 for deletion 2025-07-19 11:25:12,536 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755696_14872 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755696 2025-07-19 11:26:15,221 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755697_14873 src: /192.168.158.9:52446 dest: /192.168.158.4:9866 2025-07-19 11:26:15,241 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:52446, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1937438923_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755697_14873, duration(ns): 18326875 2025-07-19 11:26:15,241 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755697_14873, type=LAST_IN_PIPELINE terminating 2025-07-19 11:26:15,537 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755697_14873 replica FinalizedReplica, blk_1073755697_14873, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755697 for deletion 2025-07-19 11:26:15,538 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755697_14873 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755697 2025-07-19 11:30:15,232 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755701_14877 src: /192.168.158.9:48696 dest: /192.168.158.4:9866 2025-07-19 11:30:15,259 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:48696, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_853168818_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755701_14877, duration(ns): 21580140 2025-07-19 11:30:15,259 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755701_14877, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-19 11:30:18,548 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755701_14877 replica FinalizedReplica, blk_1073755701_14877, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755701 for deletion 2025-07-19 11:30:18,549 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755701_14877 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755701 2025-07-19 11:33:15,227 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755704_14880 src: /192.168.158.9:56384 dest: /192.168.158.4:9866 2025-07-19 11:33:15,253 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:56384, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_50471482_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755704_14880, duration(ns): 20252373 2025-07-19 11:33:15,253 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755704_14880, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-19 11:33:18,554 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755704_14880 replica FinalizedReplica, blk_1073755704_14880, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755704 for deletion 2025-07-19 11:33:18,556 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755704_14880 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755704 2025-07-19 11:34:20,224 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755705_14881 src: /192.168.158.8:56044 dest: /192.168.158.4:9866 2025-07-19 11:34:20,250 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:56044, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1728581457_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755705_14881, duration(ns): 20519246 2025-07-19 11:34:20,250 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755705_14881, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-19 11:34:24,558 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755705_14881 replica FinalizedReplica, blk_1073755705_14881, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755705 for deletion 2025-07-19 11:34:24,559 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755705_14881 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755705 2025-07-19 11:36:13,269 INFO org.apache.hadoop.hdfs.server.datanode.DirectoryScanner: BlockPool BP-1059995147-192.168.158.1-1752101929360 Total blocks: 8, missing metadata files:0, missing block files:0, missing blocks in memory:0, mismatched blocks:0 2025-07-19 11:38:25,233 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755709_14885 src: /192.168.158.6:48756 dest: /192.168.158.4:9866 2025-07-19 11:38:25,260 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:48756, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_432223745_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755709_14885, duration(ns): 21939981 2025-07-19 11:38:25,260 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755709_14885, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-19 11:38:30,564 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755709_14885 replica FinalizedReplica, blk_1073755709_14885, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755709 for deletion 2025-07-19 11:38:30,565 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755709_14885 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755709 2025-07-19 11:39:25,227 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755710_14886 src: /192.168.158.6:43514 dest: /192.168.158.4:9866 2025-07-19 11:39:25,248 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:43514, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_267655924_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755710_14886, duration(ns): 18693339 2025-07-19 11:39:25,249 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755710_14886, type=LAST_IN_PIPELINE terminating 2025-07-19 11:39:27,565 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755710_14886 replica FinalizedReplica, blk_1073755710_14886, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755710 for deletion 2025-07-19 11:39:27,566 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755710_14886 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755710 2025-07-19 11:40:30,234 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755711_14887 src: /192.168.158.7:38102 dest: /192.168.158.4:9866 2025-07-19 11:40:30,254 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:38102, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1299296709_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755711_14887, duration(ns): 17858557 2025-07-19 11:40:30,254 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755711_14887, type=LAST_IN_PIPELINE terminating 2025-07-19 11:40:30,567 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755711_14887 replica FinalizedReplica, blk_1073755711_14887, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755711 for deletion 2025-07-19 11:40:30,568 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755711_14887 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755711 2025-07-19 11:41:30,239 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755712_14888 src: /192.168.158.6:46292 dest: /192.168.158.4:9866 2025-07-19 11:41:30,257 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:46292, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1571188249_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755712_14888, duration(ns): 16660635 2025-07-19 11:41:30,258 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755712_14888, type=LAST_IN_PIPELINE terminating 2025-07-19 11:41:36,570 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755712_14888 replica FinalizedReplica, blk_1073755712_14888, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755712 for deletion 2025-07-19 11:41:36,571 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755712_14888 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755712 2025-07-19 11:43:35,233 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755714_14890 src: /192.168.158.8:35128 dest: /192.168.158.4:9866 2025-07-19 11:43:35,258 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:35128, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1694485839_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755714_14890, duration(ns): 18985412 2025-07-19 11:43:35,258 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755714_14890, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-19 11:43:39,575 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755714_14890 replica FinalizedReplica, blk_1073755714_14890, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755714 for deletion 2025-07-19 11:43:39,576 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755714_14890 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755714 2025-07-19 11:45:35,237 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755716_14892 src: /192.168.158.6:37760 dest: /192.168.158.4:9866 2025-07-19 11:45:35,264 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:37760, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1822271152_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755716_14892, duration(ns): 20057005 2025-07-19 11:45:35,265 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755716_14892, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-19 11:45:42,579 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755716_14892 replica FinalizedReplica, blk_1073755716_14892, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755716 for deletion 2025-07-19 11:45:42,580 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755716_14892 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755716 2025-07-19 11:47:35,239 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755718_14894 src: /192.168.158.5:44068 dest: /192.168.158.4:9866 2025-07-19 11:47:35,259 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:44068, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1278245440_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755718_14894, duration(ns): 16638755 2025-07-19 11:47:35,259 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755718_14894, type=LAST_IN_PIPELINE terminating 2025-07-19 11:47:39,583 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755718_14894 replica FinalizedReplica, blk_1073755718_14894, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755718 for deletion 2025-07-19 11:47:39,584 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755718_14894 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755718 2025-07-19 11:48:35,239 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755719_14895 src: /192.168.158.8:54002 dest: /192.168.158.4:9866 2025-07-19 11:48:35,260 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:54002, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1678547944_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755719_14895, duration(ns): 18028900 2025-07-19 11:48:35,260 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755719_14895, type=LAST_IN_PIPELINE terminating 2025-07-19 11:48:39,584 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755719_14895 replica FinalizedReplica, blk_1073755719_14895, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755719 for deletion 2025-07-19 11:48:39,586 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755719_14895 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755719 2025-07-19 11:50:35,237 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755721_14897 src: /192.168.158.6:51638 dest: /192.168.158.4:9866 2025-07-19 11:50:35,262 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:51638, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1248410657_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755721_14897, duration(ns): 20015750 2025-07-19 11:50:35,263 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755721_14897, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-19 11:50:39,589 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755721_14897 replica FinalizedReplica, blk_1073755721_14897, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755721 for deletion 2025-07-19 11:50:39,590 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755721_14897 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755721 2025-07-19 11:51:40,248 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755722_14898 src: /192.168.158.1:38506 dest: /192.168.158.4:9866 2025-07-19 11:51:40,282 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38506, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-449765731_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755722_14898, duration(ns): 25248241 2025-07-19 11:51:40,282 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755722_14898, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-19 11:51:45,591 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755722_14898 replica FinalizedReplica, blk_1073755722_14898, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755722 for deletion 2025-07-19 11:51:45,592 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755722_14898 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755722 2025-07-19 11:52:40,301 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755723_14899 src: /192.168.158.9:54890 dest: /192.168.158.4:9866 2025-07-19 11:52:40,327 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:54890, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-668481872_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755723_14899, duration(ns): 21235512 2025-07-19 11:52:40,327 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755723_14899, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-19 11:52:45,595 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755723_14899 replica FinalizedReplica, blk_1073755723_14899, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755723 for deletion 2025-07-19 11:52:45,596 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755723_14899 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755723 2025-07-19 11:54:40,245 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755725_14901 src: /192.168.158.7:41966 dest: /192.168.158.4:9866 2025-07-19 11:54:40,264 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:41966, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1083095869_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755725_14901, duration(ns): 16758840 2025-07-19 11:54:40,264 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755725_14901, type=LAST_IN_PIPELINE terminating 2025-07-19 11:54:48,599 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755725_14901 replica FinalizedReplica, blk_1073755725_14901, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755725 for deletion 2025-07-19 11:54:48,600 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755725_14901 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755725 2025-07-19 11:55:40,251 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755726_14902 src: /192.168.158.7:51926 dest: /192.168.158.4:9866 2025-07-19 11:55:40,273 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:51926, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_163257101_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755726_14902, duration(ns): 19491863 2025-07-19 11:55:40,273 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755726_14902, type=LAST_IN_PIPELINE terminating 2025-07-19 11:55:45,601 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755726_14902 replica FinalizedReplica, blk_1073755726_14902, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755726 for deletion 2025-07-19 11:55:45,602 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755726_14902 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755726 2025-07-19 11:56:40,250 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755727_14903 src: /192.168.158.8:44366 dest: /192.168.158.4:9866 2025-07-19 11:56:40,270 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:44366, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_307826956_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755727_14903, duration(ns): 17377640 2025-07-19 11:56:40,270 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755727_14903, type=LAST_IN_PIPELINE terminating 2025-07-19 11:56:48,600 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755727_14903 replica FinalizedReplica, blk_1073755727_14903, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755727 for deletion 2025-07-19 11:56:48,602 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755727_14903 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755727 2025-07-19 12:02:50,273 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755733_14909 src: /192.168.158.1:35592 dest: /192.168.158.4:9866 2025-07-19 12:02:50,308 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35592, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_863076899_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755733_14909, duration(ns): 25333753 2025-07-19 12:02:50,308 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755733_14909, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-19 12:02:54,609 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755733_14909 replica FinalizedReplica, blk_1073755733_14909, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755733 for deletion 2025-07-19 12:02:54,610 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755733_14909 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755733 2025-07-19 12:03:55,268 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755734_14910 src: /192.168.158.1:55500 dest: /192.168.158.4:9866 2025-07-19 12:03:55,306 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55500, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1004259650_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755734_14910, duration(ns): 25770954 2025-07-19 12:03:55,306 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755734_14910, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-19 12:04:03,610 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755734_14910 replica FinalizedReplica, blk_1073755734_14910, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755734 for deletion 2025-07-19 12:04:03,612 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755734_14910 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755734 2025-07-19 12:06:00,257 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755736_14912 src: /192.168.158.9:43532 dest: /192.168.158.4:9866 2025-07-19 12:06:00,276 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:43532, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_762667528_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755736_14912, duration(ns): 16861835 2025-07-19 12:06:00,277 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755736_14912, type=LAST_IN_PIPELINE terminating 2025-07-19 12:06:03,620 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755736_14912 replica FinalizedReplica, blk_1073755736_14912, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755736 for deletion 2025-07-19 12:06:03,621 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755736_14912 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755736 2025-07-19 12:07:00,282 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755737_14913 src: /192.168.158.9:55176 dest: /192.168.158.4:9866 2025-07-19 12:07:00,309 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:55176, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-804286891_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755737_14913, duration(ns): 22182147 2025-07-19 12:07:00,310 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755737_14913, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-19 12:07:06,624 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755737_14913 replica FinalizedReplica, blk_1073755737_14913, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755737 for deletion 2025-07-19 12:07:06,625 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755737_14913 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755737 2025-07-19 12:10:00,272 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755740_14916 src: /192.168.158.6:59600 dest: /192.168.158.4:9866 2025-07-19 12:10:00,291 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:59600, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1836998667_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755740_14916, duration(ns): 16692831 2025-07-19 12:10:00,291 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755740_14916, type=LAST_IN_PIPELINE terminating 2025-07-19 12:10:03,629 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755740_14916 replica FinalizedReplica, blk_1073755740_14916, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755740 for deletion 2025-07-19 12:10:03,630 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755740_14916 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755740 2025-07-19 12:11:00,266 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755741_14917 src: /192.168.158.8:56880 dest: /192.168.158.4:9866 2025-07-19 12:11:00,286 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:56880, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2072388009_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755741_14917, duration(ns): 17582977 2025-07-19 12:11:00,286 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755741_14917, type=LAST_IN_PIPELINE terminating 2025-07-19 12:11:03,630 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755741_14917 replica FinalizedReplica, blk_1073755741_14917, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755741 for deletion 2025-07-19 12:11:03,631 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755741_14917 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755741 2025-07-19 12:12:05,286 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755742_14918 src: /192.168.158.1:54992 dest: /192.168.158.4:9866 2025-07-19 12:12:05,321 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54992, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_571144809_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755742_14918, duration(ns): 25756352 2025-07-19 12:12:05,321 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755742_14918, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-19 12:12:09,630 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755742_14918 replica FinalizedReplica, blk_1073755742_14918, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755742 for deletion 2025-07-19 12:12:09,632 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755742_14918 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755742 2025-07-19 12:15:05,287 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755745_14921 src: /192.168.158.8:56106 dest: /192.168.158.4:9866 2025-07-19 12:15:05,306 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:56106, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1229120106_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755745_14921, duration(ns): 16991020 2025-07-19 12:15:05,307 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755745_14921, type=LAST_IN_PIPELINE terminating 2025-07-19 12:15:09,636 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755745_14921 replica FinalizedReplica, blk_1073755745_14921, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755745 for deletion 2025-07-19 12:15:09,638 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755745_14921 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755745 2025-07-19 12:20:10,273 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755750_14926 src: /192.168.158.8:35780 dest: /192.168.158.4:9866 2025-07-19 12:20:10,302 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:35780, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1470372317_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755750_14926, duration(ns): 23379471 2025-07-19 12:20:10,302 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755750_14926, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-19 12:20:15,651 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755750_14926 replica FinalizedReplica, blk_1073755750_14926, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755750 for deletion 2025-07-19 12:20:15,652 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755750_14926 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755750 2025-07-19 12:22:10,289 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755752_14928 src: /192.168.158.7:38226 dest: /192.168.158.4:9866 2025-07-19 12:22:10,308 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:38226, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-557305637_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755752_14928, duration(ns): 16943716 2025-07-19 12:22:10,309 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755752_14928, type=LAST_IN_PIPELINE terminating 2025-07-19 12:22:15,653 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755752_14928 replica FinalizedReplica, blk_1073755752_14928, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755752 for deletion 2025-07-19 12:22:15,654 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755752_14928 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755752 2025-07-19 12:24:10,281 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755754_14930 src: /192.168.158.1:43872 dest: /192.168.158.4:9866 2025-07-19 12:24:10,314 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43872, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-442101687_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755754_14930, duration(ns): 24482389 2025-07-19 12:24:10,315 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755754_14930, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-19 12:24:15,659 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755754_14930 replica FinalizedReplica, blk_1073755754_14930, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755754 for deletion 2025-07-19 12:24:15,661 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755754_14930 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755754 2025-07-19 12:25:10,288 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755755_14931 src: /192.168.158.8:56858 dest: /192.168.158.4:9866 2025-07-19 12:25:10,314 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:56858, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-890898634_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755755_14931, duration(ns): 20397513 2025-07-19 12:25:10,314 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755755_14931, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-19 12:25:15,661 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755755_14931 replica FinalizedReplica, blk_1073755755_14931, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755755 for deletion 2025-07-19 12:25:15,663 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755755_14931 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755755 2025-07-19 12:27:10,292 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755757_14933 src: /192.168.158.9:39948 dest: /192.168.158.4:9866 2025-07-19 12:27:10,321 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:39948, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1113678523_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755757_14933, duration(ns): 22428189 2025-07-19 12:27:10,322 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755757_14933, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-19 12:27:15,667 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755757_14933 replica FinalizedReplica, blk_1073755757_14933, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755757 for deletion 2025-07-19 12:27:15,668 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755757_14933 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755757 2025-07-19 12:28:15,290 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755758_14934 src: /192.168.158.8:33562 dest: /192.168.158.4:9866 2025-07-19 12:28:15,311 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:33562, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1372093457_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755758_14934, duration(ns): 18314774 2025-07-19 12:28:15,311 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755758_14934, type=LAST_IN_PIPELINE terminating 2025-07-19 12:28:21,667 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755758_14934 replica FinalizedReplica, blk_1073755758_14934, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755758 for deletion 2025-07-19 12:28:21,669 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755758_14934 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755758 2025-07-19 12:29:15,292 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755759_14935 src: /192.168.158.5:57690 dest: /192.168.158.4:9866 2025-07-19 12:29:15,318 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:57690, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1160930108_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755759_14935, duration(ns): 20780706 2025-07-19 12:29:15,319 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755759_14935, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-19 12:29:18,667 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755759_14935 replica FinalizedReplica, blk_1073755759_14935, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755759 for deletion 2025-07-19 12:29:18,668 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755759_14935 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755759 2025-07-19 12:30:15,291 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755760_14936 src: /192.168.158.1:38024 dest: /192.168.158.4:9866 2025-07-19 12:30:15,329 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38024, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_613793835_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755760_14936, duration(ns): 27063365 2025-07-19 12:30:15,329 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755760_14936, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-19 12:30:18,671 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755760_14936 replica FinalizedReplica, blk_1073755760_14936, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755760 for deletion 2025-07-19 12:30:18,672 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755760_14936 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755760 2025-07-19 12:32:15,289 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755762_14938 src: /192.168.158.1:43614 dest: /192.168.158.4:9866 2025-07-19 12:32:15,324 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43614, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1973906981_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755762_14938, duration(ns): 25527457 2025-07-19 12:32:15,324 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755762_14938, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-19 12:32:18,678 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755762_14938 replica FinalizedReplica, blk_1073755762_14938, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755762 for deletion 2025-07-19 12:32:18,679 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755762_14938 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755762 2025-07-19 12:36:15,306 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755766_14942 src: /192.168.158.1:57190 dest: /192.168.158.4:9866 2025-07-19 12:36:15,339 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57190, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1937799244_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755766_14942, duration(ns): 23359525 2025-07-19 12:36:15,339 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755766_14942, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-19 12:36:18,682 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755766_14942 replica FinalizedReplica, blk_1073755766_14942, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755766 for deletion 2025-07-19 12:36:18,683 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755766_14942 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755766 2025-07-19 12:37:15,315 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755767_14943 src: /192.168.158.1:36612 dest: /192.168.158.4:9866 2025-07-19 12:37:15,351 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36612, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-283456114_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755767_14943, duration(ns): 24967051 2025-07-19 12:37:15,351 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755767_14943, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-19 12:37:18,685 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755767_14943 replica FinalizedReplica, blk_1073755767_14943, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755767 for deletion 2025-07-19 12:37:18,686 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755767_14943 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755767 2025-07-19 12:38:15,317 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755768_14944 src: /192.168.158.5:38946 dest: /192.168.158.4:9866 2025-07-19 12:38:15,336 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:38946, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_594373831_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755768_14944, duration(ns): 17057778 2025-07-19 12:38:15,336 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755768_14944, type=LAST_IN_PIPELINE terminating 2025-07-19 12:38:18,687 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755768_14944 replica FinalizedReplica, blk_1073755768_14944, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755768 for deletion 2025-07-19 12:38:18,688 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755768_14944 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755768 2025-07-19 12:42:20,320 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755772_14948 src: /192.168.158.8:59084 dest: /192.168.158.4:9866 2025-07-19 12:42:20,347 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:59084, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1966089757_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755772_14948, duration(ns): 21074083 2025-07-19 12:42:20,347 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755772_14948, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-19 12:42:24,698 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755772_14948 replica FinalizedReplica, blk_1073755772_14948, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755772 for deletion 2025-07-19 12:42:24,699 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755772_14948 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755772 2025-07-19 12:46:20,326 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755776_14952 src: /192.168.158.8:50762 dest: /192.168.158.4:9866 2025-07-19 12:46:20,345 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:50762, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1248706468_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755776_14952, duration(ns): 16914607 2025-07-19 12:46:20,345 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755776_14952, type=LAST_IN_PIPELINE terminating 2025-07-19 12:46:27,705 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755776_14952 replica FinalizedReplica, blk_1073755776_14952, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755776 for deletion 2025-07-19 12:46:27,706 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755776_14952 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755776 2025-07-19 12:49:20,323 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755779_14955 src: /192.168.158.1:53414 dest: /192.168.158.4:9866 2025-07-19 12:49:20,355 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53414, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1010705930_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755779_14955, duration(ns): 23539500 2025-07-19 12:49:20,356 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755779_14955, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-19 12:49:24,717 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755779_14955 replica FinalizedReplica, blk_1073755779_14955, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755779 for deletion 2025-07-19 12:49:24,718 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755779_14955 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755779 2025-07-19 12:50:20,327 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755780_14956 src: /192.168.158.9:34248 dest: /192.168.158.4:9866 2025-07-19 12:50:20,352 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:34248, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1628121867_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755780_14956, duration(ns): 19799662 2025-07-19 12:50:20,353 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755780_14956, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-19 12:50:24,718 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755780_14956 replica FinalizedReplica, blk_1073755780_14956, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755780 for deletion 2025-07-19 12:50:24,720 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755780_14956 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755780 2025-07-19 12:52:20,328 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755782_14958 src: /192.168.158.1:35102 dest: /192.168.158.4:9866 2025-07-19 12:52:20,363 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35102, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_758360047_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755782_14958, duration(ns): 25053795 2025-07-19 12:52:20,363 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755782_14958, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-19 12:52:24,725 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755782_14958 replica FinalizedReplica, blk_1073755782_14958, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755782 for deletion 2025-07-19 12:52:24,726 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755782_14958 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755782 2025-07-19 12:57:20,335 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755787_14963 src: /192.168.158.1:49824 dest: /192.168.158.4:9866 2025-07-19 12:57:20,368 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49824, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2034433396_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755787_14963, duration(ns): 24434581 2025-07-19 12:57:20,369 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755787_14963, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-19 12:57:27,736 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755787_14963 replica FinalizedReplica, blk_1073755787_14963, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755787 for deletion 2025-07-19 12:57:27,737 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755787_14963 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755787 2025-07-19 13:01:30,344 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755791_14967 src: /192.168.158.1:47152 dest: /192.168.158.4:9866 2025-07-19 13:01:30,378 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47152, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1605853527_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755791_14967, duration(ns): 23825028 2025-07-19 13:01:30,378 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755791_14967, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-19 13:01:33,745 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755791_14967 replica FinalizedReplica, blk_1073755791_14967, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755791 for deletion 2025-07-19 13:01:33,746 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755791_14967 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755791 2025-07-19 13:04:35,348 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755794_14970 src: /192.168.158.1:50720 dest: /192.168.158.4:9866 2025-07-19 13:04:35,384 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50720, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_901199267_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755794_14970, duration(ns): 26446908 2025-07-19 13:04:35,384 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755794_14970, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-19 13:04:39,750 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755794_14970 replica FinalizedReplica, blk_1073755794_14970, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755794 for deletion 2025-07-19 13:04:39,751 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755794_14970 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755794 2025-07-19 13:06:35,352 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755796_14972 src: /192.168.158.7:42090 dest: /192.168.158.4:9866 2025-07-19 13:06:35,380 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:42090, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1377518739_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755796_14972, duration(ns): 22528303 2025-07-19 13:06:35,380 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755796_14972, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-19 13:06:39,753 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755796_14972 replica FinalizedReplica, blk_1073755796_14972, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755796 for deletion 2025-07-19 13:06:39,754 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755796_14972 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755796 2025-07-19 13:07:35,353 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755797_14973 src: /192.168.158.5:44862 dest: /192.168.158.4:9866 2025-07-19 13:07:35,379 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:44862, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1461605686_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755797_14973, duration(ns): 20754812 2025-07-19 13:07:35,379 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755797_14973, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-19 13:07:42,756 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755797_14973 replica FinalizedReplica, blk_1073755797_14973, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755797 for deletion 2025-07-19 13:07:42,757 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755797_14973 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755797 2025-07-19 13:12:45,358 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755802_14978 src: /192.168.158.8:56900 dest: /192.168.158.4:9866 2025-07-19 13:12:45,385 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:56900, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1763117394_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755802_14978, duration(ns): 20485679 2025-07-19 13:12:45,385 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755802_14978, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-19 13:12:48,768 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755802_14978 replica FinalizedReplica, blk_1073755802_14978, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755802 for deletion 2025-07-19 13:12:48,769 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755802_14978 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755802 2025-07-19 13:16:45,372 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755806_14982 src: /192.168.158.8:35682 dest: /192.168.158.4:9866 2025-07-19 13:16:45,394 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:35682, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_628104286_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755806_14982, duration(ns): 19760980 2025-07-19 13:16:45,394 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755806_14982, type=LAST_IN_PIPELINE terminating 2025-07-19 13:16:48,780 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755806_14982 replica FinalizedReplica, blk_1073755806_14982, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755806 for deletion 2025-07-19 13:16:48,781 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755806_14982 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755806 2025-07-19 13:17:45,377 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755807_14983 src: /192.168.158.8:57200 dest: /192.168.158.4:9866 2025-07-19 13:17:45,408 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:57200, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1422730402_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755807_14983, duration(ns): 24751113 2025-07-19 13:17:45,408 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755807_14983, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-19 13:17:48,781 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755807_14983 replica FinalizedReplica, blk_1073755807_14983, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755807 for deletion 2025-07-19 13:17:48,783 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755807_14983 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755807 2025-07-19 13:18:45,376 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755808_14984 src: /192.168.158.1:42258 dest: /192.168.158.4:9866 2025-07-19 13:18:45,410 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42258, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_883167546_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755808_14984, duration(ns): 24727740 2025-07-19 13:18:45,411 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755808_14984, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-19 13:18:48,780 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755808_14984 replica FinalizedReplica, blk_1073755808_14984, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755808 for deletion 2025-07-19 13:18:48,782 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755808_14984 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755808 2025-07-19 13:21:45,385 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755811_14987 src: /192.168.158.6:38086 dest: /192.168.158.4:9866 2025-07-19 13:21:45,406 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:38086, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1613294177_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755811_14987, duration(ns): 18300393 2025-07-19 13:21:45,406 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755811_14987, type=LAST_IN_PIPELINE terminating 2025-07-19 13:21:48,784 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755811_14987 replica FinalizedReplica, blk_1073755811_14987, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755811 for deletion 2025-07-19 13:21:48,786 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755811_14987 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755811 2025-07-19 13:22:45,378 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755812_14988 src: /192.168.158.6:60976 dest: /192.168.158.4:9866 2025-07-19 13:22:45,399 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:60976, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1462040743_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755812_14988, duration(ns): 18966505 2025-07-19 13:22:45,399 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755812_14988, type=LAST_IN_PIPELINE terminating 2025-07-19 13:22:48,787 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755812_14988 replica FinalizedReplica, blk_1073755812_14988, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755812 for deletion 2025-07-19 13:22:48,788 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755812_14988 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755812 2025-07-19 13:23:45,382 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755813_14989 src: /192.168.158.7:43622 dest: /192.168.158.4:9866 2025-07-19 13:23:45,407 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:43622, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-526280044_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755813_14989, duration(ns): 19703573 2025-07-19 13:23:45,407 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755813_14989, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-19 13:23:48,787 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755813_14989 replica FinalizedReplica, blk_1073755813_14989, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755813 for deletion 2025-07-19 13:23:48,788 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755813_14989 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755813 2025-07-19 13:24:45,378 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755814_14990 src: /192.168.158.6:56456 dest: /192.168.158.4:9866 2025-07-19 13:24:45,396 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:56456, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1755814517_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755814_14990, duration(ns): 16696724 2025-07-19 13:24:45,397 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755814_14990, type=LAST_IN_PIPELINE terminating 2025-07-19 13:24:48,790 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755814_14990 replica FinalizedReplica, blk_1073755814_14990, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755814 for deletion 2025-07-19 13:24:48,791 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755814_14990 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755814 2025-07-19 13:25:45,383 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755815_14991 src: /192.168.158.7:60302 dest: /192.168.158.4:9866 2025-07-19 13:25:45,402 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:60302, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_334467283_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755815_14991, duration(ns): 17018633 2025-07-19 13:25:45,403 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755815_14991, type=LAST_IN_PIPELINE terminating 2025-07-19 13:25:48,792 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755815_14991 replica FinalizedReplica, blk_1073755815_14991, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755815 for deletion 2025-07-19 13:25:48,794 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755815_14991 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755815 2025-07-19 13:27:45,383 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755817_14993 src: /192.168.158.6:39506 dest: /192.168.158.4:9866 2025-07-19 13:27:45,402 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:39506, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1000246398_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755817_14993, duration(ns): 16367539 2025-07-19 13:27:45,402 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755817_14993, type=LAST_IN_PIPELINE terminating 2025-07-19 13:27:48,794 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755817_14993 replica FinalizedReplica, blk_1073755817_14993, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755817 for deletion 2025-07-19 13:27:48,795 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755817_14993 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755817 2025-07-19 13:29:45,391 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755819_14995 src: /192.168.158.5:50324 dest: /192.168.158.4:9866 2025-07-19 13:29:45,410 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:50324, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1507505317_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755819_14995, duration(ns): 17952433 2025-07-19 13:29:45,411 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755819_14995, type=LAST_IN_PIPELINE terminating 2025-07-19 13:29:48,796 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755819_14995 replica FinalizedReplica, blk_1073755819_14995, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755819 for deletion 2025-07-19 13:29:48,798 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755819_14995 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755819 2025-07-19 13:30:50,382 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755820_14996 src: /192.168.158.1:42664 dest: /192.168.158.4:9866 2025-07-19 13:30:50,415 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42664, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-714876584_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755820_14996, duration(ns): 24614226 2025-07-19 13:30:50,415 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755820_14996, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-19 13:30:57,799 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755820_14996 replica FinalizedReplica, blk_1073755820_14996, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755820 for deletion 2025-07-19 13:30:57,800 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755820_14996 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755820 2025-07-19 13:32:56,164 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755822_14998 src: /192.168.158.7:56856 dest: /192.168.158.4:9866 2025-07-19 13:32:56,189 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:56856, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_466912326_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755822_14998, duration(ns): 20002418 2025-07-19 13:32:56,189 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755822_14998, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-19 13:33:00,800 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755822_14998 replica FinalizedReplica, blk_1073755822_14998, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755822 for deletion 2025-07-19 13:33:00,801 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755822_14998 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755822 2025-07-19 13:35:55,407 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755825_15001 src: /192.168.158.5:33954 dest: /192.168.158.4:9866 2025-07-19 13:35:55,428 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:33954, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1623693385_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755825_15001, duration(ns): 19223794 2025-07-19 13:35:55,429 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755825_15001, type=LAST_IN_PIPELINE terminating 2025-07-19 13:36:00,805 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755825_15001 replica FinalizedReplica, blk_1073755825_15001, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755825 for deletion 2025-07-19 13:36:00,806 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755825_15001 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755825 2025-07-19 13:36:55,394 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755826_15002 src: /192.168.158.5:53608 dest: /192.168.158.4:9866 2025-07-19 13:36:55,422 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:53608, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2135627703_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755826_15002, duration(ns): 21946614 2025-07-19 13:36:55,422 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755826_15002, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-19 13:37:03,808 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755826_15002 replica FinalizedReplica, blk_1073755826_15002, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755826 for deletion 2025-07-19 13:37:03,810 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755826_15002 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755826 2025-07-19 13:40:00,400 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755829_15005 src: /192.168.158.1:60050 dest: /192.168.158.4:9866 2025-07-19 13:40:00,434 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60050, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-133828389_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755829_15005, duration(ns): 24729240 2025-07-19 13:40:00,438 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755829_15005, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-19 13:40:03,818 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755829_15005 replica FinalizedReplica, blk_1073755829_15005, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755829 for deletion 2025-07-19 13:40:03,819 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755829_15005 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755829 2025-07-19 13:43:10,423 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755832_15008 src: /192.168.158.8:37864 dest: /192.168.158.4:9866 2025-07-19 13:43:10,452 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:37864, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-671159113_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755832_15008, duration(ns): 23338806 2025-07-19 13:43:10,452 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755832_15008, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-19 13:43:15,823 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755832_15008 replica FinalizedReplica, blk_1073755832_15008, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755832 for deletion 2025-07-19 13:43:15,824 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755832_15008 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755832 2025-07-19 13:44:10,407 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755833_15009 src: /192.168.158.1:57192 dest: /192.168.158.4:9866 2025-07-19 13:44:10,445 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57192, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2077869489_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755833_15009, duration(ns): 28254388 2025-07-19 13:44:10,445 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755833_15009, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-19 13:44:15,826 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755833_15009 replica FinalizedReplica, blk_1073755833_15009, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755833 for deletion 2025-07-19 13:44:15,827 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755833_15009 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755833 2025-07-19 13:46:10,408 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755835_15011 src: /192.168.158.1:38668 dest: /192.168.158.4:9866 2025-07-19 13:46:10,444 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38668, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_543072247_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755835_15011, duration(ns): 26478744 2025-07-19 13:46:10,444 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755835_15011, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-19 13:46:15,828 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755835_15011 replica FinalizedReplica, blk_1073755835_15011, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755835 for deletion 2025-07-19 13:46:15,829 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755835_15011 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755835 2025-07-19 13:48:15,423 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755837_15013 src: /192.168.158.5:47264 dest: /192.168.158.4:9866 2025-07-19 13:48:15,449 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:47264, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1778954603_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755837_15013, duration(ns): 20388806 2025-07-19 13:48:15,449 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755837_15013, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-19 13:48:18,830 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755837_15013 replica FinalizedReplica, blk_1073755837_15013, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755837 for deletion 2025-07-19 13:48:18,831 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755837_15013 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755837 2025-07-19 13:50:15,429 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755839_15015 src: /192.168.158.5:52102 dest: /192.168.158.4:9866 2025-07-19 13:50:15,457 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:52102, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1032519348_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755839_15015, duration(ns): 22227161 2025-07-19 13:50:15,457 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755839_15015, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-19 13:50:18,834 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755839_15015 replica FinalizedReplica, blk_1073755839_15015, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755839 for deletion 2025-07-19 13:50:18,835 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755839_15015 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755839 2025-07-19 13:51:15,423 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755840_15016 src: /192.168.158.9:59726 dest: /192.168.158.4:9866 2025-07-19 13:51:15,442 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:59726, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1793801120_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755840_15016, duration(ns): 16982349 2025-07-19 13:51:15,443 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755840_15016, type=LAST_IN_PIPELINE terminating 2025-07-19 13:51:21,835 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755840_15016 replica FinalizedReplica, blk_1073755840_15016, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755840 for deletion 2025-07-19 13:51:21,836 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755840_15016 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755840 2025-07-19 13:53:15,418 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755842_15018 src: /192.168.158.1:35856 dest: /192.168.158.4:9866 2025-07-19 13:53:15,454 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35856, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_408625104_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755842_15018, duration(ns): 25142073 2025-07-19 13:53:15,455 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755842_15018, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-19 13:53:18,841 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755842_15018 replica FinalizedReplica, blk_1073755842_15018, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755842 for deletion 2025-07-19 13:53:18,842 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755842_15018 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755842 2025-07-19 13:54:15,420 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755843_15019 src: /192.168.158.5:49296 dest: /192.168.158.4:9866 2025-07-19 13:54:15,446 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:49296, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1037331683_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755843_15019, duration(ns): 20405230 2025-07-19 13:54:15,447 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755843_15019, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-19 13:54:18,844 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755843_15019 replica FinalizedReplica, blk_1073755843_15019, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755843 for deletion 2025-07-19 13:54:18,845 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755843_15019 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755843 2025-07-19 13:58:20,431 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755847_15023 src: /192.168.158.5:43084 dest: /192.168.158.4:9866 2025-07-19 13:58:20,457 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:43084, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1377691655_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755847_15023, duration(ns): 20530493 2025-07-19 13:58:20,457 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755847_15023, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-19 13:58:27,849 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755847_15023 replica FinalizedReplica, blk_1073755847_15023, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755847 for deletion 2025-07-19 13:58:27,851 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755847_15023 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755847 2025-07-19 13:59:25,423 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755848_15024 src: /192.168.158.1:48992 dest: /192.168.158.4:9866 2025-07-19 13:59:25,457 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48992, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1853757273_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755848_15024, duration(ns): 24514899 2025-07-19 13:59:25,457 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755848_15024, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-19 13:59:33,852 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755848_15024 replica FinalizedReplica, blk_1073755848_15024, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755848 for deletion 2025-07-19 13:59:33,853 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755848_15024 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755848 2025-07-19 14:00:25,438 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755849_15025 src: /192.168.158.6:54124 dest: /192.168.158.4:9866 2025-07-19 14:00:25,456 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:54124, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1355207441_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755849_15025, duration(ns): 16350782 2025-07-19 14:00:25,456 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755849_15025, type=LAST_IN_PIPELINE terminating 2025-07-19 14:00:30,854 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755849_15025 replica FinalizedReplica, blk_1073755849_15025, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755849 for deletion 2025-07-19 14:00:30,855 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755849_15025 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755849 2025-07-19 14:02:25,438 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755851_15027 src: /192.168.158.9:53034 dest: /192.168.158.4:9866 2025-07-19 14:02:25,456 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:53034, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-429734685_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755851_15027, duration(ns): 16033197 2025-07-19 14:02:25,456 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755851_15027, type=LAST_IN_PIPELINE terminating 2025-07-19 14:02:30,857 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755851_15027 replica FinalizedReplica, blk_1073755851_15027, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755851 for deletion 2025-07-19 14:02:30,858 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755851_15027 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755851 2025-07-19 14:09:35,446 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755858_15034 src: /192.168.158.7:36126 dest: /192.168.158.4:9866 2025-07-19 14:09:35,465 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:36126, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_709401987_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755858_15034, duration(ns): 16236457 2025-07-19 14:09:35,465 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755858_15034, type=LAST_IN_PIPELINE terminating 2025-07-19 14:09:42,872 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755858_15034 replica FinalizedReplica, blk_1073755858_15034, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755858 for deletion 2025-07-19 14:09:42,873 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755858_15034 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755858 2025-07-19 14:10:35,446 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755859_15035 src: /192.168.158.7:52070 dest: /192.168.158.4:9866 2025-07-19 14:10:35,473 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:52070, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1165377314_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755859_15035, duration(ns): 21769424 2025-07-19 14:10:35,473 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755859_15035, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-19 14:10:42,876 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755859_15035 replica FinalizedReplica, blk_1073755859_15035, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755859 for deletion 2025-07-19 14:10:42,877 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755859_15035 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755859 2025-07-19 14:13:35,453 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755862_15038 src: /192.168.158.7:60846 dest: /192.168.158.4:9866 2025-07-19 14:13:35,472 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:60846, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-656282776_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755862_15038, duration(ns): 16636112 2025-07-19 14:13:35,472 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755862_15038, type=LAST_IN_PIPELINE terminating 2025-07-19 14:13:39,877 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755862_15038 replica FinalizedReplica, blk_1073755862_15038, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755862 for deletion 2025-07-19 14:13:39,878 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755862_15038 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755862 2025-07-19 14:17:45,462 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755866_15042 src: /192.168.158.8:42874 dest: /192.168.158.4:9866 2025-07-19 14:17:45,492 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:42874, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_953072374_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755866_15042, duration(ns): 24367024 2025-07-19 14:17:45,492 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755866_15042, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-19 14:17:48,891 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755866_15042 replica FinalizedReplica, blk_1073755866_15042, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755866 for deletion 2025-07-19 14:17:48,892 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755866_15042 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755866 2025-07-19 14:19:50,477 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755868_15044 src: /192.168.158.7:59550 dest: /192.168.158.4:9866 2025-07-19 14:19:50,498 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:59550, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1967921133_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755868_15044, duration(ns): 18661008 2025-07-19 14:19:50,499 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755868_15044, type=LAST_IN_PIPELINE terminating 2025-07-19 14:19:54,895 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755868_15044 replica FinalizedReplica, blk_1073755868_15044, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755868 for deletion 2025-07-19 14:19:54,896 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755868_15044 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755868 2025-07-19 14:20:50,471 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755869_15045 src: /192.168.158.8:43234 dest: /192.168.158.4:9866 2025-07-19 14:20:50,491 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:43234, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1166263351_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755869_15045, duration(ns): 17752377 2025-07-19 14:20:50,491 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755869_15045, type=LAST_IN_PIPELINE terminating 2025-07-19 14:20:57,896 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755869_15045 replica FinalizedReplica, blk_1073755869_15045, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755869 for deletion 2025-07-19 14:20:57,897 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755869_15045 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755869 2025-07-19 14:21:50,471 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755870_15046 src: /192.168.158.5:34566 dest: /192.168.158.4:9866 2025-07-19 14:21:50,494 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:34566, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-641653234_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755870_15046, duration(ns): 20247296 2025-07-19 14:21:50,494 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755870_15046, type=LAST_IN_PIPELINE terminating 2025-07-19 14:21:54,900 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755870_15046 replica FinalizedReplica, blk_1073755870_15046, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755870 for deletion 2025-07-19 14:21:54,901 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755870_15046 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755870 2025-07-19 14:22:50,500 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755871_15047 src: /192.168.158.5:41010 dest: /192.168.158.4:9866 2025-07-19 14:22:50,524 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:41010, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_934461818_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755871_15047, duration(ns): 18856925 2025-07-19 14:22:50,524 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755871_15047, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-19 14:22:54,904 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755871_15047 replica FinalizedReplica, blk_1073755871_15047, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755871 for deletion 2025-07-19 14:22:54,905 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755871_15047 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755871 2025-07-19 14:24:50,467 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755873_15049 src: /192.168.158.1:56804 dest: /192.168.158.4:9866 2025-07-19 14:24:50,499 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56804, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1442273697_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755873_15049, duration(ns): 23715580 2025-07-19 14:24:50,500 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755873_15049, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-19 14:24:57,906 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755873_15049 replica FinalizedReplica, blk_1073755873_15049, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755873 for deletion 2025-07-19 14:24:57,907 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755873_15049 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755873 2025-07-19 14:25:50,472 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755874_15050 src: /192.168.158.1:47474 dest: /192.168.158.4:9866 2025-07-19 14:25:50,507 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47474, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_89236146_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755874_15050, duration(ns): 25641119 2025-07-19 14:25:50,508 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755874_15050, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-19 14:25:54,908 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755874_15050 replica FinalizedReplica, blk_1073755874_15050, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755874 for deletion 2025-07-19 14:25:54,909 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755874_15050 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755874 2025-07-19 14:27:50,476 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755876_15052 src: /192.168.158.9:50768 dest: /192.168.158.4:9866 2025-07-19 14:27:50,497 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:50768, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-944781980_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755876_15052, duration(ns): 18478124 2025-07-19 14:27:50,497 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755876_15052, type=LAST_IN_PIPELINE terminating 2025-07-19 14:27:54,911 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755876_15052 replica FinalizedReplica, blk_1073755876_15052, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755876 for deletion 2025-07-19 14:27:54,912 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755876_15052 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755876 2025-07-19 14:30:55,479 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755879_15055 src: /192.168.158.9:43962 dest: /192.168.158.4:9866 2025-07-19 14:30:55,498 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:43962, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-567142505_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755879_15055, duration(ns): 17099481 2025-07-19 14:30:55,499 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755879_15055, type=LAST_IN_PIPELINE terminating 2025-07-19 14:30:57,917 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755879_15055 replica FinalizedReplica, blk_1073755879_15055, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755879 for deletion 2025-07-19 14:30:57,918 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755879_15055 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755879 2025-07-19 14:33:55,481 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755882_15058 src: /192.168.158.6:58478 dest: /192.168.158.4:9866 2025-07-19 14:33:55,508 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:58478, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1467227425_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755882_15058, duration(ns): 21868850 2025-07-19 14:33:55,509 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755882_15058, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-19 14:33:57,923 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755882_15058 replica FinalizedReplica, blk_1073755882_15058, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755882 for deletion 2025-07-19 14:33:57,924 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755882_15058 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755882 2025-07-19 14:36:00,485 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755884_15060 src: /192.168.158.9:37792 dest: /192.168.158.4:9866 2025-07-19 14:36:00,505 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:37792, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1367951240_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755884_15060, duration(ns): 17188307 2025-07-19 14:36:00,505 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755884_15060, type=LAST_IN_PIPELINE terminating 2025-07-19 14:36:03,926 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755884_15060 replica FinalizedReplica, blk_1073755884_15060, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755884 for deletion 2025-07-19 14:36:03,928 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755884_15060 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755884 2025-07-19 14:37:00,476 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755885_15061 src: /192.168.158.1:47172 dest: /192.168.158.4:9866 2025-07-19 14:37:00,512 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47172, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1217021659_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755885_15061, duration(ns): 25511795 2025-07-19 14:37:00,512 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755885_15061, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-19 14:37:03,931 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755885_15061 replica FinalizedReplica, blk_1073755885_15061, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755885 for deletion 2025-07-19 14:37:03,932 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755885_15061 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755885 2025-07-19 14:38:00,483 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755886_15062 src: /192.168.158.1:48702 dest: /192.168.158.4:9866 2025-07-19 14:38:00,519 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48702, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1236238296_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755886_15062, duration(ns): 26376645 2025-07-19 14:38:00,519 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755886_15062, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-19 14:38:03,930 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755886_15062 replica FinalizedReplica, blk_1073755886_15062, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755886 for deletion 2025-07-19 14:38:03,932 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755886_15062 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755886 2025-07-19 14:41:10,497 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755889_15065 src: /192.168.158.5:42778 dest: /192.168.158.4:9866 2025-07-19 14:41:10,518 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:42778, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-960806871_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755889_15065, duration(ns): 18651637 2025-07-19 14:41:10,518 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755889_15065, type=LAST_IN_PIPELINE terminating 2025-07-19 14:41:12,941 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755889_15065 replica FinalizedReplica, blk_1073755889_15065, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755889 for deletion 2025-07-19 14:41:12,942 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755889_15065 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755889 2025-07-19 14:42:10,498 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755890_15066 src: /192.168.158.7:35134 dest: /192.168.158.4:9866 2025-07-19 14:42:10,519 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:35134, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1038097929_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755890_15066, duration(ns): 18888385 2025-07-19 14:42:10,519 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755890_15066, type=LAST_IN_PIPELINE terminating 2025-07-19 14:42:12,942 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755890_15066 replica FinalizedReplica, blk_1073755890_15066, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755890 for deletion 2025-07-19 14:42:12,944 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755890_15066 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755890 2025-07-19 14:44:10,497 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755892_15068 src: /192.168.158.5:54510 dest: /192.168.158.4:9866 2025-07-19 14:44:10,524 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:54510, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-309676412_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755892_15068, duration(ns): 22314378 2025-07-19 14:44:10,525 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755892_15068, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-19 14:44:15,942 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755892_15068 replica FinalizedReplica, blk_1073755892_15068, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755892 for deletion 2025-07-19 14:44:15,944 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755892_15068 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755892 2025-07-19 14:46:15,493 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755894_15070 src: /192.168.158.6:59132 dest: /192.168.158.4:9866 2025-07-19 14:46:15,519 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:59132, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_551921845_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755894_15070, duration(ns): 20593075 2025-07-19 14:46:15,519 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755894_15070, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-19 14:46:18,946 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755894_15070 replica FinalizedReplica, blk_1073755894_15070, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755894 for deletion 2025-07-19 14:46:18,947 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755894_15070 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755894 2025-07-19 14:50:15,528 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755898_15074 src: /192.168.158.8:43386 dest: /192.168.158.4:9866 2025-07-19 14:50:15,553 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:43386, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2127455509_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755898_15074, duration(ns): 20071539 2025-07-19 14:50:15,554 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755898_15074, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-19 14:50:18,952 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755898_15074 replica FinalizedReplica, blk_1073755898_15074, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755898 for deletion 2025-07-19 14:50:18,953 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755898_15074 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755898 2025-07-19 14:52:15,509 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755900_15076 src: /192.168.158.6:57664 dest: /192.168.158.4:9866 2025-07-19 14:52:15,528 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:57664, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1224362967_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755900_15076, duration(ns): 16471759 2025-07-19 14:52:15,528 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755900_15076, type=LAST_IN_PIPELINE terminating 2025-07-19 14:52:18,954 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755900_15076 replica FinalizedReplica, blk_1073755900_15076, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755900 for deletion 2025-07-19 14:52:18,955 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755900_15076 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755900 2025-07-19 14:53:15,512 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755901_15077 src: /192.168.158.9:56150 dest: /192.168.158.4:9866 2025-07-19 14:53:15,531 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:56150, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_241148670_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755901_15077, duration(ns): 16566550 2025-07-19 14:53:15,531 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755901_15077, type=LAST_IN_PIPELINE terminating 2025-07-19 14:53:18,957 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755901_15077 replica FinalizedReplica, blk_1073755901_15077, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755901 for deletion 2025-07-19 14:53:18,958 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755901_15077 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755901 2025-07-19 14:54:15,514 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755902_15078 src: /192.168.158.6:50746 dest: /192.168.158.4:9866 2025-07-19 14:54:15,532 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:50746, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_713852277_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755902_15078, duration(ns): 16640818 2025-07-19 14:54:15,533 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755902_15078, type=LAST_IN_PIPELINE terminating 2025-07-19 14:54:18,957 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755902_15078 replica FinalizedReplica, blk_1073755902_15078, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755902 for deletion 2025-07-19 14:54:18,959 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755902_15078 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir22/blk_1073755902 2025-07-19 14:59:20,520 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755907_15083 src: /192.168.158.7:58174 dest: /192.168.158.4:9866 2025-07-19 14:59:20,540 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:58174, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-631573634_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755907_15083, duration(ns): 17304252 2025-07-19 14:59:20,540 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755907_15083, type=LAST_IN_PIPELINE terminating 2025-07-19 14:59:27,968 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755907_15083 replica FinalizedReplica, blk_1073755907_15083, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755907 for deletion 2025-07-19 14:59:27,969 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755907_15083 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755907 2025-07-19 15:02:20,519 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755910_15086 src: /192.168.158.1:56376 dest: /192.168.158.4:9866 2025-07-19 15:02:20,556 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56376, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-742563542_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755910_15086, duration(ns): 28032994 2025-07-19 15:02:20,557 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755910_15086, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-19 15:02:27,974 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755910_15086 replica FinalizedReplica, blk_1073755910_15086, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755910 for deletion 2025-07-19 15:02:27,976 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755910_15086 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755910 2025-07-19 15:06:25,538 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755914_15090 src: /192.168.158.6:37358 dest: /192.168.158.4:9866 2025-07-19 15:06:25,563 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:37358, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-399811935_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755914_15090, duration(ns): 20102796 2025-07-19 15:06:25,564 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755914_15090, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-19 15:06:27,981 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755914_15090 replica FinalizedReplica, blk_1073755914_15090, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755914 for deletion 2025-07-19 15:06:27,982 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755914_15090 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755914 2025-07-19 15:07:25,546 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755915_15091 src: /192.168.158.8:41910 dest: /192.168.158.4:9866 2025-07-19 15:07:25,566 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:41910, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_391811609_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755915_15091, duration(ns): 17209869 2025-07-19 15:07:25,566 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755915_15091, type=LAST_IN_PIPELINE terminating 2025-07-19 15:07:30,984 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755915_15091 replica FinalizedReplica, blk_1073755915_15091, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755915 for deletion 2025-07-19 15:07:30,986 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755915_15091 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755915 2025-07-19 15:08:30,534 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755916_15092 src: /192.168.158.1:41572 dest: /192.168.158.4:9866 2025-07-19 15:08:30,569 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41572, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_835465707_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755916_15092, duration(ns): 23941923 2025-07-19 15:08:30,569 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755916_15092, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-19 15:08:36,987 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755916_15092 replica FinalizedReplica, blk_1073755916_15092, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755916 for deletion 2025-07-19 15:08:36,988 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755916_15092 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755916 2025-07-19 15:11:30,548 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755919_15095 src: /192.168.158.8:49046 dest: /192.168.158.4:9866 2025-07-19 15:11:30,567 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:49046, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1980810640_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755919_15095, duration(ns): 17035746 2025-07-19 15:11:30,568 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755919_15095, type=LAST_IN_PIPELINE terminating 2025-07-19 15:11:33,992 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755919_15095 replica FinalizedReplica, blk_1073755919_15095, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755919 for deletion 2025-07-19 15:11:33,993 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755919_15095 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755919 2025-07-19 15:12:30,550 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755920_15096 src: /192.168.158.6:40972 dest: /192.168.158.4:9866 2025-07-19 15:12:30,570 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:40972, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_609371578_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755920_15096, duration(ns): 17832155 2025-07-19 15:12:30,570 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755920_15096, type=LAST_IN_PIPELINE terminating 2025-07-19 15:12:33,996 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755920_15096 replica FinalizedReplica, blk_1073755920_15096, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755920 for deletion 2025-07-19 15:12:33,998 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755920_15096 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755920 2025-07-19 15:19:45,561 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755927_15103 src: /192.168.158.1:51854 dest: /192.168.158.4:9866 2025-07-19 15:19:45,595 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51854, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-725392943_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755927_15103, duration(ns): 25173562 2025-07-19 15:19:45,596 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755927_15103, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-19 15:19:52,012 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755927_15103 replica FinalizedReplica, blk_1073755927_15103, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755927 for deletion 2025-07-19 15:19:52,013 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755927_15103 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755927 2025-07-19 15:21:45,565 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755929_15105 src: /192.168.158.1:49504 dest: /192.168.158.4:9866 2025-07-19 15:21:45,597 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49504, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1078008827_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755929_15105, duration(ns): 23070657 2025-07-19 15:21:45,597 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755929_15105, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-19 15:21:49,016 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755929_15105 replica FinalizedReplica, blk_1073755929_15105, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755929 for deletion 2025-07-19 15:21:49,017 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755929_15105 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755929 2025-07-19 15:24:50,578 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755932_15108 src: /192.168.158.9:53370 dest: /192.168.158.4:9866 2025-07-19 15:24:50,598 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:53370, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1976143792_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755932_15108, duration(ns): 17572904 2025-07-19 15:24:50,598 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755932_15108, type=LAST_IN_PIPELINE terminating 2025-07-19 15:24:58,021 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755932_15108 replica FinalizedReplica, blk_1073755932_15108, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755932 for deletion 2025-07-19 15:24:58,022 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755932_15108 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755932 2025-07-19 15:26:55,562 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755934_15110 src: /192.168.158.7:53402 dest: /192.168.158.4:9866 2025-07-19 15:26:55,589 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:53402, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1378447838_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755934_15110, duration(ns): 20842571 2025-07-19 15:26:55,590 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755934_15110, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-19 15:26:58,022 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755934_15110 replica FinalizedReplica, blk_1073755934_15110, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755934 for deletion 2025-07-19 15:26:58,023 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755934_15110 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755934 2025-07-19 15:27:55,574 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755935_15111 src: /192.168.158.6:44050 dest: /192.168.158.4:9866 2025-07-19 15:27:55,602 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:44050, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1551302923_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755935_15111, duration(ns): 23050696 2025-07-19 15:27:55,603 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755935_15111, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-19 15:28:01,023 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755935_15111 replica FinalizedReplica, blk_1073755935_15111, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755935 for deletion 2025-07-19 15:28:01,024 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755935_15111 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755935 2025-07-19 15:28:55,568 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755936_15112 src: /192.168.158.6:52206 dest: /192.168.158.4:9866 2025-07-19 15:28:55,595 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:52206, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1370584674_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755936_15112, duration(ns): 21143371 2025-07-19 15:28:55,596 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755936_15112, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-19 15:28:58,023 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755936_15112 replica FinalizedReplica, blk_1073755936_15112, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755936 for deletion 2025-07-19 15:28:58,024 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755936_15112 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755936 2025-07-19 15:31:00,566 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755938_15114 src: /192.168.158.1:53900 dest: /192.168.158.4:9866 2025-07-19 15:31:00,600 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53900, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1261673316_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755938_15114, duration(ns): 25392996 2025-07-19 15:31:00,601 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755938_15114, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-19 15:31:04,027 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755938_15114 replica FinalizedReplica, blk_1073755938_15114, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755938 for deletion 2025-07-19 15:31:04,028 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755938_15114 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755938 2025-07-19 15:32:00,570 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755939_15115 src: /192.168.158.5:45910 dest: /192.168.158.4:9866 2025-07-19 15:32:00,597 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:45910, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1260168760_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755939_15115, duration(ns): 20834150 2025-07-19 15:32:00,597 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755939_15115, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-19 15:32:04,030 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755939_15115 replica FinalizedReplica, blk_1073755939_15115, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755939 for deletion 2025-07-19 15:32:04,031 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755939_15115 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755939 2025-07-19 15:34:00,584 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755941_15117 src: /192.168.158.1:54060 dest: /192.168.158.4:9866 2025-07-19 15:34:00,617 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54060, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1415606195_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755941_15117, duration(ns): 23720804 2025-07-19 15:34:00,617 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755941_15117, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-19 15:34:04,035 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755941_15117 replica FinalizedReplica, blk_1073755941_15117, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755941 for deletion 2025-07-19 15:34:04,036 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755941_15117 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755941 2025-07-19 15:35:05,593 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755942_15118 src: /192.168.158.9:34364 dest: /192.168.158.4:9866 2025-07-19 15:35:05,613 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:34364, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_415253423_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755942_15118, duration(ns): 17100982 2025-07-19 15:35:05,613 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755942_15118, type=LAST_IN_PIPELINE terminating 2025-07-19 15:35:10,036 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755942_15118 replica FinalizedReplica, blk_1073755942_15118, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755942 for deletion 2025-07-19 15:35:10,037 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755942_15118 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755942 2025-07-19 15:37:10,595 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755944_15120 src: /192.168.158.5:35030 dest: /192.168.158.4:9866 2025-07-19 15:37:10,615 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:35030, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1104949900_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755944_15120, duration(ns): 17362644 2025-07-19 15:37:10,615 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755944_15120, type=LAST_IN_PIPELINE terminating 2025-07-19 15:37:16,041 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755944_15120 replica FinalizedReplica, blk_1073755944_15120, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755944 for deletion 2025-07-19 15:37:16,042 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755944_15120 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755944 2025-07-19 15:38:10,609 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755945_15121 src: /192.168.158.6:40394 dest: /192.168.158.4:9866 2025-07-19 15:38:10,629 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:40394, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-541315639_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755945_15121, duration(ns): 17940969 2025-07-19 15:38:10,630 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755945_15121, type=LAST_IN_PIPELINE terminating 2025-07-19 15:38:16,045 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755945_15121 replica FinalizedReplica, blk_1073755945_15121, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755945 for deletion 2025-07-19 15:38:16,046 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755945_15121 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755945 2025-07-19 15:42:20,603 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755949_15125 src: /192.168.158.1:46204 dest: /192.168.158.4:9866 2025-07-19 15:42:20,634 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46204, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_781500295_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755949_15125, duration(ns): 22124229 2025-07-19 15:42:20,635 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755949_15125, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-19 15:42:28,053 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755949_15125 replica FinalizedReplica, blk_1073755949_15125, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755949 for deletion 2025-07-19 15:42:28,054 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755949_15125 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755949 2025-07-19 15:43:20,630 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755950_15126 src: /192.168.158.1:43448 dest: /192.168.158.4:9866 2025-07-19 15:43:20,664 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43448, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1015869311_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755950_15126, duration(ns): 25138688 2025-07-19 15:43:20,664 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755950_15126, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-19 15:43:25,053 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755950_15126 replica FinalizedReplica, blk_1073755950_15126, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755950 for deletion 2025-07-19 15:43:25,054 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755950_15126 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755950 2025-07-19 15:47:20,626 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755954_15130 src: /192.168.158.5:52342 dest: /192.168.158.4:9866 2025-07-19 15:47:20,653 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:52342, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1257016732_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755954_15130, duration(ns): 21138559 2025-07-19 15:47:20,653 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755954_15130, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-19 15:47:25,061 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755954_15130 replica FinalizedReplica, blk_1073755954_15130, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755954 for deletion 2025-07-19 15:47:25,062 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755954_15130 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755954 2025-07-19 15:48:20,610 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755955_15131 src: /192.168.158.1:55780 dest: /192.168.158.4:9866 2025-07-19 15:48:20,645 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55780, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-300302088_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755955_15131, duration(ns): 25992439 2025-07-19 15:48:20,645 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755955_15131, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-19 15:48:25,064 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755955_15131 replica FinalizedReplica, blk_1073755955_15131, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755955 for deletion 2025-07-19 15:48:25,065 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755955_15131 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755955 2025-07-19 15:50:25,610 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755957_15133 src: /192.168.158.1:34602 dest: /192.168.158.4:9866 2025-07-19 15:50:25,643 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34602, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1553524644_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755957_15133, duration(ns): 23823869 2025-07-19 15:50:25,643 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755957_15133, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-19 15:50:28,067 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755957_15133 replica FinalizedReplica, blk_1073755957_15133, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755957 for deletion 2025-07-19 15:50:28,068 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755957_15133 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755957 2025-07-19 15:52:25,626 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755959_15135 src: /192.168.158.8:59164 dest: /192.168.158.4:9866 2025-07-19 15:52:25,645 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:59164, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-638405206_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755959_15135, duration(ns): 16587956 2025-07-19 15:52:25,645 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755959_15135, type=LAST_IN_PIPELINE terminating 2025-07-19 15:52:28,072 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755959_15135 replica FinalizedReplica, blk_1073755959_15135, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755959 for deletion 2025-07-19 15:52:28,073 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755959_15135 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755959 2025-07-19 15:56:35,623 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755963_15139 src: /192.168.158.1:41034 dest: /192.168.158.4:9866 2025-07-19 15:56:35,660 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41034, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1984616009_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755963_15139, duration(ns): 27877832 2025-07-19 15:56:35,660 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755963_15139, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-19 15:56:40,082 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755963_15139 replica FinalizedReplica, blk_1073755963_15139, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755963 for deletion 2025-07-19 15:56:40,083 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755963_15139 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755963 2025-07-19 15:59:19,089 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Successfully sent block report 0x12cfd9a757d31f4e, containing 4 storage report(s), of which we sent 4. The reports had 8 total blocks and used 1 RPC(s). This took 0 msec to generate and 4 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5. 2025-07-19 15:59:19,090 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Got finalize command for block pool BP-1059995147-192.168.158.1-1752101929360 2025-07-19 15:59:35,638 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755966_15142 src: /192.168.158.5:51736 dest: /192.168.158.4:9866 2025-07-19 15:59:35,659 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:51736, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1478612977_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755966_15142, duration(ns): 18845910 2025-07-19 15:59:35,659 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755966_15142, type=LAST_IN_PIPELINE terminating 2025-07-19 15:59:40,086 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755966_15142 replica FinalizedReplica, blk_1073755966_15142, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755966 for deletion 2025-07-19 15:59:40,087 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755966_15142 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755966 2025-07-19 16:00:35,657 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755967_15143 src: /192.168.158.5:44512 dest: /192.168.158.4:9866 2025-07-19 16:00:35,683 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:44512, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-57719217_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755967_15143, duration(ns): 20516879 2025-07-19 16:00:35,683 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755967_15143, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-19 16:00:40,090 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755967_15143 replica FinalizedReplica, blk_1073755967_15143, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755967 for deletion 2025-07-19 16:00:40,091 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755967_15143 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755967 2025-07-19 16:01:40,643 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755968_15144 src: /192.168.158.9:58164 dest: /192.168.158.4:9866 2025-07-19 16:01:40,662 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:58164, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1096987486_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755968_15144, duration(ns): 16711459 2025-07-19 16:01:40,663 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755968_15144, type=LAST_IN_PIPELINE terminating 2025-07-19 16:01:43,093 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755968_15144 replica FinalizedReplica, blk_1073755968_15144, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755968 for deletion 2025-07-19 16:01:43,094 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755968_15144 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755968 2025-07-19 16:07:50,641 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755974_15150 src: /192.168.158.1:58216 dest: /192.168.158.4:9866 2025-07-19 16:07:50,675 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58216, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_261989202_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755974_15150, duration(ns): 24785306 2025-07-19 16:07:50,675 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755974_15150, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-19 16:07:58,105 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755974_15150 replica FinalizedReplica, blk_1073755974_15150, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755974 for deletion 2025-07-19 16:07:58,107 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755974_15150 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755974 2025-07-19 16:08:50,650 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755975_15151 src: /192.168.158.1:58956 dest: /192.168.158.4:9866 2025-07-19 16:08:50,685 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58956, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1536332542_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755975_15151, duration(ns): 24512792 2025-07-19 16:08:50,685 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755975_15151, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-19 16:08:58,109 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755975_15151 replica FinalizedReplica, blk_1073755975_15151, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755975 for deletion 2025-07-19 16:08:58,111 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755975_15151 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755975 2025-07-19 16:15:00,661 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755981_15157 src: /192.168.158.7:60394 dest: /192.168.158.4:9866 2025-07-19 16:15:00,688 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:60394, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1783175062_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755981_15157, duration(ns): 21825156 2025-07-19 16:15:00,689 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755981_15157, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-19 16:15:07,122 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755981_15157 replica FinalizedReplica, blk_1073755981_15157, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755981 for deletion 2025-07-19 16:15:07,123 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755981_15157 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755981 2025-07-19 16:16:05,657 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755982_15158 src: /192.168.158.6:40394 dest: /192.168.158.4:9866 2025-07-19 16:16:05,684 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:40394, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1076885941_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755982_15158, duration(ns): 21463723 2025-07-19 16:16:05,684 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755982_15158, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-19 16:16:10,124 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755982_15158 replica FinalizedReplica, blk_1073755982_15158, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755982 for deletion 2025-07-19 16:16:10,125 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755982_15158 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755982 2025-07-19 16:17:05,679 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755983_15159 src: /192.168.158.7:33778 dest: /192.168.158.4:9866 2025-07-19 16:17:05,698 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:33778, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-581313166_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755983_15159, duration(ns): 16189328 2025-07-19 16:17:05,698 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755983_15159, type=LAST_IN_PIPELINE terminating 2025-07-19 16:17:10,125 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755983_15159 replica FinalizedReplica, blk_1073755983_15159, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755983 for deletion 2025-07-19 16:17:10,127 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755983_15159 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755983 2025-07-19 16:18:05,665 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755984_15160 src: /192.168.158.8:58530 dest: /192.168.158.4:9866 2025-07-19 16:18:05,684 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:58530, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-735960937_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755984_15160, duration(ns): 17215020 2025-07-19 16:18:05,685 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755984_15160, type=LAST_IN_PIPELINE terminating 2025-07-19 16:18:13,126 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755984_15160 replica FinalizedReplica, blk_1073755984_15160, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755984 for deletion 2025-07-19 16:18:13,127 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755984_15160 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755984 2025-07-19 16:20:10,672 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755986_15162 src: /192.168.158.7:38664 dest: /192.168.158.4:9866 2025-07-19 16:20:10,693 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:38664, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1263247464_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755986_15162, duration(ns): 18663276 2025-07-19 16:20:10,693 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755986_15162, type=LAST_IN_PIPELINE terminating 2025-07-19 16:20:13,130 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755986_15162 replica FinalizedReplica, blk_1073755986_15162, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755986 for deletion 2025-07-19 16:20:13,131 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755986_15162 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755986 2025-07-19 16:21:10,664 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755987_15163 src: /192.168.158.6:43086 dest: /192.168.158.4:9866 2025-07-19 16:21:10,683 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:43086, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1193369170_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755987_15163, duration(ns): 16693555 2025-07-19 16:21:10,683 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755987_15163, type=LAST_IN_PIPELINE terminating 2025-07-19 16:21:13,132 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755987_15163 replica FinalizedReplica, blk_1073755987_15163, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755987 for deletion 2025-07-19 16:21:13,133 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755987_15163 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755987 2025-07-19 16:22:10,648 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755988_15164 src: /192.168.158.1:59272 dest: /192.168.158.4:9866 2025-07-19 16:22:10,679 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59272, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1297028728_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755988_15164, duration(ns): 22973590 2025-07-19 16:22:10,679 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755988_15164, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-19 16:22:13,133 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755988_15164 replica FinalizedReplica, blk_1073755988_15164, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755988 for deletion 2025-07-19 16:22:13,135 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755988_15164 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755988 2025-07-19 16:23:10,746 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755989_15165 src: /192.168.158.1:36476 dest: /192.168.158.4:9866 2025-07-19 16:23:10,779 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36476, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-391423940_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755989_15165, duration(ns): 24428513 2025-07-19 16:23:10,780 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755989_15165, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-19 16:23:13,134 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755989_15165 replica FinalizedReplica, blk_1073755989_15165, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755989 for deletion 2025-07-19 16:23:13,135 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755989_15165 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755989 2025-07-19 16:24:10,661 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755990_15166 src: /192.168.158.9:57896 dest: /192.168.158.4:9866 2025-07-19 16:24:10,690 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:57896, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-762551498_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755990_15166, duration(ns): 22303273 2025-07-19 16:24:10,690 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755990_15166, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-19 16:24:16,137 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755990_15166 replica FinalizedReplica, blk_1073755990_15166, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755990 for deletion 2025-07-19 16:24:16,138 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755990_15166 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755990 2025-07-19 16:26:10,668 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755992_15168 src: /192.168.158.8:58584 dest: /192.168.158.4:9866 2025-07-19 16:26:10,696 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:58584, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1573625689_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755992_15168, duration(ns): 21994843 2025-07-19 16:26:10,696 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755992_15168, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-19 16:26:13,138 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755992_15168 replica FinalizedReplica, blk_1073755992_15168, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755992 for deletion 2025-07-19 16:26:13,139 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755992_15168 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755992 2025-07-19 16:28:10,701 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755994_15170 src: /192.168.158.8:49196 dest: /192.168.158.4:9866 2025-07-19 16:28:10,721 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:49196, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2018410487_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755994_15170, duration(ns): 17428899 2025-07-19 16:28:10,721 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755994_15170, type=LAST_IN_PIPELINE terminating 2025-07-19 16:28:13,140 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755994_15170 replica FinalizedReplica, blk_1073755994_15170, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755994 for deletion 2025-07-19 16:28:13,141 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755994_15170 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755994 2025-07-19 16:29:10,679 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755995_15171 src: /192.168.158.8:41966 dest: /192.168.158.4:9866 2025-07-19 16:29:10,708 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:41966, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_550506539_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755995_15171, duration(ns): 22987457 2025-07-19 16:29:10,708 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755995_15171, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-19 16:29:16,141 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755995_15171 replica FinalizedReplica, blk_1073755995_15171, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755995 for deletion 2025-07-19 16:29:16,142 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755995_15171 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755995 2025-07-19 16:32:20,694 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755998_15174 src: /192.168.158.6:50006 dest: /192.168.158.4:9866 2025-07-19 16:32:20,718 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:50006, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_695906327_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755998_15174, duration(ns): 18653048 2025-07-19 16:32:20,719 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755998_15174, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-19 16:32:25,150 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755998_15174 replica FinalizedReplica, blk_1073755998_15174, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755998 for deletion 2025-07-19 16:32:25,151 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755998_15174 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755998 2025-07-19 16:33:25,684 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073755999_15175 src: /192.168.158.1:59872 dest: /192.168.158.4:9866 2025-07-19 16:33:25,715 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59872, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_436598087_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073755999_15175, duration(ns): 22362312 2025-07-19 16:33:25,715 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073755999_15175, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-19 16:33:28,153 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073755999_15175 replica FinalizedReplica, blk_1073755999_15175, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755999 for deletion 2025-07-19 16:33:28,154 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073755999_15175 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073755999 2025-07-19 16:35:30,700 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756001_15177 src: /192.168.158.9:47444 dest: /192.168.158.4:9866 2025-07-19 16:35:30,728 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:47444, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1224741277_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756001_15177, duration(ns): 22358731 2025-07-19 16:35:30,728 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756001_15177, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-19 16:35:37,158 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756001_15177 replica FinalizedReplica, blk_1073756001_15177, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756001 for deletion 2025-07-19 16:35:37,159 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756001_15177 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756001 2025-07-19 16:40:30,705 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756006_15182 src: /192.168.158.7:57962 dest: /192.168.158.4:9866 2025-07-19 16:40:30,732 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:57962, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1726756727_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756006_15182, duration(ns): 21731133 2025-07-19 16:40:30,732 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756006_15182, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-19 16:40:34,165 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756006_15182 replica FinalizedReplica, blk_1073756006_15182, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756006 for deletion 2025-07-19 16:40:34,166 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756006_15182 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756006 2025-07-19 16:44:35,689 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756010_15186 src: /192.168.158.1:33672 dest: /192.168.158.4:9866 2025-07-19 16:44:35,723 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33672, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1072447107_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756010_15186, duration(ns): 25861713 2025-07-19 16:44:35,723 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756010_15186, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-19 16:44:40,174 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756010_15186 replica FinalizedReplica, blk_1073756010_15186, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756010 for deletion 2025-07-19 16:44:40,175 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756010_15186 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756010 2025-07-19 16:45:35,700 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756011_15187 src: /192.168.158.9:59508 dest: /192.168.158.4:9866 2025-07-19 16:45:35,721 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:59508, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1467330237_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756011_15187, duration(ns): 18246828 2025-07-19 16:45:35,721 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756011_15187, type=LAST_IN_PIPELINE terminating 2025-07-19 16:45:40,174 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756011_15187 replica FinalizedReplica, blk_1073756011_15187, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756011 for deletion 2025-07-19 16:45:40,175 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756011_15187 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756011 2025-07-19 16:46:35,700 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756012_15188 src: /192.168.158.8:41878 dest: /192.168.158.4:9866 2025-07-19 16:46:35,727 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:41878, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1658321055_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756012_15188, duration(ns): 21227125 2025-07-19 16:46:35,727 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756012_15188, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-19 16:46:40,174 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756012_15188 replica FinalizedReplica, blk_1073756012_15188, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756012 for deletion 2025-07-19 16:46:40,175 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756012_15188 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756012 2025-07-19 16:49:35,719 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756015_15191 src: /192.168.158.7:41552 dest: /192.168.158.4:9866 2025-07-19 16:49:35,749 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:41552, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1827015021_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756015_15191, duration(ns): 22767107 2025-07-19 16:49:35,749 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756015_15191, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-19 16:49:43,175 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756015_15191 replica FinalizedReplica, blk_1073756015_15191, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756015 for deletion 2025-07-19 16:49:43,176 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756015_15191 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756015 2025-07-19 16:50:35,703 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756016_15192 src: /192.168.158.1:45628 dest: /192.168.158.4:9866 2025-07-19 16:50:35,741 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45628, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-35252006_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756016_15192, duration(ns): 27002205 2025-07-19 16:50:35,741 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756016_15192, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-19 16:50:40,177 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756016_15192 replica FinalizedReplica, blk_1073756016_15192, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756016 for deletion 2025-07-19 16:50:40,178 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756016_15192 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756016 2025-07-19 16:51:40,704 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756017_15193 src: /192.168.158.8:56512 dest: /192.168.158.4:9866 2025-07-19 16:51:40,730 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:56512, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1132206795_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756017_15193, duration(ns): 20709336 2025-07-19 16:51:40,731 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756017_15193, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-19 16:51:43,178 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756017_15193 replica FinalizedReplica, blk_1073756017_15193, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756017 for deletion 2025-07-19 16:51:43,179 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756017_15193 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756017 2025-07-19 16:52:40,717 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756018_15194 src: /192.168.158.9:40490 dest: /192.168.158.4:9866 2025-07-19 16:52:40,744 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:40490, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1418998760_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756018_15194, duration(ns): 21249425 2025-07-19 16:52:40,744 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756018_15194, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-19 16:52:43,183 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756018_15194 replica FinalizedReplica, blk_1073756018_15194, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756018 for deletion 2025-07-19 16:52:43,184 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756018_15194 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756018 2025-07-19 16:53:40,703 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756019_15195 src: /192.168.158.1:44652 dest: /192.168.158.4:9866 2025-07-19 16:53:40,736 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44652, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-94953678_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756019_15195, duration(ns): 23819851 2025-07-19 16:53:40,737 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756019_15195, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-19 16:53:43,186 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756019_15195 replica FinalizedReplica, blk_1073756019_15195, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756019 for deletion 2025-07-19 16:53:43,187 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756019_15195 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756019 2025-07-19 16:54:40,717 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756020_15196 src: /192.168.158.1:33586 dest: /192.168.158.4:9866 2025-07-19 16:54:40,751 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33586, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1015798499_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756020_15196, duration(ns): 25642024 2025-07-19 16:54:40,752 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756020_15196, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-19 16:54:43,185 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756020_15196 replica FinalizedReplica, blk_1073756020_15196, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756020 for deletion 2025-07-19 16:54:43,187 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756020_15196 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756020 2025-07-19 16:56:50,711 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756022_15198 src: /192.168.158.1:55016 dest: /192.168.158.4:9866 2025-07-19 16:56:50,744 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55016, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1266594051_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756022_15198, duration(ns): 24045008 2025-07-19 16:56:50,744 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756022_15198, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-19 16:56:55,190 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756022_15198 replica FinalizedReplica, blk_1073756022_15198, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756022 for deletion 2025-07-19 16:56:55,191 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756022_15198 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756022 2025-07-19 16:58:50,716 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756024_15200 src: /192.168.158.9:44432 dest: /192.168.158.4:9866 2025-07-19 16:58:50,735 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:44432, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-130234354_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756024_15200, duration(ns): 16331955 2025-07-19 16:58:50,735 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756024_15200, type=LAST_IN_PIPELINE terminating 2025-07-19 16:58:58,193 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756024_15200 replica FinalizedReplica, blk_1073756024_15200, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756024 for deletion 2025-07-19 16:58:58,195 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756024_15200 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756024 2025-07-19 16:59:50,708 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756025_15201 src: /192.168.158.1:39278 dest: /192.168.158.4:9866 2025-07-19 16:59:50,744 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39278, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1313108257_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756025_15201, duration(ns): 25385790 2025-07-19 16:59:50,745 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756025_15201, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-19 16:59:55,198 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756025_15201 replica FinalizedReplica, blk_1073756025_15201, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756025 for deletion 2025-07-19 16:59:55,199 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756025_15201 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756025 2025-07-19 17:03:55,722 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756029_15205 src: /192.168.158.1:42076 dest: /192.168.158.4:9866 2025-07-19 17:03:55,753 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42076, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-467015916_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756029_15205, duration(ns): 22661493 2025-07-19 17:03:55,753 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756029_15205, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-19 17:03:58,205 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756029_15205 replica FinalizedReplica, blk_1073756029_15205, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756029 for deletion 2025-07-19 17:03:58,207 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756029_15205 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756029 2025-07-19 17:04:55,724 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756030_15206 src: /192.168.158.9:44578 dest: /192.168.158.4:9866 2025-07-19 17:04:55,745 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:44578, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1848356496_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756030_15206, duration(ns): 18202748 2025-07-19 17:04:55,745 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756030_15206, type=LAST_IN_PIPELINE terminating 2025-07-19 17:04:58,207 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756030_15206 replica FinalizedReplica, blk_1073756030_15206, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756030 for deletion 2025-07-19 17:04:58,209 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756030_15206 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756030 2025-07-19 17:05:55,719 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756031_15207 src: /192.168.158.8:34034 dest: /192.168.158.4:9866 2025-07-19 17:05:55,740 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:34034, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-801219011_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756031_15207, duration(ns): 18850767 2025-07-19 17:05:55,740 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756031_15207, type=LAST_IN_PIPELINE terminating 2025-07-19 17:06:01,208 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756031_15207 replica FinalizedReplica, blk_1073756031_15207, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756031 for deletion 2025-07-19 17:06:01,209 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756031_15207 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756031 2025-07-19 17:14:00,738 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756039_15215 src: /192.168.158.1:38984 dest: /192.168.158.4:9866 2025-07-19 17:14:00,770 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38984, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1029258521_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756039_15215, duration(ns): 23361315 2025-07-19 17:14:00,770 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756039_15215, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-19 17:14:04,220 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756039_15215 replica FinalizedReplica, blk_1073756039_15215, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756039 for deletion 2025-07-19 17:14:04,221 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756039_15215 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756039 2025-07-19 17:16:00,747 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756041_15217 src: /192.168.158.1:40664 dest: /192.168.158.4:9866 2025-07-19 17:16:00,781 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40664, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2109679426_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756041_15217, duration(ns): 25095079 2025-07-19 17:16:00,781 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756041_15217, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-19 17:16:07,223 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756041_15217 replica FinalizedReplica, blk_1073756041_15217, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756041 for deletion 2025-07-19 17:16:07,224 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756041_15217 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756041 2025-07-19 17:17:00,743 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756042_15218 src: /192.168.158.1:32768 dest: /192.168.158.4:9866 2025-07-19 17:17:00,777 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:32768, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1164605956_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756042_15218, duration(ns): 24927643 2025-07-19 17:17:00,778 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756042_15218, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-19 17:17:07,226 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756042_15218 replica FinalizedReplica, blk_1073756042_15218, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756042 for deletion 2025-07-19 17:17:07,227 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756042_15218 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756042 2025-07-19 17:21:00,757 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756046_15222 src: /192.168.158.1:41266 dest: /192.168.158.4:9866 2025-07-19 17:21:00,790 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41266, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_610182109_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756046_15222, duration(ns): 23852914 2025-07-19 17:21:00,790 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756046_15222, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-19 17:21:04,232 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756046_15222 replica FinalizedReplica, blk_1073756046_15222, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756046 for deletion 2025-07-19 17:21:04,233 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756046_15222 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756046 2025-07-19 17:24:00,769 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756049_15225 src: /192.168.158.7:55920 dest: /192.168.158.4:9866 2025-07-19 17:24:00,796 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:55920, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_254709898_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756049_15225, duration(ns): 21691022 2025-07-19 17:24:00,796 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756049_15225, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-19 17:24:07,238 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756049_15225 replica FinalizedReplica, blk_1073756049_15225, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756049 for deletion 2025-07-19 17:24:07,239 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756049_15225 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756049 2025-07-19 17:25:00,761 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756050_15226 src: /192.168.158.1:49942 dest: /192.168.158.4:9866 2025-07-19 17:25:00,797 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49942, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1327444873_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756050_15226, duration(ns): 26225580 2025-07-19 17:25:00,797 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756050_15226, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-19 17:25:04,240 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756050_15226 replica FinalizedReplica, blk_1073756050_15226, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756050 for deletion 2025-07-19 17:25:04,241 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756050_15226 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756050 2025-07-19 17:29:00,764 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756054_15230 src: /192.168.158.1:50522 dest: /192.168.158.4:9866 2025-07-19 17:29:00,798 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50522, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_120720051_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756054_15230, duration(ns): 25393022 2025-07-19 17:29:00,798 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756054_15230, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-19 17:29:04,245 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756054_15230 replica FinalizedReplica, blk_1073756054_15230, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756054 for deletion 2025-07-19 17:29:04,246 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756054_15230 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756054 2025-07-19 17:30:00,765 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756055_15231 src: /192.168.158.8:34848 dest: /192.168.158.4:9866 2025-07-19 17:30:00,791 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:34848, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1529865412_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756055_15231, duration(ns): 20725041 2025-07-19 17:30:00,792 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756055_15231, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-19 17:30:04,247 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756055_15231 replica FinalizedReplica, blk_1073756055_15231, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756055 for deletion 2025-07-19 17:30:04,248 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756055_15231 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756055 2025-07-19 17:32:00,774 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756057_15233 src: /192.168.158.1:55910 dest: /192.168.158.4:9866 2025-07-19 17:32:00,813 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55910, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1181775179_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756057_15233, duration(ns): 29034869 2025-07-19 17:32:00,814 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756057_15233, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-19 17:32:04,249 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756057_15233 replica FinalizedReplica, blk_1073756057_15233, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756057 for deletion 2025-07-19 17:32:04,250 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756057_15233 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756057 2025-07-19 17:33:00,774 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756058_15234 src: /192.168.158.8:39096 dest: /192.168.158.4:9866 2025-07-19 17:33:00,796 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:39096, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-684606895_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756058_15234, duration(ns): 19192468 2025-07-19 17:33:00,796 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756058_15234, type=LAST_IN_PIPELINE terminating 2025-07-19 17:33:04,252 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756058_15234 replica FinalizedReplica, blk_1073756058_15234, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756058 for deletion 2025-07-19 17:33:04,253 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756058_15234 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756058 2025-07-19 17:34:00,776 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756059_15235 src: /192.168.158.1:55866 dest: /192.168.158.4:9866 2025-07-19 17:34:00,809 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55866, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_73692516_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756059_15235, duration(ns): 24197024 2025-07-19 17:34:00,809 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756059_15235, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-19 17:34:07,253 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756059_15235 replica FinalizedReplica, blk_1073756059_15235, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756059 for deletion 2025-07-19 17:34:07,254 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756059_15235 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756059 2025-07-19 17:36:13,269 INFO org.apache.hadoop.hdfs.server.datanode.DirectoryScanner: BlockPool BP-1059995147-192.168.158.1-1752101929360 Total blocks: 8, missing metadata files:0, missing block files:0, missing blocks in memory:0, mismatched blocks:0 2025-07-19 17:38:00,785 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756063_15239 src: /192.168.158.1:53930 dest: /192.168.158.4:9866 2025-07-19 17:38:00,817 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53930, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-173620677_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756063_15239, duration(ns): 23177293 2025-07-19 17:38:00,818 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756063_15239, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-19 17:38:04,260 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756063_15239 replica FinalizedReplica, blk_1073756063_15239, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756063 for deletion 2025-07-19 17:38:04,262 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756063_15239 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756063 2025-07-19 17:41:05,795 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756066_15242 src: /192.168.158.1:52516 dest: /192.168.158.4:9866 2025-07-19 17:41:05,827 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52516, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-748841315_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756066_15242, duration(ns): 23220133 2025-07-19 17:41:05,828 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756066_15242, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-19 17:41:07,270 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756066_15242 replica FinalizedReplica, blk_1073756066_15242, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756066 for deletion 2025-07-19 17:41:07,272 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756066_15242 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756066 2025-07-19 17:42:05,797 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756067_15243 src: /192.168.158.6:43738 dest: /192.168.158.4:9866 2025-07-19 17:42:05,827 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:43738, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-712560995_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756067_15243, duration(ns): 23168418 2025-07-19 17:42:05,827 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756067_15243, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-19 17:42:07,273 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756067_15243 replica FinalizedReplica, blk_1073756067_15243, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756067 for deletion 2025-07-19 17:42:07,274 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756067_15243 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756067 2025-07-19 17:44:10,799 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756069_15245 src: /192.168.158.9:41298 dest: /192.168.158.4:9866 2025-07-19 17:44:10,818 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:41298, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-642959449_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756069_15245, duration(ns): 16362312 2025-07-19 17:44:10,818 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756069_15245, type=LAST_IN_PIPELINE terminating 2025-07-19 17:44:13,275 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756069_15245 replica FinalizedReplica, blk_1073756069_15245, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756069 for deletion 2025-07-19 17:44:13,276 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756069_15245 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756069 2025-07-19 17:45:15,803 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756070_15246 src: /192.168.158.9:46214 dest: /192.168.158.4:9866 2025-07-19 17:45:15,823 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:46214, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-276843772_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756070_15246, duration(ns): 18796470 2025-07-19 17:45:15,824 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756070_15246, type=LAST_IN_PIPELINE terminating 2025-07-19 17:45:22,277 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756070_15246 replica FinalizedReplica, blk_1073756070_15246, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756070 for deletion 2025-07-19 17:45:22,278 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756070_15246 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756070 2025-07-19 17:47:20,821 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756072_15248 src: /192.168.158.8:54518 dest: /192.168.158.4:9866 2025-07-19 17:47:20,840 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:54518, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-953565901_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756072_15248, duration(ns): 17601879 2025-07-19 17:47:20,841 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756072_15248, type=LAST_IN_PIPELINE terminating 2025-07-19 17:47:22,280 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756072_15248 replica FinalizedReplica, blk_1073756072_15248, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756072 for deletion 2025-07-19 17:47:22,282 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756072_15248 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756072 2025-07-19 17:48:20,806 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756073_15249 src: /192.168.158.1:59228 dest: /192.168.158.4:9866 2025-07-19 17:48:20,839 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59228, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-946892784_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756073_15249, duration(ns): 24143656 2025-07-19 17:48:20,840 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756073_15249, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-19 17:48:25,282 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756073_15249 replica FinalizedReplica, blk_1073756073_15249, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756073 for deletion 2025-07-19 17:48:25,283 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756073_15249 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756073 2025-07-19 17:49:20,802 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756074_15250 src: /192.168.158.1:40300 dest: /192.168.158.4:9866 2025-07-19 17:49:20,835 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40300, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1023385299_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756074_15250, duration(ns): 24116722 2025-07-19 17:49:20,836 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756074_15250, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-19 17:49:22,282 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756074_15250 replica FinalizedReplica, blk_1073756074_15250, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756074 for deletion 2025-07-19 17:49:22,284 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756074_15250 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756074 2025-07-19 17:50:20,804 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756075_15251 src: /192.168.158.5:55470 dest: /192.168.158.4:9866 2025-07-19 17:50:20,830 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:55470, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1022281748_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756075_15251, duration(ns): 21141177 2025-07-19 17:50:20,831 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756075_15251, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-19 17:50:25,285 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756075_15251 replica FinalizedReplica, blk_1073756075_15251, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756075 for deletion 2025-07-19 17:50:25,286 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756075_15251 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756075 2025-07-19 17:52:20,816 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756077_15253 src: /192.168.158.6:44480 dest: /192.168.158.4:9866 2025-07-19 17:52:20,843 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:44480, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_433076466_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756077_15253, duration(ns): 21486707 2025-07-19 17:52:20,843 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756077_15253, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-19 17:52:22,290 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756077_15253 replica FinalizedReplica, blk_1073756077_15253, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756077 for deletion 2025-07-19 17:52:22,291 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756077_15253 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756077 2025-07-19 17:53:20,811 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756078_15254 src: /192.168.158.1:40220 dest: /192.168.158.4:9866 2025-07-19 17:53:20,845 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40220, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_360054139_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756078_15254, duration(ns): 24378633 2025-07-19 17:53:20,845 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756078_15254, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-19 17:53:22,290 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756078_15254 replica FinalizedReplica, blk_1073756078_15254, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756078 for deletion 2025-07-19 17:53:22,291 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756078_15254 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756078 2025-07-19 17:55:25,819 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756080_15256 src: /192.168.158.7:36726 dest: /192.168.158.4:9866 2025-07-19 17:55:25,838 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:36726, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1302943536_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756080_15256, duration(ns): 16643317 2025-07-19 17:55:25,838 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756080_15256, type=LAST_IN_PIPELINE terminating 2025-07-19 17:55:28,293 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756080_15256 replica FinalizedReplica, blk_1073756080_15256, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756080 for deletion 2025-07-19 17:55:28,294 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756080_15256 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756080 2025-07-19 17:57:25,823 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756082_15258 src: /192.168.158.8:40354 dest: /192.168.158.4:9866 2025-07-19 17:57:25,842 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:40354, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_658365279_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756082_15258, duration(ns): 16118726 2025-07-19 17:57:25,842 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756082_15258, type=LAST_IN_PIPELINE terminating 2025-07-19 17:57:28,296 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756082_15258 replica FinalizedReplica, blk_1073756082_15258, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756082 for deletion 2025-07-19 17:57:28,297 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756082_15258 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756082 2025-07-19 17:59:25,865 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756084_15260 src: /192.168.158.8:58140 dest: /192.168.158.4:9866 2025-07-19 17:59:25,885 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:58140, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-422713932_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756084_15260, duration(ns): 17506692 2025-07-19 17:59:25,885 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756084_15260, type=LAST_IN_PIPELINE terminating 2025-07-19 17:59:28,300 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756084_15260 replica FinalizedReplica, blk_1073756084_15260, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756084 for deletion 2025-07-19 17:59:28,301 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756084_15260 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756084 2025-07-19 18:00:25,814 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756085_15261 src: /192.168.158.6:38816 dest: /192.168.158.4:9866 2025-07-19 18:00:25,832 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:38816, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_996094462_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756085_15261, duration(ns): 16401395 2025-07-19 18:00:25,833 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756085_15261, type=LAST_IN_PIPELINE terminating 2025-07-19 18:00:28,303 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756085_15261 replica FinalizedReplica, blk_1073756085_15261, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756085 for deletion 2025-07-19 18:00:28,304 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756085_15261 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756085 2025-07-19 18:04:30,817 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756089_15265 src: /192.168.158.1:58436 dest: /192.168.158.4:9866 2025-07-19 18:04:30,851 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58436, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1461110704_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756089_15265, duration(ns): 25784530 2025-07-19 18:04:30,852 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756089_15265, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-19 18:04:37,313 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756089_15265 replica FinalizedReplica, blk_1073756089_15265, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756089 for deletion 2025-07-19 18:04:37,314 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756089_15265 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756089 2025-07-19 18:05:30,831 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756090_15266 src: /192.168.158.1:50386 dest: /192.168.158.4:9866 2025-07-19 18:05:30,864 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50386, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-437289900_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756090_15266, duration(ns): 24896649 2025-07-19 18:05:30,864 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756090_15266, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-19 18:05:34,315 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756090_15266 replica FinalizedReplica, blk_1073756090_15266, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756090 for deletion 2025-07-19 18:05:34,317 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756090_15266 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756090 2025-07-19 18:06:35,825 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756091_15267 src: /192.168.158.7:50802 dest: /192.168.158.4:9866 2025-07-19 18:06:35,855 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:50802, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-437421073_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756091_15267, duration(ns): 22911781 2025-07-19 18:06:35,855 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756091_15267, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-19 18:06:40,316 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756091_15267 replica FinalizedReplica, blk_1073756091_15267, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756091 for deletion 2025-07-19 18:06:40,317 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756091_15267 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756091 2025-07-19 18:09:35,874 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756094_15270 src: /192.168.158.6:34552 dest: /192.168.158.4:9866 2025-07-19 18:09:35,903 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:34552, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-196791138_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756094_15270, duration(ns): 24056668 2025-07-19 18:09:35,904 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756094_15270, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-19 18:09:37,319 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756094_15270 replica FinalizedReplica, blk_1073756094_15270, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756094 for deletion 2025-07-19 18:09:37,320 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756094_15270 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756094 2025-07-19 18:11:40,835 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756096_15272 src: /192.168.158.1:38450 dest: /192.168.158.4:9866 2025-07-19 18:11:40,867 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38450, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_290692160_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756096_15272, duration(ns): 23868316 2025-07-19 18:11:40,867 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756096_15272, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-19 18:11:43,321 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756096_15272 replica FinalizedReplica, blk_1073756096_15272, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756096 for deletion 2025-07-19 18:11:43,322 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756096_15272 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756096 2025-07-19 18:12:40,856 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756097_15273 src: /192.168.158.8:37700 dest: /192.168.158.4:9866 2025-07-19 18:12:40,880 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:37700, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_948462626_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756097_15273, duration(ns): 21489166 2025-07-19 18:12:40,880 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756097_15273, type=LAST_IN_PIPELINE terminating 2025-07-19 18:12:43,324 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756097_15273 replica FinalizedReplica, blk_1073756097_15273, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756097 for deletion 2025-07-19 18:12:43,325 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756097_15273 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756097 2025-07-19 18:14:40,844 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756099_15275 src: /192.168.158.5:47526 dest: /192.168.158.4:9866 2025-07-19 18:14:40,865 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:47526, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2147399009_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756099_15275, duration(ns): 18468407 2025-07-19 18:14:40,865 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756099_15275, type=LAST_IN_PIPELINE terminating 2025-07-19 18:14:46,328 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756099_15275 replica FinalizedReplica, blk_1073756099_15275, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756099 for deletion 2025-07-19 18:14:46,329 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756099_15275 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756099 2025-07-19 18:15:40,842 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756100_15276 src: /192.168.158.1:33694 dest: /192.168.158.4:9866 2025-07-19 18:15:40,876 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33694, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-685924776_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756100_15276, duration(ns): 25839280 2025-07-19 18:15:40,877 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756100_15276, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-19 18:15:43,330 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756100_15276 replica FinalizedReplica, blk_1073756100_15276, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756100 for deletion 2025-07-19 18:15:43,331 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756100_15276 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756100 2025-07-19 18:17:45,883 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756102_15278 src: /192.168.158.6:59198 dest: /192.168.158.4:9866 2025-07-19 18:17:45,909 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:59198, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-85967773_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756102_15278, duration(ns): 20283807 2025-07-19 18:17:45,909 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756102_15278, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-19 18:17:49,335 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756102_15278 replica FinalizedReplica, blk_1073756102_15278, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756102 for deletion 2025-07-19 18:17:49,336 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756102_15278 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756102 2025-07-19 18:19:45,844 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756104_15280 src: /192.168.158.5:49608 dest: /192.168.158.4:9866 2025-07-19 18:19:45,869 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:49608, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1648008465_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756104_15280, duration(ns): 20713155 2025-07-19 18:19:45,870 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756104_15280, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-19 18:19:52,340 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756104_15280 replica FinalizedReplica, blk_1073756104_15280, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756104 for deletion 2025-07-19 18:19:52,341 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756104_15280 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756104 2025-07-19 18:23:45,859 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756108_15284 src: /192.168.158.1:42852 dest: /192.168.158.4:9866 2025-07-19 18:23:45,892 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42852, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-95770611_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756108_15284, duration(ns): 24323636 2025-07-19 18:23:45,892 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756108_15284, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-19 18:23:49,348 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756108_15284 replica FinalizedReplica, blk_1073756108_15284, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756108 for deletion 2025-07-19 18:23:49,349 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756108_15284 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756108 2025-07-19 18:24:45,847 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756109_15285 src: /192.168.158.5:41212 dest: /192.168.158.4:9866 2025-07-19 18:24:45,874 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:41212, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1948174022_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756109_15285, duration(ns): 20590296 2025-07-19 18:24:45,874 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756109_15285, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-19 18:24:49,351 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756109_15285 replica FinalizedReplica, blk_1073756109_15285, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756109 for deletion 2025-07-19 18:24:49,352 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756109_15285 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756109 2025-07-19 18:28:45,855 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756113_15289 src: /192.168.158.8:60870 dest: /192.168.158.4:9866 2025-07-19 18:28:45,874 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:60870, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2099930960_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756113_15289, duration(ns): 16611377 2025-07-19 18:28:45,874 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756113_15289, type=LAST_IN_PIPELINE terminating 2025-07-19 18:28:49,358 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756113_15289 replica FinalizedReplica, blk_1073756113_15289, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756113 for deletion 2025-07-19 18:28:49,359 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756113_15289 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756113 2025-07-19 18:29:45,851 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756114_15290 src: /192.168.158.1:37330 dest: /192.168.158.4:9866 2025-07-19 18:29:45,884 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37330, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1204558410_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756114_15290, duration(ns): 24268330 2025-07-19 18:29:45,884 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756114_15290, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-19 18:29:49,363 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756114_15290 replica FinalizedReplica, blk_1073756114_15290, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756114 for deletion 2025-07-19 18:29:49,364 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756114_15290 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756114 2025-07-19 18:35:45,869 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756120_15296 src: /192.168.158.5:38128 dest: /192.168.158.4:9866 2025-07-19 18:35:45,895 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:38128, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1877680963_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756120_15296, duration(ns): 20960290 2025-07-19 18:35:45,895 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756120_15296, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-19 18:35:49,378 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756120_15296 replica FinalizedReplica, blk_1073756120_15296, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756120 for deletion 2025-07-19 18:35:49,379 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756120_15296 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756120 2025-07-19 18:36:45,866 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756121_15297 src: /192.168.158.9:36160 dest: /192.168.158.4:9866 2025-07-19 18:36:45,894 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:36160, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_893958236_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756121_15297, duration(ns): 22068754 2025-07-19 18:36:45,894 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756121_15297, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-19 18:36:49,382 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756121_15297 replica FinalizedReplica, blk_1073756121_15297, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756121 for deletion 2025-07-19 18:36:49,383 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756121_15297 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756121 2025-07-19 18:38:45,865 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756123_15299 src: /192.168.158.1:50946 dest: /192.168.158.4:9866 2025-07-19 18:38:45,901 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50946, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2141488979_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756123_15299, duration(ns): 25525749 2025-07-19 18:38:45,901 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756123_15299, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-19 18:38:52,385 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756123_15299 replica FinalizedReplica, blk_1073756123_15299, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756123 for deletion 2025-07-19 18:38:52,386 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756123_15299 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756123 2025-07-19 18:42:50,875 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756127_15303 src: /192.168.158.7:57494 dest: /192.168.158.4:9866 2025-07-19 18:42:50,900 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:57494, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1867131532_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756127_15303, duration(ns): 19500445 2025-07-19 18:42:50,900 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756127_15303, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-19 18:42:52,389 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756127_15303 replica FinalizedReplica, blk_1073756127_15303, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756127 for deletion 2025-07-19 18:42:52,390 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756127_15303 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756127 2025-07-19 18:44:50,883 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756129_15305 src: /192.168.158.9:55980 dest: /192.168.158.4:9866 2025-07-19 18:44:50,903 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:55980, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1333294179_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756129_15305, duration(ns): 18247871 2025-07-19 18:44:50,904 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756129_15305, type=LAST_IN_PIPELINE terminating 2025-07-19 18:44:52,392 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756129_15305 replica FinalizedReplica, blk_1073756129_15305, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756129 for deletion 2025-07-19 18:44:52,393 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756129_15305 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756129 2025-07-19 18:47:55,883 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756132_15308 src: /192.168.158.1:39588 dest: /192.168.158.4:9866 2025-07-19 18:47:55,917 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39588, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-257117697_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756132_15308, duration(ns): 23487020 2025-07-19 18:47:55,917 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756132_15308, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-19 18:48:01,397 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756132_15308 replica FinalizedReplica, blk_1073756132_15308, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756132 for deletion 2025-07-19 18:48:01,398 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756132_15308 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756132 2025-07-19 18:48:55,888 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756133_15309 src: /192.168.158.7:39282 dest: /192.168.158.4:9866 2025-07-19 18:48:55,916 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:39282, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1159367612_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756133_15309, duration(ns): 22229713 2025-07-19 18:48:55,916 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756133_15309, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-19 18:48:58,399 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756133_15309 replica FinalizedReplica, blk_1073756133_15309, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756133 for deletion 2025-07-19 18:48:58,400 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756133_15309 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756133 2025-07-19 18:49:55,888 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756134_15310 src: /192.168.158.6:49044 dest: /192.168.158.4:9866 2025-07-19 18:49:55,907 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:49044, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-145706492_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756134_15310, duration(ns): 17075068 2025-07-19 18:49:55,908 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756134_15310, type=LAST_IN_PIPELINE terminating 2025-07-19 18:50:01,402 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756134_15310 replica FinalizedReplica, blk_1073756134_15310, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756134 for deletion 2025-07-19 18:50:01,403 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756134_15310 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756134 2025-07-19 18:53:00,904 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756137_15313 src: /192.168.158.7:54460 dest: /192.168.158.4:9866 2025-07-19 18:53:00,933 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:54460, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-921388279_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756137_15313, duration(ns): 22083852 2025-07-19 18:53:00,933 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756137_15313, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-19 18:53:04,404 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756137_15313 replica FinalizedReplica, blk_1073756137_15313, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756137 for deletion 2025-07-19 18:53:04,405 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756137_15313 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756137 2025-07-19 18:56:05,906 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756140_15316 src: /192.168.158.6:49990 dest: /192.168.158.4:9866 2025-07-19 18:56:05,926 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:49990, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1017292400_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756140_15316, duration(ns): 17725698 2025-07-19 18:56:05,926 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756140_15316, type=LAST_IN_PIPELINE terminating 2025-07-19 18:56:10,410 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756140_15316 replica FinalizedReplica, blk_1073756140_15316, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756140 for deletion 2025-07-19 18:56:10,411 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756140_15316 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756140 2025-07-19 18:58:10,913 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756142_15318 src: /192.168.158.1:43502 dest: /192.168.158.4:9866 2025-07-19 18:58:10,946 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43502, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-362416914_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756142_15318, duration(ns): 23943554 2025-07-19 18:58:10,947 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756142_15318, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-19 18:58:13,415 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756142_15318 replica FinalizedReplica, blk_1073756142_15318, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756142 for deletion 2025-07-19 18:58:13,416 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756142_15318 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756142 2025-07-19 18:59:10,921 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756143_15319 src: /192.168.158.7:38670 dest: /192.168.158.4:9866 2025-07-19 18:59:10,941 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:38670, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-178446180_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756143_15319, duration(ns): 18507366 2025-07-19 18:59:10,942 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756143_15319, type=LAST_IN_PIPELINE terminating 2025-07-19 18:59:13,416 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756143_15319 replica FinalizedReplica, blk_1073756143_15319, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756143 for deletion 2025-07-19 18:59:13,418 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756143_15319 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756143 2025-07-19 19:00:15,950 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756144_15320 src: /192.168.158.1:56878 dest: /192.168.158.4:9866 2025-07-19 19:00:15,982 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56878, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1214635872_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756144_15320, duration(ns): 23708523 2025-07-19 19:00:15,982 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756144_15320, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-19 19:00:19,417 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756144_15320 replica FinalizedReplica, blk_1073756144_15320, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756144 for deletion 2025-07-19 19:00:19,418 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756144_15320 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756144 2025-07-19 19:02:15,922 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756146_15322 src: /192.168.158.9:49934 dest: /192.168.158.4:9866 2025-07-19 19:02:15,948 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:49934, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1524389710_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756146_15322, duration(ns): 19495355 2025-07-19 19:02:15,948 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756146_15322, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-19 19:02:22,422 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756146_15322 replica FinalizedReplica, blk_1073756146_15322, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756146 for deletion 2025-07-19 19:02:22,423 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756146_15322 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756146 2025-07-19 19:04:15,927 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756148_15324 src: /192.168.158.6:49830 dest: /192.168.158.4:9866 2025-07-19 19:04:15,945 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:49830, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1917213265_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756148_15324, duration(ns): 16267460 2025-07-19 19:04:15,946 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756148_15324, type=LAST_IN_PIPELINE terminating 2025-07-19 19:04:19,428 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756148_15324 replica FinalizedReplica, blk_1073756148_15324, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756148 for deletion 2025-07-19 19:04:19,429 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756148_15324 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756148 2025-07-19 19:05:15,927 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756149_15325 src: /192.168.158.8:46062 dest: /192.168.158.4:9866 2025-07-19 19:05:15,954 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:46062, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-556364942_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756149_15325, duration(ns): 20988330 2025-07-19 19:05:15,954 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756149_15325, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-19 19:05:19,431 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756149_15325 replica FinalizedReplica, blk_1073756149_15325, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756149 for deletion 2025-07-19 19:05:19,432 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756149_15325 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756149 2025-07-19 19:09:20,926 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756153_15329 src: /192.168.158.8:37434 dest: /192.168.158.4:9866 2025-07-19 19:09:20,945 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:37434, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1505356665_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756153_15329, duration(ns): 16853763 2025-07-19 19:09:20,945 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756153_15329, type=LAST_IN_PIPELINE terminating 2025-07-19 19:09:25,442 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756153_15329 replica FinalizedReplica, blk_1073756153_15329, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756153 for deletion 2025-07-19 19:09:25,443 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756153_15329 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756153 2025-07-19 19:10:20,922 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756154_15330 src: /192.168.158.1:58566 dest: /192.168.158.4:9866 2025-07-19 19:10:20,955 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58566, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-545910476_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756154_15330, duration(ns): 23565785 2025-07-19 19:10:20,955 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756154_15330, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-19 19:10:25,443 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756154_15330 replica FinalizedReplica, blk_1073756154_15330, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756154 for deletion 2025-07-19 19:10:25,445 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756154_15330 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756154 2025-07-19 19:12:25,931 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756156_15332 src: /192.168.158.7:59354 dest: /192.168.158.4:9866 2025-07-19 19:12:25,949 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:59354, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1906749181_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756156_15332, duration(ns): 16030865 2025-07-19 19:12:25,950 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756156_15332, type=LAST_IN_PIPELINE terminating 2025-07-19 19:12:28,446 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756156_15332 replica FinalizedReplica, blk_1073756156_15332, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756156 for deletion 2025-07-19 19:12:28,447 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756156_15332 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756156 2025-07-19 19:13:30,921 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756157_15333 src: /192.168.158.1:47342 dest: /192.168.158.4:9866 2025-07-19 19:13:30,956 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47342, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1351625441_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756157_15333, duration(ns): 25504321 2025-07-19 19:13:30,956 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756157_15333, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-19 19:13:37,446 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756157_15333 replica FinalizedReplica, blk_1073756157_15333, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756157 for deletion 2025-07-19 19:13:37,447 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756157_15333 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir23/blk_1073756157 2025-07-19 19:17:30,947 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756161_15337 src: /192.168.158.5:36112 dest: /192.168.158.4:9866 2025-07-19 19:17:30,972 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:36112, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_88928559_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756161_15337, duration(ns): 20445631 2025-07-19 19:17:30,973 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756161_15337, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-19 19:17:34,455 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756161_15337 replica FinalizedReplica, blk_1073756161_15337, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756161 for deletion 2025-07-19 19:17:34,456 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756161_15337 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756161 2025-07-19 19:19:30,941 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756163_15339 src: /192.168.158.8:51312 dest: /192.168.158.4:9866 2025-07-19 19:19:30,971 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:51312, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1592069533_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756163_15339, duration(ns): 22610965 2025-07-19 19:19:30,971 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756163_15339, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-19 19:19:34,457 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756163_15339 replica FinalizedReplica, blk_1073756163_15339, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756163 for deletion 2025-07-19 19:19:34,458 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756163_15339 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756163 2025-07-19 19:21:35,946 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756165_15341 src: /192.168.158.8:35790 dest: /192.168.158.4:9866 2025-07-19 19:21:35,964 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:35790, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1296464962_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756165_15341, duration(ns): 15849381 2025-07-19 19:21:35,964 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756165_15341, type=LAST_IN_PIPELINE terminating 2025-07-19 19:21:37,461 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756165_15341 replica FinalizedReplica, blk_1073756165_15341, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756165 for deletion 2025-07-19 19:21:37,463 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756165_15341 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756165 2025-07-19 19:23:41,102 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756167_15343 src: /192.168.158.7:33992 dest: /192.168.158.4:9866 2025-07-19 19:23:41,124 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:33992, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_968005637_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756167_15343, duration(ns): 19857720 2025-07-19 19:23:41,124 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756167_15343, type=LAST_IN_PIPELINE terminating 2025-07-19 19:23:43,465 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756167_15343 replica FinalizedReplica, blk_1073756167_15343, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756167 for deletion 2025-07-19 19:23:43,466 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756167_15343 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756167 2025-07-19 19:24:40,947 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756168_15344 src: /192.168.158.7:45190 dest: /192.168.158.4:9866 2025-07-19 19:24:40,975 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:45190, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-789109075_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756168_15344, duration(ns): 23125424 2025-07-19 19:24:40,976 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756168_15344, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-19 19:24:43,464 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756168_15344 replica FinalizedReplica, blk_1073756168_15344, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756168 for deletion 2025-07-19 19:24:43,465 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756168_15344 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756168 2025-07-19 19:25:40,947 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756169_15345 src: /192.168.158.1:43882 dest: /192.168.158.4:9866 2025-07-19 19:25:40,982 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43882, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1340484309_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756169_15345, duration(ns): 24327806 2025-07-19 19:25:40,983 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756169_15345, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-19 19:25:43,465 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756169_15345 replica FinalizedReplica, blk_1073756169_15345, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756169 for deletion 2025-07-19 19:25:43,468 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756169_15345 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756169 2025-07-19 19:26:40,956 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756170_15346 src: /192.168.158.9:56624 dest: /192.168.158.4:9866 2025-07-19 19:26:40,985 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:56624, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_694898687_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756170_15346, duration(ns): 23053849 2025-07-19 19:26:40,985 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756170_15346, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-19 19:26:43,466 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756170_15346 replica FinalizedReplica, blk_1073756170_15346, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756170 for deletion 2025-07-19 19:26:43,468 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756170_15346 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756170 2025-07-19 19:28:45,958 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756172_15348 src: /192.168.158.6:52306 dest: /192.168.158.4:9866 2025-07-19 19:28:45,986 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:52306, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_739207971_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756172_15348, duration(ns): 21489022 2025-07-19 19:28:45,986 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756172_15348, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-19 19:28:52,471 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756172_15348 replica FinalizedReplica, blk_1073756172_15348, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756172 for deletion 2025-07-19 19:28:52,472 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756172_15348 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756172 2025-07-19 19:29:50,958 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756173_15349 src: /192.168.158.9:50644 dest: /192.168.158.4:9866 2025-07-19 19:29:50,978 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:50644, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-876173915_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756173_15349, duration(ns): 16648119 2025-07-19 19:29:50,978 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756173_15349, type=LAST_IN_PIPELINE terminating 2025-07-19 19:29:55,474 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756173_15349 replica FinalizedReplica, blk_1073756173_15349, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756173 for deletion 2025-07-19 19:29:55,475 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756173_15349 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756173 2025-07-19 19:34:55,960 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756178_15354 src: /192.168.158.1:55914 dest: /192.168.158.4:9866 2025-07-19 19:34:55,999 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55914, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-746777837_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756178_15354, duration(ns): 29386842 2025-07-19 19:34:55,999 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756178_15354, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-19 19:35:01,482 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756178_15354 replica FinalizedReplica, blk_1073756178_15354, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756178 for deletion 2025-07-19 19:35:01,483 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756178_15354 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756178 2025-07-19 19:35:55,965 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756179_15355 src: /192.168.158.1:47570 dest: /192.168.158.4:9866 2025-07-19 19:35:55,999 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47570, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-893122385_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756179_15355, duration(ns): 24599646 2025-07-19 19:35:56,000 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756179_15355, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-19 19:36:01,482 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756179_15355 replica FinalizedReplica, blk_1073756179_15355, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756179 for deletion 2025-07-19 19:36:01,484 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756179_15355 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756179 2025-07-19 19:38:00,968 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756181_15357 src: /192.168.158.5:38294 dest: /192.168.158.4:9866 2025-07-19 19:38:00,996 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:38294, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2039527683_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756181_15357, duration(ns): 22360585 2025-07-19 19:38:00,997 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756181_15357, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-19 19:38:04,489 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756181_15357 replica FinalizedReplica, blk_1073756181_15357, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756181 for deletion 2025-07-19 19:38:04,490 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756181_15357 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756181 2025-07-19 19:42:00,982 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756185_15361 src: /192.168.158.8:42928 dest: /192.168.158.4:9866 2025-07-19 19:42:01,002 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:42928, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1469719855_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756185_15361, duration(ns): 17059911 2025-07-19 19:42:01,002 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756185_15361, type=LAST_IN_PIPELINE terminating 2025-07-19 19:42:04,499 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756185_15361 replica FinalizedReplica, blk_1073756185_15361, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756185 for deletion 2025-07-19 19:42:04,500 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756185_15361 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756185 2025-07-19 19:44:00,983 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756187_15363 src: /192.168.158.6:59830 dest: /192.168.158.4:9866 2025-07-19 19:44:01,003 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:59830, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1946615193_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756187_15363, duration(ns): 17807331 2025-07-19 19:44:01,003 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756187_15363, type=LAST_IN_PIPELINE terminating 2025-07-19 19:44:04,501 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756187_15363 replica FinalizedReplica, blk_1073756187_15363, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756187 for deletion 2025-07-19 19:44:04,502 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756187_15363 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756187 2025-07-19 19:46:00,983 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756189_15365 src: /192.168.158.6:56508 dest: /192.168.158.4:9866 2025-07-19 19:46:01,010 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:56508, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2694770_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756189_15365, duration(ns): 21092307 2025-07-19 19:46:01,010 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756189_15365, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-19 19:46:07,507 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756189_15365 replica FinalizedReplica, blk_1073756189_15365, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756189 for deletion 2025-07-19 19:46:07,508 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756189_15365 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756189 2025-07-19 19:48:05,991 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756191_15367 src: /192.168.158.5:60462 dest: /192.168.158.4:9866 2025-07-19 19:48:06,009 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:60462, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2101253301_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756191_15367, duration(ns): 16722074 2025-07-19 19:48:06,010 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756191_15367, type=LAST_IN_PIPELINE terminating 2025-07-19 19:48:10,509 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756191_15367 replica FinalizedReplica, blk_1073756191_15367, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756191 for deletion 2025-07-19 19:48:10,510 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756191_15367 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756191 2025-07-19 19:49:05,988 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756192_15368 src: /192.168.158.5:40108 dest: /192.168.158.4:9866 2025-07-19 19:49:06,015 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:40108, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-212696872_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756192_15368, duration(ns): 21009979 2025-07-19 19:49:06,015 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756192_15368, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-19 19:49:07,510 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756192_15368 replica FinalizedReplica, blk_1073756192_15368, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756192 for deletion 2025-07-19 19:49:07,511 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756192_15368 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756192 2025-07-19 19:51:10,991 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756194_15370 src: /192.168.158.9:34952 dest: /192.168.158.4:9866 2025-07-19 19:51:11,010 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:34952, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1873459903_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756194_15370, duration(ns): 17035512 2025-07-19 19:51:11,010 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756194_15370, type=LAST_IN_PIPELINE terminating 2025-07-19 19:51:13,515 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756194_15370 replica FinalizedReplica, blk_1073756194_15370, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756194 for deletion 2025-07-19 19:51:13,516 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756194_15370 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756194 2025-07-19 19:53:16,007 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756196_15372 src: /192.168.158.1:54230 dest: /192.168.158.4:9866 2025-07-19 19:53:16,041 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54230, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-512343105_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756196_15372, duration(ns): 24449861 2025-07-19 19:53:16,041 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756196_15372, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-19 19:53:19,520 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756196_15372 replica FinalizedReplica, blk_1073756196_15372, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756196 for deletion 2025-07-19 19:53:19,521 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756196_15372 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756196 2025-07-19 19:54:16,007 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756197_15373 src: /192.168.158.8:49766 dest: /192.168.158.4:9866 2025-07-19 19:54:16,035 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:49766, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_732089610_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756197_15373, duration(ns): 22406105 2025-07-19 19:54:16,036 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756197_15373, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-19 19:54:22,520 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756197_15373 replica FinalizedReplica, blk_1073756197_15373, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756197 for deletion 2025-07-19 19:54:22,521 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756197_15373 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756197 2025-07-19 19:55:21,006 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756198_15374 src: /192.168.158.7:54374 dest: /192.168.158.4:9866 2025-07-19 19:55:21,033 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:54374, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_272367268_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756198_15374, duration(ns): 21943527 2025-07-19 19:55:21,034 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756198_15374, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-19 19:55:22,521 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756198_15374 replica FinalizedReplica, blk_1073756198_15374, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756198 for deletion 2025-07-19 19:55:22,523 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756198_15374 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756198 2025-07-19 19:56:21,046 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756199_15375 src: /192.168.158.1:52922 dest: /192.168.158.4:9866 2025-07-19 19:56:21,078 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52922, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-398835677_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756199_15375, duration(ns): 23157334 2025-07-19 19:56:21,078 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756199_15375, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-19 19:56:22,524 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756199_15375 replica FinalizedReplica, blk_1073756199_15375, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756199 for deletion 2025-07-19 19:56:22,525 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756199_15375 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756199 2025-07-19 19:57:21,020 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756200_15376 src: /192.168.158.1:35066 dest: /192.168.158.4:9866 2025-07-19 19:57:21,056 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35066, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_166448307_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756200_15376, duration(ns): 26166883 2025-07-19 19:57:21,056 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756200_15376, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-19 19:57:25,527 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756200_15376 replica FinalizedReplica, blk_1073756200_15376, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756200 for deletion 2025-07-19 19:57:25,528 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756200_15376 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756200 2025-07-19 19:59:26,017 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756202_15378 src: /192.168.158.9:38078 dest: /192.168.158.4:9866 2025-07-19 19:59:26,044 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:38078, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1439386166_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756202_15378, duration(ns): 21518895 2025-07-19 19:59:26,044 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756202_15378, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-19 19:59:28,531 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756202_15378 replica FinalizedReplica, blk_1073756202_15378, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756202 for deletion 2025-07-19 19:59:28,532 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756202_15378 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756202 2025-07-19 20:00:26,024 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756203_15379 src: /192.168.158.8:52720 dest: /192.168.158.4:9866 2025-07-19 20:00:26,053 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:52720, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_235213188_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756203_15379, duration(ns): 21523550 2025-07-19 20:00:26,053 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756203_15379, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-19 20:00:31,531 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756203_15379 replica FinalizedReplica, blk_1073756203_15379, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756203 for deletion 2025-07-19 20:00:31,532 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756203_15379 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756203 2025-07-19 20:04:31,001 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756207_15383 src: /192.168.158.1:50540 dest: /192.168.158.4:9866 2025-07-19 20:04:31,036 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50540, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_16025581_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756207_15383, duration(ns): 25959292 2025-07-19 20:04:31,036 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756207_15383, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-19 20:04:34,538 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756207_15383 replica FinalizedReplica, blk_1073756207_15383, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756207 for deletion 2025-07-19 20:04:34,539 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756207_15383 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756207 2025-07-19 20:05:31,010 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756208_15384 src: /192.168.158.9:42726 dest: /192.168.158.4:9866 2025-07-19 20:05:31,036 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:42726, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1495097472_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756208_15384, duration(ns): 20484402 2025-07-19 20:05:31,036 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756208_15384, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-19 20:05:34,540 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756208_15384 replica FinalizedReplica, blk_1073756208_15384, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756208 for deletion 2025-07-19 20:05:34,541 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756208_15384 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756208 2025-07-19 20:07:31,012 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756210_15386 src: /192.168.158.9:45084 dest: /192.168.158.4:9866 2025-07-19 20:07:31,034 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:45084, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2099382896_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756210_15386, duration(ns): 19836900 2025-07-19 20:07:31,034 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756210_15386, type=LAST_IN_PIPELINE terminating 2025-07-19 20:07:37,542 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756210_15386 replica FinalizedReplica, blk_1073756210_15386, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756210 for deletion 2025-07-19 20:07:37,543 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756210_15386 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756210 2025-07-19 20:09:31,020 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756212_15388 src: /192.168.158.8:49308 dest: /192.168.158.4:9866 2025-07-19 20:09:31,041 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:49308, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2019006156_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756212_15388, duration(ns): 17417882 2025-07-19 20:09:31,041 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756212_15388, type=LAST_IN_PIPELINE terminating 2025-07-19 20:09:34,547 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756212_15388 replica FinalizedReplica, blk_1073756212_15388, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756212 for deletion 2025-07-19 20:09:34,548 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756212_15388 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756212 2025-07-19 20:11:36,019 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756214_15390 src: /192.168.158.5:53356 dest: /192.168.158.4:9866 2025-07-19 20:11:36,047 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:53356, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-529123227_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756214_15390, duration(ns): 23067115 2025-07-19 20:11:36,048 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756214_15390, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-19 20:11:37,552 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756214_15390 replica FinalizedReplica, blk_1073756214_15390, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756214 for deletion 2025-07-19 20:11:37,553 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756214_15390 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756214 2025-07-19 20:14:41,018 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756217_15393 src: /192.168.158.6:45588 dest: /192.168.158.4:9866 2025-07-19 20:14:41,048 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:45588, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2046849637_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756217_15393, duration(ns): 23902296 2025-07-19 20:14:41,049 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756217_15393, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-19 20:14:46,559 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756217_15393 replica FinalizedReplica, blk_1073756217_15393, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756217 for deletion 2025-07-19 20:14:46,560 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756217_15393 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756217 2025-07-19 20:15:46,028 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756218_15394 src: /192.168.158.1:52700 dest: /192.168.158.4:9866 2025-07-19 20:15:46,061 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52700, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_465268795_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756218_15394, duration(ns): 24316818 2025-07-19 20:15:46,062 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756218_15394, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-19 20:15:46,559 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756218_15394 replica FinalizedReplica, blk_1073756218_15394, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756218 for deletion 2025-07-19 20:15:46,560 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756218_15394 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756218 2025-07-19 20:16:46,020 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756219_15395 src: /192.168.158.9:36000 dest: /192.168.158.4:9866 2025-07-19 20:16:46,046 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:36000, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_60405294_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756219_15395, duration(ns): 20388736 2025-07-19 20:16:46,047 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756219_15395, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-19 20:16:46,560 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756219_15395 replica FinalizedReplica, blk_1073756219_15395, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756219 for deletion 2025-07-19 20:16:46,562 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756219_15395 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756219 2025-07-19 20:17:46,021 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756220_15396 src: /192.168.158.9:37856 dest: /192.168.158.4:9866 2025-07-19 20:17:46,048 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:37856, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2667491_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756220_15396, duration(ns): 20784104 2025-07-19 20:17:46,049 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756220_15396, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-19 20:17:46,561 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756220_15396 replica FinalizedReplica, blk_1073756220_15396, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756220 for deletion 2025-07-19 20:17:46,562 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756220_15396 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756220 2025-07-19 20:24:46,035 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756227_15403 src: /192.168.158.9:48184 dest: /192.168.158.4:9866 2025-07-19 20:24:46,061 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:48184, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1329767780_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756227_15403, duration(ns): 20578740 2025-07-19 20:24:46,061 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756227_15403, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-19 20:24:49,581 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756227_15403 replica FinalizedReplica, blk_1073756227_15403, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756227 for deletion 2025-07-19 20:24:49,582 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756227_15403 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756227 2025-07-19 20:26:46,041 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756229_15405 src: /192.168.158.8:33600 dest: /192.168.158.4:9866 2025-07-19 20:26:46,067 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:33600, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-400383818_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756229_15405, duration(ns): 20451606 2025-07-19 20:26:46,067 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756229_15405, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-19 20:26:46,585 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756229_15405 replica FinalizedReplica, blk_1073756229_15405, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756229 for deletion 2025-07-19 20:26:46,586 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756229_15405 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756229 2025-07-19 20:29:46,049 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756232_15408 src: /192.168.158.8:52566 dest: /192.168.158.4:9866 2025-07-19 20:29:46,069 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:52566, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_13593608_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756232_15408, duration(ns): 17886993 2025-07-19 20:29:46,069 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756232_15408, type=LAST_IN_PIPELINE terminating 2025-07-19 20:29:46,596 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756232_15408 replica FinalizedReplica, blk_1073756232_15408, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756232 for deletion 2025-07-19 20:29:46,597 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756232_15408 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756232 2025-07-19 20:31:51,055 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756234_15410 src: /192.168.158.5:58012 dest: /192.168.158.4:9866 2025-07-19 20:31:51,081 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:58012, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2044628852_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756234_15410, duration(ns): 21129563 2025-07-19 20:31:51,082 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756234_15410, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-19 20:31:55,600 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756234_15410 replica FinalizedReplica, blk_1073756234_15410, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756234 for deletion 2025-07-19 20:31:55,601 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756234_15410 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756234 2025-07-19 20:32:51,063 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756235_15411 src: /192.168.158.8:44598 dest: /192.168.158.4:9866 2025-07-19 20:32:51,090 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:44598, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-958826651_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756235_15411, duration(ns): 21939599 2025-07-19 20:32:51,090 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756235_15411, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-19 20:32:52,604 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756235_15411 replica FinalizedReplica, blk_1073756235_15411, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756235 for deletion 2025-07-19 20:32:52,605 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756235_15411 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756235 2025-07-19 20:33:51,081 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756236_15412 src: /192.168.158.1:40468 dest: /192.168.158.4:9866 2025-07-19 20:33:51,117 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40468, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1243844276_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756236_15412, duration(ns): 26541479 2025-07-19 20:33:51,117 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756236_15412, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-19 20:33:55,606 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756236_15412 replica FinalizedReplica, blk_1073756236_15412, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756236 for deletion 2025-07-19 20:33:55,607 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756236_15412 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756236 2025-07-19 20:34:51,063 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756237_15413 src: /192.168.158.1:49458 dest: /192.168.158.4:9866 2025-07-19 20:34:51,100 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49458, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-111342720_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756237_15413, duration(ns): 28493555 2025-07-19 20:34:51,101 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756237_15413, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-19 20:34:55,610 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756237_15413 replica FinalizedReplica, blk_1073756237_15413, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756237 for deletion 2025-07-19 20:34:55,611 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756237_15413 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756237 2025-07-19 20:36:51,062 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756239_15415 src: /192.168.158.1:58064 dest: /192.168.158.4:9866 2025-07-19 20:36:51,096 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58064, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-309034257_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756239_15415, duration(ns): 24355735 2025-07-19 20:36:51,096 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756239_15415, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-19 20:36:52,613 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756239_15415 replica FinalizedReplica, blk_1073756239_15415, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756239 for deletion 2025-07-19 20:36:52,614 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756239_15415 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756239 2025-07-19 20:37:51,062 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756240_15416 src: /192.168.158.8:42688 dest: /192.168.158.4:9866 2025-07-19 20:37:51,088 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:42688, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1558307394_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756240_15416, duration(ns): 20232862 2025-07-19 20:37:51,089 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756240_15416, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-19 20:37:52,614 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756240_15416 replica FinalizedReplica, blk_1073756240_15416, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756240 for deletion 2025-07-19 20:37:52,615 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756240_15416 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756240 2025-07-19 20:38:56,066 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756241_15417 src: /192.168.158.6:48030 dest: /192.168.158.4:9866 2025-07-19 20:38:56,090 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:48030, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_630194308_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756241_15417, duration(ns): 19127703 2025-07-19 20:38:56,090 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756241_15417, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-19 20:38:58,615 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756241_15417 replica FinalizedReplica, blk_1073756241_15417, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756241 for deletion 2025-07-19 20:38:58,616 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756241_15417 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756241 2025-07-19 20:43:56,067 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756246_15422 src: /192.168.158.5:60746 dest: /192.168.158.4:9866 2025-07-19 20:43:56,095 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:60746, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1246898526_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756246_15422, duration(ns): 22121120 2025-07-19 20:43:56,095 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756246_15422, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-19 20:43:58,628 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756246_15422 replica FinalizedReplica, blk_1073756246_15422, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756246 for deletion 2025-07-19 20:43:58,629 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756246_15422 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756246 2025-07-19 20:44:56,072 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756247_15423 src: /192.168.158.1:59928 dest: /192.168.158.4:9866 2025-07-19 20:44:56,105 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59928, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_912276506_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756247_15423, duration(ns): 23558400 2025-07-19 20:44:56,106 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756247_15423, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-19 20:44:58,628 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756247_15423 replica FinalizedReplica, blk_1073756247_15423, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756247 for deletion 2025-07-19 20:44:58,629 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756247_15423 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756247 2025-07-19 20:45:56,080 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756248_15424 src: /192.168.158.8:35000 dest: /192.168.158.4:9866 2025-07-19 20:45:56,098 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:35000, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1630037348_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756248_15424, duration(ns): 16732707 2025-07-19 20:45:56,099 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756248_15424, type=LAST_IN_PIPELINE terminating 2025-07-19 20:45:58,630 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756248_15424 replica FinalizedReplica, blk_1073756248_15424, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756248 for deletion 2025-07-19 20:45:58,633 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756248_15424 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756248 2025-07-19 20:48:01,084 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756250_15426 src: /192.168.158.1:34032 dest: /192.168.158.4:9866 2025-07-19 20:48:01,121 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34032, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1045720751_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756250_15426, duration(ns): 27107807 2025-07-19 20:48:01,122 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756250_15426, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-19 20:48:01,634 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756250_15426 replica FinalizedReplica, blk_1073756250_15426, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756250 for deletion 2025-07-19 20:48:01,636 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756250_15426 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756250 2025-07-19 20:51:01,103 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756253_15429 src: /192.168.158.7:50996 dest: /192.168.158.4:9866 2025-07-19 20:51:01,122 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:50996, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-714995833_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756253_15429, duration(ns): 16162653 2025-07-19 20:51:01,122 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756253_15429, type=LAST_IN_PIPELINE terminating 2025-07-19 20:51:01,641 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756253_15429 replica FinalizedReplica, blk_1073756253_15429, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756253 for deletion 2025-07-19 20:51:01,642 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756253_15429 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756253 2025-07-19 20:53:11,093 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756255_15431 src: /192.168.158.5:37588 dest: /192.168.158.4:9866 2025-07-19 20:53:11,121 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:37588, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1682190136_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756255_15431, duration(ns): 22079643 2025-07-19 20:53:11,121 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756255_15431, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-19 20:53:16,645 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756255_15431 replica FinalizedReplica, blk_1073756255_15431, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756255 for deletion 2025-07-19 20:53:16,646 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756255_15431 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756255 2025-07-19 20:54:11,101 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756256_15432 src: /192.168.158.8:53228 dest: /192.168.158.4:9866 2025-07-19 20:54:11,122 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:53228, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1165114979_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756256_15432, duration(ns): 18715472 2025-07-19 20:54:11,122 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756256_15432, type=LAST_IN_PIPELINE terminating 2025-07-19 20:54:13,647 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756256_15432 replica FinalizedReplica, blk_1073756256_15432, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756256 for deletion 2025-07-19 20:54:13,648 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756256_15432 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756256 2025-07-19 20:55:11,099 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756257_15433 src: /192.168.158.9:56226 dest: /192.168.158.4:9866 2025-07-19 20:55:11,126 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:56226, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1420031262_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756257_15433, duration(ns): 21156781 2025-07-19 20:55:11,126 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756257_15433, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-19 20:55:13,648 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756257_15433 replica FinalizedReplica, blk_1073756257_15433, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756257 for deletion 2025-07-19 20:55:13,650 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756257_15433 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756257 2025-07-19 20:56:11,103 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756258_15434 src: /192.168.158.7:39398 dest: /192.168.158.4:9866 2025-07-19 20:56:11,122 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:39398, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-76511074_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756258_15434, duration(ns): 17282910 2025-07-19 20:56:11,122 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756258_15434, type=LAST_IN_PIPELINE terminating 2025-07-19 20:56:13,649 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756258_15434 replica FinalizedReplica, blk_1073756258_15434, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756258 for deletion 2025-07-19 20:56:13,651 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756258_15434 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756258 2025-07-19 20:59:21,100 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756261_15437 src: /192.168.158.7:46266 dest: /192.168.158.4:9866 2025-07-19 20:59:21,119 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:46266, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_436136919_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756261_15437, duration(ns): 17115735 2025-07-19 20:59:21,120 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756261_15437, type=LAST_IN_PIPELINE terminating 2025-07-19 20:59:22,657 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756261_15437 replica FinalizedReplica, blk_1073756261_15437, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756261 for deletion 2025-07-19 20:59:22,658 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756261_15437 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756261 2025-07-19 21:00:21,104 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756262_15438 src: /192.168.158.9:35956 dest: /192.168.158.4:9866 2025-07-19 21:00:21,123 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:35956, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-424258955_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756262_15438, duration(ns): 16691569 2025-07-19 21:00:21,124 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756262_15438, type=LAST_IN_PIPELINE terminating 2025-07-19 21:00:22,657 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756262_15438 replica FinalizedReplica, blk_1073756262_15438, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756262 for deletion 2025-07-19 21:00:22,659 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756262_15438 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756262 2025-07-19 21:04:36,103 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756266_15442 src: /192.168.158.1:53350 dest: /192.168.158.4:9866 2025-07-19 21:04:36,137 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53350, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1951775507_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756266_15442, duration(ns): 24689116 2025-07-19 21:04:36,137 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756266_15442, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-19 21:04:37,667 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756266_15442 replica FinalizedReplica, blk_1073756266_15442, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756266 for deletion 2025-07-19 21:04:37,668 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756266_15442 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756266 2025-07-19 21:07:46,107 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756269_15445 src: /192.168.158.5:38092 dest: /192.168.158.4:9866 2025-07-19 21:07:46,126 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:38092, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1156455190_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756269_15445, duration(ns): 16475779 2025-07-19 21:07:46,126 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756269_15445, type=LAST_IN_PIPELINE terminating 2025-07-19 21:07:46,675 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756269_15445 replica FinalizedReplica, blk_1073756269_15445, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756269 for deletion 2025-07-19 21:07:46,676 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756269_15445 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756269 2025-07-19 21:10:51,114 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756272_15448 src: /192.168.158.6:44424 dest: /192.168.158.4:9866 2025-07-19 21:10:51,134 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:44424, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1856682120_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756272_15448, duration(ns): 17374277 2025-07-19 21:10:51,134 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756272_15448, type=LAST_IN_PIPELINE terminating 2025-07-19 21:10:52,684 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756272_15448 replica FinalizedReplica, blk_1073756272_15448, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756272 for deletion 2025-07-19 21:10:52,685 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756272_15448 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756272 2025-07-19 21:12:56,099 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756274_15450 src: /192.168.158.1:38758 dest: /192.168.158.4:9866 2025-07-19 21:12:56,134 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38758, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_703957384_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756274_15450, duration(ns): 26336683 2025-07-19 21:12:56,134 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756274_15450, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-19 21:13:01,692 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756274_15450 replica FinalizedReplica, blk_1073756274_15450, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756274 for deletion 2025-07-19 21:13:01,693 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756274_15450 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756274 2025-07-19 21:14:56,105 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756276_15452 src: /192.168.158.5:36044 dest: /192.168.158.4:9866 2025-07-19 21:14:56,131 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:36044, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-984685003_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756276_15452, duration(ns): 21119227 2025-07-19 21:14:56,132 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756276_15452, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-19 21:14:58,698 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756276_15452 replica FinalizedReplica, blk_1073756276_15452, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756276 for deletion 2025-07-19 21:14:58,699 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756276_15452 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756276 2025-07-19 21:15:56,103 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756277_15453 src: /192.168.158.1:57488 dest: /192.168.158.4:9866 2025-07-19 21:15:56,137 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57488, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1597911213_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756277_15453, duration(ns): 25041335 2025-07-19 21:15:56,138 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756277_15453, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-19 21:15:58,699 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756277_15453 replica FinalizedReplica, blk_1073756277_15453, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756277 for deletion 2025-07-19 21:15:58,700 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756277_15453 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756277 2025-07-19 21:16:56,112 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756278_15454 src: /192.168.158.6:35770 dest: /192.168.158.4:9866 2025-07-19 21:16:56,138 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:35770, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_221149832_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756278_15454, duration(ns): 20464854 2025-07-19 21:16:56,138 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756278_15454, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-19 21:17:01,701 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756278_15454 replica FinalizedReplica, blk_1073756278_15454, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756278 for deletion 2025-07-19 21:17:01,702 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756278_15454 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756278 2025-07-19 21:17:56,119 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756279_15455 src: /192.168.158.9:52814 dest: /192.168.158.4:9866 2025-07-19 21:17:56,139 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:52814, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1046667586_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756279_15455, duration(ns): 17419868 2025-07-19 21:17:56,139 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756279_15455, type=LAST_IN_PIPELINE terminating 2025-07-19 21:18:01,704 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756279_15455 replica FinalizedReplica, blk_1073756279_15455, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756279 for deletion 2025-07-19 21:18:01,705 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756279_15455 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756279 2025-07-19 21:18:56,118 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756280_15456 src: /192.168.158.5:49404 dest: /192.168.158.4:9866 2025-07-19 21:18:56,145 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:49404, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_144367968_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756280_15456, duration(ns): 19959723 2025-07-19 21:18:56,145 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756280_15456, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-19 21:18:58,706 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756280_15456 replica FinalizedReplica, blk_1073756280_15456, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756280 for deletion 2025-07-19 21:18:58,707 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756280_15456 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756280 2025-07-19 21:19:56,115 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756281_15457 src: /192.168.158.1:42072 dest: /192.168.158.4:9866 2025-07-19 21:19:56,149 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42072, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1943045109_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756281_15457, duration(ns): 24789854 2025-07-19 21:19:56,149 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756281_15457, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-19 21:19:58,707 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756281_15457 replica FinalizedReplica, blk_1073756281_15457, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756281 for deletion 2025-07-19 21:19:58,709 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756281_15457 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756281 2025-07-19 21:20:56,119 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756282_15458 src: /192.168.158.7:36582 dest: /192.168.158.4:9866 2025-07-19 21:20:56,147 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:36582, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_864644780_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756282_15458, duration(ns): 21983512 2025-07-19 21:20:56,147 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756282_15458, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-19 21:21:01,709 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756282_15458 replica FinalizedReplica, blk_1073756282_15458, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756282 for deletion 2025-07-19 21:21:01,710 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756282_15458 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756282 2025-07-19 21:21:56,124 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756283_15459 src: /192.168.158.5:55854 dest: /192.168.158.4:9866 2025-07-19 21:21:56,150 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:55854, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-936842132_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756283_15459, duration(ns): 19895802 2025-07-19 21:21:56,150 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756283_15459, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-19 21:22:01,710 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756283_15459 replica FinalizedReplica, blk_1073756283_15459, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756283 for deletion 2025-07-19 21:22:01,711 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756283_15459 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756283 2025-07-19 21:23:56,130 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756285_15461 src: /192.168.158.6:47592 dest: /192.168.158.4:9866 2025-07-19 21:23:56,149 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:47592, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1824299074_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756285_15461, duration(ns): 16255473 2025-07-19 21:23:56,149 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756285_15461, type=LAST_IN_PIPELINE terminating 2025-07-19 21:23:58,714 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756285_15461 replica FinalizedReplica, blk_1073756285_15461, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756285 for deletion 2025-07-19 21:23:58,715 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756285_15461 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756285 2025-07-19 21:27:01,139 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756288_15464 src: /192.168.158.8:59040 dest: /192.168.158.4:9866 2025-07-19 21:27:01,163 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:59040, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1944393029_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756288_15464, duration(ns): 21242411 2025-07-19 21:27:01,164 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756288_15464, type=LAST_IN_PIPELINE terminating 2025-07-19 21:27:04,721 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756288_15464 replica FinalizedReplica, blk_1073756288_15464, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756288 for deletion 2025-07-19 21:27:04,722 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756288_15464 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756288 2025-07-19 21:28:01,140 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756289_15465 src: /192.168.158.8:52526 dest: /192.168.158.4:9866 2025-07-19 21:28:01,161 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:52526, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-174994550_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756289_15465, duration(ns): 19676548 2025-07-19 21:28:01,162 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756289_15465, type=LAST_IN_PIPELINE terminating 2025-07-19 21:28:01,725 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756289_15465 replica FinalizedReplica, blk_1073756289_15465, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756289 for deletion 2025-07-19 21:28:01,726 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756289_15465 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756289 2025-07-19 21:29:01,167 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756290_15466 src: /192.168.158.5:50990 dest: /192.168.158.4:9866 2025-07-19 21:29:01,196 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:50990, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1258535077_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756290_15466, duration(ns): 23628443 2025-07-19 21:29:01,197 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756290_15466, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-19 21:29:01,798 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756290_15466 replica FinalizedReplica, blk_1073756290_15466, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756290 for deletion 2025-07-19 21:29:01,799 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756290_15466 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756290 2025-07-19 21:31:01,140 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756292_15468 src: /192.168.158.8:50918 dest: /192.168.158.4:9866 2025-07-19 21:31:01,166 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:50918, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1232526718_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756292_15468, duration(ns): 20301702 2025-07-19 21:31:01,166 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756292_15468, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-19 21:31:04,737 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756292_15468 replica FinalizedReplica, blk_1073756292_15468, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756292 for deletion 2025-07-19 21:31:04,738 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756292_15468 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756292 2025-07-19 21:33:06,144 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756294_15470 src: /192.168.158.9:41462 dest: /192.168.158.4:9866 2025-07-19 21:33:06,161 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:41462, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1486260803_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756294_15470, duration(ns): 15795501 2025-07-19 21:33:06,161 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756294_15470, type=LAST_IN_PIPELINE terminating 2025-07-19 21:33:07,741 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756294_15470 replica FinalizedReplica, blk_1073756294_15470, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756294 for deletion 2025-07-19 21:33:07,742 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756294_15470 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756294 2025-07-19 21:34:06,150 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756295_15471 src: /192.168.158.8:48050 dest: /192.168.158.4:9866 2025-07-19 21:34:06,169 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:48050, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1589904365_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756295_15471, duration(ns): 17006672 2025-07-19 21:34:06,170 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756295_15471, type=LAST_IN_PIPELINE terminating 2025-07-19 21:34:07,744 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756295_15471 replica FinalizedReplica, blk_1073756295_15471, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756295 for deletion 2025-07-19 21:34:07,745 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756295_15471 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756295 2025-07-19 21:35:11,148 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756296_15472 src: /192.168.158.9:47120 dest: /192.168.158.4:9866 2025-07-19 21:35:11,169 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:47120, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1743171071_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756296_15472, duration(ns): 18805556 2025-07-19 21:35:11,169 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756296_15472, type=LAST_IN_PIPELINE terminating 2025-07-19 21:35:13,745 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756296_15472 replica FinalizedReplica, blk_1073756296_15472, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756296 for deletion 2025-07-19 21:35:13,746 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756296_15472 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756296 2025-07-19 21:36:16,144 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756297_15473 src: /192.168.158.1:53546 dest: /192.168.158.4:9866 2025-07-19 21:36:16,177 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53546, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_894147232_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756297_15473, duration(ns): 23492109 2025-07-19 21:36:16,177 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756297_15473, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-19 21:36:19,747 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756297_15473 replica FinalizedReplica, blk_1073756297_15473, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756297 for deletion 2025-07-19 21:36:19,748 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756297_15473 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756297 2025-07-19 21:41:21,156 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756302_15478 src: /192.168.158.6:57056 dest: /192.168.158.4:9866 2025-07-19 21:41:21,175 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:57056, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_649530152_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756302_15478, duration(ns): 16817833 2025-07-19 21:41:21,175 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756302_15478, type=LAST_IN_PIPELINE terminating 2025-07-19 21:41:25,759 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756302_15478 replica FinalizedReplica, blk_1073756302_15478, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756302 for deletion 2025-07-19 21:41:25,760 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756302_15478 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756302 2025-07-19 21:43:21,158 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756304_15480 src: /192.168.158.8:48160 dest: /192.168.158.4:9866 2025-07-19 21:43:21,184 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:48160, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_443962338_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756304_15480, duration(ns): 20179377 2025-07-19 21:43:21,185 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756304_15480, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-19 21:43:22,768 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756304_15480 replica FinalizedReplica, blk_1073756304_15480, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756304 for deletion 2025-07-19 21:43:22,769 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756304_15480 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756304 2025-07-19 21:44:26,154 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756305_15481 src: /192.168.158.1:42636 dest: /192.168.158.4:9866 2025-07-19 21:44:26,187 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42636, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1193366666_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756305_15481, duration(ns): 24317066 2025-07-19 21:44:26,188 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756305_15481, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-19 21:44:28,771 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756305_15481 replica FinalizedReplica, blk_1073756305_15481, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756305 for deletion 2025-07-19 21:44:28,772 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756305_15481 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756305 2025-07-19 21:46:31,165 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756307_15483 src: /192.168.158.6:44940 dest: /192.168.158.4:9866 2025-07-19 21:46:31,187 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:44940, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1733135849_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756307_15483, duration(ns): 19285513 2025-07-19 21:46:31,187 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756307_15483, type=LAST_IN_PIPELINE terminating 2025-07-19 21:46:31,779 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756307_15483 replica FinalizedReplica, blk_1073756307_15483, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756307 for deletion 2025-07-19 21:46:31,780 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756307_15483 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756307 2025-07-19 21:47:31,169 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756308_15484 src: /192.168.158.9:37152 dest: /192.168.158.4:9866 2025-07-19 21:47:31,196 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:37152, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_664782116_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756308_15484, duration(ns): 20969342 2025-07-19 21:47:31,196 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756308_15484, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-19 21:47:31,782 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756308_15484 replica FinalizedReplica, blk_1073756308_15484, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756308 for deletion 2025-07-19 21:47:31,783 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756308_15484 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756308 2025-07-19 21:50:36,174 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756311_15487 src: /192.168.158.5:49080 dest: /192.168.158.4:9866 2025-07-19 21:50:36,193 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:49080, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1628841246_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756311_15487, duration(ns): 16988289 2025-07-19 21:50:36,193 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756311_15487, type=LAST_IN_PIPELINE terminating 2025-07-19 21:50:37,789 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756311_15487 replica FinalizedReplica, blk_1073756311_15487, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756311 for deletion 2025-07-19 21:50:37,790 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756311_15487 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756311 2025-07-19 21:51:36,177 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756312_15488 src: /192.168.158.5:44688 dest: /192.168.158.4:9866 2025-07-19 21:51:36,198 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:44688, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1977470989_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756312_15488, duration(ns): 19018913 2025-07-19 21:51:36,198 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756312_15488, type=LAST_IN_PIPELINE terminating 2025-07-19 21:51:37,790 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756312_15488 replica FinalizedReplica, blk_1073756312_15488, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756312 for deletion 2025-07-19 21:51:37,792 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756312_15488 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756312 2025-07-19 21:54:36,175 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756315_15491 src: /192.168.158.7:60052 dest: /192.168.158.4:9866 2025-07-19 21:54:36,201 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:60052, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_118165705_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756315_15491, duration(ns): 21038426 2025-07-19 21:54:36,201 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756315_15491, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-19 21:54:37,799 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756315_15491 replica FinalizedReplica, blk_1073756315_15491, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756315 for deletion 2025-07-19 21:54:37,800 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756315_15491 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756315 2025-07-19 21:56:36,187 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756317_15493 src: /192.168.158.7:50562 dest: /192.168.158.4:9866 2025-07-19 21:56:36,216 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:50562, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-654002234_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756317_15493, duration(ns): 24294791 2025-07-19 21:56:36,216 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756317_15493, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-19 21:56:40,804 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756317_15493 replica FinalizedReplica, blk_1073756317_15493, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756317 for deletion 2025-07-19 21:56:40,805 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756317_15493 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756317 2025-07-19 21:57:36,184 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756318_15494 src: /192.168.158.5:38052 dest: /192.168.158.4:9866 2025-07-19 21:57:36,202 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:38052, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-422985074_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756318_15494, duration(ns): 16904034 2025-07-19 21:57:36,203 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756318_15494, type=LAST_IN_PIPELINE terminating 2025-07-19 21:57:37,807 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756318_15494 replica FinalizedReplica, blk_1073756318_15494, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756318 for deletion 2025-07-19 21:57:37,808 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756318_15494 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756318 2025-07-19 21:59:19,812 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Successfully sent block report 0x12cfd9a757d31f4f, containing 4 storage report(s), of which we sent 4. The reports had 8 total blocks and used 1 RPC(s). This took 1 msec to generate and 4 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5. 2025-07-19 21:59:19,812 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Got finalize command for block pool BP-1059995147-192.168.158.1-1752101929360 2025-07-19 21:59:36,224 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756320_15496 src: /192.168.158.6:39544 dest: /192.168.158.4:9866 2025-07-19 21:59:36,243 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:39544, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2093275274_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756320_15496, duration(ns): 16801355 2025-07-19 21:59:36,243 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756320_15496, type=LAST_IN_PIPELINE terminating 2025-07-19 21:59:40,809 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756320_15496 replica FinalizedReplica, blk_1073756320_15496, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756320 for deletion 2025-07-19 21:59:40,810 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756320_15496 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756320 2025-07-19 22:00:41,192 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756321_15497 src: /192.168.158.8:53252 dest: /192.168.158.4:9866 2025-07-19 22:00:41,213 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:53252, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_507331243_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756321_15497, duration(ns): 19113162 2025-07-19 22:00:41,214 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756321_15497, type=LAST_IN_PIPELINE terminating 2025-07-19 22:00:43,814 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756321_15497 replica FinalizedReplica, blk_1073756321_15497, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756321 for deletion 2025-07-19 22:00:43,815 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756321_15497 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756321 2025-07-19 22:01:41,196 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756322_15498 src: /192.168.158.5:56756 dest: /192.168.158.4:9866 2025-07-19 22:01:41,227 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:56756, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-936484137_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756322_15498, duration(ns): 24371670 2025-07-19 22:01:41,227 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756322_15498, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-19 22:01:43,819 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756322_15498 replica FinalizedReplica, blk_1073756322_15498, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756322 for deletion 2025-07-19 22:01:43,820 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756322_15498 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756322 2025-07-19 22:02:41,195 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756323_15499 src: /192.168.158.1:48104 dest: /192.168.158.4:9866 2025-07-19 22:02:41,229 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48104, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_350712434_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756323_15499, duration(ns): 24959084 2025-07-19 22:02:41,230 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756323_15499, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-19 22:02:46,824 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756323_15499 replica FinalizedReplica, blk_1073756323_15499, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756323 for deletion 2025-07-19 22:02:46,825 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756323_15499 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756323 2025-07-19 22:03:46,199 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756324_15500 src: /192.168.158.9:60754 dest: /192.168.158.4:9866 2025-07-19 22:03:46,219 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:60754, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_102068570_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756324_15500, duration(ns): 17585663 2025-07-19 22:03:46,219 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756324_15500, type=LAST_IN_PIPELINE terminating 2025-07-19 22:03:46,828 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756324_15500 replica FinalizedReplica, blk_1073756324_15500, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756324 for deletion 2025-07-19 22:03:46,829 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756324_15500 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756324 2025-07-19 22:04:46,194 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756325_15501 src: /192.168.158.1:35318 dest: /192.168.158.4:9866 2025-07-19 22:04:46,226 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35318, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1744125299_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756325_15501, duration(ns): 23492023 2025-07-19 22:04:46,227 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756325_15501, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-19 22:04:46,829 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756325_15501 replica FinalizedReplica, blk_1073756325_15501, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756325 for deletion 2025-07-19 22:04:46,831 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756325_15501 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756325 2025-07-19 22:08:51,207 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756329_15505 src: /192.168.158.9:43236 dest: /192.168.158.4:9866 2025-07-19 22:08:51,227 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:43236, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-875059410_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756329_15505, duration(ns): 18151249 2025-07-19 22:08:51,228 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756329_15505, type=LAST_IN_PIPELINE terminating 2025-07-19 22:08:52,839 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756329_15505 replica FinalizedReplica, blk_1073756329_15505, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756329 for deletion 2025-07-19 22:08:52,840 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756329_15505 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756329 2025-07-19 22:12:56,221 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756333_15509 src: /192.168.158.8:45540 dest: /192.168.158.4:9866 2025-07-19 22:12:56,241 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:45540, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1703697027_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756333_15509, duration(ns): 17739878 2025-07-19 22:12:56,241 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756333_15509, type=LAST_IN_PIPELINE terminating 2025-07-19 22:13:01,853 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756333_15509 replica FinalizedReplica, blk_1073756333_15509, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756333 for deletion 2025-07-19 22:13:01,854 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756333_15509 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756333 2025-07-19 22:15:56,244 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756336_15512 src: /192.168.158.1:37470 dest: /192.168.158.4:9866 2025-07-19 22:15:56,277 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37470, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-358976772_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756336_15512, duration(ns): 24511364 2025-07-19 22:15:56,277 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756336_15512, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-19 22:16:01,863 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756336_15512 replica FinalizedReplica, blk_1073756336_15512, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756336 for deletion 2025-07-19 22:16:01,864 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756336_15512 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756336 2025-07-19 22:16:56,236 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756337_15513 src: /192.168.158.8:46764 dest: /192.168.158.4:9866 2025-07-19 22:16:56,262 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:46764, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1377265128_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756337_15513, duration(ns): 21333502 2025-07-19 22:16:56,263 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756337_15513, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-19 22:16:58,866 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756337_15513 replica FinalizedReplica, blk_1073756337_15513, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756337 for deletion 2025-07-19 22:16:58,867 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756337_15513 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756337 2025-07-19 22:20:11,229 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756340_15516 src: /192.168.158.5:51160 dest: /192.168.158.4:9866 2025-07-19 22:20:11,255 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:51160, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-539749406_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756340_15516, duration(ns): 21622942 2025-07-19 22:20:11,256 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756340_15516, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-19 22:20:13,873 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756340_15516 replica FinalizedReplica, blk_1073756340_15516, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756340 for deletion 2025-07-19 22:20:13,874 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756340_15516 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756340 2025-07-19 22:23:16,213 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756343_15519 src: /192.168.158.5:51662 dest: /192.168.158.4:9866 2025-07-19 22:23:16,238 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:51662, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-747998517_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756343_15519, duration(ns): 19386039 2025-07-19 22:23:16,238 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756343_15519, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-19 22:23:19,881 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756343_15519 replica FinalizedReplica, blk_1073756343_15519, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756343 for deletion 2025-07-19 22:23:19,882 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756343_15519 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756343 2025-07-19 22:30:16,236 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756350_15526 src: /192.168.158.5:56836 dest: /192.168.158.4:9866 2025-07-19 22:30:16,257 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:56836, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_977049048_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756350_15526, duration(ns): 19559524 2025-07-19 22:30:16,258 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756350_15526, type=LAST_IN_PIPELINE terminating 2025-07-19 22:30:16,896 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756350_15526 replica FinalizedReplica, blk_1073756350_15526, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756350 for deletion 2025-07-19 22:30:16,897 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756350_15526 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756350 2025-07-19 22:33:16,235 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756353_15529 src: /192.168.158.1:51540 dest: /192.168.158.4:9866 2025-07-19 22:33:16,272 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51540, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1084912240_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756353_15529, duration(ns): 27766120 2025-07-19 22:33:16,272 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756353_15529, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-19 22:33:19,905 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756353_15529 replica FinalizedReplica, blk_1073756353_15529, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756353 for deletion 2025-07-19 22:33:19,906 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756353_15529 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756353 2025-07-19 22:37:16,251 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756357_15533 src: /192.168.158.7:41294 dest: /192.168.158.4:9866 2025-07-19 22:37:16,269 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:41294, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-815809159_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756357_15533, duration(ns): 15714320 2025-07-19 22:37:16,269 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756357_15533, type=LAST_IN_PIPELINE terminating 2025-07-19 22:37:16,912 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756357_15533 replica FinalizedReplica, blk_1073756357_15533, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756357 for deletion 2025-07-19 22:37:16,913 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756357_15533 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756357 2025-07-19 22:41:16,271 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756361_15537 src: /192.168.158.5:47258 dest: /192.168.158.4:9866 2025-07-19 22:41:16,289 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:47258, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2063281288_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756361_15537, duration(ns): 16655586 2025-07-19 22:41:16,290 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756361_15537, type=LAST_IN_PIPELINE terminating 2025-07-19 22:41:16,924 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756361_15537 replica FinalizedReplica, blk_1073756361_15537, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756361 for deletion 2025-07-19 22:41:16,925 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756361_15537 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756361 2025-07-19 22:42:21,268 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756362_15538 src: /192.168.158.5:52140 dest: /192.168.158.4:9866 2025-07-19 22:42:21,295 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:52140, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1652322323_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756362_15538, duration(ns): 21349866 2025-07-19 22:42:21,295 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756362_15538, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-19 22:42:28,925 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756362_15538 replica FinalizedReplica, blk_1073756362_15538, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756362 for deletion 2025-07-19 22:42:28,926 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756362_15538 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756362 2025-07-19 22:43:21,258 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756363_15539 src: /192.168.158.8:33780 dest: /192.168.158.4:9866 2025-07-19 22:43:21,277 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:33780, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_765893881_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756363_15539, duration(ns): 17156352 2025-07-19 22:43:21,277 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756363_15539, type=LAST_IN_PIPELINE terminating 2025-07-19 22:43:28,929 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756363_15539 replica FinalizedReplica, blk_1073756363_15539, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756363 for deletion 2025-07-19 22:43:28,930 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756363_15539 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756363 2025-07-19 22:44:21,255 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756364_15540 src: /192.168.158.9:41076 dest: /192.168.158.4:9866 2025-07-19 22:44:21,282 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:41076, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1067718113_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756364_15540, duration(ns): 21193341 2025-07-19 22:44:21,282 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756364_15540, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-19 22:44:28,933 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756364_15540 replica FinalizedReplica, blk_1073756364_15540, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756364 for deletion 2025-07-19 22:44:28,934 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756364_15540 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756364 2025-07-19 22:45:21,261 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756365_15541 src: /192.168.158.8:45690 dest: /192.168.158.4:9866 2025-07-19 22:45:21,280 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:45690, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_907142768_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756365_15541, duration(ns): 17380271 2025-07-19 22:45:21,281 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756365_15541, type=LAST_IN_PIPELINE terminating 2025-07-19 22:45:28,933 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756365_15541 replica FinalizedReplica, blk_1073756365_15541, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756365 for deletion 2025-07-19 22:45:28,934 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756365_15541 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756365 2025-07-19 22:47:26,285 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756367_15543 src: /192.168.158.6:55692 dest: /192.168.158.4:9866 2025-07-19 22:47:26,303 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:55692, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-958824573_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756367_15543, duration(ns): 16147412 2025-07-19 22:47:26,304 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756367_15543, type=LAST_IN_PIPELINE terminating 2025-07-19 22:47:31,934 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756367_15543 replica FinalizedReplica, blk_1073756367_15543, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756367 for deletion 2025-07-19 22:47:31,936 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756367_15543 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756367 2025-07-19 22:49:31,282 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756369_15545 src: /192.168.158.9:37904 dest: /192.168.158.4:9866 2025-07-19 22:49:31,308 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:37904, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1258970989_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756369_15545, duration(ns): 20739859 2025-07-19 22:49:31,308 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756369_15545, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-19 22:49:34,939 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756369_15545 replica FinalizedReplica, blk_1073756369_15545, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756369 for deletion 2025-07-19 22:49:34,940 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756369_15545 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756369 2025-07-19 22:50:31,293 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756370_15546 src: /192.168.158.8:60266 dest: /192.168.158.4:9866 2025-07-19 22:50:31,315 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:60266, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1881194818_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756370_15546, duration(ns): 19136645 2025-07-19 22:50:31,316 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756370_15546, type=LAST_IN_PIPELINE terminating 2025-07-19 22:50:34,940 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756370_15546 replica FinalizedReplica, blk_1073756370_15546, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756370 for deletion 2025-07-19 22:50:34,941 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756370_15546 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756370 2025-07-19 22:51:31,292 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756371_15547 src: /192.168.158.7:60322 dest: /192.168.158.4:9866 2025-07-19 22:51:31,311 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:60322, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1028934041_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756371_15547, duration(ns): 17680479 2025-07-19 22:51:31,312 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756371_15547, type=LAST_IN_PIPELINE terminating 2025-07-19 22:51:34,942 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756371_15547 replica FinalizedReplica, blk_1073756371_15547, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756371 for deletion 2025-07-19 22:51:34,944 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756371_15547 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756371 2025-07-19 22:56:31,321 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756376_15552 src: /192.168.158.1:60506 dest: /192.168.158.4:9866 2025-07-19 22:56:31,357 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60506, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_321506390_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756376_15552, duration(ns): 27035300 2025-07-19 22:56:31,357 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756376_15552, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-19 22:56:37,956 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756376_15552 replica FinalizedReplica, blk_1073756376_15552, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756376 for deletion 2025-07-19 22:56:37,957 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756376_15552 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756376 2025-07-19 22:58:31,326 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756378_15554 src: /192.168.158.1:39128 dest: /192.168.158.4:9866 2025-07-19 22:58:31,360 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39128, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1707274312_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756378_15554, duration(ns): 24696004 2025-07-19 22:58:31,360 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756378_15554, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-19 22:58:34,958 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756378_15554 replica FinalizedReplica, blk_1073756378_15554, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756378 for deletion 2025-07-19 22:58:34,959 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756378_15554 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756378 2025-07-19 22:59:31,328 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756379_15555 src: /192.168.158.7:34026 dest: /192.168.158.4:9866 2025-07-19 22:59:31,354 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:34026, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-527965005_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756379_15555, duration(ns): 20387693 2025-07-19 22:59:31,354 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756379_15555, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-19 22:59:34,961 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756379_15555 replica FinalizedReplica, blk_1073756379_15555, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756379 for deletion 2025-07-19 22:59:34,962 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756379_15555 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756379 2025-07-19 23:00:31,330 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756380_15556 src: /192.168.158.7:39404 dest: /192.168.158.4:9866 2025-07-19 23:00:31,348 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:39404, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_11694528_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756380_15556, duration(ns): 14976967 2025-07-19 23:00:31,349 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756380_15556, type=LAST_IN_PIPELINE terminating 2025-07-19 23:00:34,963 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756380_15556 replica FinalizedReplica, blk_1073756380_15556, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756380 for deletion 2025-07-19 23:00:34,964 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756380_15556 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756380 2025-07-19 23:02:31,333 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756382_15558 src: /192.168.158.1:34402 dest: /192.168.158.4:9866 2025-07-19 23:02:31,363 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34402, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1758562076_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756382_15558, duration(ns): 21175754 2025-07-19 23:02:31,363 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756382_15558, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-19 23:02:37,968 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756382_15558 replica FinalizedReplica, blk_1073756382_15558, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756382 for deletion 2025-07-19 23:02:37,969 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756382_15558 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756382 2025-07-19 23:06:31,352 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756386_15562 src: /192.168.158.7:43876 dest: /192.168.158.4:9866 2025-07-19 23:06:31,378 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:43876, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1474571422_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756386_15562, duration(ns): 20546255 2025-07-19 23:06:31,378 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756386_15562, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-19 23:06:34,978 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756386_15562 replica FinalizedReplica, blk_1073756386_15562, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756386 for deletion 2025-07-19 23:06:34,979 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756386_15562 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756386 2025-07-19 23:10:36,352 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756390_15566 src: /192.168.158.1:34700 dest: /192.168.158.4:9866 2025-07-19 23:10:36,387 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34700, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1711831814_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756390_15566, duration(ns): 26341929 2025-07-19 23:10:36,387 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756390_15566, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-19 23:10:43,984 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756390_15566 replica FinalizedReplica, blk_1073756390_15566, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756390 for deletion 2025-07-19 23:10:43,985 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756390_15566 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756390 2025-07-19 23:17:46,348 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756397_15573 src: /192.168.158.8:56496 dest: /192.168.158.4:9866 2025-07-19 23:17:46,367 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:56496, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-686503177_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756397_15573, duration(ns): 16930768 2025-07-19 23:17:46,368 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756397_15573, type=LAST_IN_PIPELINE terminating 2025-07-19 23:17:49,996 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756397_15573 replica FinalizedReplica, blk_1073756397_15573, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756397 for deletion 2025-07-19 23:17:49,997 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756397_15573 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756397 2025-07-19 23:18:46,379 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756398_15574 src: /192.168.158.5:50444 dest: /192.168.158.4:9866 2025-07-19 23:18:46,400 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:50444, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1621398098_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756398_15574, duration(ns): 18766812 2025-07-19 23:18:46,400 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756398_15574, type=LAST_IN_PIPELINE terminating 2025-07-19 23:18:49,999 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756398_15574 replica FinalizedReplica, blk_1073756398_15574, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756398 for deletion 2025-07-19 23:18:50,000 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756398_15574 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756398 2025-07-19 23:20:56,359 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756400_15576 src: /192.168.158.5:42906 dest: /192.168.158.4:9866 2025-07-19 23:20:56,387 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:42906, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2049025804_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756400_15576, duration(ns): 21903000 2025-07-19 23:20:56,387 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756400_15576, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-19 23:21:05,002 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756400_15576 replica FinalizedReplica, blk_1073756400_15576, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756400 for deletion 2025-07-19 23:21:05,003 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756400_15576 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756400 2025-07-19 23:24:01,351 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756403_15579 src: /192.168.158.5:59496 dest: /192.168.158.4:9866 2025-07-19 23:24:01,371 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:59496, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_451300040_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756403_15579, duration(ns): 17451959 2025-07-19 23:24:01,371 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756403_15579, type=LAST_IN_PIPELINE terminating 2025-07-19 23:24:08,007 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756403_15579 replica FinalizedReplica, blk_1073756403_15579, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756403 for deletion 2025-07-19 23:24:08,008 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756403_15579 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756403 2025-07-19 23:26:01,355 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756405_15581 src: /192.168.158.8:33348 dest: /192.168.158.4:9866 2025-07-19 23:26:01,385 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:33348, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_502004125_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756405_15581, duration(ns): 23493641 2025-07-19 23:26:01,385 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756405_15581, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-19 23:26:08,010 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756405_15581 replica FinalizedReplica, blk_1073756405_15581, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756405 for deletion 2025-07-19 23:26:08,011 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756405_15581 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756405 2025-07-19 23:29:06,374 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756408_15584 src: /192.168.158.6:54768 dest: /192.168.158.4:9866 2025-07-19 23:29:06,393 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:54768, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1454549495_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756408_15584, duration(ns): 16310975 2025-07-19 23:29:06,393 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756408_15584, type=LAST_IN_PIPELINE terminating 2025-07-19 23:29:11,017 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756408_15584 replica FinalizedReplica, blk_1073756408_15584, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756408 for deletion 2025-07-19 23:29:11,018 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756408_15584 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756408 2025-07-19 23:35:11,378 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756414_15590 src: /192.168.158.7:47382 dest: /192.168.158.4:9866 2025-07-19 23:35:11,397 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:47382, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1623754934_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756414_15590, duration(ns): 17349766 2025-07-19 23:35:11,398 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756414_15590, type=LAST_IN_PIPELINE terminating 2025-07-19 23:35:20,033 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756414_15590 replica FinalizedReplica, blk_1073756414_15590, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756414 for deletion 2025-07-19 23:35:20,034 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756414_15590 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir24/blk_1073756414 2025-07-19 23:36:13,269 INFO org.apache.hadoop.hdfs.server.datanode.DirectoryScanner: BlockPool BP-1059995147-192.168.158.1-1752101929360 Total blocks: 8, missing metadata files:0, missing block files:0, missing blocks in memory:0, mismatched blocks:0 2025-07-19 23:37:16,354 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756416_15592 src: /192.168.158.6:56598 dest: /192.168.158.4:9866 2025-07-19 23:37:16,381 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:56598, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1488730961_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756416_15592, duration(ns): 20651591 2025-07-19 23:37:16,382 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756416_15592, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-19 23:37:20,034 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756416_15592 replica FinalizedReplica, blk_1073756416_15592, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756416 for deletion 2025-07-19 23:37:20,035 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756416_15592 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756416 2025-07-19 23:38:21,361 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756417_15593 src: /192.168.158.1:57356 dest: /192.168.158.4:9866 2025-07-19 23:38:21,394 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57356, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-900639801_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756417_15593, duration(ns): 23980521 2025-07-19 23:38:21,394 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756417_15593, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-19 23:38:29,036 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756417_15593 replica FinalizedReplica, blk_1073756417_15593, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756417 for deletion 2025-07-19 23:38:29,037 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756417_15593 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756417 2025-07-19 23:39:21,371 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756418_15594 src: /192.168.158.6:44340 dest: /192.168.158.4:9866 2025-07-19 23:39:21,398 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:44340, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1295078280_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756418_15594, duration(ns): 20972748 2025-07-19 23:39:21,398 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756418_15594, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-19 23:39:29,036 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756418_15594 replica FinalizedReplica, blk_1073756418_15594, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756418 for deletion 2025-07-19 23:39:29,037 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756418_15594 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756418 2025-07-19 23:40:26,379 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756419_15595 src: /192.168.158.1:51246 dest: /192.168.158.4:9866 2025-07-19 23:40:26,412 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51246, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-988266138_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756419_15595, duration(ns): 24579183 2025-07-19 23:40:26,412 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756419_15595, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-19 23:40:35,037 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756419_15595 replica FinalizedReplica, blk_1073756419_15595, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756419 for deletion 2025-07-19 23:40:35,038 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756419_15595 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756419 2025-07-19 23:41:26,383 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756420_15596 src: /192.168.158.6:59906 dest: /192.168.158.4:9866 2025-07-19 23:41:26,403 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:59906, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1329130911_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756420_15596, duration(ns): 16642946 2025-07-19 23:41:26,403 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756420_15596, type=LAST_IN_PIPELINE terminating 2025-07-19 23:41:35,040 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756420_15596 replica FinalizedReplica, blk_1073756420_15596, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756420 for deletion 2025-07-19 23:41:35,041 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756420_15596 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756420 2025-07-19 23:42:26,397 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756421_15597 src: /192.168.158.6:39508 dest: /192.168.158.4:9866 2025-07-19 23:42:26,420 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:39508, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1751349449_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756421_15597, duration(ns): 21144639 2025-07-19 23:42:26,421 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756421_15597, type=LAST_IN_PIPELINE terminating 2025-07-19 23:42:32,043 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756421_15597 replica FinalizedReplica, blk_1073756421_15597, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756421 for deletion 2025-07-19 23:42:32,044 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756421_15597 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756421 2025-07-19 23:43:26,394 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756422_15598 src: /192.168.158.5:33334 dest: /192.168.158.4:9866 2025-07-19 23:43:26,412 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:33334, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-414051083_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756422_15598, duration(ns): 16551298 2025-07-19 23:43:26,413 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756422_15598, type=LAST_IN_PIPELINE terminating 2025-07-19 23:43:35,046 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756422_15598 replica FinalizedReplica, blk_1073756422_15598, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756422 for deletion 2025-07-19 23:43:35,047 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756422_15598 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756422 2025-07-19 23:46:26,394 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756425_15601 src: /192.168.158.5:59414 dest: /192.168.158.4:9866 2025-07-19 23:46:26,412 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:59414, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-682363221_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756425_15601, duration(ns): 16431114 2025-07-19 23:46:26,413 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756425_15601, type=LAST_IN_PIPELINE terminating 2025-07-19 23:46:35,051 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756425_15601 replica FinalizedReplica, blk_1073756425_15601, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756425 for deletion 2025-07-19 23:46:35,052 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756425_15601 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756425 2025-07-19 23:47:31,390 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756426_15602 src: /192.168.158.6:60996 dest: /192.168.158.4:9866 2025-07-19 23:47:31,416 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:60996, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-293271554_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756426_15602, duration(ns): 20168723 2025-07-19 23:47:31,417 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756426_15602, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-19 23:47:35,054 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756426_15602 replica FinalizedReplica, blk_1073756426_15602, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756426 for deletion 2025-07-19 23:47:35,055 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756426_15602 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756426 2025-07-19 23:48:31,383 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756427_15603 src: /192.168.158.5:56780 dest: /192.168.158.4:9866 2025-07-19 23:48:31,411 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:56780, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1351583867_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756427_15603, duration(ns): 22051985 2025-07-19 23:48:31,411 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756427_15603, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-19 23:48:38,056 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756427_15603 replica FinalizedReplica, blk_1073756427_15603, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756427 for deletion 2025-07-19 23:48:38,057 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756427_15603 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756427 2025-07-19 23:49:31,382 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756428_15604 src: /192.168.158.7:35266 dest: /192.168.158.4:9866 2025-07-19 23:49:31,408 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:35266, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1940709950_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756428_15604, duration(ns): 20302232 2025-07-19 23:49:31,409 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756428_15604, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-19 23:49:35,058 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756428_15604 replica FinalizedReplica, blk_1073756428_15604, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756428 for deletion 2025-07-19 23:49:35,059 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756428_15604 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756428 2025-07-19 23:50:36,399 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756429_15605 src: /192.168.158.7:54018 dest: /192.168.158.4:9866 2025-07-19 23:50:36,423 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:54018, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1718147999_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756429_15605, duration(ns): 19476651 2025-07-19 23:50:36,424 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756429_15605, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-19 23:50:41,060 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756429_15605 replica FinalizedReplica, blk_1073756429_15605, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756429 for deletion 2025-07-19 23:50:41,061 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756429_15605 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756429 2025-07-19 23:51:36,403 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756430_15606 src: /192.168.158.8:51304 dest: /192.168.158.4:9866 2025-07-19 23:51:36,423 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:51304, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_5944573_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756430_15606, duration(ns): 18178397 2025-07-19 23:51:36,424 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756430_15606, type=LAST_IN_PIPELINE terminating 2025-07-19 23:51:41,063 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756430_15606 replica FinalizedReplica, blk_1073756430_15606, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756430 for deletion 2025-07-19 23:51:41,064 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756430_15606 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756430 2025-07-19 23:52:36,402 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756431_15607 src: /192.168.158.1:40972 dest: /192.168.158.4:9866 2025-07-19 23:52:36,437 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40972, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2117195409_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756431_15607, duration(ns): 25746798 2025-07-19 23:52:36,438 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756431_15607, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-19 23:52:41,066 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756431_15607 replica FinalizedReplica, blk_1073756431_15607, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756431 for deletion 2025-07-19 23:52:41,067 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756431_15607 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756431 2025-07-19 23:54:41,398 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756433_15609 src: /192.168.158.9:47888 dest: /192.168.158.4:9866 2025-07-19 23:54:41,418 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:47888, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_674688551_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756433_15609, duration(ns): 17798537 2025-07-19 23:54:41,418 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756433_15609, type=LAST_IN_PIPELINE terminating 2025-07-19 23:54:47,071 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756433_15609 replica FinalizedReplica, blk_1073756433_15609, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756433 for deletion 2025-07-19 23:54:47,072 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756433_15609 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756433 2025-07-19 23:57:51,412 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756436_15612 src: /192.168.158.9:47998 dest: /192.168.158.4:9866 2025-07-19 23:57:51,439 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:47998, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_900053903_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756436_15612, duration(ns): 21612911 2025-07-19 23:57:51,439 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756436_15612, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-19 23:57:56,079 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756436_15612 replica FinalizedReplica, blk_1073756436_15612, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756436 for deletion 2025-07-19 23:57:56,080 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756436_15612 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756436 2025-07-19 23:58:51,417 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756437_15613 src: /192.168.158.8:52710 dest: /192.168.158.4:9866 2025-07-19 23:58:51,444 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:52710, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_266602213_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756437_15613, duration(ns): 21404964 2025-07-19 23:58:51,444 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756437_15613, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-19 23:58:56,082 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756437_15613 replica FinalizedReplica, blk_1073756437_15613, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756437 for deletion 2025-07-19 23:58:56,083 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756437_15613 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756437 2025-07-20 00:00:51,422 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756439_15615 src: /192.168.158.1:35122 dest: /192.168.158.4:9866 2025-07-20 00:00:51,455 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35122, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_395024408_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756439_15615, duration(ns): 23572515 2025-07-20 00:00:51,455 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756439_15615, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-20 00:00:56,086 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756439_15615 replica FinalizedReplica, blk_1073756439_15615, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756439 for deletion 2025-07-20 00:00:56,087 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756439_15615 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756439 2025-07-20 00:01:56,416 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756440_15616 src: /192.168.158.6:56512 dest: /192.168.158.4:9866 2025-07-20 00:01:56,443 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:56512, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1290598320_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756440_15616, duration(ns): 21675038 2025-07-20 00:01:56,443 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756440_15616, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-20 00:01:59,089 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756440_15616 replica FinalizedReplica, blk_1073756440_15616, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756440 for deletion 2025-07-20 00:01:59,090 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756440_15616 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756440 2025-07-20 00:04:56,442 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756443_15619 src: /192.168.158.8:60620 dest: /192.168.158.4:9866 2025-07-20 00:04:56,461 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:60620, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1498748481_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756443_15619, duration(ns): 17424789 2025-07-20 00:04:56,462 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756443_15619, type=LAST_IN_PIPELINE terminating 2025-07-20 00:04:59,097 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756443_15619 replica FinalizedReplica, blk_1073756443_15619, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756443 for deletion 2025-07-20 00:04:59,098 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756443_15619 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756443 2025-07-20 00:05:56,458 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756444_15620 src: /192.168.158.8:58004 dest: /192.168.158.4:9866 2025-07-20 00:05:56,478 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:58004, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-375873834_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756444_15620, duration(ns): 17974538 2025-07-20 00:05:56,478 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756444_15620, type=LAST_IN_PIPELINE terminating 2025-07-20 00:06:02,098 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756444_15620 replica FinalizedReplica, blk_1073756444_15620, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756444 for deletion 2025-07-20 00:06:02,100 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756444_15620 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756444 2025-07-20 00:08:01,466 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756446_15622 src: /192.168.158.5:47730 dest: /192.168.158.4:9866 2025-07-20 00:08:01,493 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:47730, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1579769315_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756446_15622, duration(ns): 20643991 2025-07-20 00:08:01,493 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756446_15622, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-20 00:08:08,102 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756446_15622 replica FinalizedReplica, blk_1073756446_15622, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756446 for deletion 2025-07-20 00:08:08,104 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756446_15622 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756446 2025-07-20 00:13:06,463 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756451_15627 src: /192.168.158.8:57344 dest: /192.168.158.4:9866 2025-07-20 00:13:06,486 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:57344, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1283513699_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756451_15627, duration(ns): 20519371 2025-07-20 00:13:06,486 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756451_15627, type=LAST_IN_PIPELINE terminating 2025-07-20 00:13:11,111 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756451_15627 replica FinalizedReplica, blk_1073756451_15627, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756451 for deletion 2025-07-20 00:13:11,112 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756451_15627 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756451 2025-07-20 00:14:06,461 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756452_15628 src: /192.168.158.5:38496 dest: /192.168.158.4:9866 2025-07-20 00:14:06,488 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:38496, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1077475611_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756452_15628, duration(ns): 21355309 2025-07-20 00:14:06,489 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756452_15628, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-20 00:14:11,115 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756452_15628 replica FinalizedReplica, blk_1073756452_15628, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756452 for deletion 2025-07-20 00:14:11,116 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756452_15628 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756452 2025-07-20 00:15:06,459 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756453_15629 src: /192.168.158.1:56922 dest: /192.168.158.4:9866 2025-07-20 00:15:06,496 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56922, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-786226747_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756453_15629, duration(ns): 26905171 2025-07-20 00:15:06,496 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756453_15629, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-20 00:15:14,116 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756453_15629 replica FinalizedReplica, blk_1073756453_15629, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756453 for deletion 2025-07-20 00:15:14,117 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756453_15629 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756453 2025-07-20 00:22:11,474 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756460_15636 src: /192.168.158.1:51824 dest: /192.168.158.4:9866 2025-07-20 00:22:11,514 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51824, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-245094576_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756460_15636, duration(ns): 30291043 2025-07-20 00:22:11,514 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756460_15636, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-20 00:22:14,130 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756460_15636 replica FinalizedReplica, blk_1073756460_15636, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756460 for deletion 2025-07-20 00:22:14,131 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756460_15636 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756460 2025-07-20 00:24:11,458 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756462_15638 src: /192.168.158.1:57166 dest: /192.168.158.4:9866 2025-07-20 00:24:11,491 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57166, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1386408312_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756462_15638, duration(ns): 24097815 2025-07-20 00:24:11,491 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756462_15638, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-20 00:24:14,134 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756462_15638 replica FinalizedReplica, blk_1073756462_15638, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756462 for deletion 2025-07-20 00:24:14,135 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756462_15638 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756462 2025-07-20 00:26:16,475 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756464_15640 src: /192.168.158.5:41104 dest: /192.168.158.4:9866 2025-07-20 00:26:16,494 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:41104, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_440580002_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756464_15640, duration(ns): 16820532 2025-07-20 00:26:16,495 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756464_15640, type=LAST_IN_PIPELINE terminating 2025-07-20 00:26:20,137 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756464_15640 replica FinalizedReplica, blk_1073756464_15640, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756464 for deletion 2025-07-20 00:26:20,138 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756464_15640 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756464 2025-07-20 00:30:16,511 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756468_15644 src: /192.168.158.1:41882 dest: /192.168.158.4:9866 2025-07-20 00:30:16,543 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41882, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1101537399_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756468_15644, duration(ns): 23297303 2025-07-20 00:30:16,543 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756468_15644, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-20 00:30:23,148 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756468_15644 replica FinalizedReplica, blk_1073756468_15644, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756468 for deletion 2025-07-20 00:30:23,149 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756468_15644 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756468 2025-07-20 00:33:21,498 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756471_15647 src: /192.168.158.6:53440 dest: /192.168.158.4:9866 2025-07-20 00:33:21,524 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:53440, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-144207774_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756471_15647, duration(ns): 20599928 2025-07-20 00:33:21,525 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756471_15647, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-20 00:33:26,149 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756471_15647 replica FinalizedReplica, blk_1073756471_15647, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756471 for deletion 2025-07-20 00:33:26,151 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756471_15647 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756471 2025-07-20 00:38:26,478 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756476_15652 src: /192.168.158.5:55780 dest: /192.168.158.4:9866 2025-07-20 00:38:26,504 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:55780, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-30865353_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756476_15652, duration(ns): 20270642 2025-07-20 00:38:26,504 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756476_15652, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-20 00:38:29,165 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756476_15652 replica FinalizedReplica, blk_1073756476_15652, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756476 for deletion 2025-07-20 00:38:29,166 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756476_15652 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756476 2025-07-20 00:39:26,485 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756477_15653 src: /192.168.158.8:40976 dest: /192.168.158.4:9866 2025-07-20 00:39:26,504 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:40976, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1417727981_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756477_15653, duration(ns): 17734084 2025-07-20 00:39:26,505 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756477_15653, type=LAST_IN_PIPELINE terminating 2025-07-20 00:39:29,165 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756477_15653 replica FinalizedReplica, blk_1073756477_15653, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756477 for deletion 2025-07-20 00:39:29,166 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756477_15653 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756477 2025-07-20 00:40:26,498 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756478_15654 src: /192.168.158.9:57284 dest: /192.168.158.4:9866 2025-07-20 00:40:26,517 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:57284, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-802752195_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756478_15654, duration(ns): 17050106 2025-07-20 00:40:26,517 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756478_15654, type=LAST_IN_PIPELINE terminating 2025-07-20 00:40:29,170 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756478_15654 replica FinalizedReplica, blk_1073756478_15654, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756478 for deletion 2025-07-20 00:40:29,171 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756478_15654 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756478 2025-07-20 00:41:26,487 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756479_15655 src: /192.168.158.8:60772 dest: /192.168.158.4:9866 2025-07-20 00:41:26,516 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:60772, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-385883328_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756479_15655, duration(ns): 23209620 2025-07-20 00:41:26,517 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756479_15655, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-20 00:41:32,171 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756479_15655 replica FinalizedReplica, blk_1073756479_15655, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756479 for deletion 2025-07-20 00:41:32,172 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756479_15655 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756479 2025-07-20 00:44:31,546 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756482_15658 src: /192.168.158.8:45100 dest: /192.168.158.4:9866 2025-07-20 00:44:31,563 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:45100, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194099093_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756482_15658, duration(ns): 15587729 2025-07-20 00:44:31,564 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756482_15658, type=LAST_IN_PIPELINE terminating 2025-07-20 00:44:38,175 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756482_15658 replica FinalizedReplica, blk_1073756482_15658, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756482 for deletion 2025-07-20 00:44:38,176 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756482_15658 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756482 2025-07-20 00:48:31,530 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756486_15662 src: /192.168.158.1:60398 dest: /192.168.158.4:9866 2025-07-20 00:48:31,563 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60398, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-401628328_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756486_15662, duration(ns): 24052751 2025-07-20 00:48:31,563 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756486_15662, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-20 00:48:35,180 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756486_15662 replica FinalizedReplica, blk_1073756486_15662, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756486 for deletion 2025-07-20 00:48:35,181 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756486_15662 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756486 2025-07-20 00:49:31,546 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756487_15663 src: /192.168.158.9:54866 dest: /192.168.158.4:9866 2025-07-20 00:49:31,572 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:54866, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1909733519_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756487_15663, duration(ns): 20813680 2025-07-20 00:49:31,572 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756487_15663, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-20 00:49:35,183 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756487_15663 replica FinalizedReplica, blk_1073756487_15663, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756487 for deletion 2025-07-20 00:49:35,184 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756487_15663 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756487 2025-07-20 00:50:31,524 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756488_15664 src: /192.168.158.1:48806 dest: /192.168.158.4:9866 2025-07-20 00:50:31,564 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48806, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1113532769_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756488_15664, duration(ns): 30524055 2025-07-20 00:50:31,564 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756488_15664, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-20 00:50:38,183 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756488_15664 replica FinalizedReplica, blk_1073756488_15664, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756488 for deletion 2025-07-20 00:50:38,184 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756488_15664 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756488 2025-07-20 00:51:31,543 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756489_15665 src: /192.168.158.5:36050 dest: /192.168.158.4:9866 2025-07-20 00:51:31,562 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:36050, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1936140303_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756489_15665, duration(ns): 17314821 2025-07-20 00:51:31,563 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756489_15665, type=LAST_IN_PIPELINE terminating 2025-07-20 00:51:35,186 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756489_15665 replica FinalizedReplica, blk_1073756489_15665, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756489 for deletion 2025-07-20 00:51:35,187 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756489_15665 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756489 2025-07-20 00:55:36,524 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756493_15669 src: /192.168.158.1:43452 dest: /192.168.158.4:9866 2025-07-20 00:55:36,559 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43452, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1399838391_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756493_15669, duration(ns): 24847276 2025-07-20 00:55:36,559 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756493_15669, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-20 00:55:44,195 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756493_15669 replica FinalizedReplica, blk_1073756493_15669, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756493 for deletion 2025-07-20 00:55:44,197 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756493_15669 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756493 2025-07-20 00:58:41,520 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756496_15672 src: /192.168.158.1:60836 dest: /192.168.158.4:9866 2025-07-20 00:58:41,557 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60836, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_254324668_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756496_15672, duration(ns): 27513112 2025-07-20 00:58:41,557 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756496_15672, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-20 00:58:44,204 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756496_15672 replica FinalizedReplica, blk_1073756496_15672, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756496 for deletion 2025-07-20 00:58:44,205 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756496_15672 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756496 2025-07-20 00:59:41,546 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756497_15673 src: /192.168.158.7:37782 dest: /192.168.158.4:9866 2025-07-20 00:59:41,565 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:37782, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_147017082_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756497_15673, duration(ns): 16728389 2025-07-20 00:59:41,566 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756497_15673, type=LAST_IN_PIPELINE terminating 2025-07-20 00:59:47,204 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756497_15673 replica FinalizedReplica, blk_1073756497_15673, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756497 for deletion 2025-07-20 00:59:47,205 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756497_15673 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756497 2025-07-20 01:00:41,532 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756498_15674 src: /192.168.158.6:42274 dest: /192.168.158.4:9866 2025-07-20 01:00:41,552 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:42274, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-647692490_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756498_15674, duration(ns): 17285889 2025-07-20 01:00:41,552 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756498_15674, type=LAST_IN_PIPELINE terminating 2025-07-20 01:00:44,205 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756498_15674 replica FinalizedReplica, blk_1073756498_15674, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756498 for deletion 2025-07-20 01:00:44,206 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756498_15674 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756498 2025-07-20 01:01:41,542 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756499_15675 src: /192.168.158.1:44712 dest: /192.168.158.4:9866 2025-07-20 01:01:41,576 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44712, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-446668977_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756499_15675, duration(ns): 24012711 2025-07-20 01:01:41,577 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756499_15675, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-20 01:01:47,207 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756499_15675 replica FinalizedReplica, blk_1073756499_15675, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756499 for deletion 2025-07-20 01:01:47,209 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756499_15675 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756499 2025-07-20 01:04:41,548 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756502_15678 src: /192.168.158.7:42354 dest: /192.168.158.4:9866 2025-07-20 01:04:41,576 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:42354, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1991779069_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756502_15678, duration(ns): 21652592 2025-07-20 01:04:41,577 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756502_15678, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-20 01:04:44,210 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756502_15678 replica FinalizedReplica, blk_1073756502_15678, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756502 for deletion 2025-07-20 01:04:44,211 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756502_15678 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756502 2025-07-20 01:06:41,546 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756504_15680 src: /192.168.158.1:51812 dest: /192.168.158.4:9866 2025-07-20 01:06:41,579 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51812, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1807808263_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756504_15680, duration(ns): 24737386 2025-07-20 01:06:41,580 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756504_15680, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-20 01:06:47,214 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756504_15680 replica FinalizedReplica, blk_1073756504_15680, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756504 for deletion 2025-07-20 01:06:47,215 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756504_15680 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756504 2025-07-20 01:07:41,565 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756505_15681 src: /192.168.158.1:47862 dest: /192.168.158.4:9866 2025-07-20 01:07:41,600 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47862, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1501676374_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756505_15681, duration(ns): 25595469 2025-07-20 01:07:41,601 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756505_15681, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-20 01:07:44,215 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756505_15681 replica FinalizedReplica, blk_1073756505_15681, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756505 for deletion 2025-07-20 01:07:44,216 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756505_15681 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756505 2025-07-20 01:08:46,545 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756506_15682 src: /192.168.158.1:52392 dest: /192.168.158.4:9866 2025-07-20 01:08:46,578 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52392, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1867295715_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756506_15682, duration(ns): 23584680 2025-07-20 01:08:46,578 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756506_15682, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-20 01:08:50,218 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756506_15682 replica FinalizedReplica, blk_1073756506_15682, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756506 for deletion 2025-07-20 01:08:50,219 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756506_15682 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756506 2025-07-20 01:09:46,563 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756507_15683 src: /192.168.158.5:42938 dest: /192.168.158.4:9866 2025-07-20 01:09:46,589 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:42938, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-375281791_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756507_15683, duration(ns): 20616711 2025-07-20 01:09:46,590 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756507_15683, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-20 01:09:50,219 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756507_15683 replica FinalizedReplica, blk_1073756507_15683, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756507 for deletion 2025-07-20 01:09:50,221 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756507_15683 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756507 2025-07-20 01:10:46,553 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756508_15684 src: /192.168.158.9:45574 dest: /192.168.158.4:9866 2025-07-20 01:10:46,579 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:45574, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-961849804_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756508_15684, duration(ns): 20095365 2025-07-20 01:10:46,579 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756508_15684, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-20 01:10:50,221 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756508_15684 replica FinalizedReplica, blk_1073756508_15684, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756508 for deletion 2025-07-20 01:10:50,223 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756508_15684 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756508 2025-07-20 01:11:46,562 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756509_15685 src: /192.168.158.7:36950 dest: /192.168.158.4:9866 2025-07-20 01:11:46,581 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:36950, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1780617942_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756509_15685, duration(ns): 16216596 2025-07-20 01:11:46,581 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756509_15685, type=LAST_IN_PIPELINE terminating 2025-07-20 01:11:50,223 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756509_15685 replica FinalizedReplica, blk_1073756509_15685, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756509 for deletion 2025-07-20 01:11:50,225 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756509_15685 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756509 2025-07-20 01:13:51,566 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756511_15687 src: /192.168.158.1:43106 dest: /192.168.158.4:9866 2025-07-20 01:13:51,599 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43106, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1894709710_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756511_15687, duration(ns): 24613025 2025-07-20 01:13:51,602 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756511_15687, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-20 01:13:59,233 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756511_15687 replica FinalizedReplica, blk_1073756511_15687, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756511 for deletion 2025-07-20 01:13:59,234 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756511_15687 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756511 2025-07-20 01:14:51,567 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756512_15688 src: /192.168.158.1:42044 dest: /192.168.158.4:9866 2025-07-20 01:14:51,602 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42044, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_481567698_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756512_15688, duration(ns): 23469483 2025-07-20 01:14:51,602 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756512_15688, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-20 01:14:56,236 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756512_15688 replica FinalizedReplica, blk_1073756512_15688, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756512 for deletion 2025-07-20 01:14:56,237 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756512_15688 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756512 2025-07-20 01:16:56,572 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756514_15690 src: /192.168.158.1:46978 dest: /192.168.158.4:9866 2025-07-20 01:16:56,607 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46978, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1631927172_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756514_15690, duration(ns): 25065092 2025-07-20 01:16:56,607 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756514_15690, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-20 01:16:59,239 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756514_15690 replica FinalizedReplica, blk_1073756514_15690, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756514 for deletion 2025-07-20 01:16:59,240 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756514_15690 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756514 2025-07-20 01:18:01,568 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756515_15691 src: /192.168.158.5:39082 dest: /192.168.158.4:9866 2025-07-20 01:18:01,594 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:39082, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1212163928_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756515_15691, duration(ns): 20405925 2025-07-20 01:18:01,596 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756515_15691, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-20 01:18:05,239 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756515_15691 replica FinalizedReplica, blk_1073756515_15691, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756515 for deletion 2025-07-20 01:18:05,240 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756515_15691 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756515 2025-07-20 01:19:01,576 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756516_15692 src: /192.168.158.6:46046 dest: /192.168.158.4:9866 2025-07-20 01:19:01,604 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:46046, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-596364686_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756516_15692, duration(ns): 21551188 2025-07-20 01:19:01,605 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756516_15692, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-20 01:19:08,239 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756516_15692 replica FinalizedReplica, blk_1073756516_15692, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756516 for deletion 2025-07-20 01:19:08,240 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756516_15692 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756516 2025-07-20 01:21:01,607 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756518_15694 src: /192.168.158.5:60504 dest: /192.168.158.4:9866 2025-07-20 01:21:01,635 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:60504, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_805424456_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756518_15694, duration(ns): 23374627 2025-07-20 01:21:01,636 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756518_15694, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-20 01:21:08,243 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756518_15694 replica FinalizedReplica, blk_1073756518_15694, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756518 for deletion 2025-07-20 01:21:08,244 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756518_15694 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756518 2025-07-20 01:23:01,583 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756520_15696 src: /192.168.158.5:33124 dest: /192.168.158.4:9866 2025-07-20 01:23:01,609 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:33124, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1927975201_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756520_15696, duration(ns): 20477278 2025-07-20 01:23:01,609 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756520_15696, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-20 01:23:05,247 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756520_15696 replica FinalizedReplica, blk_1073756520_15696, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756520 for deletion 2025-07-20 01:23:05,248 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756520_15696 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756520 2025-07-20 01:24:01,583 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756521_15697 src: /192.168.158.8:45740 dest: /192.168.158.4:9866 2025-07-20 01:24:01,610 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:45740, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1017929184_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756521_15697, duration(ns): 20709774 2025-07-20 01:24:01,610 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756521_15697, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-20 01:24:05,248 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756521_15697 replica FinalizedReplica, blk_1073756521_15697, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756521 for deletion 2025-07-20 01:24:05,250 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756521_15697 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756521 2025-07-20 01:27:01,590 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756524_15700 src: /192.168.158.8:51084 dest: /192.168.158.4:9866 2025-07-20 01:27:01,616 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:51084, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-310000509_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756524_15700, duration(ns): 20374220 2025-07-20 01:27:01,617 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756524_15700, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-20 01:27:08,255 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756524_15700 replica FinalizedReplica, blk_1073756524_15700, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756524 for deletion 2025-07-20 01:27:08,256 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756524_15700 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756524 2025-07-20 01:28:06,579 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756525_15701 src: /192.168.158.1:36652 dest: /192.168.158.4:9866 2025-07-20 01:28:06,611 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36652, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1657559722_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756525_15701, duration(ns): 23359434 2025-07-20 01:28:06,611 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756525_15701, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-20 01:28:11,258 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756525_15701 replica FinalizedReplica, blk_1073756525_15701, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756525 for deletion 2025-07-20 01:28:11,259 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756525_15701 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756525 2025-07-20 01:29:06,594 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756526_15702 src: /192.168.158.8:42006 dest: /192.168.158.4:9866 2025-07-20 01:29:06,619 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:42006, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1456492518_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756526_15702, duration(ns): 19012219 2025-07-20 01:29:06,619 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756526_15702, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-20 01:29:14,260 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756526_15702 replica FinalizedReplica, blk_1073756526_15702, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756526 for deletion 2025-07-20 01:29:14,261 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756526_15702 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756526 2025-07-20 01:35:21,619 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756532_15708 src: /192.168.158.1:51124 dest: /192.168.158.4:9866 2025-07-20 01:35:21,652 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51124, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-153547923_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756532_15708, duration(ns): 24415163 2025-07-20 01:35:21,653 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756532_15708, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-20 01:35:26,273 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756532_15708 replica FinalizedReplica, blk_1073756532_15708, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756532 for deletion 2025-07-20 01:35:26,275 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756532_15708 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756532 2025-07-20 01:36:21,599 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756533_15709 src: /192.168.158.6:36050 dest: /192.168.158.4:9866 2025-07-20 01:36:21,619 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:36050, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1928405308_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756533_15709, duration(ns): 17454975 2025-07-20 01:36:21,619 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756533_15709, type=LAST_IN_PIPELINE terminating 2025-07-20 01:36:26,275 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756533_15709 replica FinalizedReplica, blk_1073756533_15709, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756533 for deletion 2025-07-20 01:36:26,277 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756533_15709 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756533 2025-07-20 01:42:36,607 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756539_15715 src: /192.168.158.5:52848 dest: /192.168.158.4:9866 2025-07-20 01:42:36,625 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:52848, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-266168087_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756539_15715, duration(ns): 14622531 2025-07-20 01:42:36,625 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756539_15715, type=LAST_IN_PIPELINE terminating 2025-07-20 01:42:41,287 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756539_15715 replica FinalizedReplica, blk_1073756539_15715, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756539 for deletion 2025-07-20 01:42:41,288 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756539_15715 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756539 2025-07-20 01:45:41,609 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756542_15718 src: /192.168.158.6:54750 dest: /192.168.158.4:9866 2025-07-20 01:45:41,629 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:54750, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-947756895_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756542_15718, duration(ns): 17704261 2025-07-20 01:45:41,629 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756542_15718, type=LAST_IN_PIPELINE terminating 2025-07-20 01:45:44,291 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756542_15718 replica FinalizedReplica, blk_1073756542_15718, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756542 for deletion 2025-07-20 01:45:44,292 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756542_15718 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756542 2025-07-20 01:47:41,624 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756544_15720 src: /192.168.158.8:46572 dest: /192.168.158.4:9866 2025-07-20 01:47:41,650 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:46572, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-445049964_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756544_15720, duration(ns): 19966397 2025-07-20 01:47:41,650 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756544_15720, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-20 01:47:44,296 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756544_15720 replica FinalizedReplica, blk_1073756544_15720, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756544 for deletion 2025-07-20 01:47:44,298 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756544_15720 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756544 2025-07-20 01:49:41,608 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756546_15722 src: /192.168.158.9:36514 dest: /192.168.158.4:9866 2025-07-20 01:49:41,633 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:36514, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2012021679_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756546_15722, duration(ns): 18943901 2025-07-20 01:49:41,633 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756546_15722, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-20 01:49:44,303 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756546_15722 replica FinalizedReplica, blk_1073756546_15722, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756546 for deletion 2025-07-20 01:49:44,304 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756546_15722 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756546 2025-07-20 01:50:41,605 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756547_15723 src: /192.168.158.1:60306 dest: /192.168.158.4:9866 2025-07-20 01:50:41,637 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60306, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1702416410_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756547_15723, duration(ns): 22747461 2025-07-20 01:50:41,637 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756547_15723, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-20 01:50:47,304 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756547_15723 replica FinalizedReplica, blk_1073756547_15723, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756547 for deletion 2025-07-20 01:50:47,305 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756547_15723 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756547 2025-07-20 01:53:46,628 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756550_15726 src: /192.168.158.1:38406 dest: /192.168.158.4:9866 2025-07-20 01:53:46,664 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38406, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_40266393_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756550_15726, duration(ns): 27265904 2025-07-20 01:53:46,664 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756550_15726, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-20 01:53:53,311 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756550_15726 replica FinalizedReplica, blk_1073756550_15726, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756550 for deletion 2025-07-20 01:53:53,312 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756550_15726 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756550 2025-07-20 01:54:51,642 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756551_15727 src: /192.168.158.9:44466 dest: /192.168.158.4:9866 2025-07-20 01:54:51,666 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:44466, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_779936843_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756551_15727, duration(ns): 18587838 2025-07-20 01:54:51,666 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756551_15727, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-20 01:54:56,313 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756551_15727 replica FinalizedReplica, blk_1073756551_15727, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756551 for deletion 2025-07-20 01:54:56,315 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756551_15727 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756551 2025-07-20 01:55:51,656 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756552_15728 src: /192.168.158.6:48304 dest: /192.168.158.4:9866 2025-07-20 01:55:51,676 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:48304, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1857569680_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756552_15728, duration(ns): 17886474 2025-07-20 01:55:51,677 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756552_15728, type=LAST_IN_PIPELINE terminating 2025-07-20 01:55:56,318 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756552_15728 replica FinalizedReplica, blk_1073756552_15728, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756552 for deletion 2025-07-20 01:55:56,319 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756552_15728 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756552 2025-07-20 01:57:56,624 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756554_15730 src: /192.168.158.8:54998 dest: /192.168.158.4:9866 2025-07-20 01:57:56,648 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:54998, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1828563072_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756554_15730, duration(ns): 19167358 2025-07-20 01:57:56,649 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756554_15730, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-20 01:57:59,323 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756554_15730 replica FinalizedReplica, blk_1073756554_15730, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756554 for deletion 2025-07-20 01:57:59,324 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756554_15730 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756554 2025-07-20 02:00:01,637 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756556_15732 src: /192.168.158.1:52968 dest: /192.168.158.4:9866 2025-07-20 02:00:01,669 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52968, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2078712724_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756556_15732, duration(ns): 23688067 2025-07-20 02:00:01,670 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756556_15732, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-20 02:00:05,327 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756556_15732 replica FinalizedReplica, blk_1073756556_15732, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756556 for deletion 2025-07-20 02:00:05,328 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756556_15732 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756556 2025-07-20 02:04:06,647 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756560_15736 src: /192.168.158.5:56112 dest: /192.168.158.4:9866 2025-07-20 02:04:06,666 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:56112, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1725844930_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756560_15736, duration(ns): 16398095 2025-07-20 02:04:06,666 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756560_15736, type=LAST_IN_PIPELINE terminating 2025-07-20 02:04:11,339 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756560_15736 replica FinalizedReplica, blk_1073756560_15736, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756560 for deletion 2025-07-20 02:04:11,340 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756560_15736 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756560 2025-07-20 02:05:06,647 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756561_15737 src: /192.168.158.9:46318 dest: /192.168.158.4:9866 2025-07-20 02:05:06,669 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:46318, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-778366878_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756561_15737, duration(ns): 20146819 2025-07-20 02:05:06,670 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756561_15737, type=LAST_IN_PIPELINE terminating 2025-07-20 02:05:11,340 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756561_15737 replica FinalizedReplica, blk_1073756561_15737, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756561 for deletion 2025-07-20 02:05:11,342 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756561_15737 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756561 2025-07-20 02:06:11,653 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756562_15738 src: /192.168.158.5:44706 dest: /192.168.158.4:9866 2025-07-20 02:06:11,672 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:44706, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_317686153_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756562_15738, duration(ns): 16793860 2025-07-20 02:06:11,672 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756562_15738, type=LAST_IN_PIPELINE terminating 2025-07-20 02:06:14,340 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756562_15738 replica FinalizedReplica, blk_1073756562_15738, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756562 for deletion 2025-07-20 02:06:14,341 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756562_15738 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756562 2025-07-20 02:11:16,655 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756567_15743 src: /192.168.158.9:39860 dest: /192.168.158.4:9866 2025-07-20 02:11:16,674 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:39860, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_322199254_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756567_15743, duration(ns): 16961684 2025-07-20 02:11:16,675 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756567_15743, type=LAST_IN_PIPELINE terminating 2025-07-20 02:11:20,352 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756567_15743 replica FinalizedReplica, blk_1073756567_15743, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756567 for deletion 2025-07-20 02:11:20,354 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756567_15743 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756567 2025-07-20 02:13:16,658 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756569_15745 src: /192.168.158.5:36696 dest: /192.168.158.4:9866 2025-07-20 02:13:16,682 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:36696, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1943733506_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756569_15745, duration(ns): 18321625 2025-07-20 02:13:16,682 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756569_15745, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-20 02:13:20,359 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756569_15745 replica FinalizedReplica, blk_1073756569_15745, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756569 for deletion 2025-07-20 02:13:20,360 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756569_15745 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756569 2025-07-20 02:14:21,664 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756570_15746 src: /192.168.158.1:49408 dest: /192.168.158.4:9866 2025-07-20 02:14:21,698 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49408, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1354661602_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756570_15746, duration(ns): 24354335 2025-07-20 02:14:21,701 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756570_15746, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-20 02:14:26,361 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756570_15746 replica FinalizedReplica, blk_1073756570_15746, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756570 for deletion 2025-07-20 02:14:26,362 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756570_15746 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756570 2025-07-20 02:17:31,659 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756573_15749 src: /192.168.158.1:34660 dest: /192.168.158.4:9866 2025-07-20 02:17:31,693 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34660, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_182322649_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756573_15749, duration(ns): 25594337 2025-07-20 02:17:31,694 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756573_15749, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-20 02:17:35,365 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756573_15749 replica FinalizedReplica, blk_1073756573_15749, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756573 for deletion 2025-07-20 02:17:35,366 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756573_15749 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756573 2025-07-20 02:18:31,673 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756574_15750 src: /192.168.158.8:41930 dest: /192.168.158.4:9866 2025-07-20 02:18:31,701 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:41930, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_518319295_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756574_15750, duration(ns): 22480142 2025-07-20 02:18:31,701 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756574_15750, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-20 02:18:38,368 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756574_15750 replica FinalizedReplica, blk_1073756574_15750, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756574 for deletion 2025-07-20 02:18:38,369 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756574_15750 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756574 2025-07-20 02:20:36,667 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756576_15752 src: /192.168.158.5:46594 dest: /192.168.158.4:9866 2025-07-20 02:20:36,693 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:46594, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1228451105_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756576_15752, duration(ns): 20128084 2025-07-20 02:20:36,693 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756576_15752, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-20 02:20:41,371 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756576_15752 replica FinalizedReplica, blk_1073756576_15752, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756576 for deletion 2025-07-20 02:20:41,372 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756576_15752 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756576 2025-07-20 02:21:36,674 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756577_15753 src: /192.168.158.7:59430 dest: /192.168.158.4:9866 2025-07-20 02:21:36,695 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:59430, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-190424859_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756577_15753, duration(ns): 18194429 2025-07-20 02:21:36,695 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756577_15753, type=LAST_IN_PIPELINE terminating 2025-07-20 02:21:44,376 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756577_15753 replica FinalizedReplica, blk_1073756577_15753, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756577 for deletion 2025-07-20 02:21:44,377 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756577_15753 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756577 2025-07-20 02:24:41,710 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756580_15756 src: /192.168.158.9:36462 dest: /192.168.158.4:9866 2025-07-20 02:24:41,733 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:36462, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1248767482_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756580_15756, duration(ns): 19373169 2025-07-20 02:24:41,733 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756580_15756, type=LAST_IN_PIPELINE terminating 2025-07-20 02:24:47,381 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756580_15756 replica FinalizedReplica, blk_1073756580_15756, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756580 for deletion 2025-07-20 02:24:47,383 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756580_15756 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756580 2025-07-20 02:26:46,687 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756582_15758 src: /192.168.158.1:44486 dest: /192.168.158.4:9866 2025-07-20 02:26:46,721 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44486, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_183487701_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756582_15758, duration(ns): 25095695 2025-07-20 02:26:46,722 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756582_15758, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-20 02:26:53,384 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756582_15758 replica FinalizedReplica, blk_1073756582_15758, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756582 for deletion 2025-07-20 02:26:53,385 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756582_15758 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756582 2025-07-20 02:27:51,832 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756583_15759 src: /192.168.158.6:35056 dest: /192.168.158.4:9866 2025-07-20 02:27:51,851 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:35056, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1249837039_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756583_15759, duration(ns): 16776235 2025-07-20 02:27:51,851 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756583_15759, type=LAST_IN_PIPELINE terminating 2025-07-20 02:27:56,387 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756583_15759 replica FinalizedReplica, blk_1073756583_15759, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756583 for deletion 2025-07-20 02:27:56,388 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756583_15759 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756583 2025-07-20 02:29:51,696 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756585_15761 src: /192.168.158.6:41714 dest: /192.168.158.4:9866 2025-07-20 02:29:51,722 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:41714, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1702605256_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756585_15761, duration(ns): 19874490 2025-07-20 02:29:51,722 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756585_15761, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-20 02:29:56,392 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756585_15761 replica FinalizedReplica, blk_1073756585_15761, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756585 for deletion 2025-07-20 02:29:56,393 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756585_15761 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756585 2025-07-20 02:32:51,681 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756588_15764 src: /192.168.158.8:44372 dest: /192.168.158.4:9866 2025-07-20 02:32:51,709 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:44372, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_657892756_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756588_15764, duration(ns): 21920312 2025-07-20 02:32:51,709 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756588_15764, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-20 02:32:56,397 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756588_15764 replica FinalizedReplica, blk_1073756588_15764, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756588 for deletion 2025-07-20 02:32:56,398 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756588_15764 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756588 2025-07-20 02:33:51,683 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756589_15765 src: /192.168.158.8:54746 dest: /192.168.158.4:9866 2025-07-20 02:33:51,709 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:54746, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-657372034_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756589_15765, duration(ns): 20281007 2025-07-20 02:33:51,709 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756589_15765, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-20 02:33:56,398 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756589_15765 replica FinalizedReplica, blk_1073756589_15765, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756589 for deletion 2025-07-20 02:33:56,400 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756589_15765 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756589 2025-07-20 02:34:51,691 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756590_15766 src: /192.168.158.8:39254 dest: /192.168.158.4:9866 2025-07-20 02:34:51,711 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:39254, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1369282474_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756590_15766, duration(ns): 18216465 2025-07-20 02:34:51,711 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756590_15766, type=LAST_IN_PIPELINE terminating 2025-07-20 02:34:56,400 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756590_15766 replica FinalizedReplica, blk_1073756590_15766, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756590 for deletion 2025-07-20 02:34:56,402 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756590_15766 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756590 2025-07-20 02:36:51,686 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756592_15768 src: /192.168.158.7:50352 dest: /192.168.158.4:9866 2025-07-20 02:36:51,712 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:50352, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_659913870_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756592_15768, duration(ns): 20333773 2025-07-20 02:36:51,713 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756592_15768, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-20 02:36:59,401 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756592_15768 replica FinalizedReplica, blk_1073756592_15768, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756592 for deletion 2025-07-20 02:36:59,402 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756592_15768 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756592 2025-07-20 02:37:51,691 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756593_15769 src: /192.168.158.6:59554 dest: /192.168.158.4:9866 2025-07-20 02:37:51,710 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:59554, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1932667571_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756593_15769, duration(ns): 16538502 2025-07-20 02:37:51,710 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756593_15769, type=LAST_IN_PIPELINE terminating 2025-07-20 02:37:56,404 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756593_15769 replica FinalizedReplica, blk_1073756593_15769, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756593 for deletion 2025-07-20 02:37:56,405 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756593_15769 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756593 2025-07-20 02:38:56,692 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756594_15770 src: /192.168.158.7:54238 dest: /192.168.158.4:9866 2025-07-20 02:38:56,720 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:54238, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-808140489_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756594_15770, duration(ns): 22078113 2025-07-20 02:38:56,720 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756594_15770, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-20 02:39:02,407 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756594_15770 replica FinalizedReplica, blk_1073756594_15770, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756594 for deletion 2025-07-20 02:39:02,408 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756594_15770 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756594 2025-07-20 02:41:06,696 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756596_15772 src: /192.168.158.1:55494 dest: /192.168.158.4:9866 2025-07-20 02:41:06,729 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55494, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1383282742_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756596_15772, duration(ns): 23671975 2025-07-20 02:41:06,729 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756596_15772, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-20 02:41:11,409 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756596_15772 replica FinalizedReplica, blk_1073756596_15772, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756596 for deletion 2025-07-20 02:41:11,410 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756596_15772 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756596 2025-07-20 02:42:06,708 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756597_15773 src: /192.168.158.8:38822 dest: /192.168.158.4:9866 2025-07-20 02:42:06,738 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:38822, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1896608781_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756597_15773, duration(ns): 23431364 2025-07-20 02:42:06,739 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756597_15773, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-20 02:42:11,410 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756597_15773 replica FinalizedReplica, blk_1073756597_15773, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756597 for deletion 2025-07-20 02:42:11,413 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756597_15773 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756597 2025-07-20 02:43:06,703 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756598_15774 src: /192.168.158.9:33704 dest: /192.168.158.4:9866 2025-07-20 02:43:06,723 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:33704, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1658359151_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756598_15774, duration(ns): 17425114 2025-07-20 02:43:06,723 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756598_15774, type=LAST_IN_PIPELINE terminating 2025-07-20 02:43:11,412 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756598_15774 replica FinalizedReplica, blk_1073756598_15774, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756598 for deletion 2025-07-20 02:43:11,413 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756598_15774 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756598 2025-07-20 02:44:06,719 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756599_15775 src: /192.168.158.5:42880 dest: /192.168.158.4:9866 2025-07-20 02:44:06,746 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:42880, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-769580313_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756599_15775, duration(ns): 21417209 2025-07-20 02:44:06,746 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756599_15775, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-20 02:44:14,412 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756599_15775 replica FinalizedReplica, blk_1073756599_15775, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756599 for deletion 2025-07-20 02:44:14,413 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756599_15775 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756599 2025-07-20 02:45:06,700 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756600_15776 src: /192.168.158.1:52822 dest: /192.168.158.4:9866 2025-07-20 02:45:06,733 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52822, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1572001801_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756600_15776, duration(ns): 23336023 2025-07-20 02:45:06,734 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756600_15776, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-20 02:45:11,411 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756600_15776 replica FinalizedReplica, blk_1073756600_15776, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756600 for deletion 2025-07-20 02:45:11,413 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756600_15776 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756600 2025-07-20 02:46:06,719 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756601_15777 src: /192.168.158.5:46990 dest: /192.168.158.4:9866 2025-07-20 02:46:06,748 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:46990, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_162510880_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756601_15777, duration(ns): 23315246 2025-07-20 02:46:06,748 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756601_15777, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-20 02:46:11,415 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756601_15777 replica FinalizedReplica, blk_1073756601_15777, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756601 for deletion 2025-07-20 02:46:11,416 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756601_15777 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756601 2025-07-20 02:47:06,708 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756602_15778 src: /192.168.158.6:57142 dest: /192.168.158.4:9866 2025-07-20 02:47:06,736 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:57142, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1900946409_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756602_15778, duration(ns): 22123167 2025-07-20 02:47:06,737 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756602_15778, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-20 02:47:11,418 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756602_15778 replica FinalizedReplica, blk_1073756602_15778, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756602 for deletion 2025-07-20 02:47:11,419 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756602_15778 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756602 2025-07-20 02:50:06,719 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756605_15781 src: /192.168.158.1:45704 dest: /192.168.158.4:9866 2025-07-20 02:50:06,751 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45704, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_357864759_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756605_15781, duration(ns): 23522254 2025-07-20 02:50:06,751 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756605_15781, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-20 02:50:11,422 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756605_15781 replica FinalizedReplica, blk_1073756605_15781, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756605 for deletion 2025-07-20 02:50:11,423 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756605_15781 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756605 2025-07-20 02:53:06,736 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756608_15784 src: /192.168.158.5:34302 dest: /192.168.158.4:9866 2025-07-20 02:53:06,755 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:34302, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_998059689_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756608_15784, duration(ns): 16966848 2025-07-20 02:53:06,755 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756608_15784, type=LAST_IN_PIPELINE terminating 2025-07-20 02:53:14,432 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756608_15784 replica FinalizedReplica, blk_1073756608_15784, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756608 for deletion 2025-07-20 02:53:14,433 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756608_15784 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756608 2025-07-20 02:55:06,723 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756610_15786 src: /192.168.158.5:43434 dest: /192.168.158.4:9866 2025-07-20 02:55:06,744 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:43434, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_458919008_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756610_15786, duration(ns): 19347369 2025-07-20 02:55:06,744 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756610_15786, type=LAST_IN_PIPELINE terminating 2025-07-20 02:55:14,435 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756610_15786 replica FinalizedReplica, blk_1073756610_15786, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756610 for deletion 2025-07-20 02:55:14,436 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756610_15786 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756610 2025-07-20 02:57:11,718 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756612_15788 src: /192.168.158.7:46088 dest: /192.168.158.4:9866 2025-07-20 02:57:11,739 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:46088, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-39418975_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756612_15788, duration(ns): 18234591 2025-07-20 02:57:11,740 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756612_15788, type=LAST_IN_PIPELINE terminating 2025-07-20 02:57:17,439 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756612_15788 replica FinalizedReplica, blk_1073756612_15788, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756612 for deletion 2025-07-20 02:57:17,440 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756612_15788 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756612 2025-07-20 02:58:16,702 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756613_15789 src: /192.168.158.1:47740 dest: /192.168.158.4:9866 2025-07-20 02:58:16,738 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47740, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-481300675_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756613_15789, duration(ns): 24978161 2025-07-20 02:58:16,738 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756613_15789, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-20 02:58:20,441 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756613_15789 replica FinalizedReplica, blk_1073756613_15789, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756613 for deletion 2025-07-20 02:58:20,443 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756613_15789 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756613 2025-07-20 03:00:16,707 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756615_15791 src: /192.168.158.1:48984 dest: /192.168.158.4:9866 2025-07-20 03:00:16,741 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48984, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-635721726_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756615_15791, duration(ns): 24241970 2025-07-20 03:00:16,741 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756615_15791, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-20 03:00:23,445 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756615_15791 replica FinalizedReplica, blk_1073756615_15791, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756615 for deletion 2025-07-20 03:00:23,447 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756615_15791 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756615 2025-07-20 03:01:16,708 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756616_15792 src: /192.168.158.7:40478 dest: /192.168.158.4:9866 2025-07-20 03:01:16,734 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:40478, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2001875648_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756616_15792, duration(ns): 20377204 2025-07-20 03:01:16,735 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756616_15792, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-20 03:01:20,448 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756616_15792 replica FinalizedReplica, blk_1073756616_15792, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756616 for deletion 2025-07-20 03:01:20,449 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756616_15792 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756616 2025-07-20 03:02:16,714 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756617_15793 src: /192.168.158.1:56444 dest: /192.168.158.4:9866 2025-07-20 03:02:16,748 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56444, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-457551058_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756617_15793, duration(ns): 24533992 2025-07-20 03:02:16,749 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756617_15793, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-20 03:02:20,453 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756617_15793 replica FinalizedReplica, blk_1073756617_15793, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756617 for deletion 2025-07-20 03:02:20,454 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756617_15793 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756617 2025-07-20 03:04:26,714 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756619_15795 src: /192.168.158.8:35892 dest: /192.168.158.4:9866 2025-07-20 03:04:26,733 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:35892, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1446093721_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756619_15795, duration(ns): 16755169 2025-07-20 03:04:26,734 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756619_15795, type=LAST_IN_PIPELINE terminating 2025-07-20 03:04:32,459 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756619_15795 replica FinalizedReplica, blk_1073756619_15795, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756619 for deletion 2025-07-20 03:04:32,460 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756619_15795 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756619 2025-07-20 03:05:26,735 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756620_15796 src: /192.168.158.9:35384 dest: /192.168.158.4:9866 2025-07-20 03:05:26,762 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:35384, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1828463161_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756620_15796, duration(ns): 22182299 2025-07-20 03:05:26,763 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756620_15796, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-20 03:05:29,460 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756620_15796 replica FinalizedReplica, blk_1073756620_15796, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756620 for deletion 2025-07-20 03:05:29,462 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756620_15796 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756620 2025-07-20 03:06:26,739 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756621_15797 src: /192.168.158.6:33826 dest: /192.168.158.4:9866 2025-07-20 03:06:26,765 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:33826, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_616800433_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756621_15797, duration(ns): 21014275 2025-07-20 03:06:26,765 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756621_15797, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-20 03:06:32,461 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756621_15797 replica FinalizedReplica, blk_1073756621_15797, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756621 for deletion 2025-07-20 03:06:32,462 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756621_15797 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756621 2025-07-20 03:07:26,716 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756622_15798 src: /192.168.158.7:43624 dest: /192.168.158.4:9866 2025-07-20 03:07:26,740 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:43624, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1885172625_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756622_15798, duration(ns): 19265225 2025-07-20 03:07:26,740 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756622_15798, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-20 03:07:29,462 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756622_15798 replica FinalizedReplica, blk_1073756622_15798, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756622 for deletion 2025-07-20 03:07:29,464 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756622_15798 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756622 2025-07-20 03:09:26,721 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756624_15800 src: /192.168.158.7:49856 dest: /192.168.158.4:9866 2025-07-20 03:09:26,740 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:49856, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1177131294_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756624_15800, duration(ns): 17035209 2025-07-20 03:09:26,741 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756624_15800, type=LAST_IN_PIPELINE terminating 2025-07-20 03:09:32,464 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756624_15800 replica FinalizedReplica, blk_1073756624_15800, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756624 for deletion 2025-07-20 03:09:32,466 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756624_15800 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756624 2025-07-20 03:12:26,728 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756627_15803 src: /192.168.158.1:43620 dest: /192.168.158.4:9866 2025-07-20 03:12:26,762 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43620, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_610984729_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756627_15803, duration(ns): 23720657 2025-07-20 03:12:26,763 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756627_15803, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-20 03:12:29,470 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756627_15803 replica FinalizedReplica, blk_1073756627_15803, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756627 for deletion 2025-07-20 03:12:29,472 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756627_15803 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756627 2025-07-20 03:15:26,737 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756630_15806 src: /192.168.158.6:58960 dest: /192.168.158.4:9866 2025-07-20 03:15:26,759 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:58960, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1862030860_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756630_15806, duration(ns): 19421480 2025-07-20 03:15:26,759 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756630_15806, type=LAST_IN_PIPELINE terminating 2025-07-20 03:15:29,478 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756630_15806 replica FinalizedReplica, blk_1073756630_15806, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756630 for deletion 2025-07-20 03:15:29,479 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756630_15806 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756630 2025-07-20 03:19:36,744 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756634_15810 src: /192.168.158.9:45462 dest: /192.168.158.4:9866 2025-07-20 03:19:36,764 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:45462, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-971125431_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756634_15810, duration(ns): 17741878 2025-07-20 03:19:36,764 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756634_15810, type=LAST_IN_PIPELINE terminating 2025-07-20 03:19:38,481 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756634_15810 replica FinalizedReplica, blk_1073756634_15810, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756634 for deletion 2025-07-20 03:19:38,482 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756634_15810 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756634 2025-07-20 03:20:36,747 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756635_15811 src: /192.168.158.7:60240 dest: /192.168.158.4:9866 2025-07-20 03:20:36,766 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:60240, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_574883092_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756635_15811, duration(ns): 17515507 2025-07-20 03:20:36,767 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756635_15811, type=LAST_IN_PIPELINE terminating 2025-07-20 03:20:38,484 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756635_15811 replica FinalizedReplica, blk_1073756635_15811, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756635 for deletion 2025-07-20 03:20:38,485 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756635_15811 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756635 2025-07-20 03:21:36,741 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756636_15812 src: /192.168.158.6:47052 dest: /192.168.158.4:9866 2025-07-20 03:21:36,769 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:47052, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1255672714_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756636_15812, duration(ns): 22374435 2025-07-20 03:21:36,769 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756636_15812, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-20 03:21:38,488 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756636_15812 replica FinalizedReplica, blk_1073756636_15812, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756636 for deletion 2025-07-20 03:21:38,489 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756636_15812 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756636 2025-07-20 03:26:46,744 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756641_15817 src: /192.168.158.1:51054 dest: /192.168.158.4:9866 2025-07-20 03:26:46,777 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51054, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1809684651_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756641_15817, duration(ns): 23749631 2025-07-20 03:26:46,777 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756641_15817, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-20 03:26:50,495 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756641_15817 replica FinalizedReplica, blk_1073756641_15817, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756641 for deletion 2025-07-20 03:26:50,497 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756641_15817 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756641 2025-07-20 03:29:46,756 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756644_15820 src: /192.168.158.6:43478 dest: /192.168.158.4:9866 2025-07-20 03:29:46,784 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:43478, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_926415627_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756644_15820, duration(ns): 22336935 2025-07-20 03:29:46,784 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756644_15820, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-20 03:29:50,499 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756644_15820 replica FinalizedReplica, blk_1073756644_15820, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756644 for deletion 2025-07-20 03:29:50,500 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756644_15820 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756644 2025-07-20 03:37:46,766 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756652_15828 src: /192.168.158.8:53556 dest: /192.168.158.4:9866 2025-07-20 03:37:46,793 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:53556, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_669876690_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756652_15828, duration(ns): 21382376 2025-07-20 03:37:46,793 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756652_15828, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-20 03:37:50,517 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756652_15828 replica FinalizedReplica, blk_1073756652_15828, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756652 for deletion 2025-07-20 03:37:50,518 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756652_15828 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756652 2025-07-20 03:39:56,763 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756654_15830 src: /192.168.158.1:48748 dest: /192.168.158.4:9866 2025-07-20 03:39:56,796 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48748, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1988659940_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756654_15830, duration(ns): 24792287 2025-07-20 03:39:56,796 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756654_15830, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-20 03:39:59,522 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756654_15830 replica FinalizedReplica, blk_1073756654_15830, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756654 for deletion 2025-07-20 03:39:59,523 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756654_15830 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756654 2025-07-20 03:40:56,767 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756655_15831 src: /192.168.158.1:37324 dest: /192.168.158.4:9866 2025-07-20 03:40:56,802 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37324, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1259476529_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756655_15831, duration(ns): 26780424 2025-07-20 03:40:56,803 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756655_15831, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-20 03:41:02,523 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756655_15831 replica FinalizedReplica, blk_1073756655_15831, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756655 for deletion 2025-07-20 03:41:02,524 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756655_15831 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756655 2025-07-20 03:43:56,783 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756658_15834 src: /192.168.158.6:48888 dest: /192.168.158.4:9866 2025-07-20 03:43:56,804 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:48888, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1132571586_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756658_15834, duration(ns): 18117269 2025-07-20 03:43:56,805 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756658_15834, type=LAST_IN_PIPELINE terminating 2025-07-20 03:43:59,530 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756658_15834 replica FinalizedReplica, blk_1073756658_15834, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756658 for deletion 2025-07-20 03:43:59,532 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756658_15834 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756658 2025-07-20 03:46:01,784 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756660_15836 src: /192.168.158.9:55586 dest: /192.168.158.4:9866 2025-07-20 03:46:01,804 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:55586, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1449399540_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756660_15836, duration(ns): 17468487 2025-07-20 03:46:01,804 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756660_15836, type=LAST_IN_PIPELINE terminating 2025-07-20 03:46:08,536 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756660_15836 replica FinalizedReplica, blk_1073756660_15836, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756660 for deletion 2025-07-20 03:46:08,537 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756660_15836 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756660 2025-07-20 03:48:06,780 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756662_15838 src: /192.168.158.9:47904 dest: /192.168.158.4:9866 2025-07-20 03:48:06,805 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:47904, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-381072609_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756662_15838, duration(ns): 20065043 2025-07-20 03:48:06,806 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756662_15838, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-20 03:48:11,541 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756662_15838 replica FinalizedReplica, blk_1073756662_15838, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756662 for deletion 2025-07-20 03:48:11,542 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756662_15838 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756662 2025-07-20 03:52:16,779 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756666_15842 src: /192.168.158.7:43154 dest: /192.168.158.4:9866 2025-07-20 03:52:16,802 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:43154, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-481455555_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756666_15842, duration(ns): 17606891 2025-07-20 03:52:16,802 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756666_15842, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-20 03:52:20,551 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756666_15842 replica FinalizedReplica, blk_1073756666_15842, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756666 for deletion 2025-07-20 03:52:20,552 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756666_15842 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756666 2025-07-20 03:53:16,785 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756667_15843 src: /192.168.158.5:60076 dest: /192.168.158.4:9866 2025-07-20 03:53:16,811 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:60076, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-156395278_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756667_15843, duration(ns): 20300624 2025-07-20 03:53:16,811 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756667_15843, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-20 03:53:23,553 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756667_15843 replica FinalizedReplica, blk_1073756667_15843, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756667 for deletion 2025-07-20 03:53:23,554 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756667_15843 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756667 2025-07-20 03:54:16,788 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756668_15844 src: /192.168.158.5:60500 dest: /192.168.158.4:9866 2025-07-20 03:54:16,816 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:60500, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1847614451_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756668_15844, duration(ns): 22560076 2025-07-20 03:54:16,816 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756668_15844, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-20 03:54:20,553 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756668_15844 replica FinalizedReplica, blk_1073756668_15844, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756668 for deletion 2025-07-20 03:54:20,554 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756668_15844 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir25/blk_1073756668 2025-07-20 03:58:26,785 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756672_15848 src: /192.168.158.1:58544 dest: /192.168.158.4:9866 2025-07-20 03:58:26,818 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58544, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_142746000_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756672_15848, duration(ns): 24226674 2025-07-20 03:58:26,818 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756672_15848, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-20 03:58:29,559 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756672_15848 replica FinalizedReplica, blk_1073756672_15848, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756672 for deletion 2025-07-20 03:58:29,560 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756672_15848 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756672 2025-07-20 03:59:17,564 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Successfully sent block report 0x12cfd9a757d31f50, containing 4 storage report(s), of which we sent 4. The reports had 8 total blocks and used 1 RPC(s). This took 0 msec to generate and 4 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5. 2025-07-20 03:59:17,564 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Got finalize command for block pool BP-1059995147-192.168.158.1-1752101929360 2025-07-20 03:59:26,791 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756673_15849 src: /192.168.158.7:38768 dest: /192.168.158.4:9866 2025-07-20 03:59:26,812 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:38768, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_376678812_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756673_15849, duration(ns): 18233640 2025-07-20 03:59:26,812 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756673_15849, type=LAST_IN_PIPELINE terminating 2025-07-20 03:59:29,560 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756673_15849 replica FinalizedReplica, blk_1073756673_15849, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756673 for deletion 2025-07-20 03:59:29,562 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756673_15849 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756673 2025-07-20 04:00:26,788 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756674_15850 src: /192.168.158.1:33522 dest: /192.168.158.4:9866 2025-07-20 04:00:26,822 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33522, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2027298042_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756674_15850, duration(ns): 25333941 2025-07-20 04:00:26,822 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756674_15850, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-20 04:00:29,561 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756674_15850 replica FinalizedReplica, blk_1073756674_15850, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756674 for deletion 2025-07-20 04:00:29,562 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756674_15850 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756674 2025-07-20 04:03:31,798 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756677_15853 src: /192.168.158.5:56230 dest: /192.168.158.4:9866 2025-07-20 04:03:31,826 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:56230, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1924376782_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756677_15853, duration(ns): 21789295 2025-07-20 04:03:31,826 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756677_15853, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-20 04:03:35,561 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756677_15853 replica FinalizedReplica, blk_1073756677_15853, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756677 for deletion 2025-07-20 04:03:35,562 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756677_15853 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756677 2025-07-20 04:04:31,796 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756678_15854 src: /192.168.158.5:44262 dest: /192.168.158.4:9866 2025-07-20 04:04:31,823 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:44262, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2044599573_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756678_15854, duration(ns): 21580359 2025-07-20 04:04:31,824 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756678_15854, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-20 04:04:38,562 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756678_15854 replica FinalizedReplica, blk_1073756678_15854, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756678 for deletion 2025-07-20 04:04:38,563 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756678_15854 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756678 2025-07-20 04:05:31,805 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756679_15855 src: /192.168.158.1:37882 dest: /192.168.158.4:9866 2025-07-20 04:05:31,842 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37882, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1436235078_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756679_15855, duration(ns): 27074864 2025-07-20 04:05:31,842 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756679_15855, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-20 04:05:35,564 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756679_15855 replica FinalizedReplica, blk_1073756679_15855, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756679 for deletion 2025-07-20 04:05:35,565 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756679_15855 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756679 2025-07-20 04:06:36,807 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756680_15856 src: /192.168.158.9:53528 dest: /192.168.158.4:9866 2025-07-20 04:06:36,835 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:53528, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2078704669_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756680_15856, duration(ns): 21826816 2025-07-20 04:06:36,835 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756680_15856, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-20 04:06:41,566 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756680_15856 replica FinalizedReplica, blk_1073756680_15856, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756680 for deletion 2025-07-20 04:06:41,567 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756680_15856 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756680 2025-07-20 04:07:41,814 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756681_15857 src: /192.168.158.1:42076 dest: /192.168.158.4:9866 2025-07-20 04:07:41,849 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42076, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_793153967_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756681_15857, duration(ns): 24326347 2025-07-20 04:07:41,850 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756681_15857, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-20 04:07:47,567 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756681_15857 replica FinalizedReplica, blk_1073756681_15857, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756681 for deletion 2025-07-20 04:07:47,568 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756681_15857 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756681 2025-07-20 04:08:41,796 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756682_15858 src: /192.168.158.1:44056 dest: /192.168.158.4:9866 2025-07-20 04:08:41,831 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44056, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-909342450_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756682_15858, duration(ns): 25918196 2025-07-20 04:08:41,831 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756682_15858, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-20 04:08:44,567 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756682_15858 replica FinalizedReplica, blk_1073756682_15858, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756682 for deletion 2025-07-20 04:08:44,568 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756682_15858 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756682 2025-07-20 04:09:41,801 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756683_15859 src: /192.168.158.6:39442 dest: /192.168.158.4:9866 2025-07-20 04:09:41,828 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:39442, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1740487342_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756683_15859, duration(ns): 21384653 2025-07-20 04:09:41,828 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756683_15859, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-20 04:09:44,571 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756683_15859 replica FinalizedReplica, blk_1073756683_15859, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756683 for deletion 2025-07-20 04:09:44,572 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756683_15859 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756683 2025-07-20 04:14:41,808 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756688_15864 src: /192.168.158.1:38480 dest: /192.168.158.4:9866 2025-07-20 04:14:41,845 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38480, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2077700119_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756688_15864, duration(ns): 28027898 2025-07-20 04:14:41,845 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756688_15864, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-20 04:14:47,579 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756688_15864 replica FinalizedReplica, blk_1073756688_15864, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756688 for deletion 2025-07-20 04:14:47,580 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756688_15864 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756688 2025-07-20 04:15:41,813 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756689_15865 src: /192.168.158.7:60948 dest: /192.168.158.4:9866 2025-07-20 04:15:41,833 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:60948, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_750840228_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756689_15865, duration(ns): 17874523 2025-07-20 04:15:41,833 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756689_15865, type=LAST_IN_PIPELINE terminating 2025-07-20 04:15:44,581 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756689_15865 replica FinalizedReplica, blk_1073756689_15865, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756689 for deletion 2025-07-20 04:15:44,582 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756689_15865 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756689 2025-07-20 04:16:46,809 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756690_15866 src: /192.168.158.7:60402 dest: /192.168.158.4:9866 2025-07-20 04:16:46,836 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:60402, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_656617331_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756690_15866, duration(ns): 22246057 2025-07-20 04:16:46,836 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756690_15866, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-20 04:16:53,583 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756690_15866 replica FinalizedReplica, blk_1073756690_15866, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756690 for deletion 2025-07-20 04:16:53,584 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756690_15866 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756690 2025-07-20 04:17:51,809 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756691_15867 src: /192.168.158.1:55062 dest: /192.168.158.4:9866 2025-07-20 04:17:51,845 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55062, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-627612885_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756691_15867, duration(ns): 27230556 2025-07-20 04:17:51,845 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756691_15867, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-20 04:17:53,587 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756691_15867 replica FinalizedReplica, blk_1073756691_15867, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756691 for deletion 2025-07-20 04:17:53,588 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756691_15867 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756691 2025-07-20 04:23:06,812 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756696_15872 src: /192.168.158.1:37736 dest: /192.168.158.4:9866 2025-07-20 04:23:06,851 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37736, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1132508012_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756696_15872, duration(ns): 28053230 2025-07-20 04:23:06,851 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756696_15872, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-20 04:23:08,597 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756696_15872 replica FinalizedReplica, blk_1073756696_15872, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756696 for deletion 2025-07-20 04:23:08,598 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756696_15872 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756696 2025-07-20 04:25:06,821 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756698_15874 src: /192.168.158.6:51704 dest: /192.168.158.4:9866 2025-07-20 04:25:06,852 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:51704, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1028254102_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756698_15874, duration(ns): 24327673 2025-07-20 04:25:06,852 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756698_15874, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-20 04:25:08,601 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756698_15874 replica FinalizedReplica, blk_1073756698_15874, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756698 for deletion 2025-07-20 04:25:08,602 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756698_15874 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756698 2025-07-20 04:26:11,821 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756699_15875 src: /192.168.158.8:48986 dest: /192.168.158.4:9866 2025-07-20 04:26:11,840 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:48986, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_685775261_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756699_15875, duration(ns): 16842014 2025-07-20 04:26:11,840 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756699_15875, type=LAST_IN_PIPELINE terminating 2025-07-20 04:26:17,603 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756699_15875 replica FinalizedReplica, blk_1073756699_15875, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756699 for deletion 2025-07-20 04:26:17,604 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756699_15875 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756699 2025-07-20 04:27:11,890 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756700_15876 src: /192.168.158.6:60038 dest: /192.168.158.4:9866 2025-07-20 04:27:11,916 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:60038, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1674468699_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756700_15876, duration(ns): 20312395 2025-07-20 04:27:11,916 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756700_15876, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-20 04:27:14,603 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756700_15876 replica FinalizedReplica, blk_1073756700_15876, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756700 for deletion 2025-07-20 04:27:14,604 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756700_15876 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756700 2025-07-20 04:29:11,821 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756702_15878 src: /192.168.158.1:48032 dest: /192.168.158.4:9866 2025-07-20 04:29:11,854 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48032, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1316016578_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756702_15878, duration(ns): 22910623 2025-07-20 04:29:11,854 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756702_15878, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-20 04:29:14,608 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756702_15878 replica FinalizedReplica, blk_1073756702_15878, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756702 for deletion 2025-07-20 04:29:14,609 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756702_15878 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756702 2025-07-20 04:30:11,831 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756703_15879 src: /192.168.158.6:53968 dest: /192.168.158.4:9866 2025-07-20 04:30:11,850 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:53968, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1536995176_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756703_15879, duration(ns): 17326428 2025-07-20 04:30:11,850 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756703_15879, type=LAST_IN_PIPELINE terminating 2025-07-20 04:30:17,611 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756703_15879 replica FinalizedReplica, blk_1073756703_15879, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756703 for deletion 2025-07-20 04:30:17,612 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756703_15879 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756703 2025-07-20 04:31:11,828 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756704_15880 src: /192.168.158.6:49672 dest: /192.168.158.4:9866 2025-07-20 04:31:11,854 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:49672, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-436772592_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756704_15880, duration(ns): 20750436 2025-07-20 04:31:11,854 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756704_15880, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-20 04:31:14,613 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756704_15880 replica FinalizedReplica, blk_1073756704_15880, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756704 for deletion 2025-07-20 04:31:14,614 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756704_15880 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756704 2025-07-20 04:32:11,834 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756705_15881 src: /192.168.158.6:44970 dest: /192.168.158.4:9866 2025-07-20 04:32:11,852 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:44970, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1930400002_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756705_15881, duration(ns): 16533232 2025-07-20 04:32:11,853 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756705_15881, type=LAST_IN_PIPELINE terminating 2025-07-20 04:32:14,614 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756705_15881 replica FinalizedReplica, blk_1073756705_15881, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756705 for deletion 2025-07-20 04:32:14,616 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756705_15881 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756705 2025-07-20 04:33:11,832 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756706_15882 src: /192.168.158.1:48872 dest: /192.168.158.4:9866 2025-07-20 04:33:11,866 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48872, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-545987894_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756706_15882, duration(ns): 24053030 2025-07-20 04:33:11,866 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756706_15882, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-20 04:33:17,614 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756706_15882 replica FinalizedReplica, blk_1073756706_15882, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756706 for deletion 2025-07-20 04:33:17,615 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756706_15882 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756706 2025-07-20 04:34:11,832 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756707_15883 src: /192.168.158.6:54994 dest: /192.168.158.4:9866 2025-07-20 04:34:11,858 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:54994, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_77584513_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756707_15883, duration(ns): 20925309 2025-07-20 04:34:11,859 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756707_15883, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-20 04:34:17,615 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756707_15883 replica FinalizedReplica, blk_1073756707_15883, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756707 for deletion 2025-07-20 04:34:17,617 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756707_15883 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756707 2025-07-20 04:35:11,860 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756708_15884 src: /192.168.158.7:34580 dest: /192.168.158.4:9866 2025-07-20 04:35:11,886 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:34580, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1084705869_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756708_15884, duration(ns): 20483569 2025-07-20 04:35:11,886 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756708_15884, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-20 04:35:17,618 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756708_15884 replica FinalizedReplica, blk_1073756708_15884, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756708 for deletion 2025-07-20 04:35:17,620 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756708_15884 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756708 2025-07-20 04:36:11,842 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756709_15885 src: /192.168.158.7:53336 dest: /192.168.158.4:9866 2025-07-20 04:36:11,862 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:53336, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_472713818_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756709_15885, duration(ns): 18094829 2025-07-20 04:36:11,862 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756709_15885, type=LAST_IN_PIPELINE terminating 2025-07-20 04:36:14,620 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756709_15885 replica FinalizedReplica, blk_1073756709_15885, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756709 for deletion 2025-07-20 04:36:14,621 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756709_15885 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756709 2025-07-20 04:38:11,847 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756711_15887 src: /192.168.158.8:42090 dest: /192.168.158.4:9866 2025-07-20 04:38:11,871 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:42090, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1393409570_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756711_15887, duration(ns): 19231048 2025-07-20 04:38:11,872 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756711_15887, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-20 04:38:17,624 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756711_15887 replica FinalizedReplica, blk_1073756711_15887, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756711 for deletion 2025-07-20 04:38:17,626 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756711_15887 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756711 2025-07-20 04:43:31,882 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756716_15892 src: /192.168.158.9:37924 dest: /192.168.158.4:9866 2025-07-20 04:43:31,901 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:37924, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-926122296_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756716_15892, duration(ns): 16708650 2025-07-20 04:43:31,901 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756716_15892, type=LAST_IN_PIPELINE terminating 2025-07-20 04:43:35,638 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756716_15892 replica FinalizedReplica, blk_1073756716_15892, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756716 for deletion 2025-07-20 04:43:35,639 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756716_15892 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756716 2025-07-20 04:44:36,861 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756717_15893 src: /192.168.158.6:37038 dest: /192.168.158.4:9866 2025-07-20 04:44:36,882 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:37038, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1478473838_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756717_15893, duration(ns): 19396673 2025-07-20 04:44:36,883 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756717_15893, type=LAST_IN_PIPELINE terminating 2025-07-20 04:44:38,640 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756717_15893 replica FinalizedReplica, blk_1073756717_15893, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756717 for deletion 2025-07-20 04:44:38,641 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756717_15893 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756717 2025-07-20 04:45:36,842 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756718_15894 src: /192.168.158.1:48928 dest: /192.168.158.4:9866 2025-07-20 04:45:36,874 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48928, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1849331435_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756718_15894, duration(ns): 22097798 2025-07-20 04:45:36,874 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756718_15894, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-20 04:45:41,644 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756718_15894 replica FinalizedReplica, blk_1073756718_15894, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756718 for deletion 2025-07-20 04:45:41,645 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756718_15894 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756718 2025-07-20 04:46:36,850 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756719_15895 src: /192.168.158.6:40380 dest: /192.168.158.4:9866 2025-07-20 04:46:36,870 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:40380, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1179778189_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756719_15895, duration(ns): 18689319 2025-07-20 04:46:36,871 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756719_15895, type=LAST_IN_PIPELINE terminating 2025-07-20 04:46:38,646 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756719_15895 replica FinalizedReplica, blk_1073756719_15895, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756719 for deletion 2025-07-20 04:46:38,647 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756719_15895 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756719 2025-07-20 04:47:36,855 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756720_15896 src: /192.168.158.1:54148 dest: /192.168.158.4:9866 2025-07-20 04:47:36,890 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54148, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1403565039_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756720_15896, duration(ns): 26507808 2025-07-20 04:47:36,891 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756720_15896, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-20 04:47:38,649 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756720_15896 replica FinalizedReplica, blk_1073756720_15896, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756720 for deletion 2025-07-20 04:47:38,650 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756720_15896 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756720 2025-07-20 04:52:36,862 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756725_15901 src: /192.168.158.1:58270 dest: /192.168.158.4:9866 2025-07-20 04:52:36,902 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58270, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1775946662_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756725_15901, duration(ns): 29415314 2025-07-20 04:52:36,902 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756725_15901, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-20 04:52:38,659 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756725_15901 replica FinalizedReplica, blk_1073756725_15901, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756725 for deletion 2025-07-20 04:52:38,660 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756725_15901 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756725 2025-07-20 04:54:36,860 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756727_15903 src: /192.168.158.1:48138 dest: /192.168.158.4:9866 2025-07-20 04:54:36,892 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48138, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1683652121_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756727_15903, duration(ns): 22758575 2025-07-20 04:54:36,892 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756727_15903, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-20 04:54:38,664 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756727_15903 replica FinalizedReplica, blk_1073756727_15903, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756727 for deletion 2025-07-20 04:54:38,665 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756727_15903 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756727 2025-07-20 04:56:36,865 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756729_15905 src: /192.168.158.7:44558 dest: /192.168.158.4:9866 2025-07-20 04:56:36,895 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:44558, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-631925147_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756729_15905, duration(ns): 22972750 2025-07-20 04:56:36,895 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756729_15905, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-20 04:56:38,663 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756729_15905 replica FinalizedReplica, blk_1073756729_15905, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756729 for deletion 2025-07-20 04:56:38,665 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756729_15905 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756729 2025-07-20 04:58:36,879 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756731_15907 src: /192.168.158.8:35152 dest: /192.168.158.4:9866 2025-07-20 04:58:36,901 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:35152, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-707099024_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756731_15907, duration(ns): 19908107 2025-07-20 04:58:36,901 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756731_15907, type=LAST_IN_PIPELINE terminating 2025-07-20 04:58:41,670 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756731_15907 replica FinalizedReplica, blk_1073756731_15907, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756731 for deletion 2025-07-20 04:58:41,671 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756731_15907 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756731 2025-07-20 04:59:36,880 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756732_15908 src: /192.168.158.9:51818 dest: /192.168.158.4:9866 2025-07-20 04:59:36,900 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:51818, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_504439415_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756732_15908, duration(ns): 17233605 2025-07-20 04:59:36,900 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756732_15908, type=LAST_IN_PIPELINE terminating 2025-07-20 04:59:41,670 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756732_15908 replica FinalizedReplica, blk_1073756732_15908, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756732 for deletion 2025-07-20 04:59:41,671 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756732_15908 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756732 2025-07-20 05:02:36,887 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756735_15911 src: /192.168.158.5:57778 dest: /192.168.158.4:9866 2025-07-20 05:02:36,914 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:57778, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1453758513_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756735_15911, duration(ns): 21928202 2025-07-20 05:02:36,915 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756735_15911, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-20 05:02:41,676 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756735_15911 replica FinalizedReplica, blk_1073756735_15911, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756735 for deletion 2025-07-20 05:02:41,677 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756735_15911 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756735 2025-07-20 05:03:36,882 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756736_15912 src: /192.168.158.6:55024 dest: /192.168.158.4:9866 2025-07-20 05:03:36,907 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:55024, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1325943671_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756736_15912, duration(ns): 19841129 2025-07-20 05:03:36,908 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756736_15912, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-20 05:03:41,678 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756736_15912 replica FinalizedReplica, blk_1073756736_15912, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756736 for deletion 2025-07-20 05:03:41,679 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756736_15912 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756736 2025-07-20 05:05:36,886 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756738_15914 src: /192.168.158.9:48860 dest: /192.168.158.4:9866 2025-07-20 05:05:36,921 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:48860, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_952008047_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756738_15914, duration(ns): 28557822 2025-07-20 05:05:36,921 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756738_15914, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-20 05:05:38,683 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756738_15914 replica FinalizedReplica, blk_1073756738_15914, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756738 for deletion 2025-07-20 05:05:38,684 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756738_15914 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756738 2025-07-20 05:06:41,882 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756739_15915 src: /192.168.158.1:58736 dest: /192.168.158.4:9866 2025-07-20 05:06:41,917 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58736, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_180806189_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756739_15915, duration(ns): 25300753 2025-07-20 05:06:41,917 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756739_15915, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-20 05:06:44,686 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756739_15915 replica FinalizedReplica, blk_1073756739_15915, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756739 for deletion 2025-07-20 05:06:44,687 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756739_15915 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756739 2025-07-20 05:07:46,883 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756740_15916 src: /192.168.158.5:37478 dest: /192.168.158.4:9866 2025-07-20 05:07:46,910 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:37478, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-603489286_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756740_15916, duration(ns): 21447303 2025-07-20 05:07:46,911 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756740_15916, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-20 05:07:50,688 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756740_15916 replica FinalizedReplica, blk_1073756740_15916, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756740 for deletion 2025-07-20 05:07:50,689 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756740_15916 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756740 2025-07-20 05:10:51,882 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756743_15919 src: /192.168.158.1:50872 dest: /192.168.158.4:9866 2025-07-20 05:10:51,915 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50872, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-910356462_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756743_15919, duration(ns): 24762389 2025-07-20 05:10:51,916 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756743_15919, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-20 05:10:56,693 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756743_15919 replica FinalizedReplica, blk_1073756743_15919, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756743 for deletion 2025-07-20 05:10:56,694 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756743_15919 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756743 2025-07-20 05:11:56,884 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756744_15920 src: /192.168.158.5:39054 dest: /192.168.158.4:9866 2025-07-20 05:11:56,910 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:39054, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1036683822_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756744_15920, duration(ns): 20748638 2025-07-20 05:11:56,910 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756744_15920, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-20 05:11:59,695 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756744_15920 replica FinalizedReplica, blk_1073756744_15920, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756744 for deletion 2025-07-20 05:11:59,696 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756744_15920 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756744 2025-07-20 05:12:56,895 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756745_15921 src: /192.168.158.9:54558 dest: /192.168.158.4:9866 2025-07-20 05:12:56,913 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:54558, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1141426011_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756745_15921, duration(ns): 16132467 2025-07-20 05:12:56,913 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756745_15921, type=LAST_IN_PIPELINE terminating 2025-07-20 05:12:59,695 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756745_15921 replica FinalizedReplica, blk_1073756745_15921, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756745 for deletion 2025-07-20 05:12:59,696 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756745_15921 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756745 2025-07-20 05:18:06,902 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756750_15926 src: /192.168.158.6:34860 dest: /192.168.158.4:9866 2025-07-20 05:18:06,925 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:34860, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1080757677_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756750_15926, duration(ns): 20486760 2025-07-20 05:18:06,925 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756750_15926, type=LAST_IN_PIPELINE terminating 2025-07-20 05:18:11,700 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756750_15926 replica FinalizedReplica, blk_1073756750_15926, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756750 for deletion 2025-07-20 05:18:11,701 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756750_15926 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756750 2025-07-20 05:22:21,911 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756754_15930 src: /192.168.158.9:33086 dest: /192.168.158.4:9866 2025-07-20 05:22:21,938 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:33086, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-852913263_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756754_15930, duration(ns): 21391688 2025-07-20 05:22:21,939 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756754_15930, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-20 05:22:23,711 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756754_15930 replica FinalizedReplica, blk_1073756754_15930, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756754 for deletion 2025-07-20 05:22:23,712 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756754_15930 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756754 2025-07-20 05:23:21,910 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756755_15931 src: /192.168.158.7:36100 dest: /192.168.158.4:9866 2025-07-20 05:23:21,930 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:36100, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1760588427_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756755_15931, duration(ns): 18106852 2025-07-20 05:23:21,930 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756755_15931, type=LAST_IN_PIPELINE terminating 2025-07-20 05:23:23,712 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756755_15931 replica FinalizedReplica, blk_1073756755_15931, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756755 for deletion 2025-07-20 05:23:23,714 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756755_15931 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756755 2025-07-20 05:26:21,907 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756758_15934 src: /192.168.158.1:40302 dest: /192.168.158.4:9866 2025-07-20 05:26:21,939 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40302, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2132731642_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756758_15934, duration(ns): 23120357 2025-07-20 05:26:21,939 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756758_15934, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-20 05:26:26,720 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756758_15934 replica FinalizedReplica, blk_1073756758_15934, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756758 for deletion 2025-07-20 05:26:26,721 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756758_15934 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756758 2025-07-20 05:30:26,916 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756762_15938 src: /192.168.158.1:36816 dest: /192.168.158.4:9866 2025-07-20 05:30:26,948 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36816, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1025719949_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756762_15938, duration(ns): 23456685 2025-07-20 05:30:26,948 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756762_15938, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-20 05:30:29,731 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756762_15938 replica FinalizedReplica, blk_1073756762_15938, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756762 for deletion 2025-07-20 05:30:29,733 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756762_15938 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756762 2025-07-20 05:33:36,919 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756765_15941 src: /192.168.158.8:43118 dest: /192.168.158.4:9866 2025-07-20 05:33:36,948 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:43118, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-613127401_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756765_15941, duration(ns): 22892316 2025-07-20 05:33:36,948 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756765_15941, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-20 05:33:38,745 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756765_15941 replica FinalizedReplica, blk_1073756765_15941, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756765 for deletion 2025-07-20 05:33:38,746 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756765_15941 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756765 2025-07-20 05:35:36,925 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756767_15943 src: /192.168.158.1:33776 dest: /192.168.158.4:9866 2025-07-20 05:35:36,959 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33776, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1834310523_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756767_15943, duration(ns): 25020632 2025-07-20 05:35:36,959 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756767_15943, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-20 05:35:38,750 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756767_15943 replica FinalizedReplica, blk_1073756767_15943, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756767 for deletion 2025-07-20 05:35:38,751 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756767_15943 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756767 2025-07-20 05:36:13,269 INFO org.apache.hadoop.hdfs.server.datanode.DirectoryScanner: BlockPool BP-1059995147-192.168.158.1-1752101929360 Total blocks: 8, missing metadata files:0, missing block files:0, missing blocks in memory:0, mismatched blocks:0 2025-07-20 05:36:36,931 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756768_15944 src: /192.168.158.1:34314 dest: /192.168.158.4:9866 2025-07-20 05:36:36,967 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34314, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1715377815_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756768_15944, duration(ns): 26095544 2025-07-20 05:36:36,967 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756768_15944, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-20 05:36:38,753 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756768_15944 replica FinalizedReplica, blk_1073756768_15944, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756768 for deletion 2025-07-20 05:36:38,754 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756768_15944 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756768 2025-07-20 05:37:36,926 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756769_15945 src: /192.168.158.1:48754 dest: /192.168.158.4:9866 2025-07-20 05:37:36,959 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48754, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1716250209_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756769_15945, duration(ns): 24558425 2025-07-20 05:37:36,959 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756769_15945, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-20 05:37:38,752 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756769_15945 replica FinalizedReplica, blk_1073756769_15945, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756769 for deletion 2025-07-20 05:37:38,754 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756769_15945 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756769 2025-07-20 05:38:41,927 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756770_15946 src: /192.168.158.7:58796 dest: /192.168.158.4:9866 2025-07-20 05:38:41,959 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:58796, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1059325202_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756770_15946, duration(ns): 25159036 2025-07-20 05:38:41,959 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756770_15946, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-20 05:38:44,755 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756770_15946 replica FinalizedReplica, blk_1073756770_15946, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756770 for deletion 2025-07-20 05:38:44,756 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756770_15946 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756770 2025-07-20 05:39:46,924 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756771_15947 src: /192.168.158.8:40322 dest: /192.168.158.4:9866 2025-07-20 05:39:46,951 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:40322, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_332718563_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756771_15947, duration(ns): 22122164 2025-07-20 05:39:46,952 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756771_15947, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-20 05:39:50,758 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756771_15947 replica FinalizedReplica, blk_1073756771_15947, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756771 for deletion 2025-07-20 05:39:50,760 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756771_15947 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756771 2025-07-20 05:41:51,927 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756773_15949 src: /192.168.158.1:60950 dest: /192.168.158.4:9866 2025-07-20 05:41:51,963 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60950, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_443982680_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756773_15949, duration(ns): 25896732 2025-07-20 05:41:51,963 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756773_15949, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-20 05:41:56,761 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756773_15949 replica FinalizedReplica, blk_1073756773_15949, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756773 for deletion 2025-07-20 05:41:56,763 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756773_15949 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756773 2025-07-20 05:43:51,930 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756775_15951 src: /192.168.158.1:33118 dest: /192.168.158.4:9866 2025-07-20 05:43:51,963 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33118, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1542708377_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756775_15951, duration(ns): 23828211 2025-07-20 05:43:51,963 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756775_15951, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-20 05:43:53,765 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756775_15951 replica FinalizedReplica, blk_1073756775_15951, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756775 for deletion 2025-07-20 05:43:53,766 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756775_15951 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756775 2025-07-20 05:45:56,937 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756777_15953 src: /192.168.158.6:39190 dest: /192.168.158.4:9866 2025-07-20 05:45:56,954 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:39190, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_587346861_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756777_15953, duration(ns): 15655342 2025-07-20 05:45:56,955 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756777_15953, type=LAST_IN_PIPELINE terminating 2025-07-20 05:46:02,769 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756777_15953 replica FinalizedReplica, blk_1073756777_15953, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756777 for deletion 2025-07-20 05:46:02,770 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756777_15953 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756777 2025-07-20 05:50:06,937 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756781_15957 src: /192.168.158.7:52042 dest: /192.168.158.4:9866 2025-07-20 05:50:06,962 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:52042, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_106189323_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756781_15957, duration(ns): 19741808 2025-07-20 05:50:06,963 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756781_15957, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-20 05:50:08,781 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756781_15957 replica FinalizedReplica, blk_1073756781_15957, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756781 for deletion 2025-07-20 05:50:08,782 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756781_15957 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756781 2025-07-20 05:52:06,938 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756783_15959 src: /192.168.158.1:60760 dest: /192.168.158.4:9866 2025-07-20 05:52:06,972 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60760, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_782610822_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756783_15959, duration(ns): 22971779 2025-07-20 05:52:06,972 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756783_15959, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-20 05:52:11,787 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756783_15959 replica FinalizedReplica, blk_1073756783_15959, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756783 for deletion 2025-07-20 05:52:11,788 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756783_15959 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756783 2025-07-20 05:55:06,948 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756786_15962 src: /192.168.158.8:34320 dest: /192.168.158.4:9866 2025-07-20 05:55:06,968 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:34320, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1412550186_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756786_15962, duration(ns): 17716275 2025-07-20 05:55:06,968 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756786_15962, type=LAST_IN_PIPELINE terminating 2025-07-20 05:55:11,799 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756786_15962 replica FinalizedReplica, blk_1073756786_15962, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756786 for deletion 2025-07-20 05:55:11,800 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756786_15962 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756786 2025-07-20 05:56:06,946 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756787_15963 src: /192.168.158.5:45216 dest: /192.168.158.4:9866 2025-07-20 05:56:06,973 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:45216, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1325978381_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756787_15963, duration(ns): 22000401 2025-07-20 05:56:06,974 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756787_15963, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-20 05:56:08,801 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756787_15963 replica FinalizedReplica, blk_1073756787_15963, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756787 for deletion 2025-07-20 05:56:08,802 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756787_15963 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756787 2025-07-20 05:57:06,944 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756788_15964 src: /192.168.158.1:59990 dest: /192.168.158.4:9866 2025-07-20 05:57:06,977 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59990, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679807790_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756788_15964, duration(ns): 24362175 2025-07-20 05:57:06,977 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756788_15964, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-20 05:57:08,804 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756788_15964 replica FinalizedReplica, blk_1073756788_15964, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756788 for deletion 2025-07-20 05:57:08,805 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756788_15964 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756788 2025-07-20 05:58:06,955 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756789_15965 src: /192.168.158.9:55654 dest: /192.168.158.4:9866 2025-07-20 05:58:06,974 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:55654, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1688224955_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756789_15965, duration(ns): 16666841 2025-07-20 05:58:06,974 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756789_15965, type=LAST_IN_PIPELINE terminating 2025-07-20 05:58:08,806 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756789_15965 replica FinalizedReplica, blk_1073756789_15965, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756789 for deletion 2025-07-20 05:58:08,807 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756789_15965 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756789 2025-07-20 06:00:06,951 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756791_15967 src: /192.168.158.1:43206 dest: /192.168.158.4:9866 2025-07-20 06:00:06,986 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43206, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_333622975_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756791_15967, duration(ns): 26702965 2025-07-20 06:00:06,986 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756791_15967, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-20 06:00:11,810 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756791_15967 replica FinalizedReplica, blk_1073756791_15967, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756791 for deletion 2025-07-20 06:00:11,811 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756791_15967 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756791 2025-07-20 06:02:11,967 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756793_15969 src: /192.168.158.9:58592 dest: /192.168.158.4:9866 2025-07-20 06:02:11,986 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:58592, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_220128292_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756793_15969, duration(ns): 15548589 2025-07-20 06:02:11,986 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756793_15969, type=LAST_IN_PIPELINE terminating 2025-07-20 06:02:14,815 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756793_15969 replica FinalizedReplica, blk_1073756793_15969, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756793 for deletion 2025-07-20 06:02:14,816 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756793_15969 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756793 2025-07-20 06:03:11,958 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756794_15970 src: /192.168.158.9:57604 dest: /192.168.158.4:9866 2025-07-20 06:03:11,985 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:57604, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1269356692_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756794_15970, duration(ns): 21218254 2025-07-20 06:03:11,985 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756794_15970, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-20 06:03:14,819 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756794_15970 replica FinalizedReplica, blk_1073756794_15970, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756794 for deletion 2025-07-20 06:03:14,820 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756794_15970 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756794 2025-07-20 06:04:11,958 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756795_15971 src: /192.168.158.1:36352 dest: /192.168.158.4:9866 2025-07-20 06:04:11,991 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36352, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-367742981_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756795_15971, duration(ns): 24063839 2025-07-20 06:04:11,991 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756795_15971, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-20 06:04:14,823 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756795_15971 replica FinalizedReplica, blk_1073756795_15971, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756795 for deletion 2025-07-20 06:04:14,824 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756795_15971 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756795 2025-07-20 06:06:21,963 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756797_15973 src: /192.168.158.7:51752 dest: /192.168.158.4:9866 2025-07-20 06:06:21,991 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:51752, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_940321148_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756797_15973, duration(ns): 23214588 2025-07-20 06:06:21,992 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756797_15973, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-20 06:06:23,828 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756797_15973 replica FinalizedReplica, blk_1073756797_15973, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756797 for deletion 2025-07-20 06:06:23,829 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756797_15973 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756797 2025-07-20 06:07:26,961 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756798_15974 src: /192.168.158.6:38748 dest: /192.168.158.4:9866 2025-07-20 06:07:26,982 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:38748, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1427737716_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756798_15974, duration(ns): 18373017 2025-07-20 06:07:26,982 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756798_15974, type=LAST_IN_PIPELINE terminating 2025-07-20 06:07:29,831 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756798_15974 replica FinalizedReplica, blk_1073756798_15974, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756798 for deletion 2025-07-20 06:07:29,832 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756798_15974 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756798 2025-07-20 06:08:26,965 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756799_15975 src: /192.168.158.5:49104 dest: /192.168.158.4:9866 2025-07-20 06:08:26,985 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:49104, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-85508808_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756799_15975, duration(ns): 17134545 2025-07-20 06:08:26,985 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756799_15975, type=LAST_IN_PIPELINE terminating 2025-07-20 06:08:32,836 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756799_15975 replica FinalizedReplica, blk_1073756799_15975, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756799 for deletion 2025-07-20 06:08:32,838 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756799_15975 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756799 2025-07-20 06:09:26,962 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756800_15976 src: /192.168.158.1:57924 dest: /192.168.158.4:9866 2025-07-20 06:09:26,995 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57924, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2028517419_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756800_15976, duration(ns): 24465587 2025-07-20 06:09:26,996 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756800_15976, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-20 06:09:29,839 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756800_15976 replica FinalizedReplica, blk_1073756800_15976, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756800 for deletion 2025-07-20 06:09:29,840 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756800_15976 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756800 2025-07-20 06:10:26,984 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756801_15977 src: /192.168.158.1:45246 dest: /192.168.158.4:9866 2025-07-20 06:10:27,023 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45246, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1932231377_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756801_15977, duration(ns): 27213224 2025-07-20 06:10:27,023 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756801_15977, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-20 06:10:32,842 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756801_15977 replica FinalizedReplica, blk_1073756801_15977, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756801 for deletion 2025-07-20 06:10:32,844 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756801_15977 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756801 2025-07-20 06:12:36,977 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756803_15979 src: /192.168.158.9:35724 dest: /192.168.158.4:9866 2025-07-20 06:12:36,999 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:35724, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-284029194_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756803_15979, duration(ns): 18948727 2025-07-20 06:12:36,999 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756803_15979, type=LAST_IN_PIPELINE terminating 2025-07-20 06:12:38,850 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756803_15979 replica FinalizedReplica, blk_1073756803_15979, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756803 for deletion 2025-07-20 06:12:38,852 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756803_15979 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756803 2025-07-20 06:13:36,963 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756804_15980 src: /192.168.158.1:53304 dest: /192.168.158.4:9866 2025-07-20 06:13:37,000 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53304, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1094041260_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756804_15980, duration(ns): 27044159 2025-07-20 06:13:37,001 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756804_15980, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-20 06:13:38,853 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756804_15980 replica FinalizedReplica, blk_1073756804_15980, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756804 for deletion 2025-07-20 06:13:38,854 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756804_15980 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756804 2025-07-20 06:17:46,975 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756808_15984 src: /192.168.158.9:49622 dest: /192.168.158.4:9866 2025-07-20 06:17:47,002 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:49622, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2102444944_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756808_15984, duration(ns): 21562153 2025-07-20 06:17:47,002 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756808_15984, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-20 06:17:50,858 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756808_15984 replica FinalizedReplica, blk_1073756808_15984, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756808 for deletion 2025-07-20 06:17:50,859 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756808_15984 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756808 2025-07-20 06:18:51,971 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756809_15985 src: /192.168.158.6:53792 dest: /192.168.158.4:9866 2025-07-20 06:18:51,998 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:53792, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-687525757_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756809_15985, duration(ns): 21264192 2025-07-20 06:18:51,998 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756809_15985, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-20 06:18:53,863 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756809_15985 replica FinalizedReplica, blk_1073756809_15985, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756809 for deletion 2025-07-20 06:18:53,864 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756809_15985 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756809 2025-07-20 06:19:51,973 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756810_15986 src: /192.168.158.1:35400 dest: /192.168.158.4:9866 2025-07-20 06:19:52,007 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35400, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_746963519_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756810_15986, duration(ns): 24518156 2025-07-20 06:19:52,007 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756810_15986, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-20 06:19:56,868 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756810_15986 replica FinalizedReplica, blk_1073756810_15986, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756810 for deletion 2025-07-20 06:19:56,869 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756810_15986 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756810 2025-07-20 06:21:51,982 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756812_15988 src: /192.168.158.7:52150 dest: /192.168.158.4:9866 2025-07-20 06:21:52,008 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:52150, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1854264825_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756812_15988, duration(ns): 21167463 2025-07-20 06:21:52,009 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756812_15988, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-20 06:21:53,870 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756812_15988 replica FinalizedReplica, blk_1073756812_15988, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756812 for deletion 2025-07-20 06:21:53,871 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756812_15988 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756812 2025-07-20 06:26:56,992 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756817_15993 src: /192.168.158.1:45634 dest: /192.168.158.4:9866 2025-07-20 06:26:57,027 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45634, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1722649843_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756817_15993, duration(ns): 25260123 2025-07-20 06:26:57,027 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756817_15993, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-20 06:26:59,882 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756817_15993 replica FinalizedReplica, blk_1073756817_15993, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756817 for deletion 2025-07-20 06:26:59,883 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756817_15993 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756817 2025-07-20 06:31:11,984 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756821_15997 src: /192.168.158.1:33684 dest: /192.168.158.4:9866 2025-07-20 06:31:12,019 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33684, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-259862328_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756821_15997, duration(ns): 25578290 2025-07-20 06:31:12,019 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756821_15997, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-20 06:31:17,891 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756821_15997 replica FinalizedReplica, blk_1073756821_15997, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756821 for deletion 2025-07-20 06:31:17,892 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756821_15997 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756821 2025-07-20 06:32:16,988 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756822_15998 src: /192.168.158.1:40468 dest: /192.168.158.4:9866 2025-07-20 06:32:17,023 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40468, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1058110615_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756822_15998, duration(ns): 25951923 2025-07-20 06:32:17,023 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756822_15998, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-20 06:32:20,893 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756822_15998 replica FinalizedReplica, blk_1073756822_15998, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756822 for deletion 2025-07-20 06:32:20,895 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756822_15998 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756822 2025-07-20 06:35:21,997 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756825_16001 src: /192.168.158.7:36280 dest: /192.168.158.4:9866 2025-07-20 06:35:22,016 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:36280, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-411795827_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756825_16001, duration(ns): 16648792 2025-07-20 06:35:22,016 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756825_16001, type=LAST_IN_PIPELINE terminating 2025-07-20 06:35:23,898 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756825_16001 replica FinalizedReplica, blk_1073756825_16001, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756825 for deletion 2025-07-20 06:35:23,899 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756825_16001 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756825 2025-07-20 06:36:21,997 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756826_16002 src: /192.168.158.1:45742 dest: /192.168.158.4:9866 2025-07-20 06:36:22,034 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45742, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2121234394_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756826_16002, duration(ns): 28051645 2025-07-20 06:36:22,034 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756826_16002, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-20 06:36:26,901 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756826_16002 replica FinalizedReplica, blk_1073756826_16002, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756826 for deletion 2025-07-20 06:36:26,902 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756826_16002 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756826 2025-07-20 06:38:26,997 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756828_16004 src: /192.168.158.1:39604 dest: /192.168.158.4:9866 2025-07-20 06:38:27,031 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39604, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-123060000_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756828_16004, duration(ns): 24925900 2025-07-20 06:38:27,031 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756828_16004, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-20 06:38:32,906 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756828_16004 replica FinalizedReplica, blk_1073756828_16004, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756828 for deletion 2025-07-20 06:38:32,907 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756828_16004 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756828 2025-07-20 06:39:27,006 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756829_16005 src: /192.168.158.8:55824 dest: /192.168.158.4:9866 2025-07-20 06:39:27,026 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:55824, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_626934830_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756829_16005, duration(ns): 17691320 2025-07-20 06:39:27,026 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756829_16005, type=LAST_IN_PIPELINE terminating 2025-07-20 06:39:29,907 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756829_16005 replica FinalizedReplica, blk_1073756829_16005, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756829 for deletion 2025-07-20 06:39:29,908 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756829_16005 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756829 2025-07-20 06:41:27,007 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756831_16007 src: /192.168.158.8:44132 dest: /192.168.158.4:9866 2025-07-20 06:41:27,034 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:44132, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_995497776_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756831_16007, duration(ns): 20849310 2025-07-20 06:41:27,034 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756831_16007, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-20 06:41:29,910 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756831_16007 replica FinalizedReplica, blk_1073756831_16007, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756831 for deletion 2025-07-20 06:41:29,911 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756831_16007 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756831 2025-07-20 06:42:27,004 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756832_16008 src: /192.168.158.1:54398 dest: /192.168.158.4:9866 2025-07-20 06:42:27,041 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54398, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1524783272_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756832_16008, duration(ns): 27315184 2025-07-20 06:42:27,041 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756832_16008, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-20 06:42:32,915 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756832_16008 replica FinalizedReplica, blk_1073756832_16008, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756832 for deletion 2025-07-20 06:42:32,916 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756832_16008 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756832 2025-07-20 06:48:47,008 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756838_16014 src: /192.168.158.7:60080 dest: /192.168.158.4:9866 2025-07-20 06:48:47,027 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:60080, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_771761097_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756838_16014, duration(ns): 17658889 2025-07-20 06:48:47,028 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756838_16014, type=LAST_IN_PIPELINE terminating 2025-07-20 06:48:53,928 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756838_16014 replica FinalizedReplica, blk_1073756838_16014, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756838 for deletion 2025-07-20 06:48:53,929 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756838_16014 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756838 2025-07-20 06:49:52,009 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756839_16015 src: /192.168.158.5:50846 dest: /192.168.158.4:9866 2025-07-20 06:49:52,028 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:50846, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-11320775_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756839_16015, duration(ns): 17821941 2025-07-20 06:49:52,029 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756839_16015, type=LAST_IN_PIPELINE terminating 2025-07-20 06:49:56,928 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756839_16015 replica FinalizedReplica, blk_1073756839_16015, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756839 for deletion 2025-07-20 06:49:56,930 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756839_16015 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756839 2025-07-20 06:50:52,006 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756840_16016 src: /192.168.158.1:44720 dest: /192.168.158.4:9866 2025-07-20 06:50:52,039 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44720, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-89030833_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756840_16016, duration(ns): 24734445 2025-07-20 06:50:52,039 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756840_16016, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-20 06:50:53,929 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756840_16016 replica FinalizedReplica, blk_1073756840_16016, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756840 for deletion 2025-07-20 06:50:53,930 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756840_16016 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756840 2025-07-20 06:51:52,014 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756841_16017 src: /192.168.158.8:53800 dest: /192.168.158.4:9866 2025-07-20 06:51:52,034 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:53800, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2145574869_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756841_16017, duration(ns): 17672463 2025-07-20 06:51:52,034 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756841_16017, type=LAST_IN_PIPELINE terminating 2025-07-20 06:51:53,932 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756841_16017 replica FinalizedReplica, blk_1073756841_16017, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756841 for deletion 2025-07-20 06:51:53,933 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756841_16017 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756841 2025-07-20 06:52:57,006 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756842_16018 src: /192.168.158.1:50136 dest: /192.168.158.4:9866 2025-07-20 06:52:57,040 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50136, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-994722964_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756842_16018, duration(ns): 24478067 2025-07-20 06:52:57,040 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756842_16018, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-20 06:52:59,931 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756842_16018 replica FinalizedReplica, blk_1073756842_16018, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756842 for deletion 2025-07-20 06:52:59,932 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756842_16018 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756842 2025-07-20 06:55:07,010 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756844_16020 src: /192.168.158.1:42404 dest: /192.168.158.4:9866 2025-07-20 06:55:07,047 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42404, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1534399676_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756844_16020, duration(ns): 27025340 2025-07-20 06:55:07,047 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756844_16020, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-20 06:55:08,933 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756844_16020 replica FinalizedReplica, blk_1073756844_16020, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756844 for deletion 2025-07-20 06:55:08,934 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756844_16020 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756844 2025-07-20 06:56:07,010 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756845_16021 src: /192.168.158.1:56830 dest: /192.168.158.4:9866 2025-07-20 06:56:07,045 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56830, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1935429050_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756845_16021, duration(ns): 25831438 2025-07-20 06:56:07,046 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756845_16021, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-20 06:56:08,937 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756845_16021 replica FinalizedReplica, blk_1073756845_16021, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756845 for deletion 2025-07-20 06:56:08,938 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756845_16021 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756845 2025-07-20 06:59:17,011 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756848_16024 src: /192.168.158.1:57428 dest: /192.168.158.4:9866 2025-07-20 06:59:17,044 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57428, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-464694267_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756848_16024, duration(ns): 24180400 2025-07-20 06:59:17,044 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756848_16024, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-20 06:59:23,942 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756848_16024 replica FinalizedReplica, blk_1073756848_16024, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756848 for deletion 2025-07-20 06:59:23,943 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756848_16024 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756848 2025-07-20 07:00:22,019 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756849_16025 src: /192.168.158.6:39230 dest: /192.168.158.4:9866 2025-07-20 07:00:22,041 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:39230, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-977257058_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756849_16025, duration(ns): 19821559 2025-07-20 07:00:22,042 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756849_16025, type=LAST_IN_PIPELINE terminating 2025-07-20 07:00:23,941 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756849_16025 replica FinalizedReplica, blk_1073756849_16025, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756849 for deletion 2025-07-20 07:00:23,943 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756849_16025 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756849 2025-07-20 07:01:27,019 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756850_16026 src: /192.168.158.9:60078 dest: /192.168.158.4:9866 2025-07-20 07:01:27,038 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:60078, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-34750615_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756850_16026, duration(ns): 16674469 2025-07-20 07:01:27,039 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756850_16026, type=LAST_IN_PIPELINE terminating 2025-07-20 07:01:29,947 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756850_16026 replica FinalizedReplica, blk_1073756850_16026, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756850 for deletion 2025-07-20 07:01:29,948 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756850_16026 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756850 2025-07-20 07:05:37,026 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756854_16030 src: /192.168.158.9:55114 dest: /192.168.158.4:9866 2025-07-20 07:05:37,053 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:55114, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_706419300_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756854_16030, duration(ns): 21767963 2025-07-20 07:05:37,053 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756854_16030, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-20 07:05:38,956 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756854_16030 replica FinalizedReplica, blk_1073756854_16030, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756854 for deletion 2025-07-20 07:05:38,957 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756854_16030 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756854 2025-07-20 07:09:52,025 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756858_16034 src: /192.168.158.1:57268 dest: /192.168.158.4:9866 2025-07-20 07:09:52,060 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57268, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1631702797_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756858_16034, duration(ns): 26055317 2025-07-20 07:09:52,061 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756858_16034, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-20 07:09:53,964 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756858_16034 replica FinalizedReplica, blk_1073756858_16034, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756858 for deletion 2025-07-20 07:09:53,965 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756858_16034 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756858 2025-07-20 07:15:57,036 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756864_16040 src: /192.168.158.1:51260 dest: /192.168.158.4:9866 2025-07-20 07:15:57,069 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51260, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_120942163_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756864_16040, duration(ns): 24609423 2025-07-20 07:15:57,070 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756864_16040, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-20 07:16:02,975 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756864_16040 replica FinalizedReplica, blk_1073756864_16040, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756864 for deletion 2025-07-20 07:16:02,976 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756864_16040 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756864 2025-07-20 07:16:57,033 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756865_16041 src: /192.168.158.1:54808 dest: /192.168.158.4:9866 2025-07-20 07:16:57,067 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54808, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-925712763_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756865_16041, duration(ns): 25774322 2025-07-20 07:16:57,068 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756865_16041, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-20 07:17:02,979 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756865_16041 replica FinalizedReplica, blk_1073756865_16041, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756865 for deletion 2025-07-20 07:17:02,980 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756865_16041 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756865 2025-07-20 07:20:02,041 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756868_16044 src: /192.168.158.8:49120 dest: /192.168.158.4:9866 2025-07-20 07:20:02,069 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:49120, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1371770505_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756868_16044, duration(ns): 22545571 2025-07-20 07:20:02,069 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756868_16044, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-20 07:20:02,980 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756868_16044 replica FinalizedReplica, blk_1073756868_16044, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756868 for deletion 2025-07-20 07:20:02,981 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756868_16044 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756868 2025-07-20 07:21:07,039 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756869_16045 src: /192.168.158.9:47734 dest: /192.168.158.4:9866 2025-07-20 07:21:07,066 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:47734, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-185779313_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756869_16045, duration(ns): 21250636 2025-07-20 07:21:07,067 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756869_16045, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-20 07:21:11,980 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756869_16045 replica FinalizedReplica, blk_1073756869_16045, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756869 for deletion 2025-07-20 07:21:11,981 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756869_16045 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756869 2025-07-20 07:22:07,053 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756870_16046 src: /192.168.158.9:60712 dest: /192.168.158.4:9866 2025-07-20 07:22:07,074 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:60712, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_687689945_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756870_16046, duration(ns): 19329532 2025-07-20 07:22:07,075 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756870_16046, type=LAST_IN_PIPELINE terminating 2025-07-20 07:22:11,981 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756870_16046 replica FinalizedReplica, blk_1073756870_16046, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756870 for deletion 2025-07-20 07:22:11,983 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756870_16046 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756870 2025-07-20 07:23:07,040 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756871_16047 src: /192.168.158.1:36900 dest: /192.168.158.4:9866 2025-07-20 07:23:07,072 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36900, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-558627562_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756871_16047, duration(ns): 22639322 2025-07-20 07:23:07,072 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756871_16047, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-20 07:23:08,983 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756871_16047 replica FinalizedReplica, blk_1073756871_16047, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756871 for deletion 2025-07-20 07:23:08,985 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756871_16047 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756871 2025-07-20 07:25:07,052 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756873_16049 src: /192.168.158.6:41592 dest: /192.168.158.4:9866 2025-07-20 07:25:07,069 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:41592, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1492116279_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756873_16049, duration(ns): 15557959 2025-07-20 07:25:07,070 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756873_16049, type=LAST_IN_PIPELINE terminating 2025-07-20 07:25:08,990 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756873_16049 replica FinalizedReplica, blk_1073756873_16049, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756873 for deletion 2025-07-20 07:25:08,992 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756873_16049 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756873 2025-07-20 07:26:07,044 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756874_16050 src: /192.168.158.1:40290 dest: /192.168.158.4:9866 2025-07-20 07:26:07,077 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40290, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_839294424_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756874_16050, duration(ns): 23891989 2025-07-20 07:26:07,077 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756874_16050, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-20 07:26:11,992 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756874_16050 replica FinalizedReplica, blk_1073756874_16050, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756874 for deletion 2025-07-20 07:26:11,993 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756874_16050 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756874 2025-07-20 07:31:07,061 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756879_16055 src: /192.168.158.7:35886 dest: /192.168.158.4:9866 2025-07-20 07:31:07,089 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:35886, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-801334338_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756879_16055, duration(ns): 22261681 2025-07-20 07:31:07,090 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756879_16055, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-20 07:31:12,001 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756879_16055 replica FinalizedReplica, blk_1073756879_16055, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756879 for deletion 2025-07-20 07:31:12,002 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756879_16055 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756879 2025-07-20 07:33:12,062 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756881_16057 src: /192.168.158.8:47210 dest: /192.168.158.4:9866 2025-07-20 07:33:12,089 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:47210, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_887610507_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756881_16057, duration(ns): 21494360 2025-07-20 07:33:12,090 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756881_16057, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-20 07:33:15,008 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756881_16057 replica FinalizedReplica, blk_1073756881_16057, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756881 for deletion 2025-07-20 07:33:15,010 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756881_16057 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756881 2025-07-20 07:34:12,059 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756882_16058 src: /192.168.158.1:45702 dest: /192.168.158.4:9866 2025-07-20 07:34:12,093 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45702, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1040941354_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756882_16058, duration(ns): 24862923 2025-07-20 07:34:12,093 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756882_16058, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-20 07:34:15,013 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756882_16058 replica FinalizedReplica, blk_1073756882_16058, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756882 for deletion 2025-07-20 07:34:15,014 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756882_16058 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756882 2025-07-20 07:38:12,074 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756886_16062 src: /192.168.158.7:58162 dest: /192.168.158.4:9866 2025-07-20 07:38:12,093 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:58162, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1310011977_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756886_16062, duration(ns): 17568624 2025-07-20 07:38:12,094 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756886_16062, type=LAST_IN_PIPELINE terminating 2025-07-20 07:38:15,020 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756886_16062 replica FinalizedReplica, blk_1073756886_16062, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756886 for deletion 2025-07-20 07:38:15,021 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756886_16062 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756886 2025-07-20 07:39:12,063 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756887_16063 src: /192.168.158.1:34774 dest: /192.168.158.4:9866 2025-07-20 07:39:12,095 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34774, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_40313616_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756887_16063, duration(ns): 23131773 2025-07-20 07:39:12,095 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756887_16063, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-20 07:39:15,024 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756887_16063 replica FinalizedReplica, blk_1073756887_16063, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756887 for deletion 2025-07-20 07:39:15,025 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756887_16063 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756887 2025-07-20 07:42:12,092 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756890_16066 src: /192.168.158.5:55016 dest: /192.168.158.4:9866 2025-07-20 07:42:12,114 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:55016, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_120776718_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756890_16066, duration(ns): 18854798 2025-07-20 07:42:12,114 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756890_16066, type=LAST_IN_PIPELINE terminating 2025-07-20 07:42:15,030 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756890_16066 replica FinalizedReplica, blk_1073756890_16066, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756890 for deletion 2025-07-20 07:42:15,031 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756890_16066 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756890 2025-07-20 07:44:17,099 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756892_16068 src: /192.168.158.1:45226 dest: /192.168.158.4:9866 2025-07-20 07:44:17,134 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45226, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_777077375_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756892_16068, duration(ns): 25621635 2025-07-20 07:44:17,135 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756892_16068, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-20 07:44:18,034 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756892_16068 replica FinalizedReplica, blk_1073756892_16068, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756892 for deletion 2025-07-20 07:44:18,035 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756892_16068 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756892 2025-07-20 07:46:17,128 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756894_16070 src: /192.168.158.1:55338 dest: /192.168.158.4:9866 2025-07-20 07:46:17,160 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55338, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1171946074_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756894_16070, duration(ns): 22934263 2025-07-20 07:46:17,160 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756894_16070, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-20 07:46:18,037 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756894_16070 replica FinalizedReplica, blk_1073756894_16070, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756894 for deletion 2025-07-20 07:46:18,039 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756894_16070 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756894 2025-07-20 07:47:17,101 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756895_16071 src: /192.168.158.6:48556 dest: /192.168.158.4:9866 2025-07-20 07:47:17,120 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:48556, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1159510213_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756895_16071, duration(ns): 17454656 2025-07-20 07:47:17,121 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756895_16071, type=LAST_IN_PIPELINE terminating 2025-07-20 07:47:21,040 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756895_16071 replica FinalizedReplica, blk_1073756895_16071, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756895 for deletion 2025-07-20 07:47:21,041 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756895_16071 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756895 2025-07-20 07:50:17,118 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756898_16074 src: /192.168.158.5:47224 dest: /192.168.158.4:9866 2025-07-20 07:50:17,144 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:47224, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1025979673_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756898_16074, duration(ns): 20070666 2025-07-20 07:50:17,144 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756898_16074, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-20 07:50:18,043 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756898_16074 replica FinalizedReplica, blk_1073756898_16074, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756898 for deletion 2025-07-20 07:50:18,044 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756898_16074 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756898 2025-07-20 07:54:17,095 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756902_16078 src: /192.168.158.1:55426 dest: /192.168.158.4:9866 2025-07-20 07:54:17,132 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55426, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-244140438_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756902_16078, duration(ns): 27747362 2025-07-20 07:54:17,132 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756902_16078, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-20 07:54:18,055 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756902_16078 replica FinalizedReplica, blk_1073756902_16078, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756902 for deletion 2025-07-20 07:54:18,056 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756902_16078 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756902 2025-07-20 07:56:22,102 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756904_16080 src: /192.168.158.5:38810 dest: /192.168.158.4:9866 2025-07-20 07:56:22,121 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:38810, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1786356117_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756904_16080, duration(ns): 17667765 2025-07-20 07:56:22,122 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756904_16080, type=LAST_IN_PIPELINE terminating 2025-07-20 07:56:24,060 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756904_16080 replica FinalizedReplica, blk_1073756904_16080, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756904 for deletion 2025-07-20 07:56:24,061 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756904_16080 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756904 2025-07-20 07:57:22,101 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756905_16081 src: /192.168.158.8:33968 dest: /192.168.158.4:9866 2025-07-20 07:57:22,120 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:33968, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_748000722_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756905_16081, duration(ns): 17376109 2025-07-20 07:57:22,120 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756905_16081, type=LAST_IN_PIPELINE terminating 2025-07-20 07:57:27,061 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756905_16081 replica FinalizedReplica, blk_1073756905_16081, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756905 for deletion 2025-07-20 07:57:27,062 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756905_16081 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756905 2025-07-20 07:58:27,105 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756906_16082 src: /192.168.158.8:39196 dest: /192.168.158.4:9866 2025-07-20 07:58:27,124 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:39196, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2110290546_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756906_16082, duration(ns): 16870498 2025-07-20 07:58:27,125 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756906_16082, type=LAST_IN_PIPELINE terminating 2025-07-20 07:58:33,062 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756906_16082 replica FinalizedReplica, blk_1073756906_16082, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756906 for deletion 2025-07-20 07:58:33,063 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756906_16082 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756906 2025-07-20 07:59:27,106 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756907_16083 src: /192.168.158.7:39512 dest: /192.168.158.4:9866 2025-07-20 07:59:27,125 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:39512, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1174922413_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756907_16083, duration(ns): 16452017 2025-07-20 07:59:27,125 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756907_16083, type=LAST_IN_PIPELINE terminating 2025-07-20 07:59:30,063 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756907_16083 replica FinalizedReplica, blk_1073756907_16083, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756907 for deletion 2025-07-20 07:59:30,064 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756907_16083 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756907 2025-07-20 08:01:27,107 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756909_16085 src: /192.168.158.6:37844 dest: /192.168.158.4:9866 2025-07-20 08:01:27,126 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:37844, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1381835480_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756909_16085, duration(ns): 17083328 2025-07-20 08:01:27,127 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756909_16085, type=LAST_IN_PIPELINE terminating 2025-07-20 08:01:33,066 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756909_16085 replica FinalizedReplica, blk_1073756909_16085, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756909 for deletion 2025-07-20 08:01:33,067 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756909_16085 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756909 2025-07-20 08:02:27,101 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756910_16086 src: /192.168.158.1:55018 dest: /192.168.158.4:9866 2025-07-20 08:02:27,137 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55018, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_149049809_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756910_16086, duration(ns): 27330989 2025-07-20 08:02:27,138 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756910_16086, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-20 08:02:33,068 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756910_16086 replica FinalizedReplica, blk_1073756910_16086, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756910 for deletion 2025-07-20 08:02:33,069 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756910_16086 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756910 2025-07-20 08:03:27,107 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756911_16087 src: /192.168.158.6:35940 dest: /192.168.158.4:9866 2025-07-20 08:03:27,133 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:35940, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_122698686_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756911_16087, duration(ns): 19968893 2025-07-20 08:03:27,133 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756911_16087, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-20 08:03:30,071 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756911_16087 replica FinalizedReplica, blk_1073756911_16087, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756911 for deletion 2025-07-20 08:03:30,072 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756911_16087 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756911 2025-07-20 08:05:27,111 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756913_16089 src: /192.168.158.8:46980 dest: /192.168.158.4:9866 2025-07-20 08:05:27,130 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:46980, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_874486696_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756913_16089, duration(ns): 16198478 2025-07-20 08:05:27,130 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756913_16089, type=LAST_IN_PIPELINE terminating 2025-07-20 08:05:33,073 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756913_16089 replica FinalizedReplica, blk_1073756913_16089, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756913 for deletion 2025-07-20 08:05:33,074 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756913_16089 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756913 2025-07-20 08:07:32,108 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756915_16091 src: /192.168.158.1:57952 dest: /192.168.158.4:9866 2025-07-20 08:07:32,143 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57952, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-129331115_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756915_16091, duration(ns): 25787988 2025-07-20 08:07:32,143 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756915_16091, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-20 08:07:33,078 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756915_16091 replica FinalizedReplica, blk_1073756915_16091, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756915 for deletion 2025-07-20 08:07:33,079 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756915_16091 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756915 2025-07-20 08:08:32,110 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756916_16092 src: /192.168.158.1:35260 dest: /192.168.158.4:9866 2025-07-20 08:08:32,142 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35260, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-334353018_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756916_16092, duration(ns): 23377896 2025-07-20 08:08:32,142 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756916_16092, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-20 08:08:36,078 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756916_16092 replica FinalizedReplica, blk_1073756916_16092, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756916 for deletion 2025-07-20 08:08:36,079 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756916_16092 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756916 2025-07-20 08:12:37,122 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756920_16096 src: /192.168.158.6:59532 dest: /192.168.158.4:9866 2025-07-20 08:12:37,142 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:59532, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2090114270_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756920_16096, duration(ns): 17689972 2025-07-20 08:12:37,142 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756920_16096, type=LAST_IN_PIPELINE terminating 2025-07-20 08:12:42,084 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756920_16096 replica FinalizedReplica, blk_1073756920_16096, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756920 for deletion 2025-07-20 08:12:42,085 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756920_16096 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756920 2025-07-20 08:13:37,118 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756921_16097 src: /192.168.158.1:49518 dest: /192.168.158.4:9866 2025-07-20 08:13:37,153 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49518, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-13645747_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756921_16097, duration(ns): 23579369 2025-07-20 08:13:37,153 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756921_16097, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-20 08:13:42,088 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756921_16097 replica FinalizedReplica, blk_1073756921_16097, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756921 for deletion 2025-07-20 08:13:42,089 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756921_16097 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756921 2025-07-20 08:14:37,128 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756922_16098 src: /192.168.158.5:41272 dest: /192.168.158.4:9866 2025-07-20 08:14:37,148 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:41272, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1937692524_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756922_16098, duration(ns): 18239858 2025-07-20 08:14:37,149 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756922_16098, type=LAST_IN_PIPELINE terminating 2025-07-20 08:14:39,089 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756922_16098 replica FinalizedReplica, blk_1073756922_16098, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756922 for deletion 2025-07-20 08:14:39,090 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756922_16098 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756922 2025-07-20 08:16:37,131 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756924_16100 src: /192.168.158.5:47264 dest: /192.168.158.4:9866 2025-07-20 08:16:37,149 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:47264, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_577339611_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756924_16100, duration(ns): 16430670 2025-07-20 08:16:37,150 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756924_16100, type=LAST_IN_PIPELINE terminating 2025-07-20 08:16:39,093 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756924_16100 replica FinalizedReplica, blk_1073756924_16100, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756924 for deletion 2025-07-20 08:16:39,094 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756924_16100 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756924 2025-07-20 08:17:37,127 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756925_16101 src: /192.168.158.1:49772 dest: /192.168.158.4:9866 2025-07-20 08:17:37,163 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49772, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1750962857_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756925_16101, duration(ns): 25910914 2025-07-20 08:17:37,164 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756925_16101, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-20 08:17:39,095 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756925_16101 replica FinalizedReplica, blk_1073756925_16101, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756925 for deletion 2025-07-20 08:17:39,097 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756925_16101 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756925 2025-07-20 08:19:42,126 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756927_16103 src: /192.168.158.1:42618 dest: /192.168.158.4:9866 2025-07-20 08:19:42,160 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42618, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-29736130_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756927_16103, duration(ns): 25197469 2025-07-20 08:19:42,160 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756927_16103, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-20 08:19:48,099 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756927_16103 replica FinalizedReplica, blk_1073756927_16103, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756927 for deletion 2025-07-20 08:19:48,101 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756927_16103 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir26/blk_1073756927 2025-07-20 08:21:42,130 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756929_16105 src: /192.168.158.1:49920 dest: /192.168.158.4:9866 2025-07-20 08:21:42,164 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49920, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_188836342_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756929_16105, duration(ns): 25033147 2025-07-20 08:21:42,164 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756929_16105, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-20 08:21:45,103 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756929_16105 replica FinalizedReplica, blk_1073756929_16105, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756929 for deletion 2025-07-20 08:21:45,104 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756929_16105 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756929 2025-07-20 08:22:42,139 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756930_16106 src: /192.168.158.9:39054 dest: /192.168.158.4:9866 2025-07-20 08:22:42,159 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:39054, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_364936529_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756930_16106, duration(ns): 18700132 2025-07-20 08:22:42,160 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756930_16106, type=LAST_IN_PIPELINE terminating 2025-07-20 08:22:48,104 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756930_16106 replica FinalizedReplica, blk_1073756930_16106, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756930 for deletion 2025-07-20 08:22:48,105 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756930_16106 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756930 2025-07-20 08:23:42,139 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756931_16107 src: /192.168.158.5:46316 dest: /192.168.158.4:9866 2025-07-20 08:23:42,157 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:46316, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_462689645_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756931_16107, duration(ns): 16276476 2025-07-20 08:23:42,158 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756931_16107, type=LAST_IN_PIPELINE terminating 2025-07-20 08:23:48,104 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756931_16107 replica FinalizedReplica, blk_1073756931_16107, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756931 for deletion 2025-07-20 08:23:48,106 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756931_16107 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756931 2025-07-20 08:24:42,138 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756932_16108 src: /192.168.158.1:48728 dest: /192.168.158.4:9866 2025-07-20 08:24:42,174 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48728, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1533339865_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756932_16108, duration(ns): 26721521 2025-07-20 08:24:42,174 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756932_16108, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-20 08:24:45,109 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756932_16108 replica FinalizedReplica, blk_1073756932_16108, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756932 for deletion 2025-07-20 08:24:45,110 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756932_16108 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756932 2025-07-20 08:27:47,146 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756935_16111 src: /192.168.158.7:41338 dest: /192.168.158.4:9866 2025-07-20 08:27:47,171 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:41338, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-29419744_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756935_16111, duration(ns): 19319701 2025-07-20 08:27:47,171 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756935_16111, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-20 08:27:48,114 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756935_16111 replica FinalizedReplica, blk_1073756935_16111, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756935 for deletion 2025-07-20 08:27:48,116 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756935_16111 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756935 2025-07-20 08:28:47,148 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756936_16112 src: /192.168.158.1:52550 dest: /192.168.158.4:9866 2025-07-20 08:28:47,182 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52550, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2050852276_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756936_16112, duration(ns): 25440277 2025-07-20 08:28:47,183 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756936_16112, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-20 08:28:48,116 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756936_16112 replica FinalizedReplica, blk_1073756936_16112, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756936 for deletion 2025-07-20 08:28:48,118 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756936_16112 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756936 2025-07-20 08:29:47,156 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756937_16113 src: /192.168.158.9:34012 dest: /192.168.158.4:9866 2025-07-20 08:29:47,176 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:34012, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_322960866_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756937_16113, duration(ns): 17166412 2025-07-20 08:29:47,176 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756937_16113, type=LAST_IN_PIPELINE terminating 2025-07-20 08:29:51,119 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756937_16113 replica FinalizedReplica, blk_1073756937_16113, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756937 for deletion 2025-07-20 08:29:51,120 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756937_16113 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756937 2025-07-20 08:30:47,145 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756938_16114 src: /192.168.158.1:37764 dest: /192.168.158.4:9866 2025-07-20 08:30:47,178 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37764, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1852312133_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756938_16114, duration(ns): 23468053 2025-07-20 08:30:47,178 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756938_16114, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-20 08:30:48,121 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756938_16114 replica FinalizedReplica, blk_1073756938_16114, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756938 for deletion 2025-07-20 08:30:48,122 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756938_16114 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756938 2025-07-20 08:31:47,157 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756939_16115 src: /192.168.158.7:36178 dest: /192.168.158.4:9866 2025-07-20 08:31:47,176 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:36178, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1696388058_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756939_16115, duration(ns): 17155799 2025-07-20 08:31:47,176 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756939_16115, type=LAST_IN_PIPELINE terminating 2025-07-20 08:31:51,123 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756939_16115 replica FinalizedReplica, blk_1073756939_16115, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756939 for deletion 2025-07-20 08:31:51,124 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756939_16115 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756939 2025-07-20 08:34:47,172 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756942_16118 src: /192.168.158.7:44452 dest: /192.168.158.4:9866 2025-07-20 08:34:47,193 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:44452, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1731067410_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756942_16118, duration(ns): 18315551 2025-07-20 08:34:47,193 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756942_16118, type=LAST_IN_PIPELINE terminating 2025-07-20 08:34:48,129 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756942_16118 replica FinalizedReplica, blk_1073756942_16118, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756942 for deletion 2025-07-20 08:34:48,130 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756942_16118 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756942 2025-07-20 08:37:47,180 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756945_16121 src: /192.168.158.6:43780 dest: /192.168.158.4:9866 2025-07-20 08:37:47,208 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:43780, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_929428728_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756945_16121, duration(ns): 21195322 2025-07-20 08:37:47,208 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756945_16121, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-20 08:37:48,138 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756945_16121 replica FinalizedReplica, blk_1073756945_16121, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756945 for deletion 2025-07-20 08:37:48,140 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756945_16121 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756945 2025-07-20 08:38:47,180 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756946_16122 src: /192.168.158.6:48958 dest: /192.168.158.4:9866 2025-07-20 08:38:47,210 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:48958, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-169223074_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756946_16122, duration(ns): 24877644 2025-07-20 08:38:47,210 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756946_16122, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-20 08:38:48,141 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756946_16122 replica FinalizedReplica, blk_1073756946_16122, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756946 for deletion 2025-07-20 08:38:48,142 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756946_16122 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756946 2025-07-20 08:40:47,181 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756948_16124 src: /192.168.158.5:40864 dest: /192.168.158.4:9866 2025-07-20 08:40:47,201 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:40864, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_639756259_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756948_16124, duration(ns): 17113552 2025-07-20 08:40:47,201 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756948_16124, type=LAST_IN_PIPELINE terminating 2025-07-20 08:40:48,149 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756948_16124 replica FinalizedReplica, blk_1073756948_16124, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756948 for deletion 2025-07-20 08:40:48,150 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756948_16124 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756948 2025-07-20 08:44:52,191 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756952_16128 src: /192.168.158.8:51626 dest: /192.168.158.4:9866 2025-07-20 08:44:52,211 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:51626, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_543670338_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756952_16128, duration(ns): 18406166 2025-07-20 08:44:52,212 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756952_16128, type=LAST_IN_PIPELINE terminating 2025-07-20 08:44:54,156 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756952_16128 replica FinalizedReplica, blk_1073756952_16128, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756952 for deletion 2025-07-20 08:44:54,158 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756952_16128 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756952 2025-07-20 08:45:52,189 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756953_16129 src: /192.168.158.6:43638 dest: /192.168.158.4:9866 2025-07-20 08:45:52,208 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:43638, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1100001986_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756953_16129, duration(ns): 16930958 2025-07-20 08:45:52,208 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756953_16129, type=LAST_IN_PIPELINE terminating 2025-07-20 08:45:54,159 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756953_16129 replica FinalizedReplica, blk_1073756953_16129, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756953 for deletion 2025-07-20 08:45:54,160 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756953_16129 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756953 2025-07-20 08:46:57,181 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756954_16130 src: /192.168.158.1:48210 dest: /192.168.158.4:9866 2025-07-20 08:46:57,214 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48210, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1262310141_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756954_16130, duration(ns): 24063404 2025-07-20 08:46:57,215 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756954_16130, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-20 08:47:00,161 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756954_16130 replica FinalizedReplica, blk_1073756954_16130, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756954 for deletion 2025-07-20 08:47:00,162 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756954_16130 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756954 2025-07-20 08:48:02,182 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756955_16131 src: /192.168.158.1:35134 dest: /192.168.158.4:9866 2025-07-20 08:48:02,216 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35134, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_405575825_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756955_16131, duration(ns): 24842592 2025-07-20 08:48:02,216 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756955_16131, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-20 08:48:03,163 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756955_16131 replica FinalizedReplica, blk_1073756955_16131, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756955 for deletion 2025-07-20 08:48:03,164 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756955_16131 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756955 2025-07-20 08:50:02,193 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756957_16133 src: /192.168.158.6:40398 dest: /192.168.158.4:9866 2025-07-20 08:50:02,220 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:40398, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-542734661_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756957_16133, duration(ns): 21211425 2025-07-20 08:50:02,220 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756957_16133, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-20 08:50:03,165 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756957_16133 replica FinalizedReplica, blk_1073756957_16133, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756957 for deletion 2025-07-20 08:50:03,167 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756957_16133 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756957 2025-07-20 08:51:07,193 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756958_16134 src: /192.168.158.1:56136 dest: /192.168.158.4:9866 2025-07-20 08:51:07,230 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56136, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2024120693_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756958_16134, duration(ns): 26311573 2025-07-20 08:51:07,230 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756958_16134, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-20 08:51:09,168 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756958_16134 replica FinalizedReplica, blk_1073756958_16134, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756958 for deletion 2025-07-20 08:51:09,169 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756958_16134 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756958 2025-07-20 08:52:07,197 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756959_16135 src: /192.168.158.6:50734 dest: /192.168.158.4:9866 2025-07-20 08:52:07,223 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:50734, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1771054597_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756959_16135, duration(ns): 20454526 2025-07-20 08:52:07,223 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756959_16135, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-20 08:52:09,169 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756959_16135 replica FinalizedReplica, blk_1073756959_16135, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756959 for deletion 2025-07-20 08:52:09,171 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756959_16135 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756959 2025-07-20 08:54:12,201 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756961_16137 src: /192.168.158.6:34904 dest: /192.168.158.4:9866 2025-07-20 08:54:12,220 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:34904, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-522100657_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756961_16137, duration(ns): 16584835 2025-07-20 08:54:12,220 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756961_16137, type=LAST_IN_PIPELINE terminating 2025-07-20 08:54:15,172 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756961_16137 replica FinalizedReplica, blk_1073756961_16137, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756961 for deletion 2025-07-20 08:54:15,173 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756961_16137 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756961 2025-07-20 08:56:17,203 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756963_16139 src: /192.168.158.9:48740 dest: /192.168.158.4:9866 2025-07-20 08:56:17,222 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:48740, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-35135725_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756963_16139, duration(ns): 17035290 2025-07-20 08:56:17,223 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756963_16139, type=LAST_IN_PIPELINE terminating 2025-07-20 08:56:21,175 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756963_16139 replica FinalizedReplica, blk_1073756963_16139, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756963 for deletion 2025-07-20 08:56:21,176 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756963_16139 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756963 2025-07-20 09:02:32,203 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756969_16145 src: /192.168.158.5:35472 dest: /192.168.158.4:9866 2025-07-20 09:02:32,231 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:35472, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_296765533_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756969_16145, duration(ns): 22332722 2025-07-20 09:02:32,231 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756969_16145, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-20 09:02:33,189 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756969_16145 replica FinalizedReplica, blk_1073756969_16145, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756969 for deletion 2025-07-20 09:02:33,190 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756969_16145 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756969 2025-07-20 09:04:37,218 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756971_16147 src: /192.168.158.8:56854 dest: /192.168.158.4:9866 2025-07-20 09:04:37,238 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:56854, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1372388213_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756971_16147, duration(ns): 17423662 2025-07-20 09:04:37,238 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756971_16147, type=LAST_IN_PIPELINE terminating 2025-07-20 09:04:39,192 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756971_16147 replica FinalizedReplica, blk_1073756971_16147, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756971 for deletion 2025-07-20 09:04:39,193 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756971_16147 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756971 2025-07-20 09:05:37,212 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756972_16148 src: /192.168.158.1:52088 dest: /192.168.158.4:9866 2025-07-20 09:05:37,247 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52088, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1968744812_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756972_16148, duration(ns): 25416494 2025-07-20 09:05:37,247 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756972_16148, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-20 09:05:39,195 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756972_16148 replica FinalizedReplica, blk_1073756972_16148, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756972 for deletion 2025-07-20 09:05:39,196 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756972_16148 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756972 2025-07-20 09:06:37,214 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756973_16149 src: /192.168.158.1:58352 dest: /192.168.158.4:9866 2025-07-20 09:06:37,247 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58352, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2008591485_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756973_16149, duration(ns): 23762453 2025-07-20 09:06:37,247 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756973_16149, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-20 09:06:39,196 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756973_16149 replica FinalizedReplica, blk_1073756973_16149, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756973 for deletion 2025-07-20 09:06:39,198 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756973_16149 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756973 2025-07-20 09:08:37,221 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756975_16151 src: /192.168.158.1:48106 dest: /192.168.158.4:9866 2025-07-20 09:08:37,255 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48106, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1860376753_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756975_16151, duration(ns): 25135725 2025-07-20 09:08:37,255 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756975_16151, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-20 09:08:42,201 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756975_16151 replica FinalizedReplica, blk_1073756975_16151, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756975 for deletion 2025-07-20 09:08:42,202 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756975_16151 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756975 2025-07-20 09:09:42,227 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756976_16152 src: /192.168.158.5:38532 dest: /192.168.158.4:9866 2025-07-20 09:09:42,244 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:38532, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1676581188_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756976_16152, duration(ns): 15848088 2025-07-20 09:09:42,245 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756976_16152, type=LAST_IN_PIPELINE terminating 2025-07-20 09:09:45,202 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756976_16152 replica FinalizedReplica, blk_1073756976_16152, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756976 for deletion 2025-07-20 09:09:45,203 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756976_16152 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756976 2025-07-20 09:10:42,227 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756977_16153 src: /192.168.158.1:59282 dest: /192.168.158.4:9866 2025-07-20 09:10:42,260 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59282, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1683763525_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756977_16153, duration(ns): 23405091 2025-07-20 09:10:42,260 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756977_16153, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-20 09:10:48,203 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756977_16153 replica FinalizedReplica, blk_1073756977_16153, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756977 for deletion 2025-07-20 09:10:48,204 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756977_16153 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756977 2025-07-20 09:11:42,221 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756978_16154 src: /192.168.158.1:38368 dest: /192.168.158.4:9866 2025-07-20 09:11:42,257 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38368, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_285524286_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756978_16154, duration(ns): 26098095 2025-07-20 09:11:42,257 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756978_16154, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-20 09:11:48,203 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756978_16154 replica FinalizedReplica, blk_1073756978_16154, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756978 for deletion 2025-07-20 09:11:48,205 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756978_16154 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756978 2025-07-20 09:14:47,228 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756981_16157 src: /192.168.158.1:40266 dest: /192.168.158.4:9866 2025-07-20 09:14:47,261 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40266, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_153299828_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756981_16157, duration(ns): 23951553 2025-07-20 09:14:47,261 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756981_16157, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-20 09:14:48,213 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756981_16157 replica FinalizedReplica, blk_1073756981_16157, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756981 for deletion 2025-07-20 09:14:48,214 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756981_16157 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756981 2025-07-20 09:15:52,233 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756982_16158 src: /192.168.158.7:47336 dest: /192.168.158.4:9866 2025-07-20 09:15:52,252 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:47336, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-855873239_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756982_16158, duration(ns): 16234720 2025-07-20 09:15:52,252 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756982_16158, type=LAST_IN_PIPELINE terminating 2025-07-20 09:15:54,215 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756982_16158 replica FinalizedReplica, blk_1073756982_16158, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756982 for deletion 2025-07-20 09:15:54,216 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756982_16158 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756982 2025-07-20 09:17:57,233 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756984_16160 src: /192.168.158.1:46430 dest: /192.168.158.4:9866 2025-07-20 09:17:57,266 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46430, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1193772035_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756984_16160, duration(ns): 24422429 2025-07-20 09:17:57,266 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756984_16160, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-20 09:18:03,221 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756984_16160 replica FinalizedReplica, blk_1073756984_16160, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756984 for deletion 2025-07-20 09:18:03,222 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756984_16160 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756984 2025-07-20 09:18:57,229 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756985_16161 src: /192.168.158.1:37676 dest: /192.168.158.4:9866 2025-07-20 09:18:57,262 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37676, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1900906065_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756985_16161, duration(ns): 23601614 2025-07-20 09:18:57,262 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756985_16161, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-20 09:19:03,221 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756985_16161 replica FinalizedReplica, blk_1073756985_16161, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756985 for deletion 2025-07-20 09:19:03,222 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756985_16161 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756985 2025-07-20 09:21:02,250 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756987_16163 src: /192.168.158.5:56072 dest: /192.168.158.4:9866 2025-07-20 09:21:02,270 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:56072, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-72689865_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756987_16163, duration(ns): 17859094 2025-07-20 09:21:02,271 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756987_16163, type=LAST_IN_PIPELINE terminating 2025-07-20 09:21:06,225 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756987_16163 replica FinalizedReplica, blk_1073756987_16163, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756987 for deletion 2025-07-20 09:21:06,226 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756987_16163 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756987 2025-07-20 09:23:02,247 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756989_16165 src: /192.168.158.1:38138 dest: /192.168.158.4:9866 2025-07-20 09:23:02,283 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38138, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-531794220_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756989_16165, duration(ns): 24956528 2025-07-20 09:23:02,283 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756989_16165, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-20 09:23:06,229 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756989_16165 replica FinalizedReplica, blk_1073756989_16165, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756989 for deletion 2025-07-20 09:23:06,230 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756989_16165 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756989 2025-07-20 09:27:12,258 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756993_16169 src: /192.168.158.7:51792 dest: /192.168.158.4:9866 2025-07-20 09:27:12,277 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:51792, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1315879812_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756993_16169, duration(ns): 17720270 2025-07-20 09:27:12,278 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756993_16169, type=LAST_IN_PIPELINE terminating 2025-07-20 09:27:15,237 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756993_16169 replica FinalizedReplica, blk_1073756993_16169, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756993 for deletion 2025-07-20 09:27:15,238 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756993_16169 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756993 2025-07-20 09:31:12,254 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756997_16173 src: /192.168.158.5:49864 dest: /192.168.158.4:9866 2025-07-20 09:31:12,282 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:49864, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1537212531_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756997_16173, duration(ns): 21903540 2025-07-20 09:31:12,282 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756997_16173, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-20 09:31:15,241 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756997_16173 replica FinalizedReplica, blk_1073756997_16173, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756997 for deletion 2025-07-20 09:31:15,242 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756997_16173 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756997 2025-07-20 09:32:12,257 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756998_16174 src: /192.168.158.7:46136 dest: /192.168.158.4:9866 2025-07-20 09:32:12,278 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:46136, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1311077703_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756998_16174, duration(ns): 17166504 2025-07-20 09:32:12,278 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756998_16174, type=LAST_IN_PIPELINE terminating 2025-07-20 09:32:15,244 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756998_16174 replica FinalizedReplica, blk_1073756998_16174, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756998 for deletion 2025-07-20 09:32:15,245 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756998_16174 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756998 2025-07-20 09:33:12,257 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073756999_16175 src: /192.168.158.1:38130 dest: /192.168.158.4:9866 2025-07-20 09:33:12,293 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38130, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1685470974_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073756999_16175, duration(ns): 26696072 2025-07-20 09:33:12,293 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073756999_16175, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-20 09:33:18,245 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073756999_16175 replica FinalizedReplica, blk_1073756999_16175, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756999 for deletion 2025-07-20 09:33:18,247 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073756999_16175 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073756999 2025-07-20 09:41:17,270 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757007_16183 src: /192.168.158.1:48062 dest: /192.168.158.4:9866 2025-07-20 09:41:17,303 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48062, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2041862049_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757007_16183, duration(ns): 24817284 2025-07-20 09:41:17,304 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757007_16183, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-20 09:41:21,260 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757007_16183 replica FinalizedReplica, blk_1073757007_16183, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757007 for deletion 2025-07-20 09:41:21,261 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757007_16183 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757007 2025-07-20 09:42:17,286 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757008_16184 src: /192.168.158.8:36878 dest: /192.168.158.4:9866 2025-07-20 09:42:17,306 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:36878, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1023029266_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757008_16184, duration(ns): 17149624 2025-07-20 09:42:17,306 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757008_16184, type=LAST_IN_PIPELINE terminating 2025-07-20 09:42:18,262 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757008_16184 replica FinalizedReplica, blk_1073757008_16184, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757008 for deletion 2025-07-20 09:42:18,263 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757008_16184 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757008 2025-07-20 09:44:22,282 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757010_16186 src: /192.168.158.5:40814 dest: /192.168.158.4:9866 2025-07-20 09:44:22,310 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:40814, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_562378816_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757010_16186, duration(ns): 23290059 2025-07-20 09:44:22,311 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757010_16186, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-20 09:44:27,268 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757010_16186 replica FinalizedReplica, blk_1073757010_16186, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757010 for deletion 2025-07-20 09:44:27,269 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757010_16186 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757010 2025-07-20 09:45:22,278 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757011_16187 src: /192.168.158.8:37316 dest: /192.168.158.4:9866 2025-07-20 09:45:22,298 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:37316, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1363132411_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757011_16187, duration(ns): 17615454 2025-07-20 09:45:22,298 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757011_16187, type=LAST_IN_PIPELINE terminating 2025-07-20 09:45:24,271 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757011_16187 replica FinalizedReplica, blk_1073757011_16187, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757011 for deletion 2025-07-20 09:45:24,272 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757011_16187 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757011 2025-07-20 09:49:22,282 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757015_16191 src: /192.168.158.7:39046 dest: /192.168.158.4:9866 2025-07-20 09:49:22,309 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:39046, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-192753436_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757015_16191, duration(ns): 21283724 2025-07-20 09:49:22,309 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757015_16191, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-20 09:49:27,280 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757015_16191 replica FinalizedReplica, blk_1073757015_16191, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757015 for deletion 2025-07-20 09:49:27,281 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757015_16191 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757015 2025-07-20 09:54:32,305 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757020_16196 src: /192.168.158.1:49314 dest: /192.168.158.4:9866 2025-07-20 09:54:32,341 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49314, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_683281454_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757020_16196, duration(ns): 27287624 2025-07-20 09:54:32,341 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757020_16196, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-20 09:54:36,287 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757020_16196 replica FinalizedReplica, blk_1073757020_16196, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757020 for deletion 2025-07-20 09:54:36,288 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757020_16196 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757020 2025-07-20 09:55:37,310 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757021_16197 src: /192.168.158.7:49378 dest: /192.168.158.4:9866 2025-07-20 09:55:37,329 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:49378, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1863920608_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757021_16197, duration(ns): 16463416 2025-07-20 09:55:37,329 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757021_16197, type=LAST_IN_PIPELINE terminating 2025-07-20 09:55:39,289 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757021_16197 replica FinalizedReplica, blk_1073757021_16197, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757021 for deletion 2025-07-20 09:55:39,290 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757021_16197 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757021 2025-07-20 09:56:42,316 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757022_16198 src: /192.168.158.8:39834 dest: /192.168.158.4:9866 2025-07-20 09:56:42,342 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:39834, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1617467956_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757022_16198, duration(ns): 20098956 2025-07-20 09:56:42,342 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757022_16198, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-20 09:56:45,292 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757022_16198 replica FinalizedReplica, blk_1073757022_16198, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757022 for deletion 2025-07-20 09:56:45,293 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757022_16198 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757022 2025-07-20 09:57:42,319 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757023_16199 src: /192.168.158.7:48916 dest: /192.168.158.4:9866 2025-07-20 09:57:42,348 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:48916, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2082209143_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757023_16199, duration(ns): 22988959 2025-07-20 09:57:42,348 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757023_16199, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-20 09:57:48,294 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757023_16199 replica FinalizedReplica, blk_1073757023_16199, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757023 for deletion 2025-07-20 09:57:48,295 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757023_16199 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757023 2025-07-20 09:59:18,304 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Successfully sent block report 0x12cfd9a757d31f51, containing 4 storage report(s), of which we sent 4. The reports had 8 total blocks and used 1 RPC(s). This took 0 msec to generate and 4 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5. 2025-07-20 09:59:18,304 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Got finalize command for block pool BP-1059995147-192.168.158.1-1752101929360 2025-07-20 10:02:42,322 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757028_16204 src: /192.168.158.1:47132 dest: /192.168.158.4:9866 2025-07-20 10:02:42,357 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47132, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1386448087_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757028_16204, duration(ns): 25705257 2025-07-20 10:02:42,357 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757028_16204, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-20 10:02:45,304 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757028_16204 replica FinalizedReplica, blk_1073757028_16204, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757028 for deletion 2025-07-20 10:02:45,305 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757028_16204 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757028 2025-07-20 10:04:42,332 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757030_16206 src: /192.168.158.8:51136 dest: /192.168.158.4:9866 2025-07-20 10:04:42,352 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:51136, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2110873798_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757030_16206, duration(ns): 17790579 2025-07-20 10:04:42,353 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757030_16206, type=LAST_IN_PIPELINE terminating 2025-07-20 10:04:45,305 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757030_16206 replica FinalizedReplica, blk_1073757030_16206, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757030 for deletion 2025-07-20 10:04:45,307 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757030_16206 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757030 2025-07-20 10:06:42,327 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757032_16208 src: /192.168.158.8:52114 dest: /192.168.158.4:9866 2025-07-20 10:06:42,353 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:52114, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1146224249_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757032_16208, duration(ns): 20902576 2025-07-20 10:06:42,353 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757032_16208, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-20 10:06:45,312 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757032_16208 replica FinalizedReplica, blk_1073757032_16208, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757032 for deletion 2025-07-20 10:06:45,314 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757032_16208 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757032 2025-07-20 10:08:42,321 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757034_16210 src: /192.168.158.6:53894 dest: /192.168.158.4:9866 2025-07-20 10:08:42,349 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:53894, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-945058841_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757034_16210, duration(ns): 21414530 2025-07-20 10:08:42,350 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757034_16210, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-20 10:08:48,316 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757034_16210 replica FinalizedReplica, blk_1073757034_16210, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757034 for deletion 2025-07-20 10:08:48,317 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757034_16210 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757034 2025-07-20 10:10:42,340 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757036_16212 src: /192.168.158.1:56794 dest: /192.168.158.4:9866 2025-07-20 10:10:42,374 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56794, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1265420275_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757036_16212, duration(ns): 24777746 2025-07-20 10:10:42,374 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757036_16212, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-20 10:10:48,318 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757036_16212 replica FinalizedReplica, blk_1073757036_16212, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757036 for deletion 2025-07-20 10:10:48,319 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757036_16212 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757036 2025-07-20 10:16:42,329 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757042_16218 src: /192.168.158.1:57616 dest: /192.168.158.4:9866 2025-07-20 10:16:42,367 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57616, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1541967482_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757042_16218, duration(ns): 27008511 2025-07-20 10:16:42,367 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757042_16218, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-20 10:16:45,330 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757042_16218 replica FinalizedReplica, blk_1073757042_16218, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757042 for deletion 2025-07-20 10:16:45,331 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757042_16218 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757042 2025-07-20 10:18:42,338 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757044_16220 src: /192.168.158.1:40142 dest: /192.168.158.4:9866 2025-07-20 10:18:42,373 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40142, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-92654911_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757044_16220, duration(ns): 26343080 2025-07-20 10:18:42,374 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757044_16220, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-20 10:18:48,334 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757044_16220 replica FinalizedReplica, blk_1073757044_16220, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757044 for deletion 2025-07-20 10:18:48,335 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757044_16220 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757044 2025-07-20 10:20:42,341 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757046_16222 src: /192.168.158.6:37814 dest: /192.168.158.4:9866 2025-07-20 10:20:42,370 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:37814, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1800966152_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757046_16222, duration(ns): 23560545 2025-07-20 10:20:42,370 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757046_16222, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-20 10:20:45,337 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757046_16222 replica FinalizedReplica, blk_1073757046_16222, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757046 for deletion 2025-07-20 10:20:45,339 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757046_16222 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757046 2025-07-20 10:21:47,354 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757047_16223 src: /192.168.158.7:33044 dest: /192.168.158.4:9866 2025-07-20 10:21:47,380 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:33044, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1263312247_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757047_16223, duration(ns): 20599172 2025-07-20 10:21:47,381 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757047_16223, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-20 10:21:51,341 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757047_16223 replica FinalizedReplica, blk_1073757047_16223, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757047 for deletion 2025-07-20 10:21:51,342 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757047_16223 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757047 2025-07-20 10:25:52,355 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757051_16227 src: /192.168.158.8:42028 dest: /192.168.158.4:9866 2025-07-20 10:25:52,375 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:42028, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-980349933_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757051_16227, duration(ns): 16407339 2025-07-20 10:25:52,375 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757051_16227, type=LAST_IN_PIPELINE terminating 2025-07-20 10:25:54,352 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757051_16227 replica FinalizedReplica, blk_1073757051_16227, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757051 for deletion 2025-07-20 10:25:54,353 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757051_16227 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757051 2025-07-20 10:26:52,360 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757052_16228 src: /192.168.158.6:40190 dest: /192.168.158.4:9866 2025-07-20 10:26:52,381 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:40190, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1393123485_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757052_16228, duration(ns): 19005421 2025-07-20 10:26:52,382 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757052_16228, type=LAST_IN_PIPELINE terminating 2025-07-20 10:26:54,355 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757052_16228 replica FinalizedReplica, blk_1073757052_16228, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757052 for deletion 2025-07-20 10:26:54,356 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757052_16228 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757052 2025-07-20 10:29:57,356 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757055_16231 src: /192.168.158.9:45584 dest: /192.168.158.4:9866 2025-07-20 10:29:57,374 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:45584, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2127570225_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757055_16231, duration(ns): 15355937 2025-07-20 10:29:57,374 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757055_16231, type=LAST_IN_PIPELINE terminating 2025-07-20 10:30:00,361 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757055_16231 replica FinalizedReplica, blk_1073757055_16231, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757055 for deletion 2025-07-20 10:30:00,362 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757055_16231 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757055 2025-07-20 10:30:57,354 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757056_16232 src: /192.168.158.9:35108 dest: /192.168.158.4:9866 2025-07-20 10:30:57,378 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:35108, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2139208786_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757056_16232, duration(ns): 19268947 2025-07-20 10:30:57,378 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757056_16232, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-20 10:31:03,362 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757056_16232 replica FinalizedReplica, blk_1073757056_16232, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757056 for deletion 2025-07-20 10:31:03,364 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757056_16232 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757056 2025-07-20 10:31:57,350 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757057_16233 src: /192.168.158.1:38712 dest: /192.168.158.4:9866 2025-07-20 10:31:57,383 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38712, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1500364254_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757057_16233, duration(ns): 23873422 2025-07-20 10:31:57,383 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757057_16233, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-20 10:32:03,367 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757057_16233 replica FinalizedReplica, blk_1073757057_16233, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757057 for deletion 2025-07-20 10:32:03,368 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757057_16233 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757057 2025-07-20 10:33:02,352 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757058_16234 src: /192.168.158.1:42832 dest: /192.168.158.4:9866 2025-07-20 10:33:02,386 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42832, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1420113634_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757058_16234, duration(ns): 24167021 2025-07-20 10:33:02,386 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757058_16234, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-20 10:33:03,370 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757058_16234 replica FinalizedReplica, blk_1073757058_16234, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757058 for deletion 2025-07-20 10:33:03,371 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757058_16234 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757058 2025-07-20 10:34:02,358 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757059_16235 src: /192.168.158.7:55048 dest: /192.168.158.4:9866 2025-07-20 10:34:02,385 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:55048, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_368077601_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757059_16235, duration(ns): 21309069 2025-07-20 10:34:02,385 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757059_16235, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-20 10:34:03,370 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757059_16235 replica FinalizedReplica, blk_1073757059_16235, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757059 for deletion 2025-07-20 10:34:03,371 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757059_16235 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757059 2025-07-20 10:35:02,363 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757060_16236 src: /192.168.158.6:59036 dest: /192.168.158.4:9866 2025-07-20 10:35:02,383 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:59036, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-861213115_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757060_16236, duration(ns): 17831239 2025-07-20 10:35:02,383 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757060_16236, type=LAST_IN_PIPELINE terminating 2025-07-20 10:35:03,373 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757060_16236 replica FinalizedReplica, blk_1073757060_16236, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757060 for deletion 2025-07-20 10:35:03,374 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757060_16236 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757060 2025-07-20 10:36:02,364 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757061_16237 src: /192.168.158.9:47844 dest: /192.168.158.4:9866 2025-07-20 10:36:02,383 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:47844, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_276751044_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757061_16237, duration(ns): 16822035 2025-07-20 10:36:02,384 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757061_16237, type=LAST_IN_PIPELINE terminating 2025-07-20 10:36:06,376 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757061_16237 replica FinalizedReplica, blk_1073757061_16237, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757061 for deletion 2025-07-20 10:36:06,378 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757061_16237 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757061 2025-07-20 10:37:02,369 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757062_16238 src: /192.168.158.5:56834 dest: /192.168.158.4:9866 2025-07-20 10:37:02,396 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:56834, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1627744115_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757062_16238, duration(ns): 20585515 2025-07-20 10:37:02,396 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757062_16238, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-20 10:37:06,379 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757062_16238 replica FinalizedReplica, blk_1073757062_16238, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757062 for deletion 2025-07-20 10:37:06,380 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757062_16238 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757062 2025-07-20 10:40:02,366 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757065_16241 src: /192.168.158.1:58892 dest: /192.168.158.4:9866 2025-07-20 10:40:02,400 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58892, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1504993752_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757065_16241, duration(ns): 24415781 2025-07-20 10:40:02,400 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757065_16241, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-20 10:40:06,387 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757065_16241 replica FinalizedReplica, blk_1073757065_16241, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757065 for deletion 2025-07-20 10:40:06,388 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757065_16241 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757065 2025-07-20 10:43:07,379 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757068_16244 src: /192.168.158.1:57182 dest: /192.168.158.4:9866 2025-07-20 10:43:07,414 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57182, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-613045700_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757068_16244, duration(ns): 25993857 2025-07-20 10:43:07,415 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757068_16244, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-20 10:43:12,392 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757068_16244 replica FinalizedReplica, blk_1073757068_16244, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757068 for deletion 2025-07-20 10:43:12,394 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757068_16244 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757068 2025-07-20 10:44:07,391 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757069_16245 src: /192.168.158.8:50250 dest: /192.168.158.4:9866 2025-07-20 10:44:07,412 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:50250, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1217615686_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757069_16245, duration(ns): 19587576 2025-07-20 10:44:07,413 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757069_16245, type=LAST_IN_PIPELINE terminating 2025-07-20 10:44:09,395 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757069_16245 replica FinalizedReplica, blk_1073757069_16245, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757069 for deletion 2025-07-20 10:44:09,396 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757069_16245 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757069 2025-07-20 10:45:07,381 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757070_16246 src: /192.168.158.8:38626 dest: /192.168.158.4:9866 2025-07-20 10:45:07,409 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:38626, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1925415257_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757070_16246, duration(ns): 22310390 2025-07-20 10:45:07,409 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757070_16246, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-20 10:45:09,399 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757070_16246 replica FinalizedReplica, blk_1073757070_16246, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757070 for deletion 2025-07-20 10:45:09,400 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757070_16246 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757070 2025-07-20 10:48:07,393 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757073_16249 src: /192.168.158.8:37314 dest: /192.168.158.4:9866 2025-07-20 10:48:07,412 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:37314, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-81993570_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757073_16249, duration(ns): 16912242 2025-07-20 10:48:07,413 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757073_16249, type=LAST_IN_PIPELINE terminating 2025-07-20 10:48:12,406 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757073_16249 replica FinalizedReplica, blk_1073757073_16249, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757073 for deletion 2025-07-20 10:48:12,407 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757073_16249 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757073 2025-07-20 10:51:07,395 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757076_16252 src: /192.168.158.5:37954 dest: /192.168.158.4:9866 2025-07-20 10:51:07,414 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:37954, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-199703614_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757076_16252, duration(ns): 16455182 2025-07-20 10:51:07,414 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757076_16252, type=LAST_IN_PIPELINE terminating 2025-07-20 10:51:09,416 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757076_16252 replica FinalizedReplica, blk_1073757076_16252, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757076 for deletion 2025-07-20 10:51:09,417 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757076_16252 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757076 2025-07-20 10:58:12,400 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757083_16259 src: /192.168.158.1:52050 dest: /192.168.158.4:9866 2025-07-20 10:58:12,438 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52050, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-763237580_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757083_16259, duration(ns): 27100392 2025-07-20 10:58:12,438 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757083_16259, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-20 10:58:15,436 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757083_16259 replica FinalizedReplica, blk_1073757083_16259, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757083 for deletion 2025-07-20 10:58:15,437 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757083_16259 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757083 2025-07-20 10:59:12,409 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757084_16260 src: /192.168.158.1:55554 dest: /192.168.158.4:9866 2025-07-20 10:59:12,440 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55554, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1552529408_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757084_16260, duration(ns): 22482027 2025-07-20 10:59:12,441 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757084_16260, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-20 10:59:15,440 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757084_16260 replica FinalizedReplica, blk_1073757084_16260, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757084 for deletion 2025-07-20 10:59:15,441 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757084_16260 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757084 2025-07-20 11:00:12,412 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757085_16261 src: /192.168.158.6:57554 dest: /192.168.158.4:9866 2025-07-20 11:00:12,439 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:57554, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_661469326_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757085_16261, duration(ns): 20977490 2025-07-20 11:00:12,439 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757085_16261, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-20 11:00:15,441 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757085_16261 replica FinalizedReplica, blk_1073757085_16261, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757085 for deletion 2025-07-20 11:00:15,443 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757085_16261 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757085 2025-07-20 11:01:12,421 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757086_16262 src: /192.168.158.7:38276 dest: /192.168.158.4:9866 2025-07-20 11:01:12,441 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:38276, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-118439616_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757086_16262, duration(ns): 17460780 2025-07-20 11:01:12,441 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757086_16262, type=LAST_IN_PIPELINE terminating 2025-07-20 11:01:18,445 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757086_16262 replica FinalizedReplica, blk_1073757086_16262, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757086 for deletion 2025-07-20 11:01:18,446 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757086_16262 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757086 2025-07-20 11:03:12,417 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757088_16264 src: /192.168.158.5:38682 dest: /192.168.158.4:9866 2025-07-20 11:03:12,449 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:38682, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_141905697_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757088_16264, duration(ns): 24888784 2025-07-20 11:03:12,449 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757088_16264, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-20 11:03:15,451 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757088_16264 replica FinalizedReplica, blk_1073757088_16264, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757088 for deletion 2025-07-20 11:03:15,452 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757088_16264 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757088 2025-07-20 11:06:17,449 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757091_16267 src: /192.168.158.8:49172 dest: /192.168.158.4:9866 2025-07-20 11:06:17,470 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:49172, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-879532398_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757091_16267, duration(ns): 18980144 2025-07-20 11:06:17,470 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757091_16267, type=LAST_IN_PIPELINE terminating 2025-07-20 11:06:18,453 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757091_16267 replica FinalizedReplica, blk_1073757091_16267, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757091 for deletion 2025-07-20 11:06:18,454 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757091_16267 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757091 2025-07-20 11:07:17,447 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757092_16268 src: /192.168.158.6:58200 dest: /192.168.158.4:9866 2025-07-20 11:07:17,467 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:58200, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-904411411_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757092_16268, duration(ns): 17008181 2025-07-20 11:07:17,467 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757092_16268, type=LAST_IN_PIPELINE terminating 2025-07-20 11:07:18,456 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757092_16268 replica FinalizedReplica, blk_1073757092_16268, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757092 for deletion 2025-07-20 11:07:18,457 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757092_16268 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757092 2025-07-20 11:12:17,441 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757097_16273 src: /192.168.158.9:55122 dest: /192.168.158.4:9866 2025-07-20 11:12:17,467 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:55122, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1680144928_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757097_16273, duration(ns): 20259738 2025-07-20 11:12:17,467 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757097_16273, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-20 11:12:18,459 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757097_16273 replica FinalizedReplica, blk_1073757097_16273, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757097 for deletion 2025-07-20 11:12:18,460 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757097_16273 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757097 2025-07-20 11:13:22,445 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757098_16274 src: /192.168.158.5:39022 dest: /192.168.158.4:9866 2025-07-20 11:13:22,470 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:39022, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1460366717_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757098_16274, duration(ns): 19614927 2025-07-20 11:13:22,470 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757098_16274, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-20 11:13:24,463 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757098_16274 replica FinalizedReplica, blk_1073757098_16274, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757098 for deletion 2025-07-20 11:13:24,464 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757098_16274 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757098 2025-07-20 11:14:27,447 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757099_16275 src: /192.168.158.1:53488 dest: /192.168.158.4:9866 2025-07-20 11:14:27,481 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53488, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-868857187_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757099_16275, duration(ns): 25998876 2025-07-20 11:14:27,482 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757099_16275, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-20 11:14:30,464 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757099_16275 replica FinalizedReplica, blk_1073757099_16275, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757099 for deletion 2025-07-20 11:14:30,465 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757099_16275 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757099 2025-07-20 11:15:27,444 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757100_16276 src: /192.168.158.1:38726 dest: /192.168.158.4:9866 2025-07-20 11:15:27,479 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38726, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1883970269_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757100_16276, duration(ns): 26006949 2025-07-20 11:15:27,480 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757100_16276, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-20 11:15:30,466 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757100_16276 replica FinalizedReplica, blk_1073757100_16276, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757100 for deletion 2025-07-20 11:15:30,468 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757100_16276 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757100 2025-07-20 11:16:27,452 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757101_16277 src: /192.168.158.8:42332 dest: /192.168.158.4:9866 2025-07-20 11:16:27,471 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:42332, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1556812153_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757101_16277, duration(ns): 17296639 2025-07-20 11:16:27,472 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757101_16277, type=LAST_IN_PIPELINE terminating 2025-07-20 11:16:33,469 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757101_16277 replica FinalizedReplica, blk_1073757101_16277, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757101 for deletion 2025-07-20 11:16:33,470 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757101_16277 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757101 2025-07-20 11:18:27,451 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757103_16279 src: /192.168.158.1:52084 dest: /192.168.158.4:9866 2025-07-20 11:18:27,483 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52084, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-437421778_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757103_16279, duration(ns): 23312800 2025-07-20 11:18:27,484 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757103_16279, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-20 11:18:33,473 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757103_16279 replica FinalizedReplica, blk_1073757103_16279, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757103 for deletion 2025-07-20 11:18:33,474 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757103_16279 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757103 2025-07-20 11:19:27,460 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757104_16280 src: /192.168.158.5:48732 dest: /192.168.158.4:9866 2025-07-20 11:19:27,479 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:48732, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-272311718_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757104_16280, duration(ns): 17021919 2025-07-20 11:19:27,479 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757104_16280, type=LAST_IN_PIPELINE terminating 2025-07-20 11:19:33,476 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757104_16280 replica FinalizedReplica, blk_1073757104_16280, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757104 for deletion 2025-07-20 11:19:33,477 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757104_16280 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757104 2025-07-20 11:20:27,464 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757105_16281 src: /192.168.158.8:41590 dest: /192.168.158.4:9866 2025-07-20 11:20:27,485 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:41590, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2140782873_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757105_16281, duration(ns): 18090067 2025-07-20 11:20:27,485 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757105_16281, type=LAST_IN_PIPELINE terminating 2025-07-20 11:20:33,481 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757105_16281 replica FinalizedReplica, blk_1073757105_16281, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757105 for deletion 2025-07-20 11:20:33,482 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757105_16281 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757105 2025-07-20 11:21:27,457 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757106_16282 src: /192.168.158.5:47974 dest: /192.168.158.4:9866 2025-07-20 11:21:27,478 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:47974, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_416223771_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757106_16282, duration(ns): 18716135 2025-07-20 11:21:27,479 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757106_16282, type=LAST_IN_PIPELINE terminating 2025-07-20 11:21:33,483 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757106_16282 replica FinalizedReplica, blk_1073757106_16282, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757106 for deletion 2025-07-20 11:21:33,484 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757106_16282 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757106 2025-07-20 11:23:27,446 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757108_16284 src: /192.168.158.1:34948 dest: /192.168.158.4:9866 2025-07-20 11:23:27,481 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34948, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-837571010_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757108_16284, duration(ns): 23894678 2025-07-20 11:23:27,481 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757108_16284, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-20 11:23:33,487 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757108_16284 replica FinalizedReplica, blk_1073757108_16284, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757108 for deletion 2025-07-20 11:23:33,489 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757108_16284 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757108 2025-07-20 11:24:27,456 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757109_16285 src: /192.168.158.9:37324 dest: /192.168.158.4:9866 2025-07-20 11:24:27,473 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:37324, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1947450810_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757109_16285, duration(ns): 15877851 2025-07-20 11:24:27,474 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757109_16285, type=LAST_IN_PIPELINE terminating 2025-07-20 11:24:33,490 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757109_16285 replica FinalizedReplica, blk_1073757109_16285, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757109 for deletion 2025-07-20 11:24:33,491 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757109_16285 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757109 2025-07-20 11:26:32,482 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757111_16287 src: /192.168.158.6:56304 dest: /192.168.158.4:9866 2025-07-20 11:26:32,511 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:56304, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1190274961_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757111_16287, duration(ns): 23797660 2025-07-20 11:26:32,512 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757111_16287, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-20 11:26:36,494 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757111_16287 replica FinalizedReplica, blk_1073757111_16287, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757111 for deletion 2025-07-20 11:26:36,495 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757111_16287 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757111 2025-07-20 11:28:32,464 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757113_16289 src: /192.168.158.8:54016 dest: /192.168.158.4:9866 2025-07-20 11:28:32,483 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:54016, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1999346_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757113_16289, duration(ns): 16849917 2025-07-20 11:28:32,483 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757113_16289, type=LAST_IN_PIPELINE terminating 2025-07-20 11:28:36,498 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757113_16289 replica FinalizedReplica, blk_1073757113_16289, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757113 for deletion 2025-07-20 11:28:36,499 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757113_16289 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757113 2025-07-20 11:32:32,472 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757117_16293 src: /192.168.158.6:44724 dest: /192.168.158.4:9866 2025-07-20 11:32:32,493 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:44724, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1679652856_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757117_16293, duration(ns): 19504084 2025-07-20 11:32:32,494 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757117_16293, type=LAST_IN_PIPELINE terminating 2025-07-20 11:32:36,507 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757117_16293 replica FinalizedReplica, blk_1073757117_16293, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757117 for deletion 2025-07-20 11:32:36,508 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757117_16293 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757117 2025-07-20 11:33:32,471 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757118_16294 src: /192.168.158.9:59918 dest: /192.168.158.4:9866 2025-07-20 11:33:32,497 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:59918, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1243142570_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757118_16294, duration(ns): 20975161 2025-07-20 11:33:32,498 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757118_16294, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-20 11:33:39,509 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757118_16294 replica FinalizedReplica, blk_1073757118_16294, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757118 for deletion 2025-07-20 11:33:39,510 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757118_16294 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757118 2025-07-20 11:34:32,472 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757119_16295 src: /192.168.158.6:36872 dest: /192.168.158.4:9866 2025-07-20 11:34:32,498 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:36872, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1739653049_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757119_16295, duration(ns): 20361478 2025-07-20 11:34:32,498 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757119_16295, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-20 11:34:39,510 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757119_16295 replica FinalizedReplica, blk_1073757119_16295, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757119 for deletion 2025-07-20 11:34:39,512 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757119_16295 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757119 2025-07-20 11:35:32,472 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757120_16296 src: /192.168.158.1:33886 dest: /192.168.158.4:9866 2025-07-20 11:35:32,509 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33886, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-834506728_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757120_16296, duration(ns): 28094716 2025-07-20 11:35:32,510 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757120_16296, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-20 11:35:36,512 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757120_16296 replica FinalizedReplica, blk_1073757120_16296, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757120 for deletion 2025-07-20 11:35:36,513 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757120_16296 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757120 2025-07-20 11:36:13,269 INFO org.apache.hadoop.hdfs.server.datanode.DirectoryScanner: BlockPool BP-1059995147-192.168.158.1-1752101929360 Total blocks: 8, missing metadata files:0, missing block files:0, missing blocks in memory:0, mismatched blocks:0 2025-07-20 11:40:32,505 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757125_16301 src: /192.168.158.1:36288 dest: /192.168.158.4:9866 2025-07-20 11:40:32,543 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36288, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1966779496_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757125_16301, duration(ns): 27437953 2025-07-20 11:40:32,543 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757125_16301, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-20 11:40:36,527 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757125_16301 replica FinalizedReplica, blk_1073757125_16301, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757125 for deletion 2025-07-20 11:40:36,528 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757125_16301 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757125 2025-07-20 11:41:32,497 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757126_16302 src: /192.168.158.1:34540 dest: /192.168.158.4:9866 2025-07-20 11:41:32,533 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34540, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-274087698_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757126_16302, duration(ns): 27102832 2025-07-20 11:41:32,534 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757126_16302, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-20 11:41:36,528 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757126_16302 replica FinalizedReplica, blk_1073757126_16302, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757126 for deletion 2025-07-20 11:41:36,530 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757126_16302 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757126 2025-07-20 11:43:32,543 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757128_16304 src: /192.168.158.7:36104 dest: /192.168.158.4:9866 2025-07-20 11:43:32,563 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:36104, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1424053235_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757128_16304, duration(ns): 17292308 2025-07-20 11:43:32,563 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757128_16304, type=LAST_IN_PIPELINE terminating 2025-07-20 11:43:39,536 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757128_16304 replica FinalizedReplica, blk_1073757128_16304, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757128 for deletion 2025-07-20 11:43:39,537 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757128_16304 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757128 2025-07-20 11:44:32,520 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757129_16305 src: /192.168.158.5:52308 dest: /192.168.158.4:9866 2025-07-20 11:44:32,547 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:52308, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1634615797_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757129_16305, duration(ns): 21123438 2025-07-20 11:44:32,547 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757129_16305, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-20 11:44:39,537 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757129_16305 replica FinalizedReplica, blk_1073757129_16305, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757129 for deletion 2025-07-20 11:44:39,538 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757129_16305 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757129 2025-07-20 11:45:32,493 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757130_16306 src: /192.168.158.1:42848 dest: /192.168.158.4:9866 2025-07-20 11:45:32,527 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42848, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_118017614_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757130_16306, duration(ns): 25514134 2025-07-20 11:45:32,527 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757130_16306, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-20 11:45:36,540 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757130_16306 replica FinalizedReplica, blk_1073757130_16306, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757130 for deletion 2025-07-20 11:45:36,541 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757130_16306 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757130 2025-07-20 11:46:32,497 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757131_16307 src: /192.168.158.9:43084 dest: /192.168.158.4:9866 2025-07-20 11:46:32,515 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:43084, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1824204854_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757131_16307, duration(ns): 16279303 2025-07-20 11:46:32,516 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757131_16307, type=LAST_IN_PIPELINE terminating 2025-07-20 11:46:39,541 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757131_16307 replica FinalizedReplica, blk_1073757131_16307, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757131 for deletion 2025-07-20 11:46:39,542 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757131_16307 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757131 2025-07-20 11:48:32,500 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757133_16309 src: /192.168.158.9:36332 dest: /192.168.158.4:9866 2025-07-20 11:48:32,519 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:36332, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1451004312_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757133_16309, duration(ns): 16804591 2025-07-20 11:48:32,519 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757133_16309, type=LAST_IN_PIPELINE terminating 2025-07-20 11:48:36,544 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757133_16309 replica FinalizedReplica, blk_1073757133_16309, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757133 for deletion 2025-07-20 11:48:36,545 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757133_16309 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757133 2025-07-20 11:50:42,504 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757135_16311 src: /192.168.158.9:33710 dest: /192.168.158.4:9866 2025-07-20 11:50:42,524 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:33710, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1502316491_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757135_16311, duration(ns): 17091778 2025-07-20 11:50:42,524 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757135_16311, type=LAST_IN_PIPELINE terminating 2025-07-20 11:50:45,548 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757135_16311 replica FinalizedReplica, blk_1073757135_16311, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757135 for deletion 2025-07-20 11:50:45,549 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757135_16311 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757135 2025-07-20 11:51:42,524 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757136_16312 src: /192.168.158.1:37076 dest: /192.168.158.4:9866 2025-07-20 11:51:42,558 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37076, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2009121711_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757136_16312, duration(ns): 25222644 2025-07-20 11:51:42,559 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757136_16312, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-20 11:51:45,550 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757136_16312 replica FinalizedReplica, blk_1073757136_16312, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757136 for deletion 2025-07-20 11:51:45,551 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757136_16312 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757136 2025-07-20 11:52:42,503 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757137_16313 src: /192.168.158.6:57422 dest: /192.168.158.4:9866 2025-07-20 11:52:42,530 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:57422, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1257325768_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757137_16313, duration(ns): 21865920 2025-07-20 11:52:42,530 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757137_16313, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-20 11:52:45,555 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757137_16313 replica FinalizedReplica, blk_1073757137_16313, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757137 for deletion 2025-07-20 11:52:45,556 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757137_16313 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757137 2025-07-20 11:57:47,523 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757142_16318 src: /192.168.158.9:46670 dest: /192.168.158.4:9866 2025-07-20 11:57:47,549 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:46670, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-797184138_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757142_16318, duration(ns): 20522951 2025-07-20 11:57:47,549 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757142_16318, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-20 11:57:51,565 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757142_16318 replica FinalizedReplica, blk_1073757142_16318, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757142 for deletion 2025-07-20 11:57:51,566 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757142_16318 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757142 2025-07-20 12:01:47,535 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757146_16322 src: /192.168.158.9:51534 dest: /192.168.158.4:9866 2025-07-20 12:01:47,555 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:51534, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2128603368_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757146_16322, duration(ns): 17152059 2025-07-20 12:01:47,555 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757146_16322, type=LAST_IN_PIPELINE terminating 2025-07-20 12:01:51,570 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757146_16322 replica FinalizedReplica, blk_1073757146_16322, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757146 for deletion 2025-07-20 12:01:51,571 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757146_16322 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757146 2025-07-20 12:02:47,544 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757147_16323 src: /192.168.158.8:42444 dest: /192.168.158.4:9866 2025-07-20 12:02:47,575 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:42444, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_381465573_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757147_16323, duration(ns): 26034829 2025-07-20 12:02:47,576 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757147_16323, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-20 12:02:51,571 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757147_16323 replica FinalizedReplica, blk_1073757147_16323, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757147 for deletion 2025-07-20 12:02:51,573 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757147_16323 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757147 2025-07-20 12:03:47,527 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757148_16324 src: /192.168.158.1:50346 dest: /192.168.158.4:9866 2025-07-20 12:03:47,559 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50346, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2070042254_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757148_16324, duration(ns): 22767583 2025-07-20 12:03:47,559 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757148_16324, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-20 12:03:51,575 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757148_16324 replica FinalizedReplica, blk_1073757148_16324, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757148 for deletion 2025-07-20 12:03:51,576 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757148_16324 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757148 2025-07-20 12:06:47,533 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757151_16327 src: /192.168.158.1:49640 dest: /192.168.158.4:9866 2025-07-20 12:06:47,565 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49640, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_699066602_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757151_16327, duration(ns): 22819668 2025-07-20 12:06:47,565 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757151_16327, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-20 12:06:54,580 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757151_16327 replica FinalizedReplica, blk_1073757151_16327, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757151 for deletion 2025-07-20 12:06:54,581 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757151_16327 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757151 2025-07-20 12:08:47,531 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757153_16329 src: /192.168.158.1:43418 dest: /192.168.158.4:9866 2025-07-20 12:08:47,565 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43418, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1934263887_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757153_16329, duration(ns): 24921887 2025-07-20 12:08:47,565 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757153_16329, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-20 12:08:51,585 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757153_16329 replica FinalizedReplica, blk_1073757153_16329, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757153 for deletion 2025-07-20 12:08:51,586 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757153_16329 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757153 2025-07-20 12:09:47,536 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757154_16330 src: /192.168.158.1:50600 dest: /192.168.158.4:9866 2025-07-20 12:09:47,571 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50600, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_328905480_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757154_16330, duration(ns): 25794690 2025-07-20 12:09:47,573 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757154_16330, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-20 12:09:51,588 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757154_16330 replica FinalizedReplica, blk_1073757154_16330, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757154 for deletion 2025-07-20 12:09:51,589 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757154_16330 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757154 2025-07-20 12:12:52,543 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757157_16333 src: /192.168.158.7:35438 dest: /192.168.158.4:9866 2025-07-20 12:12:52,561 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:35438, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1952954936_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757157_16333, duration(ns): 16425089 2025-07-20 12:12:52,562 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757157_16333, type=LAST_IN_PIPELINE terminating 2025-07-20 12:12:57,595 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757157_16333 replica FinalizedReplica, blk_1073757157_16333, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757157 for deletion 2025-07-20 12:12:57,596 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757157_16333 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757157 2025-07-20 12:13:52,546 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757158_16334 src: /192.168.158.1:47114 dest: /192.168.158.4:9866 2025-07-20 12:13:52,578 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47114, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1246332128_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757158_16334, duration(ns): 22372401 2025-07-20 12:13:52,578 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757158_16334, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-20 12:14:00,597 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757158_16334 replica FinalizedReplica, blk_1073757158_16334, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757158 for deletion 2025-07-20 12:14:00,599 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757158_16334 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757158 2025-07-20 12:14:52,541 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757159_16335 src: /192.168.158.8:43960 dest: /192.168.158.4:9866 2025-07-20 12:14:52,568 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:43960, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1085633229_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757159_16335, duration(ns): 20318334 2025-07-20 12:14:52,569 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757159_16335, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-20 12:14:57,599 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757159_16335 replica FinalizedReplica, blk_1073757159_16335, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757159 for deletion 2025-07-20 12:14:57,601 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757159_16335 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757159 2025-07-20 12:15:52,545 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757160_16336 src: /192.168.158.1:39518 dest: /192.168.158.4:9866 2025-07-20 12:15:52,582 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39518, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1999968568_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757160_16336, duration(ns): 26554497 2025-07-20 12:15:52,582 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757160_16336, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-20 12:15:57,602 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757160_16336 replica FinalizedReplica, blk_1073757160_16336, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757160 for deletion 2025-07-20 12:15:57,603 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757160_16336 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757160 2025-07-20 12:17:57,551 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757162_16338 src: /192.168.158.5:47898 dest: /192.168.158.4:9866 2025-07-20 12:17:57,573 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:47898, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1862695600_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757162_16338, duration(ns): 19695342 2025-07-20 12:17:57,573 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757162_16338, type=LAST_IN_PIPELINE terminating 2025-07-20 12:18:00,605 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757162_16338 replica FinalizedReplica, blk_1073757162_16338, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757162 for deletion 2025-07-20 12:18:00,606 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757162_16338 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757162 2025-07-20 12:19:02,554 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757163_16339 src: /192.168.158.9:43682 dest: /192.168.158.4:9866 2025-07-20 12:19:02,572 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:43682, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-800884370_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757163_16339, duration(ns): 15985830 2025-07-20 12:19:02,573 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757163_16339, type=LAST_IN_PIPELINE terminating 2025-07-20 12:19:09,608 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757163_16339 replica FinalizedReplica, blk_1073757163_16339, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757163 for deletion 2025-07-20 12:19:09,609 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757163_16339 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757163 2025-07-20 12:21:02,551 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757165_16341 src: /192.168.158.5:48938 dest: /192.168.158.4:9866 2025-07-20 12:21:02,579 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:48938, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1935184701_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757165_16341, duration(ns): 23274937 2025-07-20 12:21:02,582 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757165_16341, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-20 12:21:09,612 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757165_16341 replica FinalizedReplica, blk_1073757165_16341, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757165 for deletion 2025-07-20 12:21:09,613 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757165_16341 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757165 2025-07-20 12:25:07,562 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757169_16345 src: /192.168.158.8:38710 dest: /192.168.158.4:9866 2025-07-20 12:25:07,588 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:38710, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-91366492_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757169_16345, duration(ns): 20948142 2025-07-20 12:25:07,589 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757169_16345, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-20 12:25:12,619 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757169_16345 replica FinalizedReplica, blk_1073757169_16345, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757169 for deletion 2025-07-20 12:25:12,620 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757169_16345 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757169 2025-07-20 12:26:07,567 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757170_16346 src: /192.168.158.7:43604 dest: /192.168.158.4:9866 2025-07-20 12:26:07,588 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:43604, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-130122443_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757170_16346, duration(ns): 18951821 2025-07-20 12:26:07,589 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757170_16346, type=LAST_IN_PIPELINE terminating 2025-07-20 12:26:12,620 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757170_16346 replica FinalizedReplica, blk_1073757170_16346, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757170 for deletion 2025-07-20 12:26:12,622 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757170_16346 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757170 2025-07-20 12:27:12,562 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757171_16347 src: /192.168.158.8:53098 dest: /192.168.158.4:9866 2025-07-20 12:27:12,587 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:53098, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2031552737_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757171_16347, duration(ns): 20145165 2025-07-20 12:27:12,588 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757171_16347, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-20 12:27:15,623 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757171_16347 replica FinalizedReplica, blk_1073757171_16347, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757171 for deletion 2025-07-20 12:27:15,625 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757171_16347 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757171 2025-07-20 12:28:12,556 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757172_16348 src: /192.168.158.8:55628 dest: /192.168.158.4:9866 2025-07-20 12:28:12,584 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:55628, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_292871684_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757172_16348, duration(ns): 22482156 2025-07-20 12:28:12,585 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757172_16348, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-20 12:28:15,626 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757172_16348 replica FinalizedReplica, blk_1073757172_16348, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757172 for deletion 2025-07-20 12:28:15,627 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757172_16348 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757172 2025-07-20 12:29:12,556 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757173_16349 src: /192.168.158.8:50020 dest: /192.168.158.4:9866 2025-07-20 12:29:12,582 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:50020, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1950491619_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757173_16349, duration(ns): 21463828 2025-07-20 12:29:12,583 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757173_16349, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-20 12:29:15,629 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757173_16349 replica FinalizedReplica, blk_1073757173_16349, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757173 for deletion 2025-07-20 12:29:15,630 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757173_16349 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757173 2025-07-20 12:30:12,563 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757174_16350 src: /192.168.158.5:46874 dest: /192.168.158.4:9866 2025-07-20 12:30:12,582 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:46874, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1238920923_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757174_16350, duration(ns): 15267752 2025-07-20 12:30:12,583 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757174_16350, type=LAST_IN_PIPELINE terminating 2025-07-20 12:30:18,633 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757174_16350 replica FinalizedReplica, blk_1073757174_16350, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757174 for deletion 2025-07-20 12:30:18,634 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757174_16350 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757174 2025-07-20 12:33:17,561 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757177_16353 src: /192.168.158.6:52100 dest: /192.168.158.4:9866 2025-07-20 12:33:17,588 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:52100, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1686998050_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757177_16353, duration(ns): 20675217 2025-07-20 12:33:17,589 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757177_16353, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-20 12:33:21,641 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757177_16353 replica FinalizedReplica, blk_1073757177_16353, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757177 for deletion 2025-07-20 12:33:21,642 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757177_16353 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757177 2025-07-20 12:37:27,579 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757181_16357 src: /192.168.158.1:44732 dest: /192.168.158.4:9866 2025-07-20 12:37:27,614 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44732, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-792974108_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757181_16357, duration(ns): 26844130 2025-07-20 12:37:27,615 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757181_16357, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-20 12:37:33,649 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757181_16357 replica FinalizedReplica, blk_1073757181_16357, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757181 for deletion 2025-07-20 12:37:33,650 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757181_16357 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757181 2025-07-20 12:38:27,588 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757182_16358 src: /192.168.158.6:47626 dest: /192.168.158.4:9866 2025-07-20 12:38:27,606 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:47626, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_780184054_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757182_16358, duration(ns): 16147376 2025-07-20 12:38:27,606 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757182_16358, type=LAST_IN_PIPELINE terminating 2025-07-20 12:38:30,650 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757182_16358 replica FinalizedReplica, blk_1073757182_16358, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757182 for deletion 2025-07-20 12:38:30,651 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757182_16358 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757182 2025-07-20 12:39:27,578 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757183_16359 src: /192.168.158.1:43948 dest: /192.168.158.4:9866 2025-07-20 12:39:27,609 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43948, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1089687926_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757183_16359, duration(ns): 21963764 2025-07-20 12:39:27,609 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757183_16359, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-20 12:39:33,650 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757183_16359 replica FinalizedReplica, blk_1073757183_16359, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757183 for deletion 2025-07-20 12:39:33,651 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757183_16359 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir27/blk_1073757183 2025-07-20 12:41:27,586 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757185_16361 src: /192.168.158.1:53482 dest: /192.168.158.4:9866 2025-07-20 12:41:27,621 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53482, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_12660750_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757185_16361, duration(ns): 25824602 2025-07-20 12:41:27,622 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757185_16361, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-20 12:41:30,656 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757185_16361 replica FinalizedReplica, blk_1073757185_16361, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757185 for deletion 2025-07-20 12:41:30,657 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757185_16361 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757185 2025-07-20 12:43:32,582 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757187_16363 src: /192.168.158.5:43618 dest: /192.168.158.4:9866 2025-07-20 12:43:32,602 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:43618, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1301468777_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757187_16363, duration(ns): 17293128 2025-07-20 12:43:32,602 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757187_16363, type=LAST_IN_PIPELINE terminating 2025-07-20 12:43:39,660 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757187_16363 replica FinalizedReplica, blk_1073757187_16363, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757187 for deletion 2025-07-20 12:43:39,661 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757187_16363 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757187 2025-07-20 12:46:37,585 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757190_16366 src: /192.168.158.6:52446 dest: /192.168.158.4:9866 2025-07-20 12:46:37,607 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:52446, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-986714400_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757190_16366, duration(ns): 19739577 2025-07-20 12:46:37,607 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757190_16366, type=LAST_IN_PIPELINE terminating 2025-07-20 12:46:42,666 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757190_16366 replica FinalizedReplica, blk_1073757190_16366, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757190 for deletion 2025-07-20 12:46:42,667 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757190_16366 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757190 2025-07-20 12:48:47,593 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757192_16368 src: /192.168.158.7:33282 dest: /192.168.158.4:9866 2025-07-20 12:48:47,613 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:33282, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_441285893_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757192_16368, duration(ns): 17849964 2025-07-20 12:48:47,613 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757192_16368, type=LAST_IN_PIPELINE terminating 2025-07-20 12:48:51,674 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757192_16368 replica FinalizedReplica, blk_1073757192_16368, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757192 for deletion 2025-07-20 12:48:51,675 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757192_16368 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757192 2025-07-20 12:49:47,592 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757193_16369 src: /192.168.158.5:57646 dest: /192.168.158.4:9866 2025-07-20 12:49:47,611 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:57646, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_329269269_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757193_16369, duration(ns): 17443124 2025-07-20 12:49:47,612 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757193_16369, type=LAST_IN_PIPELINE terminating 2025-07-20 12:49:54,677 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757193_16369 replica FinalizedReplica, blk_1073757193_16369, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757193 for deletion 2025-07-20 12:49:54,678 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757193_16369 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757193 2025-07-20 12:50:52,595 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757194_16370 src: /192.168.158.6:60452 dest: /192.168.158.4:9866 2025-07-20 12:50:52,621 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:60452, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-886223085_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757194_16370, duration(ns): 20453351 2025-07-20 12:50:52,622 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757194_16370, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-20 12:50:57,677 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757194_16370 replica FinalizedReplica, blk_1073757194_16370, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757194 for deletion 2025-07-20 12:50:57,678 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757194_16370 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757194 2025-07-20 12:51:52,594 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757195_16371 src: /192.168.158.7:41488 dest: /192.168.158.4:9866 2025-07-20 12:51:52,622 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:41488, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1306703833_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757195_16371, duration(ns): 20848381 2025-07-20 12:51:52,622 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757195_16371, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-20 12:51:57,680 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757195_16371 replica FinalizedReplica, blk_1073757195_16371, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757195 for deletion 2025-07-20 12:51:57,681 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757195_16371 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757195 2025-07-20 12:52:52,588 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757196_16372 src: /192.168.158.1:39704 dest: /192.168.158.4:9866 2025-07-20 12:52:52,623 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39704, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-664953188_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757196_16372, duration(ns): 25548884 2025-07-20 12:52:52,623 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757196_16372, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-20 12:52:57,680 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757196_16372 replica FinalizedReplica, blk_1073757196_16372, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757196 for deletion 2025-07-20 12:52:57,681 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757196_16372 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757196 2025-07-20 12:55:52,606 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757199_16375 src: /192.168.158.8:46026 dest: /192.168.158.4:9866 2025-07-20 12:55:52,633 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:46026, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_438768654_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757199_16375, duration(ns): 21534876 2025-07-20 12:55:52,633 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757199_16375, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-20 12:55:57,688 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757199_16375 replica FinalizedReplica, blk_1073757199_16375, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757199 for deletion 2025-07-20 12:55:57,689 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757199_16375 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757199 2025-07-20 12:56:52,603 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757200_16376 src: /192.168.158.1:53284 dest: /192.168.158.4:9866 2025-07-20 12:56:52,639 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53284, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1992865290_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757200_16376, duration(ns): 26382741 2025-07-20 12:56:52,639 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757200_16376, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-20 12:56:57,692 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757200_16376 replica FinalizedReplica, blk_1073757200_16376, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757200 for deletion 2025-07-20 12:56:57,693 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757200_16376 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757200 2025-07-20 12:57:52,618 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757201_16377 src: /192.168.158.1:54014 dest: /192.168.158.4:9866 2025-07-20 12:57:52,651 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54014, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-457680529_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757201_16377, duration(ns): 23615948 2025-07-20 12:57:52,655 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757201_16377, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-20 12:57:57,694 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757201_16377 replica FinalizedReplica, blk_1073757201_16377, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757201 for deletion 2025-07-20 12:57:57,695 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757201_16377 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757201 2025-07-20 12:58:52,606 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757202_16378 src: /192.168.158.5:38738 dest: /192.168.158.4:9866 2025-07-20 12:58:52,635 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:38738, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-513067588_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757202_16378, duration(ns): 23706953 2025-07-20 12:58:52,636 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757202_16378, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-20 12:58:57,695 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757202_16378 replica FinalizedReplica, blk_1073757202_16378, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757202 for deletion 2025-07-20 12:58:57,698 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757202_16378 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757202 2025-07-20 13:04:52,624 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757208_16384 src: /192.168.158.8:58282 dest: /192.168.158.4:9866 2025-07-20 13:04:52,643 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:58282, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1550739944_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757208_16384, duration(ns): 17292958 2025-07-20 13:04:52,643 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757208_16384, type=LAST_IN_PIPELINE terminating 2025-07-20 13:04:57,712 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757208_16384 replica FinalizedReplica, blk_1073757208_16384, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757208 for deletion 2025-07-20 13:04:57,713 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757208_16384 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757208 2025-07-20 13:05:52,612 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757209_16385 src: /192.168.158.1:33528 dest: /192.168.158.4:9866 2025-07-20 13:05:52,645 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33528, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_496629601_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757209_16385, duration(ns): 24578118 2025-07-20 13:05:52,646 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757209_16385, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-20 13:05:57,717 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757209_16385 replica FinalizedReplica, blk_1073757209_16385, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757209 for deletion 2025-07-20 13:05:57,718 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757209_16385 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757209 2025-07-20 13:06:52,619 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757210_16386 src: /192.168.158.7:47718 dest: /192.168.158.4:9866 2025-07-20 13:06:52,648 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:47718, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_12437767_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757210_16386, duration(ns): 22803791 2025-07-20 13:06:52,648 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757210_16386, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-20 13:06:57,717 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757210_16386 replica FinalizedReplica, blk_1073757210_16386, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757210 for deletion 2025-07-20 13:06:57,718 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757210_16386 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757210 2025-07-20 13:08:52,625 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757212_16388 src: /192.168.158.1:51566 dest: /192.168.158.4:9866 2025-07-20 13:08:52,659 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51566, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_76567129_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757212_16388, duration(ns): 24248059 2025-07-20 13:08:52,659 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757212_16388, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-20 13:09:00,720 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757212_16388 replica FinalizedReplica, blk_1073757212_16388, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757212 for deletion 2025-07-20 13:09:00,721 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757212_16388 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757212 2025-07-20 13:09:52,616 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757213_16389 src: /192.168.158.1:54884 dest: /192.168.158.4:9866 2025-07-20 13:09:52,649 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54884, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_908369711_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757213_16389, duration(ns): 23984189 2025-07-20 13:09:52,649 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757213_16389, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-20 13:10:00,724 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757213_16389 replica FinalizedReplica, blk_1073757213_16389, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757213 for deletion 2025-07-20 13:10:00,725 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757213_16389 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757213 2025-07-20 13:11:57,623 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757215_16391 src: /192.168.158.8:44052 dest: /192.168.158.4:9866 2025-07-20 13:11:57,651 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:44052, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1500565114_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757215_16391, duration(ns): 22769831 2025-07-20 13:11:57,652 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757215_16391, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-20 13:12:00,727 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757215_16391 replica FinalizedReplica, blk_1073757215_16391, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757215 for deletion 2025-07-20 13:12:00,728 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757215_16391 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757215 2025-07-20 13:13:57,626 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757217_16393 src: /192.168.158.1:42688 dest: /192.168.158.4:9866 2025-07-20 13:13:57,661 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42688, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_100988205_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757217_16393, duration(ns): 25520989 2025-07-20 13:13:57,661 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757217_16393, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-20 13:14:03,734 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757217_16393 replica FinalizedReplica, blk_1073757217_16393, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757217 for deletion 2025-07-20 13:14:03,735 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757217_16393 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757217 2025-07-20 13:18:02,629 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757221_16397 src: /192.168.158.6:52574 dest: /192.168.158.4:9866 2025-07-20 13:18:02,648 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:52574, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1955317186_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757221_16397, duration(ns): 17256291 2025-07-20 13:18:02,649 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757221_16397, type=LAST_IN_PIPELINE terminating 2025-07-20 13:18:09,747 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757221_16397 replica FinalizedReplica, blk_1073757221_16397, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757221 for deletion 2025-07-20 13:18:09,748 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757221_16397 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757221 2025-07-20 13:19:02,658 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757222_16398 src: /192.168.158.8:46380 dest: /192.168.158.4:9866 2025-07-20 13:19:02,686 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:46380, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1512143201_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757222_16398, duration(ns): 22748072 2025-07-20 13:19:02,687 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757222_16398, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-20 13:19:06,748 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757222_16398 replica FinalizedReplica, blk_1073757222_16398, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757222 for deletion 2025-07-20 13:19:06,749 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757222_16398 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757222 2025-07-20 13:20:07,632 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757223_16399 src: /192.168.158.9:47666 dest: /192.168.158.4:9866 2025-07-20 13:20:07,651 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:47666, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1142631257_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757223_16399, duration(ns): 17287636 2025-07-20 13:20:07,652 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757223_16399, type=LAST_IN_PIPELINE terminating 2025-07-20 13:20:12,750 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757223_16399 replica FinalizedReplica, blk_1073757223_16399, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757223 for deletion 2025-07-20 13:20:12,752 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757223_16399 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757223 2025-07-20 13:21:12,633 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757224_16400 src: /192.168.158.1:56558 dest: /192.168.158.4:9866 2025-07-20 13:21:12,669 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56558, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-285120991_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757224_16400, duration(ns): 26284495 2025-07-20 13:21:12,669 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757224_16400, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-20 13:21:18,756 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757224_16400 replica FinalizedReplica, blk_1073757224_16400, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757224 for deletion 2025-07-20 13:21:18,757 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757224_16400 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757224 2025-07-20 13:23:12,641 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757226_16402 src: /192.168.158.1:35018 dest: /192.168.158.4:9866 2025-07-20 13:23:12,678 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35018, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2129503340_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757226_16402, duration(ns): 26337340 2025-07-20 13:23:12,678 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757226_16402, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-20 13:23:18,760 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757226_16402 replica FinalizedReplica, blk_1073757226_16402, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757226 for deletion 2025-07-20 13:23:18,762 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757226_16402 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757226 2025-07-20 13:26:17,647 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757229_16405 src: /192.168.158.9:54178 dest: /192.168.158.4:9866 2025-07-20 13:26:17,668 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:54178, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-502071792_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757229_16405, duration(ns): 18044533 2025-07-20 13:26:17,668 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757229_16405, type=LAST_IN_PIPELINE terminating 2025-07-20 13:26:21,764 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757229_16405 replica FinalizedReplica, blk_1073757229_16405, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757229 for deletion 2025-07-20 13:26:21,765 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757229_16405 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757229 2025-07-20 13:27:22,643 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757230_16406 src: /192.168.158.1:52470 dest: /192.168.158.4:9866 2025-07-20 13:27:22,678 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52470, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_699780154_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757230_16406, duration(ns): 25842802 2025-07-20 13:27:22,678 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757230_16406, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-20 13:27:30,765 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757230_16406 replica FinalizedReplica, blk_1073757230_16406, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757230 for deletion 2025-07-20 13:27:30,766 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757230_16406 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757230 2025-07-20 13:30:27,660 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757233_16409 src: /192.168.158.5:58666 dest: /192.168.158.4:9866 2025-07-20 13:30:27,682 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:58666, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1257119383_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757233_16409, duration(ns): 19136637 2025-07-20 13:30:27,682 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757233_16409, type=LAST_IN_PIPELINE terminating 2025-07-20 13:30:30,770 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757233_16409 replica FinalizedReplica, blk_1073757233_16409, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757233 for deletion 2025-07-20 13:30:30,771 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757233_16409 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757233 2025-07-20 13:31:27,659 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757234_16410 src: /192.168.158.9:53478 dest: /192.168.158.4:9866 2025-07-20 13:31:27,679 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:53478, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1063052297_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757234_16410, duration(ns): 17476995 2025-07-20 13:31:27,679 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757234_16410, type=LAST_IN_PIPELINE terminating 2025-07-20 13:31:33,773 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757234_16410 replica FinalizedReplica, blk_1073757234_16410, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757234 for deletion 2025-07-20 13:31:33,774 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757234_16410 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757234 2025-07-20 13:36:27,687 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757239_16415 src: /192.168.158.5:55356 dest: /192.168.158.4:9866 2025-07-20 13:36:27,713 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:55356, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1849476587_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757239_16415, duration(ns): 20842738 2025-07-20 13:36:27,713 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757239_16415, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-20 13:36:33,789 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757239_16415 replica FinalizedReplica, blk_1073757239_16415, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757239 for deletion 2025-07-20 13:36:33,790 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757239_16415 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757239 2025-07-20 13:38:27,665 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757241_16417 src: /192.168.158.6:52702 dest: /192.168.158.4:9866 2025-07-20 13:38:27,691 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:52702, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1027676381_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757241_16417, duration(ns): 20346373 2025-07-20 13:38:27,692 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757241_16417, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-20 13:38:30,798 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757241_16417 replica FinalizedReplica, blk_1073757241_16417, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757241 for deletion 2025-07-20 13:38:30,799 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757241_16417 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757241 2025-07-20 13:45:37,714 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757248_16424 src: /192.168.158.1:38148 dest: /192.168.158.4:9866 2025-07-20 13:45:37,749 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38148, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_456771308_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757248_16424, duration(ns): 25986533 2025-07-20 13:45:37,749 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757248_16424, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-20 13:45:45,816 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757248_16424 replica FinalizedReplica, blk_1073757248_16424, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757248 for deletion 2025-07-20 13:45:45,817 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757248_16424 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757248 2025-07-20 13:48:47,729 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757251_16427 src: /192.168.158.7:49160 dest: /192.168.158.4:9866 2025-07-20 13:48:47,750 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:49160, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2034265913_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757251_16427, duration(ns): 19434474 2025-07-20 13:48:47,751 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757251_16427, type=LAST_IN_PIPELINE terminating 2025-07-20 13:48:51,825 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757251_16427 replica FinalizedReplica, blk_1073757251_16427, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757251 for deletion 2025-07-20 13:48:51,826 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757251_16427 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757251 2025-07-20 13:51:47,760 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757254_16430 src: /192.168.158.5:53080 dest: /192.168.158.4:9866 2025-07-20 13:51:47,785 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:53080, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1651502501_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757254_16430, duration(ns): 20183954 2025-07-20 13:51:47,785 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757254_16430, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-20 13:51:54,833 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757254_16430 replica FinalizedReplica, blk_1073757254_16430, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757254 for deletion 2025-07-20 13:51:54,834 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757254_16430 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757254 2025-07-20 13:53:52,724 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757256_16432 src: /192.168.158.5:39420 dest: /192.168.158.4:9866 2025-07-20 13:53:52,743 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:39420, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-468561741_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757256_16432, duration(ns): 16828937 2025-07-20 13:53:52,744 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757256_16432, type=LAST_IN_PIPELINE terminating 2025-07-20 13:53:57,838 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757256_16432 replica FinalizedReplica, blk_1073757256_16432, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757256 for deletion 2025-07-20 13:53:57,839 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757256_16432 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757256 2025-07-20 13:55:52,693 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757258_16434 src: /192.168.158.1:50976 dest: /192.168.158.4:9866 2025-07-20 13:55:52,726 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50976, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1032364504_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757258_16434, duration(ns): 23934203 2025-07-20 13:55:52,726 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757258_16434, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-20 13:55:57,844 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757258_16434 replica FinalizedReplica, blk_1073757258_16434, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757258 for deletion 2025-07-20 13:55:57,845 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757258_16434 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757258 2025-07-20 13:57:52,714 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757260_16436 src: /192.168.158.9:52924 dest: /192.168.158.4:9866 2025-07-20 13:57:52,733 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:52924, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_110168109_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757260_16436, duration(ns): 16910738 2025-07-20 13:57:52,733 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757260_16436, type=LAST_IN_PIPELINE terminating 2025-07-20 13:57:57,848 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757260_16436 replica FinalizedReplica, blk_1073757260_16436, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757260 for deletion 2025-07-20 13:57:57,849 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757260_16436 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757260 2025-07-20 13:59:52,714 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757262_16438 src: /192.168.158.1:33666 dest: /192.168.158.4:9866 2025-07-20 13:59:52,748 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33666, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_835789837_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757262_16438, duration(ns): 24645404 2025-07-20 13:59:52,748 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757262_16438, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-20 14:00:00,851 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757262_16438 replica FinalizedReplica, blk_1073757262_16438, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757262 for deletion 2025-07-20 14:00:00,852 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757262_16438 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757262 2025-07-20 14:00:52,715 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757263_16439 src: /192.168.158.6:41572 dest: /192.168.158.4:9866 2025-07-20 14:00:52,743 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:41572, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-625621259_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757263_16439, duration(ns): 21686193 2025-07-20 14:00:52,743 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757263_16439, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-20 14:01:00,851 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757263_16439 replica FinalizedReplica, blk_1073757263_16439, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757263 for deletion 2025-07-20 14:01:00,853 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757263_16439 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757263 2025-07-20 14:02:57,715 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757265_16441 src: /192.168.158.1:47192 dest: /192.168.158.4:9866 2025-07-20 14:02:57,749 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47192, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1039893493_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757265_16441, duration(ns): 24434113 2025-07-20 14:02:57,749 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757265_16441, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-20 14:03:00,855 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757265_16441 replica FinalizedReplica, blk_1073757265_16441, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757265 for deletion 2025-07-20 14:03:00,856 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757265_16441 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757265 2025-07-20 14:03:57,717 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757266_16442 src: /192.168.158.8:42192 dest: /192.168.158.4:9866 2025-07-20 14:03:57,744 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:42192, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1855170347_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757266_16442, duration(ns): 20730381 2025-07-20 14:03:57,744 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757266_16442, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-20 14:04:00,859 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757266_16442 replica FinalizedReplica, blk_1073757266_16442, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757266 for deletion 2025-07-20 14:04:00,860 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757266_16442 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757266 2025-07-20 14:06:57,719 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757269_16445 src: /192.168.158.1:53508 dest: /192.168.158.4:9866 2025-07-20 14:06:57,752 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53508, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-386824478_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757269_16445, duration(ns): 23801509 2025-07-20 14:06:57,752 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757269_16445, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-20 14:07:03,865 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757269_16445 replica FinalizedReplica, blk_1073757269_16445, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757269 for deletion 2025-07-20 14:07:03,866 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757269_16445 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757269 2025-07-20 14:07:57,742 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757270_16446 src: /192.168.158.8:41544 dest: /192.168.158.4:9866 2025-07-20 14:07:57,770 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:41544, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_388213401_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757270_16446, duration(ns): 22840463 2025-07-20 14:07:57,770 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757270_16446, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-20 14:08:03,867 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757270_16446 replica FinalizedReplica, blk_1073757270_16446, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757270 for deletion 2025-07-20 14:08:03,869 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757270_16446 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757270 2025-07-20 14:09:57,721 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757272_16448 src: /192.168.158.1:58790 dest: /192.168.158.4:9866 2025-07-20 14:09:57,756 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58790, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1050444921_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757272_16448, duration(ns): 25507345 2025-07-20 14:09:57,756 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757272_16448, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-20 14:10:00,872 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757272_16448 replica FinalizedReplica, blk_1073757272_16448, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757272 for deletion 2025-07-20 14:10:00,874 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757272_16448 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757272 2025-07-20 14:10:57,732 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757273_16449 src: /192.168.158.8:43990 dest: /192.168.158.4:9866 2025-07-20 14:10:57,751 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:43990, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1631986312_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757273_16449, duration(ns): 16944614 2025-07-20 14:10:57,751 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757273_16449, type=LAST_IN_PIPELINE terminating 2025-07-20 14:11:00,875 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757273_16449 replica FinalizedReplica, blk_1073757273_16449, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757273 for deletion 2025-07-20 14:11:00,876 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757273_16449 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757273 2025-07-20 14:11:57,731 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757274_16450 src: /192.168.158.8:56208 dest: /192.168.158.4:9866 2025-07-20 14:11:57,756 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:56208, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_616364890_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757274_16450, duration(ns): 19653840 2025-07-20 14:11:57,756 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757274_16450, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-20 14:12:00,878 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757274_16450 replica FinalizedReplica, blk_1073757274_16450, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757274 for deletion 2025-07-20 14:12:00,879 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757274_16450 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757274 2025-07-20 14:13:57,729 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757276_16452 src: /192.168.158.9:56288 dest: /192.168.158.4:9866 2025-07-20 14:13:57,755 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:56288, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1399020123_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757276_16452, duration(ns): 20182088 2025-07-20 14:13:57,755 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757276_16452, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-20 14:14:00,882 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757276_16452 replica FinalizedReplica, blk_1073757276_16452, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757276 for deletion 2025-07-20 14:14:00,883 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757276_16452 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757276 2025-07-20 14:15:57,733 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757278_16454 src: /192.168.158.1:41874 dest: /192.168.158.4:9866 2025-07-20 14:15:57,769 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41874, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1588707797_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757278_16454, duration(ns): 25752306 2025-07-20 14:15:57,769 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757278_16454, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-20 14:16:00,883 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757278_16454 replica FinalizedReplica, blk_1073757278_16454, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757278 for deletion 2025-07-20 14:16:00,884 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757278_16454 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757278 2025-07-20 14:17:57,736 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757280_16456 src: /192.168.158.1:32854 dest: /192.168.158.4:9866 2025-07-20 14:17:57,770 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:32854, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1044058600_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757280_16456, duration(ns): 25061477 2025-07-20 14:17:57,770 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757280_16456, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-20 14:18:00,887 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757280_16456 replica FinalizedReplica, blk_1073757280_16456, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757280 for deletion 2025-07-20 14:18:00,888 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757280_16456 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757280 2025-07-20 14:19:57,753 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757282_16458 src: /192.168.158.8:50176 dest: /192.168.158.4:9866 2025-07-20 14:19:57,772 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:50176, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1277184897_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757282_16458, duration(ns): 17042444 2025-07-20 14:19:57,773 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757282_16458, type=LAST_IN_PIPELINE terminating 2025-07-20 14:20:00,892 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757282_16458 replica FinalizedReplica, blk_1073757282_16458, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757282 for deletion 2025-07-20 14:20:00,893 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757282_16458 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757282 2025-07-20 14:21:02,753 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757283_16459 src: /192.168.158.6:46192 dest: /192.168.158.4:9866 2025-07-20 14:21:02,778 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:46192, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1036484390_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757283_16459, duration(ns): 20281417 2025-07-20 14:21:02,779 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757283_16459, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-20 14:21:06,895 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757283_16459 replica FinalizedReplica, blk_1073757283_16459, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757283 for deletion 2025-07-20 14:21:06,896 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757283_16459 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757283 2025-07-20 14:24:17,774 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757286_16462 src: /192.168.158.9:37632 dest: /192.168.158.4:9866 2025-07-20 14:24:17,795 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:37632, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1189355705_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757286_16462, duration(ns): 18996686 2025-07-20 14:24:17,795 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757286_16462, type=LAST_IN_PIPELINE terminating 2025-07-20 14:24:21,904 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757286_16462 replica FinalizedReplica, blk_1073757286_16462, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757286 for deletion 2025-07-20 14:24:21,905 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757286_16462 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757286 2025-07-20 14:26:22,767 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757288_16464 src: /192.168.158.9:39968 dest: /192.168.158.4:9866 2025-07-20 14:26:22,787 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:39968, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1611834997_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757288_16464, duration(ns): 18132915 2025-07-20 14:26:22,787 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757288_16464, type=LAST_IN_PIPELINE terminating 2025-07-20 14:26:27,909 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757288_16464 replica FinalizedReplica, blk_1073757288_16464, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757288 for deletion 2025-07-20 14:26:27,910 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757288_16464 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757288 2025-07-20 14:29:22,766 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757291_16467 src: /192.168.158.8:53586 dest: /192.168.158.4:9866 2025-07-20 14:29:22,793 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:53586, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1259534301_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757291_16467, duration(ns): 21707416 2025-07-20 14:29:22,794 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757291_16467, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-20 14:29:30,920 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757291_16467 replica FinalizedReplica, blk_1073757291_16467, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757291 for deletion 2025-07-20 14:29:30,921 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757291_16467 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757291 2025-07-20 14:31:22,769 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757293_16469 src: /192.168.158.1:48416 dest: /192.168.158.4:9866 2025-07-20 14:31:22,803 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48416, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_14396468_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757293_16469, duration(ns): 25394831 2025-07-20 14:31:22,803 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757293_16469, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-20 14:31:30,923 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757293_16469 replica FinalizedReplica, blk_1073757293_16469, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757293 for deletion 2025-07-20 14:31:30,924 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757293_16469 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757293 2025-07-20 14:32:22,777 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757294_16470 src: /192.168.158.1:41446 dest: /192.168.158.4:9866 2025-07-20 14:32:22,813 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41446, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1421715495_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757294_16470, duration(ns): 26071500 2025-07-20 14:32:22,813 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757294_16470, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-20 14:32:27,924 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757294_16470 replica FinalizedReplica, blk_1073757294_16470, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757294 for deletion 2025-07-20 14:32:27,925 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757294_16470 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757294 2025-07-20 14:33:22,771 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757295_16471 src: /192.168.158.5:55954 dest: /192.168.158.4:9866 2025-07-20 14:33:22,797 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:55954, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2063392176_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757295_16471, duration(ns): 20736945 2025-07-20 14:33:22,798 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757295_16471, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-20 14:33:27,926 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757295_16471 replica FinalizedReplica, blk_1073757295_16471, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757295 for deletion 2025-07-20 14:33:27,927 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757295_16471 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757295 2025-07-20 14:36:22,770 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757298_16474 src: /192.168.158.1:55790 dest: /192.168.158.4:9866 2025-07-20 14:36:22,804 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55790, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_144447336_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757298_16474, duration(ns): 24836590 2025-07-20 14:36:22,804 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757298_16474, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-20 14:36:24,934 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757298_16474 replica FinalizedReplica, blk_1073757298_16474, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757298 for deletion 2025-07-20 14:36:24,935 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757298_16474 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757298 2025-07-20 14:37:22,802 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757299_16475 src: /192.168.158.5:59392 dest: /192.168.158.4:9866 2025-07-20 14:37:22,830 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:59392, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-882965024_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757299_16475, duration(ns): 22111967 2025-07-20 14:37:22,831 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757299_16475, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-20 14:37:27,937 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757299_16475 replica FinalizedReplica, blk_1073757299_16475, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757299 for deletion 2025-07-20 14:37:27,938 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757299_16475 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757299 2025-07-20 14:38:27,780 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757300_16476 src: /192.168.158.9:46886 dest: /192.168.158.4:9866 2025-07-20 14:38:27,798 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:46886, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1133114474_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757300_16476, duration(ns): 15375861 2025-07-20 14:38:27,798 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757300_16476, type=LAST_IN_PIPELINE terminating 2025-07-20 14:38:33,939 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757300_16476 replica FinalizedReplica, blk_1073757300_16476, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757300 for deletion 2025-07-20 14:38:33,941 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757300_16476 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757300 2025-07-20 14:39:27,775 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757301_16477 src: /192.168.158.1:56134 dest: /192.168.158.4:9866 2025-07-20 14:39:27,807 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56134, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1052157605_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757301_16477, duration(ns): 23601514 2025-07-20 14:39:27,808 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757301_16477, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-20 14:39:30,941 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757301_16477 replica FinalizedReplica, blk_1073757301_16477, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757301 for deletion 2025-07-20 14:39:30,942 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757301_16477 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757301 2025-07-20 14:42:27,789 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757304_16480 src: /192.168.158.7:59290 dest: /192.168.158.4:9866 2025-07-20 14:42:27,809 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:59290, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1562127767_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757304_16480, duration(ns): 17594272 2025-07-20 14:42:27,809 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757304_16480, type=LAST_IN_PIPELINE terminating 2025-07-20 14:42:33,948 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757304_16480 replica FinalizedReplica, blk_1073757304_16480, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757304 for deletion 2025-07-20 14:42:33,949 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757304_16480 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757304 2025-07-20 14:43:32,784 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757305_16481 src: /192.168.158.7:44312 dest: /192.168.158.4:9866 2025-07-20 14:43:32,811 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:44312, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_35155653_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757305_16481, duration(ns): 21447569 2025-07-20 14:43:32,811 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757305_16481, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-20 14:43:36,950 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757305_16481 replica FinalizedReplica, blk_1073757305_16481, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757305 for deletion 2025-07-20 14:43:36,951 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757305_16481 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757305 2025-07-20 14:44:32,798 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757306_16482 src: /192.168.158.8:43314 dest: /192.168.158.4:9866 2025-07-20 14:44:32,825 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:43314, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-176648030_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757306_16482, duration(ns): 21577003 2025-07-20 14:44:32,825 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757306_16482, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-20 14:44:36,953 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757306_16482 replica FinalizedReplica, blk_1073757306_16482, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757306 for deletion 2025-07-20 14:44:36,954 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757306_16482 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757306 2025-07-20 14:45:32,790 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757307_16483 src: /192.168.158.1:49954 dest: /192.168.158.4:9866 2025-07-20 14:45:32,823 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49954, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-625565282_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757307_16483, duration(ns): 23631404 2025-07-20 14:45:32,823 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757307_16483, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-20 14:45:36,955 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757307_16483 replica FinalizedReplica, blk_1073757307_16483, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757307 for deletion 2025-07-20 14:45:36,956 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757307_16483 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757307 2025-07-20 14:47:37,796 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757309_16485 src: /192.168.158.1:40334 dest: /192.168.158.4:9866 2025-07-20 14:47:37,828 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40334, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1542090643_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757309_16485, duration(ns): 22644417 2025-07-20 14:47:37,828 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757309_16485, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-20 14:47:39,960 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757309_16485 replica FinalizedReplica, blk_1073757309_16485, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757309 for deletion 2025-07-20 14:47:39,961 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757309_16485 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757309 2025-07-20 14:48:37,809 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757310_16486 src: /192.168.158.7:37336 dest: /192.168.158.4:9866 2025-07-20 14:48:37,829 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:37336, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-183569520_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757310_16486, duration(ns): 18367247 2025-07-20 14:48:37,830 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757310_16486, type=LAST_IN_PIPELINE terminating 2025-07-20 14:48:39,964 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757310_16486 replica FinalizedReplica, blk_1073757310_16486, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757310 for deletion 2025-07-20 14:48:39,965 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757310_16486 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757310 2025-07-20 14:49:37,816 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757311_16487 src: /192.168.158.9:48122 dest: /192.168.158.4:9866 2025-07-20 14:49:37,846 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:48122, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2116890157_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757311_16487, duration(ns): 22793288 2025-07-20 14:49:37,846 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757311_16487, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-20 14:49:39,967 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757311_16487 replica FinalizedReplica, blk_1073757311_16487, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757311 for deletion 2025-07-20 14:49:39,968 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757311_16487 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757311 2025-07-20 14:50:37,804 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757312_16488 src: /192.168.158.7:48864 dest: /192.168.158.4:9866 2025-07-20 14:50:37,826 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:48864, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2072469571_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757312_16488, duration(ns): 19194871 2025-07-20 14:50:37,826 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757312_16488, type=LAST_IN_PIPELINE terminating 2025-07-20 14:50:42,969 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757312_16488 replica FinalizedReplica, blk_1073757312_16488, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757312 for deletion 2025-07-20 14:50:42,970 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757312_16488 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757312 2025-07-20 14:51:37,795 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757313_16489 src: /192.168.158.1:54928 dest: /192.168.158.4:9866 2025-07-20 14:51:37,833 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54928, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1183223043_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757313_16489, duration(ns): 28210419 2025-07-20 14:51:37,833 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757313_16489, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-20 14:51:42,972 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757313_16489 replica FinalizedReplica, blk_1073757313_16489, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757313 for deletion 2025-07-20 14:51:42,973 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757313_16489 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757313 2025-07-20 14:54:37,801 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757316_16492 src: /192.168.158.9:39380 dest: /192.168.158.4:9866 2025-07-20 14:54:37,828 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:39380, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_264155795_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757316_16492, duration(ns): 20742077 2025-07-20 14:54:37,828 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757316_16492, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-20 14:54:39,978 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757316_16492 replica FinalizedReplica, blk_1073757316_16492, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757316 for deletion 2025-07-20 14:54:39,979 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757316_16492 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757316 2025-07-20 14:56:37,803 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757318_16494 src: /192.168.158.5:51482 dest: /192.168.158.4:9866 2025-07-20 14:56:37,831 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:51482, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1428008439_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757318_16494, duration(ns): 21848797 2025-07-20 14:56:37,831 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757318_16494, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-20 14:56:39,983 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757318_16494 replica FinalizedReplica, blk_1073757318_16494, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757318 for deletion 2025-07-20 14:56:39,984 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757318_16494 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757318 2025-07-20 14:57:37,808 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757319_16495 src: /192.168.158.6:40750 dest: /192.168.158.4:9866 2025-07-20 14:57:37,826 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:40750, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_599859952_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757319_16495, duration(ns): 16333805 2025-07-20 14:57:37,827 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757319_16495, type=LAST_IN_PIPELINE terminating 2025-07-20 14:57:39,983 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757319_16495 replica FinalizedReplica, blk_1073757319_16495, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757319 for deletion 2025-07-20 14:57:39,984 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757319_16495 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757319 2025-07-20 14:58:42,805 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757320_16496 src: /192.168.158.1:49068 dest: /192.168.158.4:9866 2025-07-20 14:58:42,841 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49068, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1537244181_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757320_16496, duration(ns): 27761269 2025-07-20 14:58:42,842 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757320_16496, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-20 14:58:45,988 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757320_16496 replica FinalizedReplica, blk_1073757320_16496, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757320 for deletion 2025-07-20 14:58:45,989 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757320_16496 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757320 2025-07-20 15:02:47,846 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757324_16500 src: /192.168.158.6:37596 dest: /192.168.158.4:9866 2025-07-20 15:02:47,871 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:37596, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1641301197_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757324_16500, duration(ns): 19895756 2025-07-20 15:02:47,871 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757324_16500, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-20 15:02:51,999 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757324_16500 replica FinalizedReplica, blk_1073757324_16500, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757324 for deletion 2025-07-20 15:02:52,000 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757324_16500 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757324 2025-07-20 15:03:47,822 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757325_16501 src: /192.168.158.5:48824 dest: /192.168.158.4:9866 2025-07-20 15:03:47,841 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:48824, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1591423405_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757325_16501, duration(ns): 16977153 2025-07-20 15:03:47,841 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757325_16501, type=LAST_IN_PIPELINE terminating 2025-07-20 15:03:52,000 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757325_16501 replica FinalizedReplica, blk_1073757325_16501, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757325 for deletion 2025-07-20 15:03:52,002 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757325_16501 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757325 2025-07-20 15:05:47,823 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757327_16503 src: /192.168.158.1:52362 dest: /192.168.158.4:9866 2025-07-20 15:05:47,861 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52362, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1312064326_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757327_16503, duration(ns): 29113402 2025-07-20 15:05:47,861 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757327_16503, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-20 15:05:55,004 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757327_16503 replica FinalizedReplica, blk_1073757327_16503, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757327 for deletion 2025-07-20 15:05:55,005 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757327_16503 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757327 2025-07-20 15:06:47,820 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757328_16504 src: /192.168.158.1:49868 dest: /192.168.158.4:9866 2025-07-20 15:06:47,854 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49868, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-441061160_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757328_16504, duration(ns): 24170025 2025-07-20 15:06:47,854 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757328_16504, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-20 15:06:52,005 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757328_16504 replica FinalizedReplica, blk_1073757328_16504, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757328 for deletion 2025-07-20 15:06:52,006 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757328_16504 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757328 2025-07-20 15:07:47,836 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757329_16505 src: /192.168.158.8:60500 dest: /192.168.158.4:9866 2025-07-20 15:07:47,864 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:60500, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_132191234_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757329_16505, duration(ns): 23328110 2025-07-20 15:07:47,864 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757329_16505, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-20 15:07:55,008 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757329_16505 replica FinalizedReplica, blk_1073757329_16505, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757329 for deletion 2025-07-20 15:07:55,009 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757329_16505 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757329 2025-07-20 15:09:47,830 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757331_16507 src: /192.168.158.7:44916 dest: /192.168.158.4:9866 2025-07-20 15:09:47,855 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:44916, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-57174747_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757331_16507, duration(ns): 19985648 2025-07-20 15:09:47,855 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757331_16507, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-20 15:09:52,014 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757331_16507 replica FinalizedReplica, blk_1073757331_16507, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757331 for deletion 2025-07-20 15:09:52,015 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757331_16507 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757331 2025-07-20 15:11:47,836 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757333_16509 src: /192.168.158.1:46280 dest: /192.168.158.4:9866 2025-07-20 15:11:47,870 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46280, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1790385563_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757333_16509, duration(ns): 24606594 2025-07-20 15:11:47,871 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757333_16509, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-20 15:11:55,020 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757333_16509 replica FinalizedReplica, blk_1073757333_16509, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757333 for deletion 2025-07-20 15:11:55,021 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757333_16509 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757333 2025-07-20 15:13:52,838 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757335_16511 src: /192.168.158.5:55234 dest: /192.168.158.4:9866 2025-07-20 15:13:52,858 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:55234, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1961160551_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757335_16511, duration(ns): 17218781 2025-07-20 15:13:52,858 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757335_16511, type=LAST_IN_PIPELINE terminating 2025-07-20 15:13:58,023 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757335_16511 replica FinalizedReplica, blk_1073757335_16511, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757335 for deletion 2025-07-20 15:13:58,024 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757335_16511 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757335 2025-07-20 15:14:57,844 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757336_16512 src: /192.168.158.5:59852 dest: /192.168.158.4:9866 2025-07-20 15:14:57,869 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:59852, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1556240790_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757336_16512, duration(ns): 19788080 2025-07-20 15:14:57,870 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757336_16512, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-20 15:15:04,026 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757336_16512 replica FinalizedReplica, blk_1073757336_16512, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757336 for deletion 2025-07-20 15:15:04,027 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757336_16512 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757336 2025-07-20 15:16:02,845 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757337_16513 src: /192.168.158.1:49008 dest: /192.168.158.4:9866 2025-07-20 15:16:02,882 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49008, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1803573624_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757337_16513, duration(ns): 28403823 2025-07-20 15:16:02,882 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757337_16513, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-20 15:16:07,028 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757337_16513 replica FinalizedReplica, blk_1073757337_16513, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757337 for deletion 2025-07-20 15:16:07,029 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757337_16513 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757337 2025-07-20 15:17:02,847 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757338_16514 src: /192.168.158.5:56474 dest: /192.168.158.4:9866 2025-07-20 15:17:02,874 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:56474, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1765908687_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757338_16514, duration(ns): 21201589 2025-07-20 15:17:02,874 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757338_16514, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-20 15:17:07,030 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757338_16514 replica FinalizedReplica, blk_1073757338_16514, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757338 for deletion 2025-07-20 15:17:07,032 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757338_16514 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757338 2025-07-20 15:18:02,987 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757339_16515 src: /192.168.158.8:38256 dest: /192.168.158.4:9866 2025-07-20 15:18:03,014 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:38256, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1600873057_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757339_16515, duration(ns): 21534392 2025-07-20 15:18:03,014 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757339_16515, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-20 15:18:10,033 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757339_16515 replica FinalizedReplica, blk_1073757339_16515, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757339 for deletion 2025-07-20 15:18:10,034 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757339_16515 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757339 2025-07-20 15:21:02,856 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757342_16518 src: /192.168.158.1:49422 dest: /192.168.158.4:9866 2025-07-20 15:21:02,890 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49422, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-449367017_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757342_16518, duration(ns): 25117381 2025-07-20 15:21:02,890 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757342_16518, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-20 15:21:07,040 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757342_16518 replica FinalizedReplica, blk_1073757342_16518, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757342 for deletion 2025-07-20 15:21:07,041 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757342_16518 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757342 2025-07-20 15:24:07,857 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757345_16521 src: /192.168.158.1:42268 dest: /192.168.158.4:9866 2025-07-20 15:24:07,894 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42268, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2105365291_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757345_16521, duration(ns): 26532844 2025-07-20 15:24:07,895 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757345_16521, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-20 15:24:10,050 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757345_16521 replica FinalizedReplica, blk_1073757345_16521, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757345 for deletion 2025-07-20 15:24:10,052 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757345_16521 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757345 2025-07-20 15:25:07,870 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757346_16522 src: /192.168.158.6:38382 dest: /192.168.158.4:9866 2025-07-20 15:25:07,898 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:38382, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1576674179_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757346_16522, duration(ns): 22590125 2025-07-20 15:25:07,899 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757346_16522, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-20 15:25:10,054 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757346_16522 replica FinalizedReplica, blk_1073757346_16522, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757346 for deletion 2025-07-20 15:25:10,055 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757346_16522 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757346 2025-07-20 15:26:07,865 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757347_16523 src: /192.168.158.1:59970 dest: /192.168.158.4:9866 2025-07-20 15:26:07,900 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59970, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1821882096_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757347_16523, duration(ns): 25227691 2025-07-20 15:26:07,900 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757347_16523, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-20 15:26:10,055 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757347_16523 replica FinalizedReplica, blk_1073757347_16523, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757347 for deletion 2025-07-20 15:26:10,057 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757347_16523 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757347 2025-07-20 15:28:07,869 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757349_16525 src: /192.168.158.6:34156 dest: /192.168.158.4:9866 2025-07-20 15:28:07,887 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:34156, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-719934296_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757349_16525, duration(ns): 16563713 2025-07-20 15:28:07,888 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757349_16525, type=LAST_IN_PIPELINE terminating 2025-07-20 15:28:13,057 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757349_16525 replica FinalizedReplica, blk_1073757349_16525, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757349 for deletion 2025-07-20 15:28:13,058 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757349_16525 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757349 2025-07-20 15:31:07,886 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757352_16528 src: /192.168.158.8:40860 dest: /192.168.158.4:9866 2025-07-20 15:31:07,906 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:40860, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-237342526_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757352_16528, duration(ns): 18027870 2025-07-20 15:31:07,907 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757352_16528, type=LAST_IN_PIPELINE terminating 2025-07-20 15:31:10,060 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757352_16528 replica FinalizedReplica, blk_1073757352_16528, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757352 for deletion 2025-07-20 15:31:10,062 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757352_16528 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757352 2025-07-20 15:32:12,902 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757353_16529 src: /192.168.158.6:35710 dest: /192.168.158.4:9866 2025-07-20 15:32:12,929 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:35710, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-244915434_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757353_16529, duration(ns): 20714292 2025-07-20 15:32:12,930 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757353_16529, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-20 15:32:16,063 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757353_16529 replica FinalizedReplica, blk_1073757353_16529, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757353 for deletion 2025-07-20 15:32:16,064 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757353_16529 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757353 2025-07-20 15:36:12,878 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757357_16533 src: /192.168.158.6:54838 dest: /192.168.158.4:9866 2025-07-20 15:36:12,897 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:54838, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1507774940_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757357_16533, duration(ns): 17131054 2025-07-20 15:36:12,897 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757357_16533, type=LAST_IN_PIPELINE terminating 2025-07-20 15:36:19,071 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757357_16533 replica FinalizedReplica, blk_1073757357_16533, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757357 for deletion 2025-07-20 15:36:19,072 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757357_16533 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757357 2025-07-20 15:38:17,877 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757359_16535 src: /192.168.158.1:38526 dest: /192.168.158.4:9866 2025-07-20 15:38:17,910 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38526, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_750533803_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757359_16535, duration(ns): 23610395 2025-07-20 15:38:17,911 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757359_16535, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-20 15:38:25,074 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757359_16535 replica FinalizedReplica, blk_1073757359_16535, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757359 for deletion 2025-07-20 15:38:25,075 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757359_16535 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757359 2025-07-20 15:41:17,892 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757362_16538 src: /192.168.158.5:58042 dest: /192.168.158.4:9866 2025-07-20 15:41:17,912 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:58042, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_386841746_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757362_16538, duration(ns): 17751508 2025-07-20 15:41:17,912 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757362_16538, type=LAST_IN_PIPELINE terminating 2025-07-20 15:41:22,083 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757362_16538 replica FinalizedReplica, blk_1073757362_16538, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757362 for deletion 2025-07-20 15:41:22,084 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757362_16538 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757362 2025-07-20 15:42:22,891 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757363_16539 src: /192.168.158.9:44196 dest: /192.168.158.4:9866 2025-07-20 15:42:22,918 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:44196, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1259323576_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757363_16539, duration(ns): 21455888 2025-07-20 15:42:22,918 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757363_16539, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-20 15:42:28,086 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757363_16539 replica FinalizedReplica, blk_1073757363_16539, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757363 for deletion 2025-07-20 15:42:28,087 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757363_16539 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757363 2025-07-20 15:45:22,896 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757366_16542 src: /192.168.158.8:60742 dest: /192.168.158.4:9866 2025-07-20 15:45:22,925 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:60742, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_861730318_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757366_16542, duration(ns): 23523408 2025-07-20 15:45:22,926 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757366_16542, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-20 15:45:25,092 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757366_16542 replica FinalizedReplica, blk_1073757366_16542, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757366 for deletion 2025-07-20 15:45:25,094 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757366_16542 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757366 2025-07-20 15:49:37,900 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757370_16546 src: /192.168.158.1:34642 dest: /192.168.158.4:9866 2025-07-20 15:49:37,938 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34642, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_358141934_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757370_16546, duration(ns): 28173689 2025-07-20 15:49:37,939 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757370_16546, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-20 15:49:40,101 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757370_16546 replica FinalizedReplica, blk_1073757370_16546, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757370 for deletion 2025-07-20 15:49:40,102 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757370_16546 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757370 2025-07-20 15:52:37,902 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757373_16549 src: /192.168.158.1:49288 dest: /192.168.158.4:9866 2025-07-20 15:52:37,934 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49288, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-3299974_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757373_16549, duration(ns): 23049111 2025-07-20 15:52:37,934 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757373_16549, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-20 15:52:40,105 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757373_16549 replica FinalizedReplica, blk_1073757373_16549, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757373 for deletion 2025-07-20 15:52:40,106 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757373_16549 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757373 2025-07-20 15:53:37,906 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757374_16550 src: /192.168.158.8:48390 dest: /192.168.158.4:9866 2025-07-20 15:53:37,926 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:48390, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1688334308_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757374_16550, duration(ns): 17052466 2025-07-20 15:53:37,926 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757374_16550, type=LAST_IN_PIPELINE terminating 2025-07-20 15:53:40,108 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757374_16550 replica FinalizedReplica, blk_1073757374_16550, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757374 for deletion 2025-07-20 15:53:40,109 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757374_16550 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757374 2025-07-20 15:54:42,910 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757375_16551 src: /192.168.158.1:47194 dest: /192.168.158.4:9866 2025-07-20 15:54:42,942 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47194, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-648161476_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757375_16551, duration(ns): 23923582 2025-07-20 15:54:42,943 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757375_16551, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-20 15:54:46,109 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757375_16551 replica FinalizedReplica, blk_1073757375_16551, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757375 for deletion 2025-07-20 15:54:46,110 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757375_16551 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757375 2025-07-20 15:55:47,913 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757376_16552 src: /192.168.158.1:33772 dest: /192.168.158.4:9866 2025-07-20 15:55:47,944 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33772, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-985920542_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757376_16552, duration(ns): 22535899 2025-07-20 15:55:47,944 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757376_16552, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-20 15:55:52,112 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757376_16552 replica FinalizedReplica, blk_1073757376_16552, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757376 for deletion 2025-07-20 15:55:52,113 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757376_16552 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757376 2025-07-20 15:59:19,120 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Successfully sent block report 0x12cfd9a757d31f52, containing 4 storage report(s), of which we sent 4. The reports had 8 total blocks and used 1 RPC(s). This took 0 msec to generate and 4 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5. 2025-07-20 15:59:19,120 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Got finalize command for block pool BP-1059995147-192.168.158.1-1752101929360 2025-07-20 15:59:47,912 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757380_16556 src: /192.168.158.7:58110 dest: /192.168.158.4:9866 2025-07-20 15:59:47,932 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:58110, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2122296498_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757380_16556, duration(ns): 18114454 2025-07-20 15:59:47,932 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757380_16556, type=LAST_IN_PIPELINE terminating 2025-07-20 15:59:52,116 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757380_16556 replica FinalizedReplica, blk_1073757380_16556, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757380 for deletion 2025-07-20 15:59:52,117 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757380_16556 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757380 2025-07-20 16:00:47,909 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757381_16557 src: /192.168.158.1:58294 dest: /192.168.158.4:9866 2025-07-20 16:00:47,944 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58294, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-724238038_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757381_16557, duration(ns): 25383952 2025-07-20 16:00:47,944 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757381_16557, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-20 16:00:52,121 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757381_16557 replica FinalizedReplica, blk_1073757381_16557, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757381 for deletion 2025-07-20 16:00:52,122 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757381_16557 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757381 2025-07-20 16:05:52,923 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757386_16562 src: /192.168.158.5:40810 dest: /192.168.158.4:9866 2025-07-20 16:05:52,944 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:40810, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1727693281_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757386_16562, duration(ns): 19459309 2025-07-20 16:05:52,945 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757386_16562, type=LAST_IN_PIPELINE terminating 2025-07-20 16:05:55,126 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757386_16562 replica FinalizedReplica, blk_1073757386_16562, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757386 for deletion 2025-07-20 16:05:55,127 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757386_16562 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757386 2025-07-20 16:07:52,935 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757388_16564 src: /192.168.158.1:59924 dest: /192.168.158.4:9866 2025-07-20 16:07:52,968 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59924, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1554680293_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757388_16564, duration(ns): 23718434 2025-07-20 16:07:52,968 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757388_16564, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-20 16:07:58,133 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757388_16564 replica FinalizedReplica, blk_1073757388_16564, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757388 for deletion 2025-07-20 16:07:58,134 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757388_16564 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757388 2025-07-20 16:10:52,926 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757391_16567 src: /192.168.158.8:45070 dest: /192.168.158.4:9866 2025-07-20 16:10:52,946 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:45070, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1603510310_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757391_16567, duration(ns): 16858015 2025-07-20 16:10:52,947 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757391_16567, type=LAST_IN_PIPELINE terminating 2025-07-20 16:10:58,139 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757391_16567 replica FinalizedReplica, blk_1073757391_16567, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757391 for deletion 2025-07-20 16:10:58,141 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757391_16567 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757391 2025-07-20 16:11:52,935 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757392_16568 src: /192.168.158.8:57118 dest: /192.168.158.4:9866 2025-07-20 16:11:52,955 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:57118, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_132914149_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757392_16568, duration(ns): 17648679 2025-07-20 16:11:52,955 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757392_16568, type=LAST_IN_PIPELINE terminating 2025-07-20 16:11:55,141 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757392_16568 replica FinalizedReplica, blk_1073757392_16568, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757392 for deletion 2025-07-20 16:11:55,142 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757392_16568 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757392 2025-07-20 16:12:52,922 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757393_16569 src: /192.168.158.7:48650 dest: /192.168.158.4:9866 2025-07-20 16:12:52,948 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:48650, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-130214970_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757393_16569, duration(ns): 20244289 2025-07-20 16:12:52,948 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757393_16569, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-20 16:12:58,144 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757393_16569 replica FinalizedReplica, blk_1073757393_16569, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757393 for deletion 2025-07-20 16:12:58,145 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757393_16569 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757393 2025-07-20 16:14:52,935 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757395_16571 src: /192.168.158.5:49360 dest: /192.168.158.4:9866 2025-07-20 16:14:52,955 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:49360, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1343562253_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757395_16571, duration(ns): 17511327 2025-07-20 16:14:52,955 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757395_16571, type=LAST_IN_PIPELINE terminating 2025-07-20 16:14:58,151 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757395_16571 replica FinalizedReplica, blk_1073757395_16571, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757395 for deletion 2025-07-20 16:14:58,152 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757395_16571 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757395 2025-07-20 16:16:52,933 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757397_16573 src: /192.168.158.7:36422 dest: /192.168.158.4:9866 2025-07-20 16:16:52,960 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:36422, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_567437660_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757397_16573, duration(ns): 21139454 2025-07-20 16:16:52,960 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757397_16573, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-20 16:16:55,157 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757397_16573 replica FinalizedReplica, blk_1073757397_16573, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757397 for deletion 2025-07-20 16:16:55,158 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757397_16573 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757397 2025-07-20 16:17:52,931 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757398_16574 src: /192.168.158.5:33834 dest: /192.168.158.4:9866 2025-07-20 16:17:52,956 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:33834, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_222662925_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757398_16574, duration(ns): 19760226 2025-07-20 16:17:52,957 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757398_16574, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-20 16:17:58,159 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757398_16574 replica FinalizedReplica, blk_1073757398_16574, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757398 for deletion 2025-07-20 16:17:58,161 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757398_16574 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757398 2025-07-20 16:18:52,935 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757399_16575 src: /192.168.158.7:45422 dest: /192.168.158.4:9866 2025-07-20 16:18:52,961 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:45422, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-573233045_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757399_16575, duration(ns): 19837621 2025-07-20 16:18:52,961 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757399_16575, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-20 16:18:55,162 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757399_16575 replica FinalizedReplica, blk_1073757399_16575, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757399 for deletion 2025-07-20 16:18:55,163 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757399_16575 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757399 2025-07-20 16:20:52,931 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757401_16577 src: /192.168.158.1:52084 dest: /192.168.158.4:9866 2025-07-20 16:20:52,968 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52084, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-753857558_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757401_16577, duration(ns): 26377505 2025-07-20 16:20:52,968 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757401_16577, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-20 16:20:55,167 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757401_16577 replica FinalizedReplica, blk_1073757401_16577, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757401 for deletion 2025-07-20 16:20:55,168 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757401_16577 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757401 2025-07-20 16:22:57,938 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757403_16579 src: /192.168.158.1:34002 dest: /192.168.158.4:9866 2025-07-20 16:22:57,972 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34002, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_987311530_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757403_16579, duration(ns): 24827409 2025-07-20 16:22:57,973 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757403_16579, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-20 16:23:01,170 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757403_16579 replica FinalizedReplica, blk_1073757403_16579, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757403 for deletion 2025-07-20 16:23:01,172 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757403_16579 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757403 2025-07-20 16:25:57,943 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757406_16582 src: /192.168.158.1:45418 dest: /192.168.158.4:9866 2025-07-20 16:25:57,979 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45418, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1698606658_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757406_16582, duration(ns): 24878052 2025-07-20 16:25:57,979 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757406_16582, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-20 16:26:01,176 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757406_16582 replica FinalizedReplica, blk_1073757406_16582, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757406 for deletion 2025-07-20 16:26:01,178 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757406_16582 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757406 2025-07-20 16:27:57,957 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757408_16584 src: /192.168.158.1:43774 dest: /192.168.158.4:9866 2025-07-20 16:27:57,997 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43774, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-779423266_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757408_16584, duration(ns): 31082099 2025-07-20 16:27:57,997 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757408_16584, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-20 16:28:01,181 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757408_16584 replica FinalizedReplica, blk_1073757408_16584, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757408 for deletion 2025-07-20 16:28:01,182 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757408_16584 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757408 2025-07-20 16:31:57,974 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757412_16588 src: /192.168.158.8:48274 dest: /192.168.158.4:9866 2025-07-20 16:31:57,995 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:48274, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-159726573_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757412_16588, duration(ns): 19439192 2025-07-20 16:31:57,996 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757412_16588, type=LAST_IN_PIPELINE terminating 2025-07-20 16:32:01,190 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757412_16588 replica FinalizedReplica, blk_1073757412_16588, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757412 for deletion 2025-07-20 16:32:01,191 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757412_16588 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757412 2025-07-20 16:37:57,974 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757418_16594 src: /192.168.158.9:34122 dest: /192.168.158.4:9866 2025-07-20 16:37:57,995 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:34122, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1854057255_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757418_16594, duration(ns): 18406787 2025-07-20 16:37:57,995 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757418_16594, type=LAST_IN_PIPELINE terminating 2025-07-20 16:38:04,195 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757418_16594 replica FinalizedReplica, blk_1073757418_16594, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757418 for deletion 2025-07-20 16:38:04,196 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757418_16594 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757418 2025-07-20 16:38:57,978 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757419_16595 src: /192.168.158.8:50872 dest: /192.168.158.4:9866 2025-07-20 16:38:58,007 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:50872, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-594598374_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757419_16595, duration(ns): 23158738 2025-07-20 16:38:58,008 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757419_16595, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-20 16:39:01,198 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757419_16595 replica FinalizedReplica, blk_1073757419_16595, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757419 for deletion 2025-07-20 16:39:01,199 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757419_16595 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757419 2025-07-20 16:40:02,978 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757420_16596 src: /192.168.158.6:58710 dest: /192.168.158.4:9866 2025-07-20 16:40:03,004 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:58710, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-248774567_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757420_16596, duration(ns): 20128627 2025-07-20 16:40:03,004 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757420_16596, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-20 16:40:07,198 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757420_16596 replica FinalizedReplica, blk_1073757420_16596, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757420 for deletion 2025-07-20 16:40:07,199 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757420_16596 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757420 2025-07-20 16:41:02,985 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757421_16597 src: /192.168.158.6:59702 dest: /192.168.158.4:9866 2025-07-20 16:41:03,005 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:59702, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1023533176_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757421_16597, duration(ns): 16606833 2025-07-20 16:41:03,005 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757421_16597, type=LAST_IN_PIPELINE terminating 2025-07-20 16:41:07,201 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757421_16597 replica FinalizedReplica, blk_1073757421_16597, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757421 for deletion 2025-07-20 16:41:07,202 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757421_16597 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757421 2025-07-20 16:42:02,986 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757422_16598 src: /192.168.158.8:44012 dest: /192.168.158.4:9866 2025-07-20 16:42:03,010 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:44012, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_125929282_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757422_16598, duration(ns): 21699453 2025-07-20 16:42:03,010 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757422_16598, type=LAST_IN_PIPELINE terminating 2025-07-20 16:42:07,201 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757422_16598 replica FinalizedReplica, blk_1073757422_16598, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757422 for deletion 2025-07-20 16:42:07,202 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757422_16598 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757422 2025-07-20 16:44:07,990 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757424_16600 src: /192.168.158.8:48876 dest: /192.168.158.4:9866 2025-07-20 16:44:08,009 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:48876, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-526837553_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757424_16600, duration(ns): 16718848 2025-07-20 16:44:08,009 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757424_16600, type=LAST_IN_PIPELINE terminating 2025-07-20 16:44:13,207 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757424_16600 replica FinalizedReplica, blk_1073757424_16600, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757424 for deletion 2025-07-20 16:44:13,208 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757424_16600 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757424 2025-07-20 16:45:12,997 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757425_16601 src: /192.168.158.6:33000 dest: /192.168.158.4:9866 2025-07-20 16:45:13,016 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:33000, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-328963053_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757425_16601, duration(ns): 16932207 2025-07-20 16:45:13,017 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757425_16601, type=LAST_IN_PIPELINE terminating 2025-07-20 16:45:16,210 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757425_16601 replica FinalizedReplica, blk_1073757425_16601, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757425 for deletion 2025-07-20 16:45:16,211 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757425_16601 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757425 2025-07-20 16:47:12,991 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757427_16603 src: /192.168.158.1:48484 dest: /192.168.158.4:9866 2025-07-20 16:47:13,028 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48484, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1898625884_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757427_16603, duration(ns): 26056479 2025-07-20 16:47:13,028 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757427_16603, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-20 16:47:16,214 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757427_16603 replica FinalizedReplica, blk_1073757427_16603, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757427 for deletion 2025-07-20 16:47:16,215 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757427_16603 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757427 2025-07-20 16:53:18,012 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757433_16609 src: /192.168.158.6:41888 dest: /192.168.158.4:9866 2025-07-20 16:53:18,042 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:41888, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1892053676_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757433_16609, duration(ns): 23099617 2025-07-20 16:53:18,042 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757433_16609, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-20 16:53:22,220 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757433_16609 replica FinalizedReplica, blk_1073757433_16609, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757433 for deletion 2025-07-20 16:53:22,221 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757433_16609 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757433 2025-07-20 16:54:18,003 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757434_16610 src: /192.168.158.1:60242 dest: /192.168.158.4:9866 2025-07-20 16:54:18,039 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60242, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-307548361_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757434_16610, duration(ns): 27106904 2025-07-20 16:54:18,040 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757434_16610, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-20 16:54:22,224 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757434_16610 replica FinalizedReplica, blk_1073757434_16610, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757434 for deletion 2025-07-20 16:54:22,225 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757434_16610 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757434 2025-07-20 16:55:18,014 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757435_16611 src: /192.168.158.8:53724 dest: /192.168.158.4:9866 2025-07-20 16:55:18,040 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:53724, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-887518895_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757435_16611, duration(ns): 20424487 2025-07-20 16:55:18,040 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757435_16611, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-20 16:55:25,225 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757435_16611 replica FinalizedReplica, blk_1073757435_16611, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757435 for deletion 2025-07-20 16:55:25,226 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757435_16611 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757435 2025-07-20 16:56:18,021 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757436_16612 src: /192.168.158.7:36220 dest: /192.168.158.4:9866 2025-07-20 16:56:18,041 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:36220, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1897137421_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757436_16612, duration(ns): 17730447 2025-07-20 16:56:18,041 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757436_16612, type=LAST_IN_PIPELINE terminating 2025-07-20 16:56:22,228 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757436_16612 replica FinalizedReplica, blk_1073757436_16612, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757436 for deletion 2025-07-20 16:56:22,229 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757436_16612 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757436 2025-07-20 16:57:18,006 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757437_16613 src: /192.168.158.1:54080 dest: /192.168.158.4:9866 2025-07-20 16:57:18,036 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54080, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_781453086_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757437_16613, duration(ns): 20765698 2025-07-20 16:57:18,036 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757437_16613, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-20 16:57:25,231 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757437_16613 replica FinalizedReplica, blk_1073757437_16613, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757437 for deletion 2025-07-20 16:57:25,232 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757437_16613 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757437 2025-07-20 16:58:18,011 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757438_16614 src: /192.168.158.7:54472 dest: /192.168.158.4:9866 2025-07-20 16:58:18,039 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:54472, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_881012878_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757438_16614, duration(ns): 21882644 2025-07-20 16:58:18,039 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757438_16614, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-20 16:58:25,232 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757438_16614 replica FinalizedReplica, blk_1073757438_16614, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757438 for deletion 2025-07-20 16:58:25,234 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757438_16614 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757438 2025-07-20 16:59:23,002 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757439_16615 src: /192.168.158.1:34636 dest: /192.168.158.4:9866 2025-07-20 16:59:23,054 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34636, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-130971510_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757439_16615, duration(ns): 25452932 2025-07-20 16:59:23,054 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757439_16615, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-20 16:59:25,233 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757439_16615 replica FinalizedReplica, blk_1073757439_16615, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757439 for deletion 2025-07-20 16:59:25,235 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757439_16615 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir28/blk_1073757439 2025-07-20 17:05:43,014 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757445_16621 src: /192.168.158.5:46224 dest: /192.168.158.4:9866 2025-07-20 17:05:43,042 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:46224, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1635127336_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757445_16621, duration(ns): 22520898 2025-07-20 17:05:43,043 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757445_16621, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-20 17:05:46,241 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757445_16621 replica FinalizedReplica, blk_1073757445_16621, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757445 for deletion 2025-07-20 17:05:46,242 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757445_16621 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757445 2025-07-20 17:08:43,024 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757448_16624 src: /192.168.158.6:54666 dest: /192.168.158.4:9866 2025-07-20 17:08:43,050 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:54666, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_40270989_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757448_16624, duration(ns): 20412867 2025-07-20 17:08:43,050 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757448_16624, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-20 17:08:46,247 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757448_16624 replica FinalizedReplica, blk_1073757448_16624, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757448 for deletion 2025-07-20 17:08:46,248 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757448_16624 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757448 2025-07-20 17:10:48,034 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757450_16626 src: /192.168.158.8:57366 dest: /192.168.158.4:9866 2025-07-20 17:10:48,053 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:57366, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-499984220_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757450_16626, duration(ns): 17599934 2025-07-20 17:10:48,054 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757450_16626, type=LAST_IN_PIPELINE terminating 2025-07-20 17:10:52,251 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757450_16626 replica FinalizedReplica, blk_1073757450_16626, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757450 for deletion 2025-07-20 17:10:52,252 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757450_16626 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757450 2025-07-20 17:12:48,045 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757452_16628 src: /192.168.158.5:34412 dest: /192.168.158.4:9866 2025-07-20 17:12:48,074 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:34412, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1431324470_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757452_16628, duration(ns): 23847069 2025-07-20 17:12:48,075 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757452_16628, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-20 17:12:52,254 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757452_16628 replica FinalizedReplica, blk_1073757452_16628, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757452 for deletion 2025-07-20 17:12:52,255 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757452_16628 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757452 2025-07-20 17:13:53,027 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757453_16629 src: /192.168.158.1:42642 dest: /192.168.158.4:9866 2025-07-20 17:13:53,063 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42642, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1765123238_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757453_16629, duration(ns): 27132202 2025-07-20 17:13:53,063 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757453_16629, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-20 17:13:55,256 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757453_16629 replica FinalizedReplica, blk_1073757453_16629, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757453 for deletion 2025-07-20 17:13:55,257 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757453_16629 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757453 2025-07-20 17:15:53,035 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757455_16631 src: /192.168.158.5:60030 dest: /192.168.158.4:9866 2025-07-20 17:15:53,054 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:60030, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1584063641_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757455_16631, duration(ns): 16961144 2025-07-20 17:15:53,055 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757455_16631, type=LAST_IN_PIPELINE terminating 2025-07-20 17:15:55,259 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757455_16631 replica FinalizedReplica, blk_1073757455_16631, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757455 for deletion 2025-07-20 17:15:55,260 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757455_16631 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757455 2025-07-20 17:19:53,043 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757459_16635 src: /192.168.158.9:37678 dest: /192.168.158.4:9866 2025-07-20 17:19:53,070 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:37678, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1890654133_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757459_16635, duration(ns): 20835972 2025-07-20 17:19:53,070 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757459_16635, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-20 17:19:58,265 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757459_16635 replica FinalizedReplica, blk_1073757459_16635, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757459 for deletion 2025-07-20 17:19:58,266 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757459_16635 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757459 2025-07-20 17:22:53,038 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757462_16638 src: /192.168.158.9:50306 dest: /192.168.158.4:9866 2025-07-20 17:22:53,065 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:50306, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1571547732_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757462_16638, duration(ns): 22256797 2025-07-20 17:22:53,066 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757462_16638, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-20 17:22:55,269 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757462_16638 replica FinalizedReplica, blk_1073757462_16638, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757462 for deletion 2025-07-20 17:22:55,270 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757462_16638 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757462 2025-07-20 17:25:53,042 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757465_16641 src: /192.168.158.6:60294 dest: /192.168.158.4:9866 2025-07-20 17:25:53,061 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:60294, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1825682353_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757465_16641, duration(ns): 16396374 2025-07-20 17:25:53,061 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757465_16641, type=LAST_IN_PIPELINE terminating 2025-07-20 17:25:58,278 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757465_16641 replica FinalizedReplica, blk_1073757465_16641, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757465 for deletion 2025-07-20 17:25:58,279 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757465_16641 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757465 2025-07-20 17:28:58,055 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757468_16644 src: /192.168.158.1:52920 dest: /192.168.158.4:9866 2025-07-20 17:28:58,092 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52920, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1681499540_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757468_16644, duration(ns): 27016371 2025-07-20 17:28:58,092 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757468_16644, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-20 17:29:04,284 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757468_16644 replica FinalizedReplica, blk_1073757468_16644, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757468 for deletion 2025-07-20 17:29:04,285 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757468_16644 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757468 2025-07-20 17:30:58,060 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757470_16646 src: /192.168.158.1:43642 dest: /192.168.158.4:9866 2025-07-20 17:30:58,093 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43642, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1105810281_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757470_16646, duration(ns): 23887130 2025-07-20 17:30:58,093 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757470_16646, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-20 17:31:01,285 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757470_16646 replica FinalizedReplica, blk_1073757470_16646, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757470 for deletion 2025-07-20 17:31:01,286 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757470_16646 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757470 2025-07-20 17:33:03,065 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757472_16648 src: /192.168.158.7:36672 dest: /192.168.158.4:9866 2025-07-20 17:33:03,085 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:36672, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-661116824_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757472_16648, duration(ns): 17532896 2025-07-20 17:33:03,085 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757472_16648, type=LAST_IN_PIPELINE terminating 2025-07-20 17:33:04,288 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757472_16648 replica FinalizedReplica, blk_1073757472_16648, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757472 for deletion 2025-07-20 17:33:04,289 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757472_16648 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757472 2025-07-20 17:36:03,055 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757475_16651 src: /192.168.158.1:40376 dest: /192.168.158.4:9866 2025-07-20 17:36:03,088 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40376, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_331003591_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757475_16651, duration(ns): 24190197 2025-07-20 17:36:03,089 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757475_16651, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-20 17:36:04,295 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757475_16651 replica FinalizedReplica, blk_1073757475_16651, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757475 for deletion 2025-07-20 17:36:04,297 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757475_16651 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757475 2025-07-20 17:36:13,269 INFO org.apache.hadoop.hdfs.server.datanode.DirectoryScanner: BlockPool BP-1059995147-192.168.158.1-1752101929360 Total blocks: 8, missing metadata files:0, missing block files:0, missing blocks in memory:0, mismatched blocks:0 2025-07-20 17:37:03,060 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757476_16652 src: /192.168.158.6:55076 dest: /192.168.158.4:9866 2025-07-20 17:37:03,081 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:55076, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_133277739_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757476_16652, duration(ns): 17676863 2025-07-20 17:37:03,081 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757476_16652, type=LAST_IN_PIPELINE terminating 2025-07-20 17:37:04,300 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757476_16652 replica FinalizedReplica, blk_1073757476_16652, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757476 for deletion 2025-07-20 17:37:04,301 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757476_16652 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757476 2025-07-20 17:39:03,059 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757478_16654 src: /192.168.158.1:56462 dest: /192.168.158.4:9866 2025-07-20 17:39:03,093 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56462, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1955341273_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757478_16654, duration(ns): 25955224 2025-07-20 17:39:03,094 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757478_16654, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-20 17:39:04,303 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757478_16654 replica FinalizedReplica, blk_1073757478_16654, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757478 for deletion 2025-07-20 17:39:04,304 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757478_16654 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757478 2025-07-20 17:40:03,062 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757479_16655 src: /192.168.158.5:41168 dest: /192.168.158.4:9866 2025-07-20 17:40:03,089 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:41168, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1246099648_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757479_16655, duration(ns): 20454984 2025-07-20 17:40:03,089 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757479_16655, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-20 17:40:07,304 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757479_16655 replica FinalizedReplica, blk_1073757479_16655, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757479 for deletion 2025-07-20 17:40:07,306 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757479_16655 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757479 2025-07-20 17:41:03,072 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757480_16656 src: /192.168.158.9:42554 dest: /192.168.158.4:9866 2025-07-20 17:41:03,094 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:42554, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-47306479_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757480_16656, duration(ns): 20421339 2025-07-20 17:41:03,095 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757480_16656, type=LAST_IN_PIPELINE terminating 2025-07-20 17:41:04,307 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757480_16656 replica FinalizedReplica, blk_1073757480_16656, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757480 for deletion 2025-07-20 17:41:04,308 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757480_16656 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757480 2025-07-20 17:42:03,076 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757481_16657 src: /192.168.158.7:54150 dest: /192.168.158.4:9866 2025-07-20 17:42:03,101 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:54150, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1830524279_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757481_16657, duration(ns): 20525582 2025-07-20 17:42:03,101 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757481_16657, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-20 17:42:04,311 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757481_16657 replica FinalizedReplica, blk_1073757481_16657, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757481 for deletion 2025-07-20 17:42:04,312 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757481_16657 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757481 2025-07-20 17:44:13,072 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757483_16659 src: /192.168.158.7:53830 dest: /192.168.158.4:9866 2025-07-20 17:44:13,098 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:53830, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_580225327_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757483_16659, duration(ns): 20725415 2025-07-20 17:44:13,098 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757483_16659, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-20 17:44:16,315 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757483_16659 replica FinalizedReplica, blk_1073757483_16659, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757483 for deletion 2025-07-20 17:44:16,316 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757483_16659 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757483 2025-07-20 17:45:13,072 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757484_16660 src: /192.168.158.1:51466 dest: /192.168.158.4:9866 2025-07-20 17:45:13,104 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51466, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1094987322_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757484_16660, duration(ns): 23698831 2025-07-20 17:45:13,104 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757484_16660, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-20 17:45:16,318 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757484_16660 replica FinalizedReplica, blk_1073757484_16660, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757484 for deletion 2025-07-20 17:45:16,319 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757484_16660 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757484 2025-07-20 17:46:13,075 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757485_16661 src: /192.168.158.7:43090 dest: /192.168.158.4:9866 2025-07-20 17:46:13,098 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:43090, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2051152744_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757485_16661, duration(ns): 20280660 2025-07-20 17:46:13,098 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757485_16661, type=LAST_IN_PIPELINE terminating 2025-07-20 17:46:16,318 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757485_16661 replica FinalizedReplica, blk_1073757485_16661, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757485 for deletion 2025-07-20 17:46:16,319 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757485_16661 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757485 2025-07-20 17:47:18,083 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757486_16662 src: /192.168.158.5:40262 dest: /192.168.158.4:9866 2025-07-20 17:47:18,103 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:40262, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-139086691_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757486_16662, duration(ns): 17729852 2025-07-20 17:47:18,103 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757486_16662, type=LAST_IN_PIPELINE terminating 2025-07-20 17:47:19,321 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757486_16662 replica FinalizedReplica, blk_1073757486_16662, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757486 for deletion 2025-07-20 17:47:19,322 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757486_16662 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757486 2025-07-20 17:48:18,075 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757487_16663 src: /192.168.158.1:34698 dest: /192.168.158.4:9866 2025-07-20 17:48:18,107 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34698, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-693295769_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757487_16663, duration(ns): 23369819 2025-07-20 17:48:18,108 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757487_16663, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-20 17:48:19,322 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757487_16663 replica FinalizedReplica, blk_1073757487_16663, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757487 for deletion 2025-07-20 17:48:19,324 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757487_16663 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757487 2025-07-20 17:54:23,085 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757493_16669 src: /192.168.158.6:38200 dest: /192.168.158.4:9866 2025-07-20 17:54:23,113 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:38200, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1153299001_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757493_16669, duration(ns): 22574722 2025-07-20 17:54:23,113 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757493_16669, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-20 17:54:25,334 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757493_16669 replica FinalizedReplica, blk_1073757493_16669, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757493 for deletion 2025-07-20 17:54:25,335 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757493_16669 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757493 2025-07-20 17:56:23,099 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757495_16671 src: /192.168.158.9:41378 dest: /192.168.158.4:9866 2025-07-20 17:56:23,125 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:41378, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_181985010_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757495_16671, duration(ns): 20460896 2025-07-20 17:56:23,126 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757495_16671, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-20 17:56:25,338 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757495_16671 replica FinalizedReplica, blk_1073757495_16671, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757495 for deletion 2025-07-20 17:56:25,339 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757495_16671 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757495 2025-07-20 17:57:28,088 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757496_16672 src: /192.168.158.1:33936 dest: /192.168.158.4:9866 2025-07-20 17:57:28,119 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33936, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1200358315_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757496_16672, duration(ns): 22609282 2025-07-20 17:57:28,120 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757496_16672, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-20 17:57:34,341 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757496_16672 replica FinalizedReplica, blk_1073757496_16672, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757496 for deletion 2025-07-20 17:57:34,342 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757496_16672 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757496 2025-07-20 18:02:38,089 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757501_16677 src: /192.168.158.1:57004 dest: /192.168.158.4:9866 2025-07-20 18:02:38,123 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57004, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2144949087_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757501_16677, duration(ns): 25599703 2025-07-20 18:02:38,124 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757501_16677, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-20 18:02:40,352 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757501_16677 replica FinalizedReplica, blk_1073757501_16677, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757501 for deletion 2025-07-20 18:02:40,353 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757501_16677 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757501 2025-07-20 18:04:43,096 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757503_16679 src: /192.168.158.8:50986 dest: /192.168.158.4:9866 2025-07-20 18:04:43,123 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:50986, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2096438180_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757503_16679, duration(ns): 21452995 2025-07-20 18:04:43,123 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757503_16679, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-20 18:04:46,356 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757503_16679 replica FinalizedReplica, blk_1073757503_16679, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757503 for deletion 2025-07-20 18:04:46,357 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757503_16679 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757503 2025-07-20 18:05:43,099 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757504_16680 src: /192.168.158.1:58086 dest: /192.168.158.4:9866 2025-07-20 18:05:43,132 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58086, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1294301937_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757504_16680, duration(ns): 23824648 2025-07-20 18:05:43,132 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757504_16680, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-20 18:05:46,359 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757504_16680 replica FinalizedReplica, blk_1073757504_16680, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757504 for deletion 2025-07-20 18:05:46,360 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757504_16680 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757504 2025-07-20 18:09:48,120 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757508_16684 src: /192.168.158.5:33536 dest: /192.168.158.4:9866 2025-07-20 18:09:48,147 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:33536, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1054484661_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757508_16684, duration(ns): 21402500 2025-07-20 18:09:48,147 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757508_16684, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-20 18:09:49,365 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757508_16684 replica FinalizedReplica, blk_1073757508_16684, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757508 for deletion 2025-07-20 18:09:49,367 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757508_16684 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757508 2025-07-20 18:10:48,123 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757509_16685 src: /192.168.158.5:34136 dest: /192.168.158.4:9866 2025-07-20 18:10:48,144 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:34136, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-112346259_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757509_16685, duration(ns): 19490682 2025-07-20 18:10:48,144 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757509_16685, type=LAST_IN_PIPELINE terminating 2025-07-20 18:10:49,368 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757509_16685 replica FinalizedReplica, blk_1073757509_16685, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757509 for deletion 2025-07-20 18:10:49,369 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757509_16685 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757509 2025-07-20 18:12:53,121 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757511_16687 src: /192.168.158.7:58308 dest: /192.168.158.4:9866 2025-07-20 18:12:53,140 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:58308, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1293878647_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757511_16687, duration(ns): 16910217 2025-07-20 18:12:53,140 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757511_16687, type=LAST_IN_PIPELINE terminating 2025-07-20 18:12:58,375 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757511_16687 replica FinalizedReplica, blk_1073757511_16687, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757511 for deletion 2025-07-20 18:12:58,376 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757511_16687 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757511 2025-07-20 18:14:53,120 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757513_16689 src: /192.168.158.5:55630 dest: /192.168.158.4:9866 2025-07-20 18:14:53,139 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:55630, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_248273897_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757513_16689, duration(ns): 16919962 2025-07-20 18:14:53,139 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757513_16689, type=LAST_IN_PIPELINE terminating 2025-07-20 18:14:55,378 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757513_16689 replica FinalizedReplica, blk_1073757513_16689, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757513 for deletion 2025-07-20 18:14:55,379 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757513_16689 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757513 2025-07-20 18:17:53,123 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757516_16692 src: /192.168.158.1:55826 dest: /192.168.158.4:9866 2025-07-20 18:17:53,159 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55826, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1216450752_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757516_16692, duration(ns): 25598133 2025-07-20 18:17:53,159 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757516_16692, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-20 18:17:55,384 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757516_16692 replica FinalizedReplica, blk_1073757516_16692, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757516 for deletion 2025-07-20 18:17:55,385 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757516_16692 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757516 2025-07-20 18:18:53,130 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757517_16693 src: /192.168.158.7:51632 dest: /192.168.158.4:9866 2025-07-20 18:18:53,157 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:51632, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-217245442_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757517_16693, duration(ns): 21528120 2025-07-20 18:18:53,158 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757517_16693, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-20 18:18:58,387 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757517_16693 replica FinalizedReplica, blk_1073757517_16693, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757517 for deletion 2025-07-20 18:18:58,388 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757517_16693 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757517 2025-07-20 18:19:53,133 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757518_16694 src: /192.168.158.1:32834 dest: /192.168.158.4:9866 2025-07-20 18:19:53,166 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:32834, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1241362115_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757518_16694, duration(ns): 24146044 2025-07-20 18:19:53,167 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757518_16694, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-20 18:19:55,389 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757518_16694 replica FinalizedReplica, blk_1073757518_16694, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757518 for deletion 2025-07-20 18:19:55,391 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757518_16694 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757518 2025-07-20 18:20:58,169 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757519_16695 src: /192.168.158.8:43622 dest: /192.168.158.4:9866 2025-07-20 18:20:58,188 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:43622, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-415943913_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757519_16695, duration(ns): 17146249 2025-07-20 18:20:58,189 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757519_16695, type=LAST_IN_PIPELINE terminating 2025-07-20 18:21:01,392 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757519_16695 replica FinalizedReplica, blk_1073757519_16695, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757519 for deletion 2025-07-20 18:21:01,393 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757519_16695 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757519 2025-07-20 18:26:03,156 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757524_16700 src: /192.168.158.1:39230 dest: /192.168.158.4:9866 2025-07-20 18:26:03,189 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39230, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_629645473_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757524_16700, duration(ns): 23122537 2025-07-20 18:26:03,189 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757524_16700, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-20 18:26:04,411 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757524_16700 replica FinalizedReplica, blk_1073757524_16700, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757524 for deletion 2025-07-20 18:26:04,412 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757524_16700 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757524 2025-07-20 18:27:03,138 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757525_16701 src: /192.168.158.1:34930 dest: /192.168.158.4:9866 2025-07-20 18:27:03,173 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34930, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-893231761_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757525_16701, duration(ns): 25334069 2025-07-20 18:27:03,173 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757525_16701, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-20 18:27:07,411 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757525_16701 replica FinalizedReplica, blk_1073757525_16701, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757525 for deletion 2025-07-20 18:27:07,412 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757525_16701 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757525 2025-07-20 18:29:03,153 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757527_16703 src: /192.168.158.8:34520 dest: /192.168.158.4:9866 2025-07-20 18:29:03,174 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:34520, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_42307319_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757527_16703, duration(ns): 18868484 2025-07-20 18:29:03,175 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757527_16703, type=LAST_IN_PIPELINE terminating 2025-07-20 18:29:04,413 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757527_16703 replica FinalizedReplica, blk_1073757527_16703, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757527 for deletion 2025-07-20 18:29:04,414 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757527_16703 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757527 2025-07-20 18:30:03,155 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757528_16704 src: /192.168.158.7:57660 dest: /192.168.158.4:9866 2025-07-20 18:30:03,174 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:57660, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-812077950_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757528_16704, duration(ns): 17120503 2025-07-20 18:30:03,175 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757528_16704, type=LAST_IN_PIPELINE terminating 2025-07-20 18:30:04,416 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757528_16704 replica FinalizedReplica, blk_1073757528_16704, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757528 for deletion 2025-07-20 18:30:04,416 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757528_16704 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757528 2025-07-20 18:31:03,146 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757529_16705 src: /192.168.158.1:37602 dest: /192.168.158.4:9866 2025-07-20 18:31:03,178 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37602, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-657043455_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757529_16705, duration(ns): 22929731 2025-07-20 18:31:03,178 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757529_16705, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-20 18:31:04,418 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757529_16705 replica FinalizedReplica, blk_1073757529_16705, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757529 for deletion 2025-07-20 18:31:04,419 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757529_16705 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757529 2025-07-20 18:32:03,147 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757530_16706 src: /192.168.158.1:41200 dest: /192.168.158.4:9866 2025-07-20 18:32:03,181 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41200, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-107122324_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757530_16706, duration(ns): 24789848 2025-07-20 18:32:03,181 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757530_16706, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-20 18:32:07,419 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757530_16706 replica FinalizedReplica, blk_1073757530_16706, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757530 for deletion 2025-07-20 18:32:07,420 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757530_16706 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757530 2025-07-20 18:35:03,149 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757533_16709 src: /192.168.158.1:34412 dest: /192.168.158.4:9866 2025-07-20 18:35:03,181 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34412, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1641486378_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757533_16709, duration(ns): 23008489 2025-07-20 18:35:03,181 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757533_16709, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-20 18:35:07,428 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757533_16709 replica FinalizedReplica, blk_1073757533_16709, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757533 for deletion 2025-07-20 18:35:07,429 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757533_16709 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757533 2025-07-20 18:36:03,156 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757534_16710 src: /192.168.158.6:51570 dest: /192.168.158.4:9866 2025-07-20 18:36:03,176 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:51570, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-267898754_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757534_16710, duration(ns): 18654317 2025-07-20 18:36:03,177 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757534_16710, type=LAST_IN_PIPELINE terminating 2025-07-20 18:36:04,430 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757534_16710 replica FinalizedReplica, blk_1073757534_16710, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757534 for deletion 2025-07-20 18:36:04,431 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757534_16710 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757534 2025-07-20 18:37:03,153 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757535_16711 src: /192.168.158.9:60406 dest: /192.168.158.4:9866 2025-07-20 18:37:03,174 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:60406, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_537788882_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757535_16711, duration(ns): 19609425 2025-07-20 18:37:03,175 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757535_16711, type=LAST_IN_PIPELINE terminating 2025-07-20 18:37:04,433 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757535_16711 replica FinalizedReplica, blk_1073757535_16711, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757535 for deletion 2025-07-20 18:37:04,434 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757535_16711 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757535 2025-07-20 18:40:03,154 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757538_16714 src: /192.168.158.1:34836 dest: /192.168.158.4:9866 2025-07-20 18:40:03,188 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34836, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1679294105_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757538_16714, duration(ns): 25324656 2025-07-20 18:40:03,188 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757538_16714, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-20 18:40:04,437 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757538_16714 replica FinalizedReplica, blk_1073757538_16714, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757538 for deletion 2025-07-20 18:40:04,438 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757538_16714 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757538 2025-07-20 18:41:03,163 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757539_16715 src: /192.168.158.7:47752 dest: /192.168.158.4:9866 2025-07-20 18:41:03,190 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:47752, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2071616422_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757539_16715, duration(ns): 21790628 2025-07-20 18:41:03,191 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757539_16715, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-20 18:41:04,436 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757539_16715 replica FinalizedReplica, blk_1073757539_16715, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757539 for deletion 2025-07-20 18:41:04,437 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757539_16715 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757539 2025-07-20 18:42:03,167 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757540_16716 src: /192.168.158.7:32816 dest: /192.168.158.4:9866 2025-07-20 18:42:03,185 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:32816, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_254250802_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757540_16716, duration(ns): 16480469 2025-07-20 18:42:03,186 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757540_16716, type=LAST_IN_PIPELINE terminating 2025-07-20 18:42:07,439 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757540_16716 replica FinalizedReplica, blk_1073757540_16716, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757540 for deletion 2025-07-20 18:42:07,440 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757540_16716 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757540 2025-07-20 18:44:03,168 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757542_16718 src: /192.168.158.5:58838 dest: /192.168.158.4:9866 2025-07-20 18:44:03,189 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:58838, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-8570799_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757542_16718, duration(ns): 18303233 2025-07-20 18:44:03,189 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757542_16718, type=LAST_IN_PIPELINE terminating 2025-07-20 18:44:04,443 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757542_16718 replica FinalizedReplica, blk_1073757542_16718, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757542 for deletion 2025-07-20 18:44:04,444 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757542_16718 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757542 2025-07-20 18:46:03,164 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757544_16720 src: /192.168.158.1:46120 dest: /192.168.158.4:9866 2025-07-20 18:46:03,197 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46120, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-655063446_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757544_16720, duration(ns): 23452266 2025-07-20 18:46:03,197 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757544_16720, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-20 18:46:04,446 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757544_16720 replica FinalizedReplica, blk_1073757544_16720, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757544 for deletion 2025-07-20 18:46:04,447 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757544_16720 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757544 2025-07-20 18:48:08,176 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757546_16722 src: /192.168.158.8:56898 dest: /192.168.158.4:9866 2025-07-20 18:48:08,194 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:56898, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1334845343_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757546_16722, duration(ns): 16674821 2025-07-20 18:48:08,195 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757546_16722, type=LAST_IN_PIPELINE terminating 2025-07-20 18:48:10,449 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757546_16722 replica FinalizedReplica, blk_1073757546_16722, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757546 for deletion 2025-07-20 18:48:10,450 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757546_16722 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757546 2025-07-20 18:49:08,165 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757547_16723 src: /192.168.158.8:34538 dest: /192.168.158.4:9866 2025-07-20 18:49:08,191 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:34538, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-243443671_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757547_16723, duration(ns): 20284076 2025-07-20 18:49:08,191 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757547_16723, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-20 18:49:10,450 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757547_16723 replica FinalizedReplica, blk_1073757547_16723, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757547 for deletion 2025-07-20 18:49:10,451 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757547_16723 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757547 2025-07-20 18:50:08,167 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757548_16724 src: /192.168.158.6:55932 dest: /192.168.158.4:9866 2025-07-20 18:50:08,197 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:55932, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1935813396_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757548_16724, duration(ns): 21950896 2025-07-20 18:50:08,197 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757548_16724, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-20 18:50:10,452 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757548_16724 replica FinalizedReplica, blk_1073757548_16724, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757548 for deletion 2025-07-20 18:50:10,454 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757548_16724 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757548 2025-07-20 18:53:13,170 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757551_16727 src: /192.168.158.6:53964 dest: /192.168.158.4:9866 2025-07-20 18:53:13,198 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:53964, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-717737955_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757551_16727, duration(ns): 21064351 2025-07-20 18:53:13,198 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757551_16727, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-20 18:53:16,458 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757551_16727 replica FinalizedReplica, blk_1073757551_16727, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757551 for deletion 2025-07-20 18:53:16,459 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757551_16727 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757551 2025-07-20 18:54:13,169 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757552_16728 src: /192.168.158.1:35178 dest: /192.168.158.4:9866 2025-07-20 18:54:13,205 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35178, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-114119726_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757552_16728, duration(ns): 27287556 2025-07-20 18:54:13,206 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757552_16728, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-20 18:54:16,460 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757552_16728 replica FinalizedReplica, blk_1073757552_16728, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757552 for deletion 2025-07-20 18:54:16,462 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757552_16728 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757552 2025-07-20 18:56:13,171 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757554_16730 src: /192.168.158.1:41602 dest: /192.168.158.4:9866 2025-07-20 18:56:13,205 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41602, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-483269868_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757554_16730, duration(ns): 25354109 2025-07-20 18:56:13,206 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757554_16730, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-20 18:56:16,463 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757554_16730 replica FinalizedReplica, blk_1073757554_16730, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757554 for deletion 2025-07-20 18:56:16,465 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757554_16730 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757554 2025-07-20 18:58:13,182 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757556_16732 src: /192.168.158.8:47642 dest: /192.168.158.4:9866 2025-07-20 18:58:13,201 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:47642, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1413512154_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757556_16732, duration(ns): 17069279 2025-07-20 18:58:13,201 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757556_16732, type=LAST_IN_PIPELINE terminating 2025-07-20 18:58:16,466 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757556_16732 replica FinalizedReplica, blk_1073757556_16732, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757556 for deletion 2025-07-20 18:58:16,468 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757556_16732 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757556 2025-07-20 19:02:28,187 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757560_16736 src: /192.168.158.5:48994 dest: /192.168.158.4:9866 2025-07-20 19:02:28,209 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:48994, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_372158753_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757560_16736, duration(ns): 19377659 2025-07-20 19:02:28,209 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757560_16736, type=LAST_IN_PIPELINE terminating 2025-07-20 19:02:34,477 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757560_16736 replica FinalizedReplica, blk_1073757560_16736, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757560 for deletion 2025-07-20 19:02:34,478 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757560_16736 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757560 2025-07-20 19:05:28,182 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757563_16739 src: /192.168.158.5:60758 dest: /192.168.158.4:9866 2025-07-20 19:05:28,209 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:60758, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1436858881_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757563_16739, duration(ns): 21384992 2025-07-20 19:05:28,211 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757563_16739, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-20 19:05:31,485 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757563_16739 replica FinalizedReplica, blk_1073757563_16739, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757563 for deletion 2025-07-20 19:05:31,486 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757563_16739 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757563 2025-07-20 19:09:38,183 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757567_16743 src: /192.168.158.1:49030 dest: /192.168.158.4:9866 2025-07-20 19:09:38,215 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49030, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-34146212_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757567_16743, duration(ns): 23105687 2025-07-20 19:09:38,215 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757567_16743, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-20 19:09:40,494 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757567_16743 replica FinalizedReplica, blk_1073757567_16743, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757567 for deletion 2025-07-20 19:09:40,495 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757567_16743 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757567 2025-07-20 19:10:43,184 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757568_16744 src: /192.168.158.1:47646 dest: /192.168.158.4:9866 2025-07-20 19:10:43,218 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47646, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1116954962_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757568_16744, duration(ns): 25208639 2025-07-20 19:10:43,218 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757568_16744, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-20 19:10:43,495 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757568_16744 replica FinalizedReplica, blk_1073757568_16744, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757568 for deletion 2025-07-20 19:10:43,496 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757568_16744 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757568 2025-07-20 19:11:43,186 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757569_16745 src: /192.168.158.7:38686 dest: /192.168.158.4:9866 2025-07-20 19:11:43,214 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:38686, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-784160248_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757569_16745, duration(ns): 21688462 2025-07-20 19:11:43,214 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757569_16745, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-20 19:11:46,496 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757569_16745 replica FinalizedReplica, blk_1073757569_16745, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757569 for deletion 2025-07-20 19:11:46,497 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757569_16745 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757569 2025-07-20 19:14:43,191 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757572_16748 src: /192.168.158.1:42594 dest: /192.168.158.4:9866 2025-07-20 19:14:43,225 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42594, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1811405192_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757572_16748, duration(ns): 23206411 2025-07-20 19:14:43,225 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757572_16748, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-20 19:14:43,502 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757572_16748 replica FinalizedReplica, blk_1073757572_16748, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757572 for deletion 2025-07-20 19:14:43,503 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757572_16748 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757572 2025-07-20 19:17:48,197 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757575_16751 src: /192.168.158.1:46884 dest: /192.168.158.4:9866 2025-07-20 19:17:48,236 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46884, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_114921313_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757575_16751, duration(ns): 29994674 2025-07-20 19:17:48,236 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757575_16751, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-20 19:17:49,512 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757575_16751 replica FinalizedReplica, blk_1073757575_16751, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757575 for deletion 2025-07-20 19:17:49,513 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757575_16751 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757575 2025-07-20 19:18:48,191 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757576_16752 src: /192.168.158.1:49808 dest: /192.168.158.4:9866 2025-07-20 19:18:48,224 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49808, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_969661981_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757576_16752, duration(ns): 23895092 2025-07-20 19:18:48,224 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757576_16752, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-20 19:18:52,513 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757576_16752 replica FinalizedReplica, blk_1073757576_16752, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757576 for deletion 2025-07-20 19:18:52,514 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757576_16752 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757576 2025-07-20 19:21:48,211 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757579_16755 src: /192.168.158.5:56456 dest: /192.168.158.4:9866 2025-07-20 19:21:48,230 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:56456, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-913826936_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757579_16755, duration(ns): 16962344 2025-07-20 19:21:48,230 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757579_16755, type=LAST_IN_PIPELINE terminating 2025-07-20 19:21:52,522 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757579_16755 replica FinalizedReplica, blk_1073757579_16755, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757579 for deletion 2025-07-20 19:21:52,523 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757579_16755 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757579 2025-07-20 19:23:48,223 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757581_16757 src: /192.168.158.9:39416 dest: /192.168.158.4:9866 2025-07-20 19:23:48,242 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:39416, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1150299063_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757581_16757, duration(ns): 17109845 2025-07-20 19:23:48,243 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757581_16757, type=LAST_IN_PIPELINE terminating 2025-07-20 19:23:49,524 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757581_16757 replica FinalizedReplica, blk_1073757581_16757, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757581 for deletion 2025-07-20 19:23:49,525 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757581_16757 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757581 2025-07-20 19:26:58,232 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757584_16760 src: /192.168.158.7:43472 dest: /192.168.158.4:9866 2025-07-20 19:26:58,251 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:43472, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-808440292_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757584_16760, duration(ns): 16438409 2025-07-20 19:26:58,251 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757584_16760, type=LAST_IN_PIPELINE terminating 2025-07-20 19:26:58,527 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757584_16760 replica FinalizedReplica, blk_1073757584_16760, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757584 for deletion 2025-07-20 19:26:58,528 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757584_16760 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757584 2025-07-20 19:27:58,233 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757585_16761 src: /192.168.158.6:59998 dest: /192.168.158.4:9866 2025-07-20 19:27:58,252 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:59998, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_456526721_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757585_16761, duration(ns): 16680586 2025-07-20 19:27:58,252 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757585_16761, type=LAST_IN_PIPELINE terminating 2025-07-20 19:28:01,529 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757585_16761 replica FinalizedReplica, blk_1073757585_16761, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757585 for deletion 2025-07-20 19:28:01,530 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757585_16761 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757585 2025-07-20 19:28:58,245 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757586_16762 src: /192.168.158.9:58658 dest: /192.168.158.4:9866 2025-07-20 19:28:58,268 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:58658, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_461340047_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757586_16762, duration(ns): 17191043 2025-07-20 19:28:58,271 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757586_16762, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-20 19:28:58,533 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757586_16762 replica FinalizedReplica, blk_1073757586_16762, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757586 for deletion 2025-07-20 19:28:58,534 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757586_16762 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757586 2025-07-20 19:29:58,323 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757587_16763 src: /192.168.158.6:50888 dest: /192.168.158.4:9866 2025-07-20 19:29:58,342 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:50888, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1754679536_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757587_16763, duration(ns): 16840084 2025-07-20 19:29:58,342 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757587_16763, type=LAST_IN_PIPELINE terminating 2025-07-20 19:29:58,534 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757587_16763 replica FinalizedReplica, blk_1073757587_16763, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757587 for deletion 2025-07-20 19:29:58,536 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757587_16763 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757587 2025-07-20 19:32:03,252 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757589_16765 src: /192.168.158.1:55740 dest: /192.168.158.4:9866 2025-07-20 19:32:03,279 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55740, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1477353318_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757589_16765, duration(ns): 18482196 2025-07-20 19:32:03,280 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757589_16765, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-20 19:32:04,536 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757589_16765 replica FinalizedReplica, blk_1073757589_16765, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757589 for deletion 2025-07-20 19:32:04,537 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757589_16765 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757589 2025-07-20 19:34:03,265 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757591_16767 src: /192.168.158.6:56314 dest: /192.168.158.4:9866 2025-07-20 19:34:03,286 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:56314, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_660629018_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757591_16767, duration(ns): 17220619 2025-07-20 19:34:03,286 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757591_16767, type=LAST_IN_PIPELINE terminating 2025-07-20 19:34:04,539 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757591_16767 replica FinalizedReplica, blk_1073757591_16767, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757591 for deletion 2025-07-20 19:34:04,540 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757591_16767 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757591 2025-07-20 19:35:03,274 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757592_16768 src: /192.168.158.6:52674 dest: /192.168.158.4:9866 2025-07-20 19:35:03,296 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:52674, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1423586674_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757592_16768, duration(ns): 19730710 2025-07-20 19:35:03,296 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757592_16768, type=LAST_IN_PIPELINE terminating 2025-07-20 19:35:04,542 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757592_16768 replica FinalizedReplica, blk_1073757592_16768, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757592 for deletion 2025-07-20 19:35:04,543 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757592_16768 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757592 2025-07-20 19:37:03,232 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757594_16770 src: /192.168.158.6:52982 dest: /192.168.158.4:9866 2025-07-20 19:37:03,258 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:52982, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_987829930_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757594_16770, duration(ns): 20270717 2025-07-20 19:37:03,259 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757594_16770, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-20 19:37:04,545 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757594_16770 replica FinalizedReplica, blk_1073757594_16770, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757594 for deletion 2025-07-20 19:37:04,547 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757594_16770 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757594 2025-07-20 19:38:03,239 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757595_16771 src: /192.168.158.8:44164 dest: /192.168.158.4:9866 2025-07-20 19:38:03,267 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:44164, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-186132740_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757595_16771, duration(ns): 22481008 2025-07-20 19:38:03,267 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757595_16771, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-20 19:38:07,548 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757595_16771 replica FinalizedReplica, blk_1073757595_16771, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757595 for deletion 2025-07-20 19:38:07,549 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757595_16771 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757595 2025-07-20 19:40:03,244 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757597_16773 src: /192.168.158.8:55312 dest: /192.168.158.4:9866 2025-07-20 19:40:03,274 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:55312, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1167515044_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757597_16773, duration(ns): 24473950 2025-07-20 19:40:03,274 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757597_16773, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-20 19:40:04,553 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757597_16773 replica FinalizedReplica, blk_1073757597_16773, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757597 for deletion 2025-07-20 19:40:04,554 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757597_16773 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757597 2025-07-20 19:44:08,253 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757601_16777 src: /192.168.158.7:37396 dest: /192.168.158.4:9866 2025-07-20 19:44:08,279 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:37396, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1650361154_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757601_16777, duration(ns): 20723739 2025-07-20 19:44:08,279 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757601_16777, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-20 19:44:13,560 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757601_16777 replica FinalizedReplica, blk_1073757601_16777, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757601 for deletion 2025-07-20 19:44:13,561 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757601_16777 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757601 2025-07-20 19:45:08,250 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757602_16778 src: /192.168.158.1:35968 dest: /192.168.158.4:9866 2025-07-20 19:45:08,284 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35968, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1546823964_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757602_16778, duration(ns): 24722193 2025-07-20 19:45:08,284 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757602_16778, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-20 19:45:10,562 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757602_16778 replica FinalizedReplica, blk_1073757602_16778, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757602 for deletion 2025-07-20 19:45:10,563 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757602_16778 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757602 2025-07-20 19:46:08,260 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757603_16779 src: /192.168.158.8:48550 dest: /192.168.158.4:9866 2025-07-20 19:46:08,279 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:48550, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1011690316_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757603_16779, duration(ns): 16770142 2025-07-20 19:46:08,280 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757603_16779, type=LAST_IN_PIPELINE terminating 2025-07-20 19:46:10,564 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757603_16779 replica FinalizedReplica, blk_1073757603_16779, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757603 for deletion 2025-07-20 19:46:10,565 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757603_16779 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757603 2025-07-20 19:49:13,264 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757606_16782 src: /192.168.158.7:42246 dest: /192.168.158.4:9866 2025-07-20 19:49:13,284 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:42246, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1997772422_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757606_16782, duration(ns): 17732665 2025-07-20 19:49:13,284 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757606_16782, type=LAST_IN_PIPELINE terminating 2025-07-20 19:49:13,572 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757606_16782 replica FinalizedReplica, blk_1073757606_16782, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757606 for deletion 2025-07-20 19:49:13,573 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757606_16782 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757606 2025-07-20 19:50:13,262 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757607_16783 src: /192.168.158.9:49926 dest: /192.168.158.4:9866 2025-07-20 19:50:13,283 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:49926, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1842766917_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757607_16783, duration(ns): 19359048 2025-07-20 19:50:13,284 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757607_16783, type=LAST_IN_PIPELINE terminating 2025-07-20 19:50:16,574 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757607_16783 replica FinalizedReplica, blk_1073757607_16783, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757607 for deletion 2025-07-20 19:50:16,575 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757607_16783 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757607 2025-07-20 19:52:13,267 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757609_16785 src: /192.168.158.7:54382 dest: /192.168.158.4:9866 2025-07-20 19:52:13,289 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:54382, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1233607825_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757609_16785, duration(ns): 19600203 2025-07-20 19:52:13,289 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757609_16785, type=LAST_IN_PIPELINE terminating 2025-07-20 19:52:13,578 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757609_16785 replica FinalizedReplica, blk_1073757609_16785, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757609 for deletion 2025-07-20 19:52:13,579 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757609_16785 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757609 2025-07-20 19:53:13,263 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757610_16786 src: /192.168.158.1:43624 dest: /192.168.158.4:9866 2025-07-20 19:53:13,303 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43624, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1162295676_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757610_16786, duration(ns): 30541508 2025-07-20 19:53:13,306 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757610_16786, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-20 19:53:16,578 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757610_16786 replica FinalizedReplica, blk_1073757610_16786, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757610 for deletion 2025-07-20 19:53:16,580 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757610_16786 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757610 2025-07-20 19:54:13,267 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757611_16787 src: /192.168.158.6:57716 dest: /192.168.158.4:9866 2025-07-20 19:54:13,292 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:57716, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1132267366_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757611_16787, duration(ns): 20415145 2025-07-20 19:54:13,295 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757611_16787, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-20 19:54:13,583 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757611_16787 replica FinalizedReplica, blk_1073757611_16787, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757611 for deletion 2025-07-20 19:54:13,584 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757611_16787 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757611 2025-07-20 19:56:13,266 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757613_16789 src: /192.168.158.1:41576 dest: /192.168.158.4:9866 2025-07-20 19:56:13,301 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41576, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1400722796_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757613_16789, duration(ns): 26400269 2025-07-20 19:56:13,302 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757613_16789, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-20 19:56:13,587 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757613_16789 replica FinalizedReplica, blk_1073757613_16789, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757613 for deletion 2025-07-20 19:56:13,589 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757613_16789 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757613 2025-07-20 19:58:13,269 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757615_16791 src: /192.168.158.1:46472 dest: /192.168.158.4:9866 2025-07-20 19:58:13,302 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46472, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1691609036_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757615_16791, duration(ns): 24321676 2025-07-20 19:58:13,302 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757615_16791, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-20 19:58:13,589 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757615_16791 replica FinalizedReplica, blk_1073757615_16791, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757615 for deletion 2025-07-20 19:58:13,591 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757615_16791 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757615 2025-07-20 20:01:13,277 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757618_16794 src: /192.168.158.6:52204 dest: /192.168.158.4:9866 2025-07-20 20:01:13,303 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:52204, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1094103158_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757618_16794, duration(ns): 20183063 2025-07-20 20:01:13,303 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757618_16794, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-20 20:01:16,595 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757618_16794 replica FinalizedReplica, blk_1073757618_16794, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757618 for deletion 2025-07-20 20:01:16,596 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757618_16794 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757618 2025-07-20 20:04:13,274 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757621_16797 src: /192.168.158.1:54860 dest: /192.168.158.4:9866 2025-07-20 20:04:13,308 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54860, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-330929244_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757621_16797, duration(ns): 24553998 2025-07-20 20:04:13,308 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757621_16797, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-20 20:04:19,605 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757621_16797 replica FinalizedReplica, blk_1073757621_16797, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757621 for deletion 2025-07-20 20:04:19,606 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757621_16797 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757621 2025-07-20 20:05:18,285 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757622_16798 src: /192.168.158.8:38446 dest: /192.168.158.4:9866 2025-07-20 20:05:18,303 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:38446, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2089376661_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757622_16798, duration(ns): 15928447 2025-07-20 20:05:18,303 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757622_16798, type=LAST_IN_PIPELINE terminating 2025-07-20 20:05:22,607 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757622_16798 replica FinalizedReplica, blk_1073757622_16798, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757622 for deletion 2025-07-20 20:05:22,608 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757622_16798 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757622 2025-07-20 20:08:18,292 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757625_16801 src: /192.168.158.5:41772 dest: /192.168.158.4:9866 2025-07-20 20:08:18,312 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:41772, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-554357641_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757625_16801, duration(ns): 17216467 2025-07-20 20:08:18,312 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757625_16801, type=LAST_IN_PIPELINE terminating 2025-07-20 20:08:22,609 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757625_16801 replica FinalizedReplica, blk_1073757625_16801, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757625 for deletion 2025-07-20 20:08:22,610 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757625_16801 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757625 2025-07-20 20:10:18,294 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757627_16803 src: /192.168.158.5:50308 dest: /192.168.158.4:9866 2025-07-20 20:10:18,313 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:50308, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1393089208_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757627_16803, duration(ns): 16251740 2025-07-20 20:10:18,313 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757627_16803, type=LAST_IN_PIPELINE terminating 2025-07-20 20:10:22,613 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757627_16803 replica FinalizedReplica, blk_1073757627_16803, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757627 for deletion 2025-07-20 20:10:22,614 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757627_16803 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757627 2025-07-20 20:12:18,300 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757629_16805 src: /192.168.158.5:40992 dest: /192.168.158.4:9866 2025-07-20 20:12:18,321 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:40992, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1404367514_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757629_16805, duration(ns): 17287011 2025-07-20 20:12:18,321 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757629_16805, type=LAST_IN_PIPELINE terminating 2025-07-20 20:12:22,615 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757629_16805 replica FinalizedReplica, blk_1073757629_16805, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757629 for deletion 2025-07-20 20:12:22,616 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757629_16805 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757629 2025-07-20 20:13:18,293 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757630_16806 src: /192.168.158.1:54266 dest: /192.168.158.4:9866 2025-07-20 20:13:18,330 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54266, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1686309524_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757630_16806, duration(ns): 28387765 2025-07-20 20:13:18,331 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757630_16806, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-20 20:13:22,617 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757630_16806 replica FinalizedReplica, blk_1073757630_16806, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757630 for deletion 2025-07-20 20:13:22,618 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757630_16806 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757630 2025-07-20 20:16:23,297 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757633_16809 src: /192.168.158.1:58442 dest: /192.168.158.4:9866 2025-07-20 20:16:23,330 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58442, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_518921756_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757633_16809, duration(ns): 24838300 2025-07-20 20:16:23,330 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757633_16809, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-20 20:16:28,624 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757633_16809 replica FinalizedReplica, blk_1073757633_16809, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757633 for deletion 2025-07-20 20:16:28,625 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757633_16809 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757633 2025-07-20 20:17:23,300 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757634_16810 src: /192.168.158.5:59526 dest: /192.168.158.4:9866 2025-07-20 20:17:23,329 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:59526, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_536074601_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757634_16810, duration(ns): 21754725 2025-07-20 20:17:23,329 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757634_16810, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-20 20:17:28,629 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757634_16810 replica FinalizedReplica, blk_1073757634_16810, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757634 for deletion 2025-07-20 20:17:28,630 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757634_16810 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757634 2025-07-20 20:24:23,311 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757641_16817 src: /192.168.158.1:38718 dest: /192.168.158.4:9866 2025-07-20 20:24:23,344 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38718, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_461525312_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757641_16817, duration(ns): 24119142 2025-07-20 20:24:23,344 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757641_16817, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-20 20:24:31,645 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757641_16817 replica FinalizedReplica, blk_1073757641_16817, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757641 for deletion 2025-07-20 20:24:31,646 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757641_16817 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757641 2025-07-20 20:25:23,316 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757642_16818 src: /192.168.158.1:39744 dest: /192.168.158.4:9866 2025-07-20 20:25:23,350 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39744, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1869294783_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757642_16818, duration(ns): 25218502 2025-07-20 20:25:23,351 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757642_16818, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-20 20:25:31,648 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757642_16818 replica FinalizedReplica, blk_1073757642_16818, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757642 for deletion 2025-07-20 20:25:31,649 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757642_16818 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757642 2025-07-20 20:27:23,325 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757644_16820 src: /192.168.158.7:39628 dest: /192.168.158.4:9866 2025-07-20 20:27:23,344 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:39628, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1007469743_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757644_16820, duration(ns): 17583898 2025-07-20 20:27:23,345 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757644_16820, type=LAST_IN_PIPELINE terminating 2025-07-20 20:27:28,652 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757644_16820 replica FinalizedReplica, blk_1073757644_16820, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757644 for deletion 2025-07-20 20:27:28,654 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757644_16820 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757644 2025-07-20 20:28:23,328 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757645_16821 src: /192.168.158.8:50738 dest: /192.168.158.4:9866 2025-07-20 20:28:23,348 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:50738, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1106195106_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757645_16821, duration(ns): 17178305 2025-07-20 20:28:23,348 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757645_16821, type=LAST_IN_PIPELINE terminating 2025-07-20 20:28:28,655 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757645_16821 replica FinalizedReplica, blk_1073757645_16821, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757645 for deletion 2025-07-20 20:28:28,656 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757645_16821 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757645 2025-07-20 20:29:23,326 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757646_16822 src: /192.168.158.1:36604 dest: /192.168.158.4:9866 2025-07-20 20:29:23,358 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36604, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-973818146_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757646_16822, duration(ns): 22892442 2025-07-20 20:29:23,358 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757646_16822, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-20 20:29:28,658 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757646_16822 replica FinalizedReplica, blk_1073757646_16822, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757646 for deletion 2025-07-20 20:29:28,659 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757646_16822 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757646 2025-07-20 20:31:23,334 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757648_16824 src: /192.168.158.1:59146 dest: /192.168.158.4:9866 2025-07-20 20:31:23,370 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59146, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1879992392_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757648_16824, duration(ns): 27627437 2025-07-20 20:31:23,371 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757648_16824, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-20 20:31:28,663 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757648_16824 replica FinalizedReplica, blk_1073757648_16824, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757648 for deletion 2025-07-20 20:31:28,665 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757648_16824 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757648 2025-07-20 20:32:23,336 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757649_16825 src: /192.168.158.6:50752 dest: /192.168.158.4:9866 2025-07-20 20:32:23,355 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:50752, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1670445809_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757649_16825, duration(ns): 16485473 2025-07-20 20:32:23,355 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757649_16825, type=LAST_IN_PIPELINE terminating 2025-07-20 20:32:28,668 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757649_16825 replica FinalizedReplica, blk_1073757649_16825, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757649 for deletion 2025-07-20 20:32:28,669 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757649_16825 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757649 2025-07-20 20:34:28,336 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757651_16827 src: /192.168.158.7:35706 dest: /192.168.158.4:9866 2025-07-20 20:34:28,356 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:35706, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1801484994_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757651_16827, duration(ns): 17662720 2025-07-20 20:34:28,356 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757651_16827, type=LAST_IN_PIPELINE terminating 2025-07-20 20:34:31,671 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757651_16827 replica FinalizedReplica, blk_1073757651_16827, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757651 for deletion 2025-07-20 20:34:31,672 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757651_16827 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757651 2025-07-20 20:35:28,334 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757652_16828 src: /192.168.158.1:46798 dest: /192.168.158.4:9866 2025-07-20 20:35:28,367 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46798, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1550466758_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757652_16828, duration(ns): 24214905 2025-07-20 20:35:28,367 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757652_16828, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-20 20:35:34,674 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757652_16828 replica FinalizedReplica, blk_1073757652_16828, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757652 for deletion 2025-07-20 20:35:34,675 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757652_16828 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757652 2025-07-20 20:38:33,341 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757655_16831 src: /192.168.158.9:46216 dest: /192.168.158.4:9866 2025-07-20 20:38:33,367 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:46216, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1508765175_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757655_16831, duration(ns): 20420191 2025-07-20 20:38:33,368 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757655_16831, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-20 20:38:37,678 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757655_16831 replica FinalizedReplica, blk_1073757655_16831, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757655 for deletion 2025-07-20 20:38:37,679 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757655_16831 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757655 2025-07-20 20:41:38,345 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757658_16834 src: /192.168.158.5:49370 dest: /192.168.158.4:9866 2025-07-20 20:41:38,372 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:49370, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-49131294_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757658_16834, duration(ns): 21089764 2025-07-20 20:41:38,372 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757658_16834, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-20 20:41:43,682 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757658_16834 replica FinalizedReplica, blk_1073757658_16834, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757658 for deletion 2025-07-20 20:41:43,683 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757658_16834 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757658 2025-07-20 20:42:38,351 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757659_16835 src: /192.168.158.7:34106 dest: /192.168.158.4:9866 2025-07-20 20:42:38,371 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:34106, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1244690190_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757659_16835, duration(ns): 17808234 2025-07-20 20:42:38,372 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757659_16835, type=LAST_IN_PIPELINE terminating 2025-07-20 20:42:43,687 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757659_16835 replica FinalizedReplica, blk_1073757659_16835, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757659 for deletion 2025-07-20 20:42:43,687 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757659_16835 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757659 2025-07-20 20:43:43,366 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757660_16836 src: /192.168.158.7:43940 dest: /192.168.158.4:9866 2025-07-20 20:43:43,386 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:43940, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1682172550_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757660_16836, duration(ns): 18322947 2025-07-20 20:43:43,387 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757660_16836, type=LAST_IN_PIPELINE terminating 2025-07-20 20:43:49,690 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757660_16836 replica FinalizedReplica, blk_1073757660_16836, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757660 for deletion 2025-07-20 20:43:49,691 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757660_16836 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757660 2025-07-20 20:44:43,355 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757661_16837 src: /192.168.158.5:37612 dest: /192.168.158.4:9866 2025-07-20 20:44:43,380 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:37612, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1817576665_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757661_16837, duration(ns): 19887420 2025-07-20 20:44:43,381 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757661_16837, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-20 20:44:46,692 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757661_16837 replica FinalizedReplica, blk_1073757661_16837, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757661 for deletion 2025-07-20 20:44:46,693 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757661_16837 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757661 2025-07-20 20:46:43,364 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757663_16839 src: /192.168.158.7:36112 dest: /192.168.158.4:9866 2025-07-20 20:46:43,392 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:36112, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_310187207_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757663_16839, duration(ns): 21485634 2025-07-20 20:46:43,392 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757663_16839, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-20 20:46:46,693 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757663_16839 replica FinalizedReplica, blk_1073757663_16839, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757663 for deletion 2025-07-20 20:46:46,694 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757663_16839 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757663 2025-07-20 20:48:48,365 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757665_16841 src: /192.168.158.6:33408 dest: /192.168.158.4:9866 2025-07-20 20:48:48,388 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:33408, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1124756123_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757665_16841, duration(ns): 18337745 2025-07-20 20:48:48,388 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757665_16841, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-20 20:48:52,696 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757665_16841 replica FinalizedReplica, blk_1073757665_16841, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757665 for deletion 2025-07-20 20:48:52,697 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757665_16841 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757665 2025-07-20 20:49:48,373 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757666_16842 src: /192.168.158.7:58178 dest: /192.168.158.4:9866 2025-07-20 20:49:48,403 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:58178, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1476688566_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757666_16842, duration(ns): 23532880 2025-07-20 20:49:48,403 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757666_16842, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-20 20:49:55,698 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757666_16842 replica FinalizedReplica, blk_1073757666_16842, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757666 for deletion 2025-07-20 20:49:55,699 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757666_16842 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757666 2025-07-20 20:53:48,365 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757670_16846 src: /192.168.158.1:57542 dest: /192.168.158.4:9866 2025-07-20 20:53:48,401 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57542, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2000191075_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757670_16846, duration(ns): 26806339 2025-07-20 20:53:48,402 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757670_16846, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-20 20:53:52,708 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757670_16846 replica FinalizedReplica, blk_1073757670_16846, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757670 for deletion 2025-07-20 20:53:52,710 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757670_16846 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757670 2025-07-20 20:54:48,384 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757671_16847 src: /192.168.158.7:56478 dest: /192.168.158.4:9866 2025-07-20 20:54:48,404 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:56478, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1865193123_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757671_16847, duration(ns): 18220392 2025-07-20 20:54:48,405 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757671_16847, type=LAST_IN_PIPELINE terminating 2025-07-20 20:54:52,712 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757671_16847 replica FinalizedReplica, blk_1073757671_16847, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757671 for deletion 2025-07-20 20:54:52,713 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757671_16847 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757671 2025-07-20 20:59:03,394 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757675_16851 src: /192.168.158.1:45534 dest: /192.168.158.4:9866 2025-07-20 20:59:03,428 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45534, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-646061892_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757675_16851, duration(ns): 25312465 2025-07-20 20:59:03,429 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757675_16851, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-20 20:59:07,718 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757675_16851 replica FinalizedReplica, blk_1073757675_16851, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757675 for deletion 2025-07-20 20:59:07,719 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757675_16851 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757675 2025-07-20 21:02:03,413 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757678_16854 src: /192.168.158.1:39954 dest: /192.168.158.4:9866 2025-07-20 21:02:03,448 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39954, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_456570702_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757678_16854, duration(ns): 24672030 2025-07-20 21:02:03,448 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757678_16854, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-20 21:02:07,727 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757678_16854 replica FinalizedReplica, blk_1073757678_16854, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757678 for deletion 2025-07-20 21:02:07,728 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757678_16854 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757678 2025-07-20 21:03:03,413 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757679_16855 src: /192.168.158.1:42146 dest: /192.168.158.4:9866 2025-07-20 21:03:03,445 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42146, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1450244055_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757679_16855, duration(ns): 23421718 2025-07-20 21:03:03,445 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757679_16855, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-20 21:03:10,729 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757679_16855 replica FinalizedReplica, blk_1073757679_16855, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757679 for deletion 2025-07-20 21:03:10,730 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757679_16855 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757679 2025-07-20 21:04:03,425 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757680_16856 src: /192.168.158.7:41792 dest: /192.168.158.4:9866 2025-07-20 21:04:03,446 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:41792, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_201678128_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757680_16856, duration(ns): 18234425 2025-07-20 21:04:03,446 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757680_16856, type=LAST_IN_PIPELINE terminating 2025-07-20 21:04:10,730 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757680_16856 replica FinalizedReplica, blk_1073757680_16856, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757680 for deletion 2025-07-20 21:04:10,732 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757680_16856 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757680 2025-07-20 21:05:08,417 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757681_16857 src: /192.168.158.6:47898 dest: /192.168.158.4:9866 2025-07-20 21:05:08,445 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:47898, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2062999536_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757681_16857, duration(ns): 21921026 2025-07-20 21:05:08,445 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757681_16857, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-20 21:05:13,732 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757681_16857 replica FinalizedReplica, blk_1073757681_16857, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757681 for deletion 2025-07-20 21:05:13,733 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757681_16857 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757681 2025-07-20 21:06:08,414 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757682_16858 src: /192.168.158.1:37098 dest: /192.168.158.4:9866 2025-07-20 21:06:08,447 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37098, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1106391733_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757682_16858, duration(ns): 25286203 2025-07-20 21:06:08,448 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757682_16858, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-20 21:06:13,733 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757682_16858 replica FinalizedReplica, blk_1073757682_16858, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757682 for deletion 2025-07-20 21:06:13,734 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757682_16858 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757682 2025-07-20 21:09:08,403 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757685_16861 src: /192.168.158.8:54048 dest: /192.168.158.4:9866 2025-07-20 21:09:08,430 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:54048, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-139867548_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757685_16861, duration(ns): 21761323 2025-07-20 21:09:08,431 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757685_16861, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-20 21:09:13,735 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757685_16861 replica FinalizedReplica, blk_1073757685_16861, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757685 for deletion 2025-07-20 21:09:13,737 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757685_16861 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757685 2025-07-20 21:11:08,402 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757687_16863 src: /192.168.158.9:55132 dest: /192.168.158.4:9866 2025-07-20 21:11:08,421 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:55132, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_165504682_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757687_16863, duration(ns): 16755601 2025-07-20 21:11:08,422 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757687_16863, type=LAST_IN_PIPELINE terminating 2025-07-20 21:11:16,742 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757687_16863 replica FinalizedReplica, blk_1073757687_16863, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757687 for deletion 2025-07-20 21:11:16,743 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757687_16863 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757687 2025-07-20 21:13:08,399 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757689_16865 src: /192.168.158.1:53884 dest: /192.168.158.4:9866 2025-07-20 21:13:08,436 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53884, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-198591797_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757689_16865, duration(ns): 27392500 2025-07-20 21:13:08,437 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757689_16865, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-20 21:13:13,745 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757689_16865 replica FinalizedReplica, blk_1073757689_16865, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757689 for deletion 2025-07-20 21:13:13,746 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757689_16865 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757689 2025-07-20 21:14:08,416 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757690_16866 src: /192.168.158.9:46026 dest: /192.168.158.4:9866 2025-07-20 21:14:08,439 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:46026, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1427388105_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757690_16866, duration(ns): 20297775 2025-07-20 21:14:08,439 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757690_16866, type=LAST_IN_PIPELINE terminating 2025-07-20 21:14:16,746 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757690_16866 replica FinalizedReplica, blk_1073757690_16866, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757690 for deletion 2025-07-20 21:14:16,747 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757690_16866 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757690 2025-07-20 21:15:08,414 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757691_16867 src: /192.168.158.6:56454 dest: /192.168.158.4:9866 2025-07-20 21:15:08,440 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:56454, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-505284782_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757691_16867, duration(ns): 19831605 2025-07-20 21:15:08,440 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757691_16867, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-20 21:15:16,746 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757691_16867 replica FinalizedReplica, blk_1073757691_16867, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757691 for deletion 2025-07-20 21:15:16,747 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757691_16867 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757691 2025-07-20 21:17:13,436 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757693_16869 src: /192.168.158.9:53910 dest: /192.168.158.4:9866 2025-07-20 21:17:13,462 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:53910, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_152202125_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757693_16869, duration(ns): 20633469 2025-07-20 21:17:13,462 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757693_16869, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-20 21:17:19,749 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757693_16869 replica FinalizedReplica, blk_1073757693_16869, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757693 for deletion 2025-07-20 21:17:19,750 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757693_16869 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir29/blk_1073757693 2025-07-20 21:20:13,430 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757696_16872 src: /192.168.158.1:59012 dest: /192.168.158.4:9866 2025-07-20 21:20:13,462 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59012, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_907172994_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757696_16872, duration(ns): 23923712 2025-07-20 21:20:13,462 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757696_16872, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-20 21:20:16,756 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757696_16872 replica FinalizedReplica, blk_1073757696_16872, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757696 for deletion 2025-07-20 21:20:16,757 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757696_16872 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757696 2025-07-20 21:27:13,417 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757703_16879 src: /192.168.158.7:33322 dest: /192.168.158.4:9866 2025-07-20 21:27:13,445 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:33322, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1523478589_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757703_16879, duration(ns): 22436255 2025-07-20 21:27:13,445 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757703_16879, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-20 21:27:16,769 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757703_16879 replica FinalizedReplica, blk_1073757703_16879, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757703 for deletion 2025-07-20 21:27:16,770 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757703_16879 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757703 2025-07-20 21:29:18,423 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757705_16881 src: /192.168.158.8:52650 dest: /192.168.158.4:9866 2025-07-20 21:29:18,445 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:52650, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-27207556_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757705_16881, duration(ns): 19313892 2025-07-20 21:29:18,445 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757705_16881, type=LAST_IN_PIPELINE terminating 2025-07-20 21:29:22,773 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757705_16881 replica FinalizedReplica, blk_1073757705_16881, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757705 for deletion 2025-07-20 21:29:22,774 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757705_16881 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757705 2025-07-20 21:31:18,423 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757707_16883 src: /192.168.158.1:54558 dest: /192.168.158.4:9866 2025-07-20 21:31:18,455 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54558, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-978656095_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757707_16883, duration(ns): 23516317 2025-07-20 21:31:18,455 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757707_16883, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-20 21:31:22,777 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757707_16883 replica FinalizedReplica, blk_1073757707_16883, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757707 for deletion 2025-07-20 21:31:22,778 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757707_16883 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757707 2025-07-20 21:32:23,440 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757708_16884 src: /192.168.158.8:48354 dest: /192.168.158.4:9866 2025-07-20 21:32:23,467 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:48354, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1680044446_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757708_16884, duration(ns): 21450763 2025-07-20 21:32:23,467 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757708_16884, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-20 21:32:31,778 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757708_16884 replica FinalizedReplica, blk_1073757708_16884, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757708 for deletion 2025-07-20 21:32:31,779 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757708_16884 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757708 2025-07-20 21:34:23,442 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757710_16886 src: /192.168.158.1:46134 dest: /192.168.158.4:9866 2025-07-20 21:34:23,483 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46134, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1053735428_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757710_16886, duration(ns): 30185720 2025-07-20 21:34:23,484 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757710_16886, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-20 21:34:28,781 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757710_16886 replica FinalizedReplica, blk_1073757710_16886, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757710 for deletion 2025-07-20 21:34:28,782 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757710_16886 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757710 2025-07-20 21:37:28,450 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757713_16889 src: /192.168.158.1:59328 dest: /192.168.158.4:9866 2025-07-20 21:37:28,487 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59328, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1446296262_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757713_16889, duration(ns): 27279629 2025-07-20 21:37:28,487 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757713_16889, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-20 21:37:34,789 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757713_16889 replica FinalizedReplica, blk_1073757713_16889, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757713 for deletion 2025-07-20 21:37:34,790 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757713_16889 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757713 2025-07-20 21:38:28,445 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757714_16890 src: /192.168.158.1:37556 dest: /192.168.158.4:9866 2025-07-20 21:38:28,478 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37556, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1729959327_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757714_16890, duration(ns): 24436673 2025-07-20 21:38:28,478 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757714_16890, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-20 21:38:31,793 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757714_16890 replica FinalizedReplica, blk_1073757714_16890, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757714 for deletion 2025-07-20 21:38:31,794 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757714_16890 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757714 2025-07-20 21:40:33,447 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757716_16892 src: /192.168.158.7:59258 dest: /192.168.158.4:9866 2025-07-20 21:40:33,468 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:59258, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_773397018_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757716_16892, duration(ns): 18047264 2025-07-20 21:40:33,469 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757716_16892, type=LAST_IN_PIPELINE terminating 2025-07-20 21:40:37,798 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757716_16892 replica FinalizedReplica, blk_1073757716_16892, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757716 for deletion 2025-07-20 21:40:37,800 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757716_16892 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757716 2025-07-20 21:41:33,488 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757717_16893 src: /192.168.158.1:36028 dest: /192.168.158.4:9866 2025-07-20 21:41:33,523 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36028, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1050231692_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757717_16893, duration(ns): 25844470 2025-07-20 21:41:33,523 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757717_16893, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-20 21:41:37,800 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757717_16893 replica FinalizedReplica, blk_1073757717_16893, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757717 for deletion 2025-07-20 21:41:37,802 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757717_16893 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757717 2025-07-20 21:43:33,475 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757719_16895 src: /192.168.158.6:58484 dest: /192.168.158.4:9866 2025-07-20 21:43:33,503 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:58484, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1184065616_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757719_16895, duration(ns): 21931766 2025-07-20 21:43:33,503 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757719_16895, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-20 21:43:40,802 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757719_16895 replica FinalizedReplica, blk_1073757719_16895, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757719 for deletion 2025-07-20 21:43:40,804 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757719_16895 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757719 2025-07-20 21:50:43,482 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757726_16902 src: /192.168.158.1:38042 dest: /192.168.158.4:9866 2025-07-20 21:50:43,516 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38042, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-147414988_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757726_16902, duration(ns): 24957170 2025-07-20 21:50:43,517 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757726_16902, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-20 21:50:46,819 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757726_16902 replica FinalizedReplica, blk_1073757726_16902, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757726 for deletion 2025-07-20 21:50:46,820 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757726_16902 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757726 2025-07-20 21:51:43,461 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757727_16903 src: /192.168.158.1:56424 dest: /192.168.158.4:9866 2025-07-20 21:51:43,495 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56424, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1418464816_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757727_16903, duration(ns): 25172958 2025-07-20 21:51:43,495 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757727_16903, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-20 21:51:46,823 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757727_16903 replica FinalizedReplica, blk_1073757727_16903, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757727 for deletion 2025-07-20 21:51:46,824 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757727_16903 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757727 2025-07-20 21:56:53,496 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757732_16908 src: /192.168.158.8:52430 dest: /192.168.158.4:9866 2025-07-20 21:56:53,518 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:52430, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_436511726_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757732_16908, duration(ns): 19885157 2025-07-20 21:56:53,518 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757732_16908, type=LAST_IN_PIPELINE terminating 2025-07-20 21:56:58,831 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757732_16908 replica FinalizedReplica, blk_1073757732_16908, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757732 for deletion 2025-07-20 21:56:58,832 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757732_16908 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757732 2025-07-20 21:57:53,519 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757733_16909 src: /192.168.158.8:53718 dest: /192.168.158.4:9866 2025-07-20 21:57:53,540 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:53718, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1775771604_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757733_16909, duration(ns): 18709232 2025-07-20 21:57:53,540 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757733_16909, type=LAST_IN_PIPELINE terminating 2025-07-20 21:57:58,832 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757733_16909 replica FinalizedReplica, blk_1073757733_16909, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757733 for deletion 2025-07-20 21:57:58,833 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757733_16909 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757733 2025-07-20 21:58:53,489 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757734_16910 src: /192.168.158.6:58460 dest: /192.168.158.4:9866 2025-07-20 21:58:53,516 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:58460, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1827484939_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757734_16910, duration(ns): 21627820 2025-07-20 21:58:53,516 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757734_16910, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-20 21:59:01,837 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757734_16910 replica FinalizedReplica, blk_1073757734_16910, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757734 for deletion 2025-07-20 21:59:01,838 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757734_16910 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757734 2025-07-20 21:59:19,841 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Successfully sent block report 0x12cfd9a757d31f53, containing 4 storage report(s), of which we sent 4. The reports had 8 total blocks and used 1 RPC(s). This took 0 msec to generate and 4 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5. 2025-07-20 21:59:19,841 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Got finalize command for block pool BP-1059995147-192.168.158.1-1752101929360 2025-07-20 21:59:53,500 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757735_16911 src: /192.168.158.9:39578 dest: /192.168.158.4:9866 2025-07-20 21:59:53,527 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:39578, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_319041152_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757735_16911, duration(ns): 21376278 2025-07-20 21:59:53,528 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757735_16911, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-20 21:59:58,838 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757735_16911 replica FinalizedReplica, blk_1073757735_16911, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757735 for deletion 2025-07-20 21:59:58,840 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757735_16911 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757735 2025-07-20 22:00:53,505 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757736_16912 src: /192.168.158.8:50756 dest: /192.168.158.4:9866 2025-07-20 22:00:53,524 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:50756, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1057637535_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757736_16912, duration(ns): 16670572 2025-07-20 22:00:53,525 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757736_16912, type=LAST_IN_PIPELINE terminating 2025-07-20 22:00:58,839 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757736_16912 replica FinalizedReplica, blk_1073757736_16912, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757736 for deletion 2025-07-20 22:00:58,841 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757736_16912 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757736 2025-07-20 22:01:53,501 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757737_16913 src: /192.168.158.1:46410 dest: /192.168.158.4:9866 2025-07-20 22:01:53,535 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46410, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1740108325_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757737_16913, duration(ns): 24498232 2025-07-20 22:01:53,535 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757737_16913, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-20 22:01:58,841 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757737_16913 replica FinalizedReplica, blk_1073757737_16913, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757737 for deletion 2025-07-20 22:01:58,843 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757737_16913 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757737 2025-07-20 22:03:58,509 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757739_16915 src: /192.168.158.1:32850 dest: /192.168.158.4:9866 2025-07-20 22:03:58,544 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:32850, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-442563755_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757739_16915, duration(ns): 26359983 2025-07-20 22:03:58,545 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757739_16915, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-20 22:04:01,842 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757739_16915 replica FinalizedReplica, blk_1073757739_16915, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757739 for deletion 2025-07-20 22:04:01,844 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757739_16915 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757739 2025-07-20 22:04:58,521 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757740_16916 src: /192.168.158.9:60122 dest: /192.168.158.4:9866 2025-07-20 22:04:58,541 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:60122, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1271813192_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757740_16916, duration(ns): 17878016 2025-07-20 22:04:58,541 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757740_16916, type=LAST_IN_PIPELINE terminating 2025-07-20 22:05:01,843 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757740_16916 replica FinalizedReplica, blk_1073757740_16916, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757740 for deletion 2025-07-20 22:05:01,844 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757740_16916 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757740 2025-07-20 22:05:58,516 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757741_16917 src: /192.168.158.1:59286 dest: /192.168.158.4:9866 2025-07-20 22:05:58,551 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59286, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_794112323_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757741_16917, duration(ns): 26449274 2025-07-20 22:05:58,552 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757741_16917, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-20 22:06:01,847 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757741_16917 replica FinalizedReplica, blk_1073757741_16917, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757741 for deletion 2025-07-20 22:06:01,848 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757741_16917 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757741 2025-07-20 22:06:58,526 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757742_16918 src: /192.168.158.8:41464 dest: /192.168.158.4:9866 2025-07-20 22:06:58,553 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:41464, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-818457584_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757742_16918, duration(ns): 21017109 2025-07-20 22:06:58,553 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757742_16918, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-20 22:07:04,850 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757742_16918 replica FinalizedReplica, blk_1073757742_16918, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757742 for deletion 2025-07-20 22:07:04,851 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757742_16918 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757742 2025-07-20 22:07:58,537 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757743_16919 src: /192.168.158.1:37074 dest: /192.168.158.4:9866 2025-07-20 22:07:58,576 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37074, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-552003518_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757743_16919, duration(ns): 29048428 2025-07-20 22:07:58,576 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757743_16919, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-20 22:08:01,851 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757743_16919 replica FinalizedReplica, blk_1073757743_16919, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757743 for deletion 2025-07-20 22:08:01,852 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757743_16919 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757743 2025-07-20 22:08:58,518 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757744_16920 src: /192.168.158.7:44170 dest: /192.168.158.4:9866 2025-07-20 22:08:58,539 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:44170, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1554028941_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757744_16920, duration(ns): 18942570 2025-07-20 22:08:58,540 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757744_16920, type=LAST_IN_PIPELINE terminating 2025-07-20 22:09:01,855 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757744_16920 replica FinalizedReplica, blk_1073757744_16920, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757744 for deletion 2025-07-20 22:09:01,856 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757744_16920 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757744 2025-07-20 22:09:58,520 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757745_16921 src: /192.168.158.1:36226 dest: /192.168.158.4:9866 2025-07-20 22:09:58,556 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36226, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-535385682_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757745_16921, duration(ns): 25471259 2025-07-20 22:09:58,556 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757745_16921, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-20 22:10:01,857 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757745_16921 replica FinalizedReplica, blk_1073757745_16921, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757745 for deletion 2025-07-20 22:10:01,859 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757745_16921 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757745 2025-07-20 22:13:03,529 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757748_16924 src: /192.168.158.9:33932 dest: /192.168.158.4:9866 2025-07-20 22:13:03,556 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:33932, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1880449019_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757748_16924, duration(ns): 21725365 2025-07-20 22:13:03,557 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757748_16924, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-20 22:13:10,868 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757748_16924 replica FinalizedReplica, blk_1073757748_16924, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757748 for deletion 2025-07-20 22:13:10,869 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757748_16924 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757748 2025-07-20 22:15:03,525 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757750_16926 src: /192.168.158.1:58606 dest: /192.168.158.4:9866 2025-07-20 22:15:03,561 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58606, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1663888000_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757750_16926, duration(ns): 26349198 2025-07-20 22:15:03,561 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757750_16926, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-20 22:15:10,875 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757750_16926 replica FinalizedReplica, blk_1073757750_16926, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757750 for deletion 2025-07-20 22:15:10,876 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757750_16926 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757750 2025-07-20 22:17:08,548 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757752_16928 src: /192.168.158.8:49416 dest: /192.168.158.4:9866 2025-07-20 22:17:08,574 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:49416, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1923878431_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757752_16928, duration(ns): 20973227 2025-07-20 22:17:08,575 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757752_16928, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-20 22:17:13,879 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757752_16928 replica FinalizedReplica, blk_1073757752_16928, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757752 for deletion 2025-07-20 22:17:13,880 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757752_16928 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757752 2025-07-20 22:18:08,516 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757753_16929 src: /192.168.158.7:55232 dest: /192.168.158.4:9866 2025-07-20 22:18:08,535 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:55232, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-583494026_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757753_16929, duration(ns): 16961344 2025-07-20 22:18:08,535 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757753_16929, type=LAST_IN_PIPELINE terminating 2025-07-20 22:18:13,882 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757753_16929 replica FinalizedReplica, blk_1073757753_16929, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757753 for deletion 2025-07-20 22:18:13,883 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757753_16929 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757753 2025-07-20 22:19:08,524 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757754_16930 src: /192.168.158.8:56292 dest: /192.168.158.4:9866 2025-07-20 22:19:08,544 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:56292, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1810574195_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757754_16930, duration(ns): 17116916 2025-07-20 22:19:08,544 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757754_16930, type=LAST_IN_PIPELINE terminating 2025-07-20 22:19:13,882 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757754_16930 replica FinalizedReplica, blk_1073757754_16930, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757754 for deletion 2025-07-20 22:19:13,884 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757754_16930 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757754 2025-07-20 22:24:18,547 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757759_16935 src: /192.168.158.1:45316 dest: /192.168.158.4:9866 2025-07-20 22:24:18,579 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45316, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1572168636_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757759_16935, duration(ns): 23134504 2025-07-20 22:24:18,579 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757759_16935, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-20 22:24:25,892 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757759_16935 replica FinalizedReplica, blk_1073757759_16935, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757759 for deletion 2025-07-20 22:24:25,893 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757759_16935 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757759 2025-07-20 22:25:18,544 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757760_16936 src: /192.168.158.1:51462 dest: /192.168.158.4:9866 2025-07-20 22:25:18,577 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51462, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-274391688_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757760_16936, duration(ns): 23591151 2025-07-20 22:25:18,577 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757760_16936, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-20 22:25:25,892 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757760_16936 replica FinalizedReplica, blk_1073757760_16936, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757760 for deletion 2025-07-20 22:25:25,893 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757760_16936 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757760 2025-07-20 22:27:18,562 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757762_16938 src: /192.168.158.6:38198 dest: /192.168.158.4:9866 2025-07-20 22:27:18,589 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:38198, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1933542690_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757762_16938, duration(ns): 21735555 2025-07-20 22:27:18,589 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757762_16938, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-20 22:27:25,895 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757762_16938 replica FinalizedReplica, blk_1073757762_16938, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757762 for deletion 2025-07-20 22:27:25,896 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757762_16938 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757762 2025-07-20 22:28:23,537 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757763_16939 src: /192.168.158.6:58542 dest: /192.168.158.4:9866 2025-07-20 22:28:23,565 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:58542, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1753968661_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757763_16939, duration(ns): 22594930 2025-07-20 22:28:23,565 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757763_16939, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-20 22:28:28,898 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757763_16939 replica FinalizedReplica, blk_1073757763_16939, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757763 for deletion 2025-07-20 22:28:28,899 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757763_16939 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757763 2025-07-20 22:30:23,536 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757765_16941 src: /192.168.158.6:44158 dest: /192.168.158.4:9866 2025-07-20 22:30:23,556 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:44158, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2136914363_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757765_16941, duration(ns): 17119981 2025-07-20 22:30:23,556 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757765_16941, type=LAST_IN_PIPELINE terminating 2025-07-20 22:30:25,902 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757765_16941 replica FinalizedReplica, blk_1073757765_16941, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757765 for deletion 2025-07-20 22:30:25,903 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757765_16941 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757765 2025-07-20 22:32:28,545 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757767_16943 src: /192.168.158.7:44452 dest: /192.168.158.4:9866 2025-07-20 22:32:28,572 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:44452, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_986833505_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757767_16943, duration(ns): 21707946 2025-07-20 22:32:28,572 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757767_16943, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-20 22:32:31,909 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757767_16943 replica FinalizedReplica, blk_1073757767_16943, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757767 for deletion 2025-07-20 22:32:31,910 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757767_16943 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757767 2025-07-20 22:33:28,539 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757768_16944 src: /192.168.158.5:45026 dest: /192.168.158.4:9866 2025-07-20 22:33:28,565 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:45026, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1350523537_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757768_16944, duration(ns): 21075616 2025-07-20 22:33:28,565 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757768_16944, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-20 22:33:31,910 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757768_16944 replica FinalizedReplica, blk_1073757768_16944, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757768 for deletion 2025-07-20 22:33:31,913 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757768_16944 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757768 2025-07-20 22:34:33,543 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757769_16945 src: /192.168.158.7:42960 dest: /192.168.158.4:9866 2025-07-20 22:34:33,571 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:42960, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1568146853_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757769_16945, duration(ns): 22166492 2025-07-20 22:34:33,571 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757769_16945, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-20 22:34:40,912 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757769_16945 replica FinalizedReplica, blk_1073757769_16945, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757769 for deletion 2025-07-20 22:34:40,913 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757769_16945 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757769 2025-07-20 22:35:33,552 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757770_16946 src: /192.168.158.5:48990 dest: /192.168.158.4:9866 2025-07-20 22:35:33,573 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:48990, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_55759306_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757770_16946, duration(ns): 19584452 2025-07-20 22:35:33,574 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757770_16946, type=LAST_IN_PIPELINE terminating 2025-07-20 22:35:37,914 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757770_16946 replica FinalizedReplica, blk_1073757770_16946, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757770 for deletion 2025-07-20 22:35:37,915 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757770_16946 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757770 2025-07-20 22:36:33,539 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757771_16947 src: /192.168.158.1:49952 dest: /192.168.158.4:9866 2025-07-20 22:36:33,573 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49952, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_454670399_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757771_16947, duration(ns): 24775643 2025-07-20 22:36:33,573 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757771_16947, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-20 22:36:40,914 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757771_16947 replica FinalizedReplica, blk_1073757771_16947, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757771 for deletion 2025-07-20 22:36:40,915 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757771_16947 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757771 2025-07-20 22:38:38,569 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757773_16949 src: /192.168.158.6:54732 dest: /192.168.158.4:9866 2025-07-20 22:38:38,595 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:54732, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1565474658_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757773_16949, duration(ns): 19850079 2025-07-20 22:38:38,595 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757773_16949, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-20 22:38:40,917 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757773_16949 replica FinalizedReplica, blk_1073757773_16949, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757773 for deletion 2025-07-20 22:38:40,918 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757773_16949 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757773 2025-07-20 22:39:38,549 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757774_16950 src: /192.168.158.6:35026 dest: /192.168.158.4:9866 2025-07-20 22:39:38,567 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:35026, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_539548432_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757774_16950, duration(ns): 16063876 2025-07-20 22:39:38,567 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757774_16950, type=LAST_IN_PIPELINE terminating 2025-07-20 22:39:40,919 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757774_16950 replica FinalizedReplica, blk_1073757774_16950, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757774 for deletion 2025-07-20 22:39:40,921 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757774_16950 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757774 2025-07-20 22:41:43,552 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757776_16952 src: /192.168.158.7:50710 dest: /192.168.158.4:9866 2025-07-20 22:41:43,570 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:50710, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_13445287_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757776_16952, duration(ns): 16650309 2025-07-20 22:41:43,571 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757776_16952, type=LAST_IN_PIPELINE terminating 2025-07-20 22:41:46,919 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757776_16952 replica FinalizedReplica, blk_1073757776_16952, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757776 for deletion 2025-07-20 22:41:46,920 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757776_16952 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757776 2025-07-20 22:44:48,551 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757779_16955 src: /192.168.158.1:52570 dest: /192.168.158.4:9866 2025-07-20 22:44:48,586 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52570, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_741489090_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757779_16955, duration(ns): 26235277 2025-07-20 22:44:48,586 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757779_16955, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-20 22:44:52,927 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757779_16955 replica FinalizedReplica, blk_1073757779_16955, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757779 for deletion 2025-07-20 22:44:52,929 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757779_16955 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757779 2025-07-20 22:45:48,567 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757780_16956 src: /192.168.158.8:46896 dest: /192.168.158.4:9866 2025-07-20 22:45:48,598 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:46896, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1178907492_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757780_16956, duration(ns): 23082956 2025-07-20 22:45:48,598 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757780_16956, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-20 22:45:52,930 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757780_16956 replica FinalizedReplica, blk_1073757780_16956, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757780 for deletion 2025-07-20 22:45:52,931 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757780_16956 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757780 2025-07-20 22:46:48,561 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757781_16957 src: /192.168.158.8:36036 dest: /192.168.158.4:9866 2025-07-20 22:46:48,590 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:36036, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1535285172_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757781_16957, duration(ns): 23879975 2025-07-20 22:46:48,591 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757781_16957, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-20 22:46:52,933 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757781_16957 replica FinalizedReplica, blk_1073757781_16957, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757781 for deletion 2025-07-20 22:46:52,934 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757781_16957 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757781 2025-07-20 22:48:53,591 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757783_16959 src: /192.168.158.5:55534 dest: /192.168.158.4:9866 2025-07-20 22:48:53,614 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:55534, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1473586625_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757783_16959, duration(ns): 21422847 2025-07-20 22:48:53,614 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757783_16959, type=LAST_IN_PIPELINE terminating 2025-07-20 22:48:55,938 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757783_16959 replica FinalizedReplica, blk_1073757783_16959, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757783 for deletion 2025-07-20 22:48:55,940 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757783_16959 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757783 2025-07-20 22:53:58,593 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757788_16964 src: /192.168.158.6:35330 dest: /192.168.158.4:9866 2025-07-20 22:53:58,611 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:35330, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1230751162_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757788_16964, duration(ns): 15736671 2025-07-20 22:53:58,612 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757788_16964, type=LAST_IN_PIPELINE terminating 2025-07-20 22:54:04,953 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757788_16964 replica FinalizedReplica, blk_1073757788_16964, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757788 for deletion 2025-07-20 22:54:04,954 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757788_16964 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757788 2025-07-20 22:54:58,588 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757789_16965 src: /192.168.158.5:42516 dest: /192.168.158.4:9866 2025-07-20 22:54:58,607 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:42516, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_331271689_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757789_16965, duration(ns): 17293549 2025-07-20 22:54:58,608 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757789_16965, type=LAST_IN_PIPELINE terminating 2025-07-20 22:55:04,956 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757789_16965 replica FinalizedReplica, blk_1073757789_16965, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757789 for deletion 2025-07-20 22:55:04,957 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757789_16965 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757789 2025-07-20 22:57:58,585 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757792_16968 src: /192.168.158.1:43664 dest: /192.168.158.4:9866 2025-07-20 22:57:58,620 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43664, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-129099603_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757792_16968, duration(ns): 26016639 2025-07-20 22:57:58,620 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757792_16968, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-20 22:58:04,961 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757792_16968 replica FinalizedReplica, blk_1073757792_16968, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757792 for deletion 2025-07-20 22:58:04,962 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757792_16968 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757792 2025-07-20 22:58:58,580 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757793_16969 src: /192.168.158.6:56508 dest: /192.168.158.4:9866 2025-07-20 22:58:58,608 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:56508, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-896575211_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757793_16969, duration(ns): 22698105 2025-07-20 22:58:58,608 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757793_16969, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-20 22:59:04,965 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757793_16969 replica FinalizedReplica, blk_1073757793_16969, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757793 for deletion 2025-07-20 22:59:04,966 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757793_16969 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757793 2025-07-20 23:01:58,598 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757796_16972 src: /192.168.158.8:50170 dest: /192.168.158.4:9866 2025-07-20 23:01:58,619 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:50170, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1170626989_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757796_16972, duration(ns): 18179081 2025-07-20 23:01:58,619 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757796_16972, type=LAST_IN_PIPELINE terminating 2025-07-20 23:02:01,970 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757796_16972 replica FinalizedReplica, blk_1073757796_16972, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757796 for deletion 2025-07-20 23:02:01,971 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757796_16972 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757796 2025-07-20 23:03:03,570 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757797_16973 src: /192.168.158.5:55864 dest: /192.168.158.4:9866 2025-07-20 23:03:03,591 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:55864, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1874888694_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757797_16973, duration(ns): 18983053 2025-07-20 23:03:03,591 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757797_16973, type=LAST_IN_PIPELINE terminating 2025-07-20 23:03:10,975 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757797_16973 replica FinalizedReplica, blk_1073757797_16973, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757797 for deletion 2025-07-20 23:03:10,976 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757797_16973 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757797 2025-07-20 23:04:03,603 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757798_16974 src: /192.168.158.1:56754 dest: /192.168.158.4:9866 2025-07-20 23:04:03,636 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56754, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1863678367_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757798_16974, duration(ns): 23837895 2025-07-20 23:04:03,636 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757798_16974, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-20 23:04:07,977 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757798_16974 replica FinalizedReplica, blk_1073757798_16974, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757798 for deletion 2025-07-20 23:04:07,978 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757798_16974 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757798 2025-07-20 23:05:03,601 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757799_16975 src: /192.168.158.1:54506 dest: /192.168.158.4:9866 2025-07-20 23:05:03,634 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54506, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1745340612_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757799_16975, duration(ns): 24248150 2025-07-20 23:05:03,635 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757799_16975, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-20 23:05:10,977 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757799_16975 replica FinalizedReplica, blk_1073757799_16975, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757799 for deletion 2025-07-20 23:05:10,979 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757799_16975 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757799 2025-07-20 23:06:03,628 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757800_16976 src: /192.168.158.8:57384 dest: /192.168.158.4:9866 2025-07-20 23:06:03,648 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:57384, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2062157611_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757800_16976, duration(ns): 17946196 2025-07-20 23:06:03,649 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757800_16976, type=LAST_IN_PIPELINE terminating 2025-07-20 23:06:07,979 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757800_16976 replica FinalizedReplica, blk_1073757800_16976, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757800 for deletion 2025-07-20 23:06:07,980 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757800_16976 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757800 2025-07-20 23:08:03,604 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757802_16978 src: /192.168.158.1:34194 dest: /192.168.158.4:9866 2025-07-20 23:08:03,641 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34194, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-634495121_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757802_16978, duration(ns): 27264915 2025-07-20 23:08:03,641 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757802_16978, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-20 23:08:07,982 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757802_16978 replica FinalizedReplica, blk_1073757802_16978, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757802 for deletion 2025-07-20 23:08:07,983 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757802_16978 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757802 2025-07-20 23:09:03,623 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757803_16979 src: /192.168.158.9:41516 dest: /192.168.158.4:9866 2025-07-20 23:09:03,642 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:41516, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-251007978_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757803_16979, duration(ns): 16926442 2025-07-20 23:09:03,642 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757803_16979, type=LAST_IN_PIPELINE terminating 2025-07-20 23:09:07,984 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757803_16979 replica FinalizedReplica, blk_1073757803_16979, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757803 for deletion 2025-07-20 23:09:07,985 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757803_16979 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757803 2025-07-20 23:10:03,609 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757804_16980 src: /192.168.158.1:34342 dest: /192.168.158.4:9866 2025-07-20 23:10:03,644 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34342, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2095225896_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757804_16980, duration(ns): 25806839 2025-07-20 23:10:03,645 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757804_16980, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-20 23:10:10,986 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757804_16980 replica FinalizedReplica, blk_1073757804_16980, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757804 for deletion 2025-07-20 23:10:10,987 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757804_16980 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757804 2025-07-20 23:11:03,579 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757805_16981 src: /192.168.158.9:43368 dest: /192.168.158.4:9866 2025-07-20 23:11:03,604 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:43368, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-850412373_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757805_16981, duration(ns): 19670561 2025-07-20 23:11:03,604 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757805_16981, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-20 23:11:07,987 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757805_16981 replica FinalizedReplica, blk_1073757805_16981, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757805 for deletion 2025-07-20 23:11:07,988 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757805_16981 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757805 2025-07-20 23:12:03,581 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757806_16982 src: /192.168.158.7:34768 dest: /192.168.158.4:9866 2025-07-20 23:12:03,607 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:34768, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_75658981_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757806_16982, duration(ns): 20936382 2025-07-20 23:12:03,608 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757806_16982, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-20 23:12:10,990 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757806_16982 replica FinalizedReplica, blk_1073757806_16982, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757806 for deletion 2025-07-20 23:12:10,991 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757806_16982 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757806 2025-07-20 23:15:08,657 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757809_16985 src: /192.168.158.7:33074 dest: /192.168.158.4:9866 2025-07-20 23:15:08,683 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:33074, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_980352450_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757809_16985, duration(ns): 20540746 2025-07-20 23:15:08,683 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757809_16985, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-20 23:15:10,998 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757809_16985 replica FinalizedReplica, blk_1073757809_16985, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757809 for deletion 2025-07-20 23:15:10,999 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757809_16985 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757809 2025-07-20 23:16:08,619 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757810_16986 src: /192.168.158.8:59276 dest: /192.168.158.4:9866 2025-07-20 23:16:08,651 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:59276, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1475822446_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757810_16986, duration(ns): 26989238 2025-07-20 23:16:08,651 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757810_16986, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-20 23:16:14,002 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757810_16986 replica FinalizedReplica, blk_1073757810_16986, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757810 for deletion 2025-07-20 23:16:14,003 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757810_16986 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757810 2025-07-20 23:19:08,631 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757813_16989 src: /192.168.158.7:39144 dest: /192.168.158.4:9866 2025-07-20 23:19:08,651 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:39144, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1663068632_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757813_16989, duration(ns): 17994281 2025-07-20 23:19:08,651 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757813_16989, type=LAST_IN_PIPELINE terminating 2025-07-20 23:19:14,010 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757813_16989 replica FinalizedReplica, blk_1073757813_16989, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757813 for deletion 2025-07-20 23:19:14,011 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757813_16989 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757813 2025-07-20 23:20:08,622 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757814_16990 src: /192.168.158.9:55718 dest: /192.168.158.4:9866 2025-07-20 23:20:08,650 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:55718, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2132043780_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757814_16990, duration(ns): 21605627 2025-07-20 23:20:08,650 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757814_16990, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-20 23:20:14,013 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757814_16990 replica FinalizedReplica, blk_1073757814_16990, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757814 for deletion 2025-07-20 23:20:14,014 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757814_16990 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757814 2025-07-20 23:21:13,605 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757815_16991 src: /192.168.158.9:51394 dest: /192.168.158.4:9866 2025-07-20 23:21:13,623 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:51394, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1646053383_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757815_16991, duration(ns): 15753043 2025-07-20 23:21:13,623 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757815_16991, type=LAST_IN_PIPELINE terminating 2025-07-20 23:21:20,016 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757815_16991 replica FinalizedReplica, blk_1073757815_16991, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757815 for deletion 2025-07-20 23:21:20,017 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757815_16991 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757815 2025-07-20 23:23:13,659 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757817_16993 src: /192.168.158.1:60400 dest: /192.168.158.4:9866 2025-07-20 23:23:13,694 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60400, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_217425547_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757817_16993, duration(ns): 26374081 2025-07-20 23:23:13,695 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757817_16993, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-20 23:23:17,020 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757817_16993 replica FinalizedReplica, blk_1073757817_16993, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757817 for deletion 2025-07-20 23:23:17,021 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757817_16993 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757817 2025-07-20 23:24:18,617 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757818_16994 src: /192.168.158.1:48040 dest: /192.168.158.4:9866 2025-07-20 23:24:18,650 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48040, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-792177790_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757818_16994, duration(ns): 24731116 2025-07-20 23:24:18,651 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757818_16994, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-20 23:24:26,021 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757818_16994 replica FinalizedReplica, blk_1073757818_16994, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757818 for deletion 2025-07-20 23:24:26,022 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757818_16994 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757818 2025-07-20 23:29:18,648 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757823_16999 src: /192.168.158.8:54874 dest: /192.168.158.4:9866 2025-07-20 23:29:18,667 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:54874, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2108935451_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757823_16999, duration(ns): 16742315 2025-07-20 23:29:18,667 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757823_16999, type=LAST_IN_PIPELINE terminating 2025-07-20 23:29:26,030 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757823_16999 replica FinalizedReplica, blk_1073757823_16999, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757823 for deletion 2025-07-20 23:29:26,031 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757823_16999 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757823 2025-07-20 23:32:18,650 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757826_17002 src: /192.168.158.5:53544 dest: /192.168.158.4:9866 2025-07-20 23:32:18,669 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:53544, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-168906533_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757826_17002, duration(ns): 16806235 2025-07-20 23:32:18,670 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757826_17002, type=LAST_IN_PIPELINE terminating 2025-07-20 23:32:23,037 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757826_17002 replica FinalizedReplica, blk_1073757826_17002, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757826 for deletion 2025-07-20 23:32:23,038 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757826_17002 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757826 2025-07-20 23:33:18,653 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757827_17003 src: /192.168.158.8:42788 dest: /192.168.158.4:9866 2025-07-20 23:33:18,679 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:42788, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-25804726_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757827_17003, duration(ns): 20182139 2025-07-20 23:33:18,679 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757827_17003, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-20 23:33:23,040 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757827_17003 replica FinalizedReplica, blk_1073757827_17003, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757827 for deletion 2025-07-20 23:33:23,041 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757827_17003 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757827 2025-07-20 23:35:23,644 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757829_17005 src: /192.168.158.1:60510 dest: /192.168.158.4:9866 2025-07-20 23:35:23,675 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60510, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-256895978_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757829_17005, duration(ns): 23025402 2025-07-20 23:35:23,675 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757829_17005, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-20 23:35:26,045 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757829_17005 replica FinalizedReplica, blk_1073757829_17005, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757829 for deletion 2025-07-20 23:35:26,047 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757829_17005 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757829 2025-07-20 23:36:13,269 INFO org.apache.hadoop.hdfs.server.datanode.DirectoryScanner: BlockPool BP-1059995147-192.168.158.1-1752101929360 Total blocks: 8, missing metadata files:0, missing block files:0, missing blocks in memory:0, mismatched blocks:0 2025-07-20 23:36:24,418 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757830_17006 src: /192.168.158.1:60524 dest: /192.168.158.4:9866 2025-07-20 23:36:24,454 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60524, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1282415566_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757830_17006, duration(ns): 26805311 2025-07-20 23:36:24,454 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757830_17006, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-20 23:36:32,046 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757830_17006 replica FinalizedReplica, blk_1073757830_17006, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757830 for deletion 2025-07-20 23:36:32,047 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757830_17006 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757830 2025-07-20 23:37:23,656 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757831_17007 src: /192.168.158.5:45090 dest: /192.168.158.4:9866 2025-07-20 23:37:23,682 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:45090, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2103768283_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757831_17007, duration(ns): 19586378 2025-07-20 23:37:23,682 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757831_17007, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-20 23:37:29,048 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757831_17007 replica FinalizedReplica, blk_1073757831_17007, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757831 for deletion 2025-07-20 23:37:29,049 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757831_17007 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757831 2025-07-20 23:39:23,661 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757833_17009 src: /192.168.158.6:60472 dest: /192.168.158.4:9866 2025-07-20 23:39:23,686 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:60472, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1986166799_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757833_17009, duration(ns): 19282974 2025-07-20 23:39:23,686 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757833_17009, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-20 23:39:26,052 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757833_17009 replica FinalizedReplica, blk_1073757833_17009, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757833 for deletion 2025-07-20 23:39:26,053 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757833_17009 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757833 2025-07-20 23:40:23,655 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757834_17010 src: /192.168.158.1:48692 dest: /192.168.158.4:9866 2025-07-20 23:40:23,693 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48692, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_142178530_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757834_17010, duration(ns): 28900633 2025-07-20 23:40:23,694 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757834_17010, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-20 23:40:29,054 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757834_17010 replica FinalizedReplica, blk_1073757834_17010, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757834 for deletion 2025-07-20 23:40:29,055 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757834_17010 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757834 2025-07-20 23:44:28,664 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757838_17014 src: /192.168.158.7:38882 dest: /192.168.158.4:9866 2025-07-20 23:44:28,692 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:38882, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1148925187_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757838_17014, duration(ns): 21496322 2025-07-20 23:44:28,692 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757838_17014, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-20 23:44:32,064 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757838_17014 replica FinalizedReplica, blk_1073757838_17014, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757838 for deletion 2025-07-20 23:44:32,065 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757838_17014 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757838 2025-07-20 23:46:28,676 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757840_17016 src: /192.168.158.1:42426 dest: /192.168.158.4:9866 2025-07-20 23:46:28,713 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42426, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1320664079_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757840_17016, duration(ns): 26010951 2025-07-20 23:46:28,714 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757840_17016, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-20 23:46:32,068 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757840_17016 replica FinalizedReplica, blk_1073757840_17016, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757840 for deletion 2025-07-20 23:46:32,070 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757840_17016 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757840 2025-07-20 23:49:33,654 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757843_17019 src: /192.168.158.1:60282 dest: /192.168.158.4:9866 2025-07-20 23:49:33,688 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60282, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1914577298_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757843_17019, duration(ns): 24721420 2025-07-20 23:49:33,689 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757843_17019, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-20 23:49:38,073 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757843_17019 replica FinalizedReplica, blk_1073757843_17019, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757843 for deletion 2025-07-20 23:49:38,074 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757843_17019 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757843 2025-07-20 23:53:38,644 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757847_17023 src: /192.168.158.9:56804 dest: /192.168.158.4:9866 2025-07-20 23:53:38,670 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:56804, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_612231875_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757847_17023, duration(ns): 20347167 2025-07-20 23:53:38,670 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757847_17023, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-20 23:53:44,085 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757847_17023 replica FinalizedReplica, blk_1073757847_17023, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757847 for deletion 2025-07-20 23:53:44,086 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757847_17023 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757847 2025-07-20 23:54:38,650 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757848_17024 src: /192.168.158.8:37966 dest: /192.168.158.4:9866 2025-07-20 23:54:38,672 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:37966, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2119330675_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757848_17024, duration(ns): 19511804 2025-07-20 23:54:38,672 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757848_17024, type=LAST_IN_PIPELINE terminating 2025-07-20 23:54:41,086 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757848_17024 replica FinalizedReplica, blk_1073757848_17024, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757848 for deletion 2025-07-20 23:54:41,087 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757848_17024 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757848 2025-07-20 23:55:38,655 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757849_17025 src: /192.168.158.7:36020 dest: /192.168.158.4:9866 2025-07-20 23:55:38,674 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:36020, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-301302934_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757849_17025, duration(ns): 16577221 2025-07-20 23:55:38,674 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757849_17025, type=LAST_IN_PIPELINE terminating 2025-07-20 23:55:41,087 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757849_17025 replica FinalizedReplica, blk_1073757849_17025, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757849 for deletion 2025-07-20 23:55:41,089 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757849_17025 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757849 2025-07-20 23:56:43,650 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757850_17026 src: /192.168.158.8:37150 dest: /192.168.158.4:9866 2025-07-20 23:56:43,671 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:37150, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1512715247_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757850_17026, duration(ns): 19185888 2025-07-20 23:56:43,671 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757850_17026, type=LAST_IN_PIPELINE terminating 2025-07-20 23:56:47,088 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757850_17026 replica FinalizedReplica, blk_1073757850_17026, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757850 for deletion 2025-07-20 23:56:47,090 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757850_17026 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757850 2025-07-20 23:58:48,713 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757852_17028 src: /192.168.158.7:55418 dest: /192.168.158.4:9866 2025-07-20 23:58:48,732 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:55418, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_478618369_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757852_17028, duration(ns): 17674453 2025-07-20 23:58:48,733 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757852_17028, type=LAST_IN_PIPELINE terminating 2025-07-20 23:58:56,095 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757852_17028 replica FinalizedReplica, blk_1073757852_17028, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757852 for deletion 2025-07-20 23:58:56,096 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757852_17028 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757852 2025-07-20 23:59:48,702 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757853_17029 src: /192.168.158.7:51134 dest: /192.168.158.4:9866 2025-07-20 23:59:48,724 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:51134, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_289868286_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757853_17029, duration(ns): 18460386 2025-07-20 23:59:48,724 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757853_17029, type=LAST_IN_PIPELINE terminating 2025-07-20 23:59:53,095 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757853_17029 replica FinalizedReplica, blk_1073757853_17029, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757853 for deletion 2025-07-20 23:59:53,096 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757853_17029 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757853 2025-07-21 00:02:48,676 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757856_17032 src: /192.168.158.8:44470 dest: /192.168.158.4:9866 2025-07-21 00:02:48,704 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:44470, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-834768100_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757856_17032, duration(ns): 22503544 2025-07-21 00:02:48,704 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757856_17032, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-21 00:02:56,102 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757856_17032 replica FinalizedReplica, blk_1073757856_17032, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757856 for deletion 2025-07-21 00:02:56,103 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757856_17032 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757856 2025-07-21 00:03:53,689 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757857_17033 src: /192.168.158.5:42914 dest: /192.168.158.4:9866 2025-07-21 00:03:53,710 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:42914, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_417920294_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757857_17033, duration(ns): 19395358 2025-07-21 00:03:53,711 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757857_17033, type=LAST_IN_PIPELINE terminating 2025-07-21 00:03:59,104 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757857_17033 replica FinalizedReplica, blk_1073757857_17033, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757857 for deletion 2025-07-21 00:03:59,105 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757857_17033 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757857 2025-07-21 00:04:53,683 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757858_17034 src: /192.168.158.6:49930 dest: /192.168.158.4:9866 2025-07-21 00:04:53,702 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:49930, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2041475708_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757858_17034, duration(ns): 17383432 2025-07-21 00:04:53,703 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757858_17034, type=LAST_IN_PIPELINE terminating 2025-07-21 00:04:59,108 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757858_17034 replica FinalizedReplica, blk_1073757858_17034, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757858 for deletion 2025-07-21 00:04:59,110 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757858_17034 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757858 2025-07-21 00:05:58,690 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757859_17035 src: /192.168.158.6:46372 dest: /192.168.158.4:9866 2025-07-21 00:05:58,717 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:46372, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1810738903_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757859_17035, duration(ns): 21873309 2025-07-21 00:05:58,717 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757859_17035, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-21 00:06:02,110 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757859_17035 replica FinalizedReplica, blk_1073757859_17035, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757859 for deletion 2025-07-21 00:06:02,111 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757859_17035 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757859 2025-07-21 00:07:58,667 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757861_17037 src: /192.168.158.9:34076 dest: /192.168.158.4:9866 2025-07-21 00:07:58,693 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:34076, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_258151005_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757861_17037, duration(ns): 20159706 2025-07-21 00:07:58,693 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757861_17037, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-21 00:08:02,116 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757861_17037 replica FinalizedReplica, blk_1073757861_17037, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757861 for deletion 2025-07-21 00:08:02,117 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757861_17037 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757861 2025-07-21 00:09:03,673 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757862_17038 src: /192.168.158.6:49978 dest: /192.168.158.4:9866 2025-07-21 00:09:03,702 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:49978, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1694637173_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757862_17038, duration(ns): 22669390 2025-07-21 00:09:03,702 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757862_17038, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-21 00:09:08,119 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757862_17038 replica FinalizedReplica, blk_1073757862_17038, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757862 for deletion 2025-07-21 00:09:08,120 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757862_17038 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757862 2025-07-21 00:10:08,674 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757863_17039 src: /192.168.158.7:51858 dest: /192.168.158.4:9866 2025-07-21 00:10:08,700 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:51858, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1953742169_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757863_17039, duration(ns): 20525623 2025-07-21 00:10:08,701 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757863_17039, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-21 00:10:14,120 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757863_17039 replica FinalizedReplica, blk_1073757863_17039, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757863 for deletion 2025-07-21 00:10:14,121 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757863_17039 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757863 2025-07-21 00:14:18,681 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757867_17043 src: /192.168.158.1:58402 dest: /192.168.158.4:9866 2025-07-21 00:14:18,716 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58402, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_985274699_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757867_17043, duration(ns): 26523714 2025-07-21 00:14:18,718 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757867_17043, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-21 00:14:26,127 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757867_17043 replica FinalizedReplica, blk_1073757867_17043, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757867 for deletion 2025-07-21 00:14:26,128 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757867_17043 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757867 2025-07-21 00:17:23,679 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757870_17046 src: /192.168.158.8:53976 dest: /192.168.158.4:9866 2025-07-21 00:17:23,701 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:53976, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1831795229_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757870_17046, duration(ns): 19808436 2025-07-21 00:17:23,702 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757870_17046, type=LAST_IN_PIPELINE terminating 2025-07-21 00:17:26,134 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757870_17046 replica FinalizedReplica, blk_1073757870_17046, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757870 for deletion 2025-07-21 00:17:26,135 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757870_17046 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757870 2025-07-21 00:21:23,692 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757874_17050 src: /192.168.158.5:53398 dest: /192.168.158.4:9866 2025-07-21 00:21:23,710 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:53398, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1414659806_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757874_17050, duration(ns): 16156205 2025-07-21 00:21:23,710 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757874_17050, type=LAST_IN_PIPELINE terminating 2025-07-21 00:21:29,138 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757874_17050 replica FinalizedReplica, blk_1073757874_17050, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757874 for deletion 2025-07-21 00:21:29,139 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757874_17050 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757874 2025-07-21 00:22:23,690 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757875_17051 src: /192.168.158.9:49716 dest: /192.168.158.4:9866 2025-07-21 00:22:23,718 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:49716, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1755413173_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757875_17051, duration(ns): 22773629 2025-07-21 00:22:23,719 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757875_17051, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-21 00:22:26,137 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757875_17051 replica FinalizedReplica, blk_1073757875_17051, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757875 for deletion 2025-07-21 00:22:26,139 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757875_17051 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757875 2025-07-21 00:24:23,688 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757877_17053 src: /192.168.158.1:44044 dest: /192.168.158.4:9866 2025-07-21 00:24:23,724 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44044, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2104502963_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757877_17053, duration(ns): 26898239 2025-07-21 00:24:23,725 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757877_17053, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-21 00:24:29,140 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757877_17053 replica FinalizedReplica, blk_1073757877_17053, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757877 for deletion 2025-07-21 00:24:29,141 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757877_17053 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757877 2025-07-21 00:25:23,696 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757878_17054 src: /192.168.158.8:41944 dest: /192.168.158.4:9866 2025-07-21 00:25:23,716 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:41944, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-473493559_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757878_17054, duration(ns): 18035402 2025-07-21 00:25:23,716 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757878_17054, type=LAST_IN_PIPELINE terminating 2025-07-21 00:25:26,141 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757878_17054 replica FinalizedReplica, blk_1073757878_17054, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757878 for deletion 2025-07-21 00:25:26,142 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757878_17054 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757878 2025-07-21 00:27:23,700 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757880_17056 src: /192.168.158.9:60040 dest: /192.168.158.4:9866 2025-07-21 00:27:23,726 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:60040, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1011370400_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757880_17056, duration(ns): 20838508 2025-07-21 00:27:23,726 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757880_17056, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-21 00:27:26,144 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757880_17056 replica FinalizedReplica, blk_1073757880_17056, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757880 for deletion 2025-07-21 00:27:26,145 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757880_17056 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757880 2025-07-21 00:28:23,706 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757881_17057 src: /192.168.158.9:36654 dest: /192.168.158.4:9866 2025-07-21 00:28:23,725 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:36654, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1419921284_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757881_17057, duration(ns): 16787419 2025-07-21 00:28:23,725 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757881_17057, type=LAST_IN_PIPELINE terminating 2025-07-21 00:28:26,149 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757881_17057 replica FinalizedReplica, blk_1073757881_17057, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757881 for deletion 2025-07-21 00:28:26,150 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757881_17057 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757881 2025-07-21 00:32:23,705 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757885_17061 src: /192.168.158.1:60796 dest: /192.168.158.4:9866 2025-07-21 00:32:23,740 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60796, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_756734355_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757885_17061, duration(ns): 26499395 2025-07-21 00:32:23,740 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757885_17061, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-21 00:32:29,153 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757885_17061 replica FinalizedReplica, blk_1073757885_17061, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757885 for deletion 2025-07-21 00:32:29,154 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757885_17061 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757885 2025-07-21 00:34:28,716 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757887_17063 src: /192.168.158.1:56754 dest: /192.168.158.4:9866 2025-07-21 00:34:28,748 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56754, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1517910389_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757887_17063, duration(ns): 23551484 2025-07-21 00:34:28,748 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757887_17063, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-21 00:34:32,160 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757887_17063 replica FinalizedReplica, blk_1073757887_17063, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757887 for deletion 2025-07-21 00:34:32,161 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757887_17063 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757887 2025-07-21 00:35:28,728 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757888_17064 src: /192.168.158.1:39888 dest: /192.168.158.4:9866 2025-07-21 00:35:28,763 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39888, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1779973773_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757888_17064, duration(ns): 25339078 2025-07-21 00:35:28,763 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757888_17064, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-21 00:35:32,163 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757888_17064 replica FinalizedReplica, blk_1073757888_17064, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757888 for deletion 2025-07-21 00:35:32,164 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757888_17064 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757888 2025-07-21 00:36:28,722 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757889_17065 src: /192.168.158.7:41822 dest: /192.168.158.4:9866 2025-07-21 00:36:28,749 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:41822, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-195903409_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757889_17065, duration(ns): 22022410 2025-07-21 00:36:28,750 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757889_17065, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-21 00:36:32,164 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757889_17065 replica FinalizedReplica, blk_1073757889_17065, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757889 for deletion 2025-07-21 00:36:32,166 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757889_17065 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757889 2025-07-21 00:37:28,729 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757890_17066 src: /192.168.158.6:33238 dest: /192.168.158.4:9866 2025-07-21 00:37:28,758 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:33238, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_829791881_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757890_17066, duration(ns): 23009669 2025-07-21 00:37:28,759 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757890_17066, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-21 00:37:35,165 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757890_17066 replica FinalizedReplica, blk_1073757890_17066, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757890 for deletion 2025-07-21 00:37:35,166 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757890_17066 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757890 2025-07-21 00:39:28,728 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757892_17068 src: /192.168.158.5:40000 dest: /192.168.158.4:9866 2025-07-21 00:39:28,756 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:40000, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1174118786_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757892_17068, duration(ns): 22965073 2025-07-21 00:39:28,758 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757892_17068, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-21 00:39:35,169 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757892_17068 replica FinalizedReplica, blk_1073757892_17068, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757892 for deletion 2025-07-21 00:39:35,171 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757892_17068 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757892 2025-07-21 00:40:28,731 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757893_17069 src: /192.168.158.7:59128 dest: /192.168.158.4:9866 2025-07-21 00:40:28,751 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:59128, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1713093053_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757893_17069, duration(ns): 17730511 2025-07-21 00:40:28,751 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757893_17069, type=LAST_IN_PIPELINE terminating 2025-07-21 00:40:35,171 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757893_17069 replica FinalizedReplica, blk_1073757893_17069, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757893 for deletion 2025-07-21 00:40:35,172 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757893_17069 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757893 2025-07-21 00:42:28,734 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757895_17071 src: /192.168.158.8:56370 dest: /192.168.158.4:9866 2025-07-21 00:42:28,753 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:56370, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_989978854_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757895_17071, duration(ns): 16480224 2025-07-21 00:42:28,753 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757895_17071, type=LAST_IN_PIPELINE terminating 2025-07-21 00:42:35,173 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757895_17071 replica FinalizedReplica, blk_1073757895_17071, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757895 for deletion 2025-07-21 00:42:35,174 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757895_17071 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757895 2025-07-21 00:44:38,739 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757897_17073 src: /192.168.158.7:59792 dest: /192.168.158.4:9866 2025-07-21 00:44:38,758 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:59792, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_110119243_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757897_17073, duration(ns): 16936359 2025-07-21 00:44:38,758 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757897_17073, type=LAST_IN_PIPELINE terminating 2025-07-21 00:44:41,178 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757897_17073 replica FinalizedReplica, blk_1073757897_17073, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757897 for deletion 2025-07-21 00:44:41,179 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757897_17073 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757897 2025-07-21 00:46:48,735 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757899_17075 src: /192.168.158.7:49468 dest: /192.168.158.4:9866 2025-07-21 00:46:48,754 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:49468, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1019051331_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757899_17075, duration(ns): 16633577 2025-07-21 00:46:48,754 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757899_17075, type=LAST_IN_PIPELINE terminating 2025-07-21 00:46:56,181 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757899_17075 replica FinalizedReplica, blk_1073757899_17075, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757899 for deletion 2025-07-21 00:46:56,182 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757899_17075 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757899 2025-07-21 00:47:48,746 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757900_17076 src: /192.168.158.1:38742 dest: /192.168.158.4:9866 2025-07-21 00:47:48,780 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38742, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_145072845_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757900_17076, duration(ns): 24677284 2025-07-21 00:47:48,780 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757900_17076, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-21 00:47:56,184 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757900_17076 replica FinalizedReplica, blk_1073757900_17076, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757900 for deletion 2025-07-21 00:47:56,185 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757900_17076 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757900 2025-07-21 00:48:48,740 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757901_17077 src: /192.168.158.6:41628 dest: /192.168.158.4:9866 2025-07-21 00:48:48,759 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:41628, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-797363811_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757901_17077, duration(ns): 17045286 2025-07-21 00:48:48,759 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757901_17077, type=LAST_IN_PIPELINE terminating 2025-07-21 00:48:53,185 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757901_17077 replica FinalizedReplica, blk_1073757901_17077, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757901 for deletion 2025-07-21 00:48:53,186 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757901_17077 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757901 2025-07-21 00:49:48,769 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757902_17078 src: /192.168.158.8:60990 dest: /192.168.158.4:9866 2025-07-21 00:49:48,790 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:60990, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-270082894_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757902_17078, duration(ns): 18717139 2025-07-21 00:49:48,790 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757902_17078, type=LAST_IN_PIPELINE terminating 2025-07-21 00:49:56,187 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757902_17078 replica FinalizedReplica, blk_1073757902_17078, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757902 for deletion 2025-07-21 00:49:56,188 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757902_17078 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757902 2025-07-21 00:53:48,751 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757906_17082 src: /192.168.158.6:57526 dest: /192.168.158.4:9866 2025-07-21 00:53:48,772 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:57526, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2072959433_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757906_17082, duration(ns): 18557455 2025-07-21 00:53:48,772 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757906_17082, type=LAST_IN_PIPELINE terminating 2025-07-21 00:53:53,197 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757906_17082 replica FinalizedReplica, blk_1073757906_17082, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757906 for deletion 2025-07-21 00:53:53,198 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757906_17082 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757906 2025-07-21 00:54:53,753 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757907_17083 src: /192.168.158.5:47494 dest: /192.168.158.4:9866 2025-07-21 00:54:53,781 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:47494, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1275779209_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757907_17083, duration(ns): 21699844 2025-07-21 00:54:53,781 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757907_17083, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-21 00:54:56,198 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757907_17083 replica FinalizedReplica, blk_1073757907_17083, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757907 for deletion 2025-07-21 00:54:56,199 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757907_17083 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757907 2025-07-21 00:59:03,750 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757911_17087 src: /192.168.158.5:54824 dest: /192.168.158.4:9866 2025-07-21 00:59:03,771 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:54824, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1905242985_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757911_17087, duration(ns): 16925039 2025-07-21 00:59:03,771 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757911_17087, type=LAST_IN_PIPELINE terminating 2025-07-21 00:59:08,207 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757911_17087 replica FinalizedReplica, blk_1073757911_17087, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757911 for deletion 2025-07-21 00:59:08,208 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757911_17087 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757911 2025-07-21 01:01:13,750 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757913_17089 src: /192.168.158.8:50768 dest: /192.168.158.4:9866 2025-07-21 01:01:13,777 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:50768, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2141931239_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757913_17089, duration(ns): 21210678 2025-07-21 01:01:13,777 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757913_17089, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-21 01:01:20,208 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757913_17089 replica FinalizedReplica, blk_1073757913_17089, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757913 for deletion 2025-07-21 01:01:20,209 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757913_17089 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757913 2025-07-21 01:02:18,744 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757914_17090 src: /192.168.158.1:58344 dest: /192.168.158.4:9866 2025-07-21 01:02:18,777 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58344, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-839067701_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757914_17090, duration(ns): 23445868 2025-07-21 01:02:18,777 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757914_17090, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-21 01:02:26,209 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757914_17090 replica FinalizedReplica, blk_1073757914_17090, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757914 for deletion 2025-07-21 01:02:26,210 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757914_17090 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757914 2025-07-21 01:04:18,755 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757916_17092 src: /192.168.158.9:45518 dest: /192.168.158.4:9866 2025-07-21 01:04:18,780 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:45518, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1762251875_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757916_17092, duration(ns): 20018640 2025-07-21 01:04:18,781 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757916_17092, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-21 01:04:23,212 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757916_17092 replica FinalizedReplica, blk_1073757916_17092, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757916 for deletion 2025-07-21 01:04:23,213 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757916_17092 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757916 2025-07-21 01:07:28,767 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757919_17095 src: /192.168.158.8:52562 dest: /192.168.158.4:9866 2025-07-21 01:07:28,784 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:52562, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-681011360_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757919_17095, duration(ns): 15091298 2025-07-21 01:07:28,784 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757919_17095, type=LAST_IN_PIPELINE terminating 2025-07-21 01:07:32,216 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757919_17095 replica FinalizedReplica, blk_1073757919_17095, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757919 for deletion 2025-07-21 01:07:32,217 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757919_17095 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757919 2025-07-21 01:10:33,797 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757922_17098 src: /192.168.158.9:39174 dest: /192.168.158.4:9866 2025-07-21 01:10:33,824 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:39174, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1464887691_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757922_17098, duration(ns): 20915647 2025-07-21 01:10:33,824 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757922_17098, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-21 01:10:38,223 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757922_17098 replica FinalizedReplica, blk_1073757922_17098, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757922 for deletion 2025-07-21 01:10:38,225 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757922_17098 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757922 2025-07-21 01:11:33,765 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757923_17099 src: /192.168.158.9:57472 dest: /192.168.158.4:9866 2025-07-21 01:11:33,794 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:57472, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1792938514_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757923_17099, duration(ns): 22935585 2025-07-21 01:11:33,794 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757923_17099, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-21 01:11:38,224 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757923_17099 replica FinalizedReplica, blk_1073757923_17099, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757923 for deletion 2025-07-21 01:11:38,225 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757923_17099 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757923 2025-07-21 01:18:43,770 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757930_17106 src: /192.168.158.5:47128 dest: /192.168.158.4:9866 2025-07-21 01:18:43,797 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:47128, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1079900772_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757930_17106, duration(ns): 21840932 2025-07-21 01:18:43,797 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757930_17106, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-21 01:18:50,233 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757930_17106 replica FinalizedReplica, blk_1073757930_17106, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757930 for deletion 2025-07-21 01:18:50,234 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757930_17106 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757930 2025-07-21 01:19:43,772 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757931_17107 src: /192.168.158.9:60160 dest: /192.168.158.4:9866 2025-07-21 01:19:43,793 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:60160, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1796896311_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757931_17107, duration(ns): 19030601 2025-07-21 01:19:43,794 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757931_17107, type=LAST_IN_PIPELINE terminating 2025-07-21 01:19:47,233 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757931_17107 replica FinalizedReplica, blk_1073757931_17107, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757931 for deletion 2025-07-21 01:19:47,234 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757931_17107 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757931 2025-07-21 01:25:48,776 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757937_17113 src: /192.168.158.1:45146 dest: /192.168.158.4:9866 2025-07-21 01:25:48,813 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45146, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_466584943_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757937_17113, duration(ns): 26182607 2025-07-21 01:25:48,813 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757937_17113, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-21 01:25:56,250 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757937_17113 replica FinalizedReplica, blk_1073757937_17113, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757937 for deletion 2025-07-21 01:25:56,251 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757937_17113 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757937 2025-07-21 01:26:53,777 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757938_17114 src: /192.168.158.1:53504 dest: /192.168.158.4:9866 2025-07-21 01:26:53,812 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53504, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1605584219_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757938_17114, duration(ns): 25950299 2025-07-21 01:26:53,813 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757938_17114, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-21 01:26:56,251 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757938_17114 replica FinalizedReplica, blk_1073757938_17114, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757938 for deletion 2025-07-21 01:26:56,253 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757938_17114 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757938 2025-07-21 01:31:58,784 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757943_17119 src: /192.168.158.1:56338 dest: /192.168.158.4:9866 2025-07-21 01:31:58,818 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56338, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1023475650_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757943_17119, duration(ns): 24906619 2025-07-21 01:31:58,818 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757943_17119, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-21 01:32:05,262 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757943_17119 replica FinalizedReplica, blk_1073757943_17119, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757943 for deletion 2025-07-21 01:32:05,263 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757943_17119 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757943 2025-07-21 01:32:58,807 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757944_17120 src: /192.168.158.5:40296 dest: /192.168.158.4:9866 2025-07-21 01:32:58,832 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:40296, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-683987684_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757944_17120, duration(ns): 19846910 2025-07-21 01:32:58,833 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757944_17120, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-21 01:33:02,265 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757944_17120 replica FinalizedReplica, blk_1073757944_17120, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757944 for deletion 2025-07-21 01:33:02,266 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757944_17120 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757944 2025-07-21 01:33:58,804 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757945_17121 src: /192.168.158.8:38288 dest: /192.168.158.4:9866 2025-07-21 01:33:58,825 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:38288, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_421958073_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757945_17121, duration(ns): 19278756 2025-07-21 01:33:58,825 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757945_17121, type=LAST_IN_PIPELINE terminating 2025-07-21 01:34:02,266 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757945_17121 replica FinalizedReplica, blk_1073757945_17121, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757945 for deletion 2025-07-21 01:34:02,268 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757945_17121 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757945 2025-07-21 01:34:58,788 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757946_17122 src: /192.168.158.1:33940 dest: /192.168.158.4:9866 2025-07-21 01:34:58,822 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33940, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-283757607_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757946_17122, duration(ns): 24496268 2025-07-21 01:34:58,823 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757946_17122, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-21 01:35:05,267 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757946_17122 replica FinalizedReplica, blk_1073757946_17122, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757946 for deletion 2025-07-21 01:35:05,268 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757946_17122 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757946 2025-07-21 01:37:03,801 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757948_17124 src: /192.168.158.8:51620 dest: /192.168.158.4:9866 2025-07-21 01:37:03,829 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:51620, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1981754237_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757948_17124, duration(ns): 20844482 2025-07-21 01:37:03,829 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757948_17124, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-21 01:37:08,270 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757948_17124 replica FinalizedReplica, blk_1073757948_17124, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757948 for deletion 2025-07-21 01:37:08,271 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757948_17124 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757948 2025-07-21 01:39:03,794 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757950_17126 src: /192.168.158.1:54386 dest: /192.168.158.4:9866 2025-07-21 01:39:03,828 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54386, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-137153396_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757950_17126, duration(ns): 24679605 2025-07-21 01:39:03,829 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757950_17126, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-21 01:39:08,276 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757950_17126 replica FinalizedReplica, blk_1073757950_17126, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757950 for deletion 2025-07-21 01:39:08,277 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757950_17126 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757950 2025-07-21 01:40:08,792 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757951_17127 src: /192.168.158.9:41882 dest: /192.168.158.4:9866 2025-07-21 01:40:08,818 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:41882, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-636393849_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757951_17127, duration(ns): 20207874 2025-07-21 01:40:08,819 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757951_17127, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-21 01:40:14,277 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757951_17127 replica FinalizedReplica, blk_1073757951_17127, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757951 for deletion 2025-07-21 01:40:14,278 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757951_17127 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir30/blk_1073757951 2025-07-21 01:41:08,808 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757952_17128 src: /192.168.158.8:59900 dest: /192.168.158.4:9866 2025-07-21 01:41:08,829 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:59900, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-668254982_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757952_17128, duration(ns): 18339084 2025-07-21 01:41:08,829 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757952_17128, type=LAST_IN_PIPELINE terminating 2025-07-21 01:41:11,277 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757952_17128 replica FinalizedReplica, blk_1073757952_17128, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073757952 for deletion 2025-07-21 01:41:11,278 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757952_17128 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073757952 2025-07-21 01:43:08,804 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757954_17130 src: /192.168.158.1:52658 dest: /192.168.158.4:9866 2025-07-21 01:43:08,836 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52658, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1823696296_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757954_17130, duration(ns): 22831502 2025-07-21 01:43:08,836 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757954_17130, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-21 01:43:14,280 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757954_17130 replica FinalizedReplica, blk_1073757954_17130, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073757954 for deletion 2025-07-21 01:43:14,281 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757954_17130 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073757954 2025-07-21 01:48:18,802 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757959_17135 src: /192.168.158.6:55380 dest: /192.168.158.4:9866 2025-07-21 01:48:18,831 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:55380, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1221483449_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757959_17135, duration(ns): 23710622 2025-07-21 01:48:18,831 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757959_17135, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-21 01:48:26,286 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757959_17135 replica FinalizedReplica, blk_1073757959_17135, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073757959 for deletion 2025-07-21 01:48:26,288 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757959_17135 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073757959 2025-07-21 01:51:23,815 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757962_17138 src: /192.168.158.8:34778 dest: /192.168.158.4:9866 2025-07-21 01:51:23,843 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:34778, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_175107495_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757962_17138, duration(ns): 22016982 2025-07-21 01:51:23,843 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757962_17138, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-21 01:51:26,294 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757962_17138 replica FinalizedReplica, blk_1073757962_17138, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073757962 for deletion 2025-07-21 01:51:26,295 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757962_17138 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073757962 2025-07-21 01:52:28,809 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757963_17139 src: /192.168.158.1:56082 dest: /192.168.158.4:9866 2025-07-21 01:52:28,844 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56082, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_538303014_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757963_17139, duration(ns): 24995912 2025-07-21 01:52:28,844 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757963_17139, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-21 01:52:32,297 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757963_17139 replica FinalizedReplica, blk_1073757963_17139, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073757963 for deletion 2025-07-21 01:52:32,298 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757963_17139 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073757963 2025-07-21 01:54:28,814 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757965_17141 src: /192.168.158.6:57478 dest: /192.168.158.4:9866 2025-07-21 01:54:28,841 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:57478, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1742726970_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757965_17141, duration(ns): 21166749 2025-07-21 01:54:28,841 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757965_17141, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-21 01:54:32,301 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757965_17141 replica FinalizedReplica, blk_1073757965_17141, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073757965 for deletion 2025-07-21 01:54:32,302 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757965_17141 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073757965 2025-07-21 01:55:28,808 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757966_17142 src: /192.168.158.1:56266 dest: /192.168.158.4:9866 2025-07-21 01:55:28,841 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56266, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1714767341_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757966_17142, duration(ns): 23112899 2025-07-21 01:55:28,841 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757966_17142, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-21 01:55:32,302 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757966_17142 replica FinalizedReplica, blk_1073757966_17142, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073757966 for deletion 2025-07-21 01:55:32,303 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757966_17142 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073757966 2025-07-21 01:59:33,809 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757970_17146 src: /192.168.158.1:53980 dest: /192.168.158.4:9866 2025-07-21 01:59:33,847 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53980, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1658694594_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757970_17146, duration(ns): 29078341 2025-07-21 01:59:33,847 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757970_17146, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-21 01:59:35,312 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757970_17146 replica FinalizedReplica, blk_1073757970_17146, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073757970 for deletion 2025-07-21 01:59:35,313 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757970_17146 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073757970 2025-07-21 02:02:38,820 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757973_17149 src: /192.168.158.1:32908 dest: /192.168.158.4:9866 2025-07-21 02:02:38,856 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:32908, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2019852043_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757973_17149, duration(ns): 26771910 2025-07-21 02:02:38,856 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757973_17149, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-21 02:02:41,315 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757973_17149 replica FinalizedReplica, blk_1073757973_17149, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073757973 for deletion 2025-07-21 02:02:41,316 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757973_17149 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073757973 2025-07-21 02:06:38,819 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757977_17153 src: /192.168.158.1:53746 dest: /192.168.158.4:9866 2025-07-21 02:06:38,852 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53746, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_52313226_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757977_17153, duration(ns): 24322425 2025-07-21 02:06:38,852 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757977_17153, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-21 02:06:41,325 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757977_17153 replica FinalizedReplica, blk_1073757977_17153, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073757977 for deletion 2025-07-21 02:06:41,327 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757977_17153 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073757977 2025-07-21 02:10:43,831 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757981_17157 src: /192.168.158.6:57582 dest: /192.168.158.4:9866 2025-07-21 02:10:43,856 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:57582, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2036912785_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757981_17157, duration(ns): 19763805 2025-07-21 02:10:43,857 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757981_17157, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-21 02:10:47,336 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757981_17157 replica FinalizedReplica, blk_1073757981_17157, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073757981 for deletion 2025-07-21 02:10:47,337 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757981_17157 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073757981 2025-07-21 02:12:43,844 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757983_17159 src: /192.168.158.6:43698 dest: /192.168.158.4:9866 2025-07-21 02:12:43,863 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:43698, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_464319910_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757983_17159, duration(ns): 16153612 2025-07-21 02:12:43,863 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757983_17159, type=LAST_IN_PIPELINE terminating 2025-07-21 02:12:50,343 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757983_17159 replica FinalizedReplica, blk_1073757983_17159, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073757983 for deletion 2025-07-21 02:12:50,344 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757983_17159 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073757983 2025-07-21 02:15:43,846 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757986_17162 src: /192.168.158.6:34704 dest: /192.168.158.4:9866 2025-07-21 02:15:43,867 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:34704, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_726559866_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757986_17162, duration(ns): 18189406 2025-07-21 02:15:43,867 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757986_17162, type=LAST_IN_PIPELINE terminating 2025-07-21 02:15:47,351 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757986_17162 replica FinalizedReplica, blk_1073757986_17162, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073757986 for deletion 2025-07-21 02:15:47,352 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757986_17162 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073757986 2025-07-21 02:17:48,845 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757988_17164 src: /192.168.158.7:51468 dest: /192.168.158.4:9866 2025-07-21 02:17:48,873 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:51468, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-957481141_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757988_17164, duration(ns): 23483908 2025-07-21 02:17:48,874 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757988_17164, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-21 02:17:53,354 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757988_17164 replica FinalizedReplica, blk_1073757988_17164, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073757988 for deletion 2025-07-21 02:17:53,356 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757988_17164 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073757988 2025-07-21 02:20:48,846 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757991_17167 src: /192.168.158.7:51058 dest: /192.168.158.4:9866 2025-07-21 02:20:48,871 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:51058, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1984190720_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757991_17167, duration(ns): 19845111 2025-07-21 02:20:48,871 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757991_17167, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-21 02:20:53,368 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757991_17167 replica FinalizedReplica, blk_1073757991_17167, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073757991 for deletion 2025-07-21 02:20:53,369 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757991_17167 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073757991 2025-07-21 02:23:53,851 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757994_17170 src: /192.168.158.1:50514 dest: /192.168.158.4:9866 2025-07-21 02:23:53,885 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50514, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2106563691_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757994_17170, duration(ns): 24323201 2025-07-21 02:23:53,885 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757994_17170, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-21 02:23:59,375 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757994_17170 replica FinalizedReplica, blk_1073757994_17170, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073757994 for deletion 2025-07-21 02:23:59,376 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757994_17170 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073757994 2025-07-21 02:24:53,853 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757995_17171 src: /192.168.158.8:34864 dest: /192.168.158.4:9866 2025-07-21 02:24:53,879 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:34864, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1546715419_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757995_17171, duration(ns): 20374015 2025-07-21 02:24:53,879 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757995_17171, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-21 02:24:56,379 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757995_17171 replica FinalizedReplica, blk_1073757995_17171, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073757995 for deletion 2025-07-21 02:24:56,380 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757995_17171 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073757995 2025-07-21 02:26:53,855 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073757997_17173 src: /192.168.158.5:36930 dest: /192.168.158.4:9866 2025-07-21 02:26:53,882 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:36930, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-59418114_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073757997_17173, duration(ns): 21291558 2025-07-21 02:26:53,882 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073757997_17173, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-21 02:26:59,383 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073757997_17173 replica FinalizedReplica, blk_1073757997_17173, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073757997 for deletion 2025-07-21 02:26:59,384 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073757997_17173 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073757997 2025-07-21 02:31:53,878 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758002_17178 src: /192.168.158.6:43884 dest: /192.168.158.4:9866 2025-07-21 02:31:53,897 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:43884, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_339583282_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758002_17178, duration(ns): 17169546 2025-07-21 02:31:53,897 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758002_17178, type=LAST_IN_PIPELINE terminating 2025-07-21 02:31:56,398 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758002_17178 replica FinalizedReplica, blk_1073758002_17178, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758002 for deletion 2025-07-21 02:31:56,399 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758002_17178 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758002 2025-07-21 02:34:53,866 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758005_17181 src: /192.168.158.1:58476 dest: /192.168.158.4:9866 2025-07-21 02:34:53,901 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58476, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_782164503_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758005_17181, duration(ns): 25489355 2025-07-21 02:34:53,901 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758005_17181, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-21 02:34:56,406 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758005_17181 replica FinalizedReplica, blk_1073758005_17181, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758005 for deletion 2025-07-21 02:34:56,408 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758005_17181 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758005 2025-07-21 02:35:53,874 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758006_17182 src: /192.168.158.5:48546 dest: /192.168.158.4:9866 2025-07-21 02:35:53,899 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:48546, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2111009521_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758006_17182, duration(ns): 19911570 2025-07-21 02:35:53,900 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758006_17182, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-21 02:35:56,408 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758006_17182 replica FinalizedReplica, blk_1073758006_17182, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758006 for deletion 2025-07-21 02:35:56,410 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758006_17182 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758006 2025-07-21 02:36:53,877 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758007_17183 src: /192.168.158.1:45766 dest: /192.168.158.4:9866 2025-07-21 02:36:53,910 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45766, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1369488693_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758007_17183, duration(ns): 23723358 2025-07-21 02:36:53,910 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758007_17183, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-21 02:36:56,413 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758007_17183 replica FinalizedReplica, blk_1073758007_17183, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758007 for deletion 2025-07-21 02:36:56,415 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758007_17183 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758007 2025-07-21 02:37:53,883 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758008_17184 src: /192.168.158.6:58088 dest: /192.168.158.4:9866 2025-07-21 02:37:53,911 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:58088, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1026614376_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758008_17184, duration(ns): 22006062 2025-07-21 02:37:53,912 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758008_17184, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-21 02:37:56,416 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758008_17184 replica FinalizedReplica, blk_1073758008_17184, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758008 for deletion 2025-07-21 02:37:56,417 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758008_17184 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758008 2025-07-21 02:38:53,877 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758009_17185 src: /192.168.158.8:46172 dest: /192.168.158.4:9866 2025-07-21 02:38:53,904 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:46172, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2125468308_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758009_17185, duration(ns): 21218512 2025-07-21 02:38:53,904 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758009_17185, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-21 02:38:56,421 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758009_17185 replica FinalizedReplica, blk_1073758009_17185, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758009 for deletion 2025-07-21 02:38:56,422 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758009_17185 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758009 2025-07-21 02:40:53,879 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758011_17187 src: /192.168.158.5:51938 dest: /192.168.158.4:9866 2025-07-21 02:40:53,906 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:51938, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2126008770_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758011_17187, duration(ns): 20981723 2025-07-21 02:40:53,906 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758011_17187, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-21 02:40:59,425 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758011_17187 replica FinalizedReplica, blk_1073758011_17187, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758011 for deletion 2025-07-21 02:40:59,426 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758011_17187 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758011 2025-07-21 02:42:58,882 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758013_17189 src: /192.168.158.9:33572 dest: /192.168.158.4:9866 2025-07-21 02:42:58,902 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:33572, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1613769349_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758013_17189, duration(ns): 18139461 2025-07-21 02:42:58,903 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758013_17189, type=LAST_IN_PIPELINE terminating 2025-07-21 02:43:02,431 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758013_17189 replica FinalizedReplica, blk_1073758013_17189, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758013 for deletion 2025-07-21 02:43:02,432 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758013_17189 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758013 2025-07-21 02:44:58,886 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758015_17191 src: /192.168.158.6:49934 dest: /192.168.158.4:9866 2025-07-21 02:44:58,912 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:49934, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1921873672_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758015_17191, duration(ns): 20596010 2025-07-21 02:44:58,912 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758015_17191, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-21 02:45:02,436 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758015_17191 replica FinalizedReplica, blk_1073758015_17191, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758015 for deletion 2025-07-21 02:45:02,437 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758015_17191 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758015 2025-07-21 02:47:03,887 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758017_17193 src: /192.168.158.8:35974 dest: /192.168.158.4:9866 2025-07-21 02:47:03,917 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:35974, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1484882481_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758017_17193, duration(ns): 24244710 2025-07-21 02:47:03,917 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758017_17193, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-21 02:47:05,438 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758017_17193 replica FinalizedReplica, blk_1073758017_17193, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758017 for deletion 2025-07-21 02:47:05,439 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758017_17193 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758017 2025-07-21 02:48:03,892 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758018_17194 src: /192.168.158.5:58544 dest: /192.168.158.4:9866 2025-07-21 02:48:03,912 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:58544, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_179342073_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758018_17194, duration(ns): 18021836 2025-07-21 02:48:03,913 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758018_17194, type=LAST_IN_PIPELINE terminating 2025-07-21 02:48:05,441 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758018_17194 replica FinalizedReplica, blk_1073758018_17194, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758018 for deletion 2025-07-21 02:48:05,442 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758018_17194 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758018 2025-07-21 02:49:03,895 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758019_17195 src: /192.168.158.7:55386 dest: /192.168.158.4:9866 2025-07-21 02:49:03,914 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:55386, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1099040923_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758019_17195, duration(ns): 16873659 2025-07-21 02:49:03,914 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758019_17195, type=LAST_IN_PIPELINE terminating 2025-07-21 02:49:05,441 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758019_17195 replica FinalizedReplica, blk_1073758019_17195, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758019 for deletion 2025-07-21 02:49:05,442 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758019_17195 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758019 2025-07-21 02:52:13,897 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758022_17198 src: /192.168.158.5:46474 dest: /192.168.158.4:9866 2025-07-21 02:52:13,923 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:46474, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-749063365_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758022_17198, duration(ns): 20851463 2025-07-21 02:52:13,923 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758022_17198, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-21 02:52:17,447 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758022_17198 replica FinalizedReplica, blk_1073758022_17198, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758022 for deletion 2025-07-21 02:52:17,448 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758022_17198 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758022 2025-07-21 02:53:13,894 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758023_17199 src: /192.168.158.8:57208 dest: /192.168.158.4:9866 2025-07-21 02:53:13,916 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:57208, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_389226_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758023_17199, duration(ns): 18753318 2025-07-21 02:53:13,916 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758023_17199, type=LAST_IN_PIPELINE terminating 2025-07-21 02:53:20,449 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758023_17199 replica FinalizedReplica, blk_1073758023_17199, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758023 for deletion 2025-07-21 02:53:20,450 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758023_17199 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758023 2025-07-21 02:56:23,892 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758026_17202 src: /192.168.158.1:37544 dest: /192.168.158.4:9866 2025-07-21 02:56:23,929 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37544, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2124111862_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758026_17202, duration(ns): 27565639 2025-07-21 02:56:23,929 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758026_17202, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-21 02:56:29,459 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758026_17202 replica FinalizedReplica, blk_1073758026_17202, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758026 for deletion 2025-07-21 02:56:29,460 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758026_17202 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758026 2025-07-21 02:57:23,892 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758027_17203 src: /192.168.158.1:38074 dest: /192.168.158.4:9866 2025-07-21 02:57:23,925 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38074, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1568636298_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758027_17203, duration(ns): 24084334 2025-07-21 02:57:23,925 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758027_17203, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-21 02:57:26,462 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758027_17203 replica FinalizedReplica, blk_1073758027_17203, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758027 for deletion 2025-07-21 02:57:26,463 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758027_17203 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758027 2025-07-21 02:58:23,898 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758028_17204 src: /192.168.158.1:58306 dest: /192.168.158.4:9866 2025-07-21 02:58:23,932 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58306, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-911225533_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758028_17204, duration(ns): 25422463 2025-07-21 02:58:23,932 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758028_17204, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-21 02:58:29,464 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758028_17204 replica FinalizedReplica, blk_1073758028_17204, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758028 for deletion 2025-07-21 02:58:29,465 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758028_17204 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758028 2025-07-21 02:59:23,900 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758029_17205 src: /192.168.158.6:37534 dest: /192.168.158.4:9866 2025-07-21 02:59:23,928 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:37534, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-981252544_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758029_17205, duration(ns): 22242599 2025-07-21 02:59:23,928 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758029_17205, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-21 02:59:29,467 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758029_17205 replica FinalizedReplica, blk_1073758029_17205, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758029 for deletion 2025-07-21 02:59:29,468 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758029_17205 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758029 2025-07-21 03:01:23,904 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758031_17207 src: /192.168.158.1:38612 dest: /192.168.158.4:9866 2025-07-21 03:01:23,938 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38612, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_25192408_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758031_17207, duration(ns): 23985045 2025-07-21 03:01:23,939 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758031_17207, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-21 03:01:26,471 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758031_17207 replica FinalizedReplica, blk_1073758031_17207, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758031 for deletion 2025-07-21 03:01:26,473 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758031_17207 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758031 2025-07-21 03:03:28,904 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758033_17209 src: /192.168.158.8:37344 dest: /192.168.158.4:9866 2025-07-21 03:03:28,923 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:37344, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-489955845_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758033_17209, duration(ns): 17268820 2025-07-21 03:03:28,924 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758033_17209, type=LAST_IN_PIPELINE terminating 2025-07-21 03:03:32,477 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758033_17209 replica FinalizedReplica, blk_1073758033_17209, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758033 for deletion 2025-07-21 03:03:32,478 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758033_17209 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758033 2025-07-21 03:04:28,906 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758034_17210 src: /192.168.158.7:44582 dest: /192.168.158.4:9866 2025-07-21 03:04:28,924 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:44582, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-17075732_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758034_17210, duration(ns): 16326249 2025-07-21 03:04:28,925 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758034_17210, type=LAST_IN_PIPELINE terminating 2025-07-21 03:04:35,480 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758034_17210 replica FinalizedReplica, blk_1073758034_17210, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758034 for deletion 2025-07-21 03:04:35,481 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758034_17210 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758034 2025-07-21 03:05:28,905 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758035_17211 src: /192.168.158.1:41382 dest: /192.168.158.4:9866 2025-07-21 03:05:28,936 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41382, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_68061209_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758035_17211, duration(ns): 22819858 2025-07-21 03:05:28,937 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758035_17211, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-21 03:05:32,482 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758035_17211 replica FinalizedReplica, blk_1073758035_17211, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758035 for deletion 2025-07-21 03:05:32,483 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758035_17211 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758035 2025-07-21 03:06:28,910 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758036_17212 src: /192.168.158.9:43456 dest: /192.168.158.4:9866 2025-07-21 03:06:28,937 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:43456, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_494408039_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758036_17212, duration(ns): 21340785 2025-07-21 03:06:28,937 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758036_17212, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-21 03:06:32,485 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758036_17212 replica FinalizedReplica, blk_1073758036_17212, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758036 for deletion 2025-07-21 03:06:32,487 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758036_17212 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758036 2025-07-21 03:07:33,912 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758037_17213 src: /192.168.158.5:57672 dest: /192.168.158.4:9866 2025-07-21 03:07:33,939 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:57672, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-412425627_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758037_17213, duration(ns): 21037087 2025-07-21 03:07:33,939 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758037_17213, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-21 03:07:38,487 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758037_17213 replica FinalizedReplica, blk_1073758037_17213, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758037 for deletion 2025-07-21 03:07:38,488 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758037_17213 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758037 2025-07-21 03:08:33,917 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758038_17214 src: /192.168.158.6:34526 dest: /192.168.158.4:9866 2025-07-21 03:08:33,939 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:34526, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_921840064_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758038_17214, duration(ns): 20139299 2025-07-21 03:08:33,940 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758038_17214, type=LAST_IN_PIPELINE terminating 2025-07-21 03:08:35,489 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758038_17214 replica FinalizedReplica, blk_1073758038_17214, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758038 for deletion 2025-07-21 03:08:35,490 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758038_17214 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758038 2025-07-21 03:11:33,914 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758041_17217 src: /192.168.158.8:48562 dest: /192.168.158.4:9866 2025-07-21 03:11:33,941 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:48562, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-942149145_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758041_17217, duration(ns): 21045051 2025-07-21 03:11:33,941 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758041_17217, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-21 03:11:35,495 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758041_17217 replica FinalizedReplica, blk_1073758041_17217, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758041 for deletion 2025-07-21 03:11:35,496 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758041_17217 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758041 2025-07-21 03:16:38,923 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758046_17222 src: /192.168.158.9:41214 dest: /192.168.158.4:9866 2025-07-21 03:16:38,948 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:41214, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1687276437_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758046_17222, duration(ns): 19934631 2025-07-21 03:16:38,948 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758046_17222, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-21 03:16:41,507 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758046_17222 replica FinalizedReplica, blk_1073758046_17222, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758046 for deletion 2025-07-21 03:16:41,508 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758046_17222 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758046 2025-07-21 03:19:43,932 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758049_17225 src: /192.168.158.7:38558 dest: /192.168.158.4:9866 2025-07-21 03:19:43,952 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:38558, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_707042857_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758049_17225, duration(ns): 17810328 2025-07-21 03:19:43,952 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758049_17225, type=LAST_IN_PIPELINE terminating 2025-07-21 03:19:47,515 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758049_17225 replica FinalizedReplica, blk_1073758049_17225, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758049 for deletion 2025-07-21 03:19:47,516 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758049_17225 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758049 2025-07-21 03:22:48,936 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758052_17228 src: /192.168.158.7:39964 dest: /192.168.158.4:9866 2025-07-21 03:22:48,962 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:39964, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_348529450_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758052_17228, duration(ns): 20894146 2025-07-21 03:22:48,962 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758052_17228, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-21 03:22:53,526 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758052_17228 replica FinalizedReplica, blk_1073758052_17228, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758052 for deletion 2025-07-21 03:22:53,527 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758052_17228 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758052 2025-07-21 03:24:53,930 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758054_17230 src: /192.168.158.7:44692 dest: /192.168.158.4:9866 2025-07-21 03:24:53,958 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:44692, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-392873048_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758054_17230, duration(ns): 22915936 2025-07-21 03:24:53,958 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758054_17230, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-21 03:24:59,528 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758054_17230 replica FinalizedReplica, blk_1073758054_17230, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758054 for deletion 2025-07-21 03:24:59,530 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758054_17230 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758054 2025-07-21 03:25:58,929 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758055_17231 src: /192.168.158.7:51936 dest: /192.168.158.4:9866 2025-07-21 03:25:58,955 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:51936, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-578930830_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758055_17231, duration(ns): 20858964 2025-07-21 03:25:58,955 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758055_17231, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-21 03:26:02,529 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758055_17231 replica FinalizedReplica, blk_1073758055_17231, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758055 for deletion 2025-07-21 03:26:02,530 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758055_17231 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758055 2025-07-21 03:26:58,934 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758056_17232 src: /192.168.158.8:56862 dest: /192.168.158.4:9866 2025-07-21 03:26:58,959 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:56862, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-75164870_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758056_17232, duration(ns): 20388676 2025-07-21 03:26:58,959 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758056_17232, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-21 03:27:02,533 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758056_17232 replica FinalizedReplica, blk_1073758056_17232, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758056 for deletion 2025-07-21 03:27:02,534 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758056_17232 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758056 2025-07-21 03:27:58,936 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758057_17233 src: /192.168.158.1:59270 dest: /192.168.158.4:9866 2025-07-21 03:27:58,969 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59270, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_490870848_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758057_17233, duration(ns): 24780317 2025-07-21 03:27:58,970 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758057_17233, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-21 03:28:02,532 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758057_17233 replica FinalizedReplica, blk_1073758057_17233, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758057 for deletion 2025-07-21 03:28:02,534 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758057_17233 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758057 2025-07-21 03:29:03,938 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758058_17234 src: /192.168.158.8:44734 dest: /192.168.158.4:9866 2025-07-21 03:29:03,969 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:44734, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1516411780_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758058_17234, duration(ns): 23273009 2025-07-21 03:29:03,969 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758058_17234, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-21 03:29:08,535 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758058_17234 replica FinalizedReplica, blk_1073758058_17234, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758058 for deletion 2025-07-21 03:29:08,536 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758058_17234 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758058 2025-07-21 03:32:08,947 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758061_17237 src: /192.168.158.9:47746 dest: /192.168.158.4:9866 2025-07-21 03:32:08,970 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:47746, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-740421561_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758061_17237, duration(ns): 20249783 2025-07-21 03:32:08,970 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758061_17237, type=LAST_IN_PIPELINE terminating 2025-07-21 03:32:11,540 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758061_17237 replica FinalizedReplica, blk_1073758061_17237, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758061 for deletion 2025-07-21 03:32:11,541 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758061_17237 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758061 2025-07-21 03:33:13,941 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758062_17238 src: /192.168.158.1:33086 dest: /192.168.158.4:9866 2025-07-21 03:33:13,976 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33086, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1318026899_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758062_17238, duration(ns): 25677923 2025-07-21 03:33:13,976 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758062_17238, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-21 03:33:20,542 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758062_17238 replica FinalizedReplica, blk_1073758062_17238, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758062 for deletion 2025-07-21 03:33:20,543 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758062_17238 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758062 2025-07-21 03:35:18,945 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758064_17240 src: /192.168.158.1:42932 dest: /192.168.158.4:9866 2025-07-21 03:35:18,980 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42932, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1541731635_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758064_17240, duration(ns): 25700664 2025-07-21 03:35:18,981 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758064_17240, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-21 03:35:20,547 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758064_17240 replica FinalizedReplica, blk_1073758064_17240, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758064 for deletion 2025-07-21 03:35:20,548 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758064_17240 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758064 2025-07-21 03:36:18,951 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758065_17241 src: /192.168.158.1:46690 dest: /192.168.158.4:9866 2025-07-21 03:36:18,983 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46690, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1975336684_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758065_17241, duration(ns): 23644106 2025-07-21 03:36:18,983 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758065_17241, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-21 03:36:23,549 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758065_17241 replica FinalizedReplica, blk_1073758065_17241, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758065 for deletion 2025-07-21 03:36:23,550 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758065_17241 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758065 2025-07-21 03:37:18,955 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758066_17242 src: /192.168.158.9:59178 dest: /192.168.158.4:9866 2025-07-21 03:37:18,977 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:59178, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1428720791_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758066_17242, duration(ns): 19733709 2025-07-21 03:37:18,977 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758066_17242, type=LAST_IN_PIPELINE terminating 2025-07-21 03:37:20,550 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758066_17242 replica FinalizedReplica, blk_1073758066_17242, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758066 for deletion 2025-07-21 03:37:20,551 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758066_17242 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758066 2025-07-21 03:38:18,978 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758067_17243 src: /192.168.158.1:42742 dest: /192.168.158.4:9866 2025-07-21 03:38:19,011 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42742, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1082946618_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758067_17243, duration(ns): 24201714 2025-07-21 03:38:19,012 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758067_17243, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-21 03:38:20,553 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758067_17243 replica FinalizedReplica, blk_1073758067_17243, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758067 for deletion 2025-07-21 03:38:20,554 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758067_17243 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758067 2025-07-21 03:40:18,960 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758069_17245 src: /192.168.158.7:58012 dest: /192.168.158.4:9866 2025-07-21 03:40:18,989 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:58012, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-622598313_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758069_17245, duration(ns): 23330207 2025-07-21 03:40:18,989 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758069_17245, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-21 03:40:20,558 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758069_17245 replica FinalizedReplica, blk_1073758069_17245, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758069 for deletion 2025-07-21 03:40:20,559 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758069_17245 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758069 2025-07-21 03:41:18,967 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758070_17246 src: /192.168.158.8:39106 dest: /192.168.158.4:9866 2025-07-21 03:41:18,987 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:39106, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_481157164_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758070_17246, duration(ns): 17966386 2025-07-21 03:41:18,987 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758070_17246, type=LAST_IN_PIPELINE terminating 2025-07-21 03:41:23,561 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758070_17246 replica FinalizedReplica, blk_1073758070_17246, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758070 for deletion 2025-07-21 03:41:23,562 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758070_17246 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758070 2025-07-21 03:43:18,966 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758072_17248 src: /192.168.158.6:57288 dest: /192.168.158.4:9866 2025-07-21 03:43:18,987 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:57288, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1916180877_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758072_17248, duration(ns): 18849153 2025-07-21 03:43:18,988 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758072_17248, type=LAST_IN_PIPELINE terminating 2025-07-21 03:43:23,564 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758072_17248 replica FinalizedReplica, blk_1073758072_17248, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758072 for deletion 2025-07-21 03:43:23,566 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758072_17248 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758072 2025-07-21 03:44:18,976 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758073_17249 src: /192.168.158.9:32948 dest: /192.168.158.4:9866 2025-07-21 03:44:18,995 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:32948, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1059166151_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758073_17249, duration(ns): 17076529 2025-07-21 03:44:18,995 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758073_17249, type=LAST_IN_PIPELINE terminating 2025-07-21 03:44:23,566 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758073_17249 replica FinalizedReplica, blk_1073758073_17249, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758073 for deletion 2025-07-21 03:44:23,568 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758073_17249 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758073 2025-07-21 03:45:23,963 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758074_17250 src: /192.168.158.5:38642 dest: /192.168.158.4:9866 2025-07-21 03:45:23,986 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:38642, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1582205413_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758074_17250, duration(ns): 20499603 2025-07-21 03:45:23,986 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758074_17250, type=LAST_IN_PIPELINE terminating 2025-07-21 03:45:29,568 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758074_17250 replica FinalizedReplica, blk_1073758074_17250, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758074 for deletion 2025-07-21 03:45:29,569 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758074_17250 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758074 2025-07-21 03:47:23,966 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758076_17252 src: /192.168.158.8:59710 dest: /192.168.158.4:9866 2025-07-21 03:47:23,993 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:59710, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1657348152_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758076_17252, duration(ns): 21548254 2025-07-21 03:47:23,994 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758076_17252, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-21 03:47:29,570 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758076_17252 replica FinalizedReplica, blk_1073758076_17252, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758076 for deletion 2025-07-21 03:47:29,571 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758076_17252 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758076 2025-07-21 03:48:23,966 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758077_17253 src: /192.168.158.7:38190 dest: /192.168.158.4:9866 2025-07-21 03:48:23,994 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:38190, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_548964486_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758077_17253, duration(ns): 22211353 2025-07-21 03:48:23,994 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758077_17253, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-21 03:48:26,571 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758077_17253 replica FinalizedReplica, blk_1073758077_17253, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758077 for deletion 2025-07-21 03:48:26,573 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758077_17253 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758077 2025-07-21 03:50:28,966 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758079_17255 src: /192.168.158.5:34196 dest: /192.168.158.4:9866 2025-07-21 03:50:28,984 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:34196, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_182193565_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758079_17255, duration(ns): 16097768 2025-07-21 03:50:28,985 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758079_17255, type=LAST_IN_PIPELINE terminating 2025-07-21 03:50:32,575 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758079_17255 replica FinalizedReplica, blk_1073758079_17255, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758079 for deletion 2025-07-21 03:50:32,576 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758079_17255 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758079 2025-07-21 03:54:28,972 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758083_17259 src: /192.168.158.6:56632 dest: /192.168.158.4:9866 2025-07-21 03:54:28,991 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:56632, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1466462346_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758083_17259, duration(ns): 16468418 2025-07-21 03:54:28,991 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758083_17259, type=LAST_IN_PIPELINE terminating 2025-07-21 03:54:32,584 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758083_17259 replica FinalizedReplica, blk_1073758083_17259, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758083 for deletion 2025-07-21 03:54:32,586 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758083_17259 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758083 2025-07-21 03:55:33,977 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758084_17260 src: /192.168.158.7:55868 dest: /192.168.158.4:9866 2025-07-21 03:55:33,997 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:55868, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_324484602_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758084_17260, duration(ns): 17889762 2025-07-21 03:55:33,997 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758084_17260, type=LAST_IN_PIPELINE terminating 2025-07-21 03:55:35,587 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758084_17260 replica FinalizedReplica, blk_1073758084_17260, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758084 for deletion 2025-07-21 03:55:35,588 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758084_17260 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758084 2025-07-21 03:57:33,980 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758086_17262 src: /192.168.158.7:45384 dest: /192.168.158.4:9866 2025-07-21 03:57:34,002 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:45384, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-8205035_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758086_17262, duration(ns): 19547834 2025-07-21 03:57:34,002 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758086_17262, type=LAST_IN_PIPELINE terminating 2025-07-21 03:57:38,588 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758086_17262 replica FinalizedReplica, blk_1073758086_17262, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758086 for deletion 2025-07-21 03:57:38,589 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758086_17262 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758086 2025-07-21 03:58:33,980 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758087_17263 src: /192.168.158.1:58146 dest: /192.168.158.4:9866 2025-07-21 03:58:34,014 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58146, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2062881123_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758087_17263, duration(ns): 24832115 2025-07-21 03:58:34,014 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758087_17263, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-21 03:58:38,591 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758087_17263 replica FinalizedReplica, blk_1073758087_17263, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758087 for deletion 2025-07-21 03:58:38,592 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758087_17263 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758087 2025-07-21 03:59:17,597 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Successfully sent block report 0x12cfd9a757d31f54, containing 4 storage report(s), of which we sent 4. The reports had 8 total blocks and used 1 RPC(s). This took 0 msec to generate and 4 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5. 2025-07-21 03:59:17,598 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Got finalize command for block pool BP-1059995147-192.168.158.1-1752101929360 2025-07-21 04:03:33,993 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758092_17268 src: /192.168.158.8:47378 dest: /192.168.158.4:9866 2025-07-21 04:03:34,015 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:47378, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1261594015_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758092_17268, duration(ns): 19313196 2025-07-21 04:03:34,015 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758092_17268, type=LAST_IN_PIPELINE terminating 2025-07-21 04:03:38,602 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758092_17268 replica FinalizedReplica, blk_1073758092_17268, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758092 for deletion 2025-07-21 04:03:38,603 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758092_17268 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758092 2025-07-21 04:06:33,992 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758095_17271 src: /192.168.158.5:46016 dest: /192.168.158.4:9866 2025-07-21 04:06:34,022 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:46016, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1348468986_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758095_17271, duration(ns): 23829545 2025-07-21 04:06:34,022 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758095_17271, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-21 04:06:35,610 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758095_17271 replica FinalizedReplica, blk_1073758095_17271, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758095 for deletion 2025-07-21 04:06:35,612 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758095_17271 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758095 2025-07-21 04:10:33,999 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758099_17275 src: /192.168.158.1:55120 dest: /192.168.158.4:9866 2025-07-21 04:10:34,034 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55120, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1602385462_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758099_17275, duration(ns): 25649243 2025-07-21 04:10:34,034 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758099_17275, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-21 04:10:35,623 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758099_17275 replica FinalizedReplica, blk_1073758099_17275, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758099 for deletion 2025-07-21 04:10:35,624 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758099_17275 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758099 2025-07-21 04:11:34,006 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758100_17276 src: /192.168.158.7:42680 dest: /192.168.158.4:9866 2025-07-21 04:11:34,031 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:42680, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_924006499_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758100_17276, duration(ns): 19954352 2025-07-21 04:11:34,032 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758100_17276, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-21 04:11:38,627 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758100_17276 replica FinalizedReplica, blk_1073758100_17276, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758100 for deletion 2025-07-21 04:11:38,628 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758100_17276 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758100 2025-07-21 04:12:34,011 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758101_17277 src: /192.168.158.6:42958 dest: /192.168.158.4:9866 2025-07-21 04:12:34,031 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:42958, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1108293986_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758101_17277, duration(ns): 18231383 2025-07-21 04:12:34,032 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758101_17277, type=LAST_IN_PIPELINE terminating 2025-07-21 04:12:35,629 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758101_17277 replica FinalizedReplica, blk_1073758101_17277, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758101 for deletion 2025-07-21 04:12:35,630 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758101_17277 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758101 2025-07-21 04:14:44,005 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758103_17279 src: /192.168.158.1:51996 dest: /192.168.158.4:9866 2025-07-21 04:14:44,038 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51996, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-481337045_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758103_17279, duration(ns): 23980201 2025-07-21 04:14:44,038 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758103_17279, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-21 04:14:50,634 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758103_17279 replica FinalizedReplica, blk_1073758103_17279, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758103 for deletion 2025-07-21 04:14:50,635 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758103_17279 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758103 2025-07-21 04:15:44,005 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758104_17280 src: /192.168.158.1:42046 dest: /192.168.158.4:9866 2025-07-21 04:15:44,038 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42046, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1837657989_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758104_17280, duration(ns): 24109719 2025-07-21 04:15:44,038 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758104_17280, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-21 04:15:50,637 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758104_17280 replica FinalizedReplica, blk_1073758104_17280, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758104 for deletion 2025-07-21 04:15:50,638 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758104_17280 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758104 2025-07-21 04:17:44,023 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758106_17282 src: /192.168.158.6:60030 dest: /192.168.158.4:9866 2025-07-21 04:17:44,042 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:60030, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-927694335_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758106_17282, duration(ns): 17548585 2025-07-21 04:17:44,043 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758106_17282, type=LAST_IN_PIPELINE terminating 2025-07-21 04:17:47,642 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758106_17282 replica FinalizedReplica, blk_1073758106_17282, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758106 for deletion 2025-07-21 04:17:47,643 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758106_17282 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758106 2025-07-21 04:19:44,020 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758108_17284 src: /192.168.158.9:37196 dest: /192.168.158.4:9866 2025-07-21 04:19:44,039 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:37196, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2112950585_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758108_17284, duration(ns): 16834617 2025-07-21 04:19:44,040 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758108_17284, type=LAST_IN_PIPELINE terminating 2025-07-21 04:19:47,644 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758108_17284 replica FinalizedReplica, blk_1073758108_17284, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758108 for deletion 2025-07-21 04:19:47,645 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758108_17284 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758108 2025-07-21 04:20:49,019 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758109_17285 src: /192.168.158.7:40598 dest: /192.168.158.4:9866 2025-07-21 04:20:49,038 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:40598, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1282801656_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758109_17285, duration(ns): 16731689 2025-07-21 04:20:49,038 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758109_17285, type=LAST_IN_PIPELINE terminating 2025-07-21 04:20:50,649 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758109_17285 replica FinalizedReplica, blk_1073758109_17285, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758109 for deletion 2025-07-21 04:20:50,650 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758109_17285 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758109 2025-07-21 04:21:49,017 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758110_17286 src: /192.168.158.7:43322 dest: /192.168.158.4:9866 2025-07-21 04:21:49,045 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:43322, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_519448858_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758110_17286, duration(ns): 22907358 2025-07-21 04:21:49,045 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758110_17286, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-21 04:21:50,650 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758110_17286 replica FinalizedReplica, blk_1073758110_17286, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758110 for deletion 2025-07-21 04:21:50,652 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758110_17286 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758110 2025-07-21 04:22:49,024 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758111_17287 src: /192.168.158.6:58556 dest: /192.168.158.4:9866 2025-07-21 04:22:49,044 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:58556, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1645798743_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758111_17287, duration(ns): 17611624 2025-07-21 04:22:49,044 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758111_17287, type=LAST_IN_PIPELINE terminating 2025-07-21 04:22:50,655 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758111_17287 replica FinalizedReplica, blk_1073758111_17287, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758111 for deletion 2025-07-21 04:22:50,656 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758111_17287 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758111 2025-07-21 04:23:49,021 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758112_17288 src: /192.168.158.1:53702 dest: /192.168.158.4:9866 2025-07-21 04:23:49,057 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53702, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-631889560_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758112_17288, duration(ns): 26897803 2025-07-21 04:23:49,057 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758112_17288, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-21 04:23:53,657 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758112_17288 replica FinalizedReplica, blk_1073758112_17288, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758112 for deletion 2025-07-21 04:23:53,658 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758112_17288 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758112 2025-07-21 04:24:49,022 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758113_17289 src: /192.168.158.8:52572 dest: /192.168.158.4:9866 2025-07-21 04:24:49,048 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:52572, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_147960975_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758113_17289, duration(ns): 21142944 2025-07-21 04:24:49,048 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758113_17289, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-21 04:24:53,658 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758113_17289 replica FinalizedReplica, blk_1073758113_17289, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758113 for deletion 2025-07-21 04:24:53,660 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758113_17289 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758113 2025-07-21 04:26:54,021 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758115_17291 src: /192.168.158.1:39124 dest: /192.168.158.4:9866 2025-07-21 04:26:54,054 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39124, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1434676035_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758115_17291, duration(ns): 23630726 2025-07-21 04:26:54,054 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758115_17291, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-21 04:26:56,663 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758115_17291 replica FinalizedReplica, blk_1073758115_17291, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758115 for deletion 2025-07-21 04:26:56,664 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758115_17291 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758115 2025-07-21 04:27:59,022 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758116_17292 src: /192.168.158.8:40534 dest: /192.168.158.4:9866 2025-07-21 04:27:59,049 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:40534, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-11649334_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758116_17292, duration(ns): 20748501 2025-07-21 04:27:59,049 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758116_17292, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-21 04:28:02,665 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758116_17292 replica FinalizedReplica, blk_1073758116_17292, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758116 for deletion 2025-07-21 04:28:02,666 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758116_17292 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758116 2025-07-21 04:29:59,022 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758118_17294 src: /192.168.158.1:45880 dest: /192.168.158.4:9866 2025-07-21 04:29:59,055 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45880, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1076921668_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758118_17294, duration(ns): 25060204 2025-07-21 04:29:59,056 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758118_17294, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-21 04:30:02,672 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758118_17294 replica FinalizedReplica, blk_1073758118_17294, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758118 for deletion 2025-07-21 04:30:02,673 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758118_17294 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758118 2025-07-21 04:30:59,030 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758119_17295 src: /192.168.158.7:50506 dest: /192.168.158.4:9866 2025-07-21 04:30:59,051 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:50506, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_682017768_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758119_17295, duration(ns): 18202197 2025-07-21 04:30:59,051 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758119_17295, type=LAST_IN_PIPELINE terminating 2025-07-21 04:31:02,673 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758119_17295 replica FinalizedReplica, blk_1073758119_17295, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758119 for deletion 2025-07-21 04:31:02,675 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758119_17295 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758119 2025-07-21 04:33:04,029 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758121_17297 src: /192.168.158.9:40544 dest: /192.168.158.4:9866 2025-07-21 04:33:04,054 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:40544, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-931953255_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758121_17297, duration(ns): 20026535 2025-07-21 04:33:04,055 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758121_17297, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-21 04:33:05,677 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758121_17297 replica FinalizedReplica, blk_1073758121_17297, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758121 for deletion 2025-07-21 04:33:05,678 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758121_17297 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758121 2025-07-21 04:36:04,036 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758124_17300 src: /192.168.158.6:54422 dest: /192.168.158.4:9866 2025-07-21 04:36:04,056 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:54422, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2058414019_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758124_17300, duration(ns): 17437113 2025-07-21 04:36:04,056 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758124_17300, type=LAST_IN_PIPELINE terminating 2025-07-21 04:36:05,684 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758124_17300 replica FinalizedReplica, blk_1073758124_17300, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758124 for deletion 2025-07-21 04:36:05,685 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758124_17300 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758124 2025-07-21 04:40:09,050 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758128_17304 src: /192.168.158.8:36638 dest: /192.168.158.4:9866 2025-07-21 04:40:09,068 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:36638, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_544696667_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758128_17304, duration(ns): 15739501 2025-07-21 04:40:09,068 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758128_17304, type=LAST_IN_PIPELINE terminating 2025-07-21 04:40:11,694 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758128_17304 replica FinalizedReplica, blk_1073758128_17304, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758128 for deletion 2025-07-21 04:40:11,696 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758128_17304 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758128 2025-07-21 04:41:09,040 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758129_17305 src: /192.168.158.8:60498 dest: /192.168.158.4:9866 2025-07-21 04:41:09,070 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:60498, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-168983325_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758129_17305, duration(ns): 23538610 2025-07-21 04:41:09,071 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758129_17305, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-21 04:41:11,694 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758129_17305 replica FinalizedReplica, blk_1073758129_17305, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758129 for deletion 2025-07-21 04:41:11,696 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758129_17305 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758129 2025-07-21 04:42:09,042 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758130_17306 src: /192.168.158.8:52142 dest: /192.168.158.4:9866 2025-07-21 04:42:09,073 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:52142, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1493082583_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758130_17306, duration(ns): 24354765 2025-07-21 04:42:09,074 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758130_17306, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-21 04:42:11,695 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758130_17306 replica FinalizedReplica, blk_1073758130_17306, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758130 for deletion 2025-07-21 04:42:11,697 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758130_17306 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758130 2025-07-21 04:44:09,048 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758132_17308 src: /192.168.158.6:58818 dest: /192.168.158.4:9866 2025-07-21 04:44:09,073 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:58818, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_241690849_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758132_17308, duration(ns): 20867413 2025-07-21 04:44:09,074 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758132_17308, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-21 04:44:11,701 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758132_17308 replica FinalizedReplica, blk_1073758132_17308, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758132 for deletion 2025-07-21 04:44:11,702 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758132_17308 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758132 2025-07-21 04:45:09,048 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758133_17309 src: /192.168.158.8:40308 dest: /192.168.158.4:9866 2025-07-21 04:45:09,076 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:40308, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1238206089_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758133_17309, duration(ns): 23157530 2025-07-21 04:45:09,076 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758133_17309, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-21 04:45:14,704 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758133_17309 replica FinalizedReplica, blk_1073758133_17309, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758133 for deletion 2025-07-21 04:45:14,705 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758133_17309 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758133 2025-07-21 04:52:14,059 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758140_17316 src: /192.168.158.8:33776 dest: /192.168.158.4:9866 2025-07-21 04:52:14,087 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:33776, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1469586232_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758140_17316, duration(ns): 21748847 2025-07-21 04:52:14,087 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758140_17316, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-21 04:52:17,723 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758140_17316 replica FinalizedReplica, blk_1073758140_17316, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758140 for deletion 2025-07-21 04:52:17,725 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758140_17316 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758140 2025-07-21 04:54:24,058 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758142_17318 src: /192.168.158.6:59584 dest: /192.168.158.4:9866 2025-07-21 04:54:24,084 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:59584, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1802143869_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758142_17318, duration(ns): 20913648 2025-07-21 04:54:24,085 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758142_17318, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-21 04:54:26,743 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758142_17318 replica FinalizedReplica, blk_1073758142_17318, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758142 for deletion 2025-07-21 04:54:26,744 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758142_17318 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758142 2025-07-21 04:55:24,065 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758143_17319 src: /192.168.158.6:45542 dest: /192.168.158.4:9866 2025-07-21 04:55:24,085 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:45542, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1366604911_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758143_17319, duration(ns): 17100271 2025-07-21 04:55:24,085 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758143_17319, type=LAST_IN_PIPELINE terminating 2025-07-21 04:55:26,730 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758143_17319 replica FinalizedReplica, blk_1073758143_17319, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758143 for deletion 2025-07-21 04:55:26,731 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758143_17319 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758143 2025-07-21 04:56:24,062 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758144_17320 src: /192.168.158.8:53444 dest: /192.168.158.4:9866 2025-07-21 04:56:24,088 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:53444, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_917629526_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758144_17320, duration(ns): 21024739 2025-07-21 04:56:24,088 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758144_17320, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-21 04:56:26,733 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758144_17320 replica FinalizedReplica, blk_1073758144_17320, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758144 for deletion 2025-07-21 04:56:26,734 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758144_17320 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758144 2025-07-21 05:00:24,067 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758148_17324 src: /192.168.158.1:39732 dest: /192.168.158.4:9866 2025-07-21 05:00:24,102 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39732, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_487603881_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758148_17324, duration(ns): 25630936 2025-07-21 05:00:24,102 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758148_17324, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-21 05:00:26,739 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758148_17324 replica FinalizedReplica, blk_1073758148_17324, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758148 for deletion 2025-07-21 05:00:26,740 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758148_17324 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758148 2025-07-21 05:03:29,076 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758151_17327 src: /192.168.158.8:43488 dest: /192.168.158.4:9866 2025-07-21 05:03:29,095 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:43488, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1934110510_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758151_17327, duration(ns): 17246129 2025-07-21 05:03:29,096 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758151_17327, type=LAST_IN_PIPELINE terminating 2025-07-21 05:03:35,744 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758151_17327 replica FinalizedReplica, blk_1073758151_17327, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758151 for deletion 2025-07-21 05:03:35,745 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758151_17327 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758151 2025-07-21 05:04:29,077 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758152_17328 src: /192.168.158.7:43456 dest: /192.168.158.4:9866 2025-07-21 05:04:29,104 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:43456, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_330695579_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758152_17328, duration(ns): 21325460 2025-07-21 05:04:29,104 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758152_17328, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-21 05:04:35,744 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758152_17328 replica FinalizedReplica, blk_1073758152_17328, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758152 for deletion 2025-07-21 05:04:35,745 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758152_17328 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758152 2025-07-21 05:05:29,083 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758153_17329 src: /192.168.158.9:49398 dest: /192.168.158.4:9866 2025-07-21 05:05:29,105 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:49398, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_269851751_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758153_17329, duration(ns): 19706402 2025-07-21 05:05:29,105 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758153_17329, type=LAST_IN_PIPELINE terminating 2025-07-21 05:05:32,748 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758153_17329 replica FinalizedReplica, blk_1073758153_17329, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758153 for deletion 2025-07-21 05:05:32,749 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758153_17329 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758153 2025-07-21 05:06:34,082 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758154_17330 src: /192.168.158.6:43468 dest: /192.168.158.4:9866 2025-07-21 05:06:34,100 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:43468, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_216361540_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758154_17330, duration(ns): 16112388 2025-07-21 05:06:34,100 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758154_17330, type=LAST_IN_PIPELINE terminating 2025-07-21 05:06:35,749 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758154_17330 replica FinalizedReplica, blk_1073758154_17330, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758154 for deletion 2025-07-21 05:06:35,750 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758154_17330 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758154 2025-07-21 05:08:34,084 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758156_17332 src: /192.168.158.5:52920 dest: /192.168.158.4:9866 2025-07-21 05:08:34,103 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:52920, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-572111915_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758156_17332, duration(ns): 16461766 2025-07-21 05:08:34,103 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758156_17332, type=LAST_IN_PIPELINE terminating 2025-07-21 05:08:35,753 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758156_17332 replica FinalizedReplica, blk_1073758156_17332, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758156 for deletion 2025-07-21 05:08:35,754 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758156_17332 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758156 2025-07-21 05:09:39,080 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758157_17333 src: /192.168.158.9:54392 dest: /192.168.158.4:9866 2025-07-21 05:09:39,106 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:54392, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1626757017_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758157_17333, duration(ns): 20246970 2025-07-21 05:09:39,106 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758157_17333, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-21 05:09:41,756 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758157_17333 replica FinalizedReplica, blk_1073758157_17333, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758157 for deletion 2025-07-21 05:09:41,757 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758157_17333 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758157 2025-07-21 05:10:39,088 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758158_17334 src: /192.168.158.9:51878 dest: /192.168.158.4:9866 2025-07-21 05:10:39,107 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:51878, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1121187168_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758158_17334, duration(ns): 16835962 2025-07-21 05:10:39,107 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758158_17334, type=LAST_IN_PIPELINE terminating 2025-07-21 05:10:41,758 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758158_17334 replica FinalizedReplica, blk_1073758158_17334, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758158 for deletion 2025-07-21 05:10:41,760 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758158_17334 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758158 2025-07-21 05:11:39,086 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758159_17335 src: /192.168.158.6:52608 dest: /192.168.158.4:9866 2025-07-21 05:11:39,115 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:52608, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-310324336_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758159_17335, duration(ns): 22968202 2025-07-21 05:11:39,116 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758159_17335, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-21 05:11:41,758 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758159_17335 replica FinalizedReplica, blk_1073758159_17335, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758159 for deletion 2025-07-21 05:11:41,759 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758159_17335 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758159 2025-07-21 05:12:39,094 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758160_17336 src: /192.168.158.7:58234 dest: /192.168.158.4:9866 2025-07-21 05:12:39,114 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:58234, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1658291493_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758160_17336, duration(ns): 18266077 2025-07-21 05:12:39,115 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758160_17336, type=LAST_IN_PIPELINE terminating 2025-07-21 05:12:41,759 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758160_17336 replica FinalizedReplica, blk_1073758160_17336, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758160 for deletion 2025-07-21 05:12:41,760 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758160_17336 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758160 2025-07-21 05:13:39,087 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758161_17337 src: /192.168.158.6:42932 dest: /192.168.158.4:9866 2025-07-21 05:13:39,113 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:42932, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2124929040_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758161_17337, duration(ns): 20475460 2025-07-21 05:13:39,113 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758161_17337, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-21 05:13:41,763 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758161_17337 replica FinalizedReplica, blk_1073758161_17337, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758161 for deletion 2025-07-21 05:13:41,764 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758161_17337 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758161 2025-07-21 05:17:44,096 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758165_17341 src: /192.168.158.6:46966 dest: /192.168.158.4:9866 2025-07-21 05:17:44,116 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:46966, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_5220442_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758165_17341, duration(ns): 17721939 2025-07-21 05:17:44,116 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758165_17341, type=LAST_IN_PIPELINE terminating 2025-07-21 05:17:44,771 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758165_17341 replica FinalizedReplica, blk_1073758165_17341, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758165 for deletion 2025-07-21 05:17:44,772 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758165_17341 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758165 2025-07-21 05:19:44,102 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758167_17343 src: /192.168.158.8:58558 dest: /192.168.158.4:9866 2025-07-21 05:19:44,121 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:58558, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-282121230_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758167_17343, duration(ns): 16677635 2025-07-21 05:19:44,122 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758167_17343, type=LAST_IN_PIPELINE terminating 2025-07-21 05:19:44,776 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758167_17343 replica FinalizedReplica, blk_1073758167_17343, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758167 for deletion 2025-07-21 05:19:44,777 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758167_17343 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758167 2025-07-21 05:20:44,098 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758168_17344 src: /192.168.158.7:42344 dest: /192.168.158.4:9866 2025-07-21 05:20:44,127 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:42344, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1973517946_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758168_17344, duration(ns): 23855720 2025-07-21 05:20:44,128 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758168_17344, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-21 05:20:44,778 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758168_17344 replica FinalizedReplica, blk_1073758168_17344, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758168 for deletion 2025-07-21 05:20:44,779 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758168_17344 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758168 2025-07-21 05:21:49,099 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758169_17345 src: /192.168.158.8:55790 dest: /192.168.158.4:9866 2025-07-21 05:21:49,126 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:55790, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-812105625_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758169_17345, duration(ns): 21336187 2025-07-21 05:21:49,126 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758169_17345, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-21 05:21:53,780 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758169_17345 replica FinalizedReplica, blk_1073758169_17345, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758169 for deletion 2025-07-21 05:21:53,782 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758169_17345 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758169 2025-07-21 05:22:49,105 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758170_17346 src: /192.168.158.5:42002 dest: /192.168.158.4:9866 2025-07-21 05:22:49,126 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:42002, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1946344882_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758170_17346, duration(ns): 18787226 2025-07-21 05:22:49,126 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758170_17346, type=LAST_IN_PIPELINE terminating 2025-07-21 05:22:53,781 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758170_17346 replica FinalizedReplica, blk_1073758170_17346, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758170 for deletion 2025-07-21 05:22:53,783 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758170_17346 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758170 2025-07-21 05:24:49,109 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758172_17348 src: /192.168.158.8:41418 dest: /192.168.158.4:9866 2025-07-21 05:24:49,131 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:41418, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-142852593_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758172_17348, duration(ns): 19144867 2025-07-21 05:24:49,131 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758172_17348, type=LAST_IN_PIPELINE terminating 2025-07-21 05:24:53,785 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758172_17348 replica FinalizedReplica, blk_1073758172_17348, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758172 for deletion 2025-07-21 05:24:53,786 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758172_17348 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758172 2025-07-21 05:27:49,110 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758175_17351 src: /192.168.158.9:48708 dest: /192.168.158.4:9866 2025-07-21 05:27:49,130 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:48708, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1170505217_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758175_17351, duration(ns): 18071730 2025-07-21 05:27:49,130 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758175_17351, type=LAST_IN_PIPELINE terminating 2025-07-21 05:27:50,789 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758175_17351 replica FinalizedReplica, blk_1073758175_17351, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758175 for deletion 2025-07-21 05:27:50,790 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758175_17351 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758175 2025-07-21 05:28:49,104 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758176_17352 src: /192.168.158.1:53298 dest: /192.168.158.4:9866 2025-07-21 05:28:49,136 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53298, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1729799966_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758176_17352, duration(ns): 23195757 2025-07-21 05:28:49,137 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758176_17352, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-21 05:28:53,790 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758176_17352 replica FinalizedReplica, blk_1073758176_17352, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758176 for deletion 2025-07-21 05:28:53,791 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758176_17352 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758176 2025-07-21 05:29:49,112 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758177_17353 src: /192.168.158.7:59714 dest: /192.168.158.4:9866 2025-07-21 05:29:49,138 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:59714, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2137327981_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758177_17353, duration(ns): 20357775 2025-07-21 05:29:49,139 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758177_17353, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-21 05:29:53,792 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758177_17353 replica FinalizedReplica, blk_1073758177_17353, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758177 for deletion 2025-07-21 05:29:53,794 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758177_17353 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758177 2025-07-21 05:32:49,122 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758180_17356 src: /192.168.158.9:35732 dest: /192.168.158.4:9866 2025-07-21 05:32:49,141 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:35732, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_926266613_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758180_17356, duration(ns): 17124637 2025-07-21 05:32:49,142 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758180_17356, type=LAST_IN_PIPELINE terminating 2025-07-21 05:32:53,801 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758180_17356 replica FinalizedReplica, blk_1073758180_17356, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758180 for deletion 2025-07-21 05:32:53,802 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758180_17356 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758180 2025-07-21 05:33:49,114 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758181_17357 src: /192.168.158.1:46454 dest: /192.168.158.4:9866 2025-07-21 05:33:49,154 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46454, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1095410714_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758181_17357, duration(ns): 31150070 2025-07-21 05:33:49,155 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758181_17357, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-21 05:33:50,804 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758181_17357 replica FinalizedReplica, blk_1073758181_17357, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758181 for deletion 2025-07-21 05:33:50,806 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758181_17357 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758181 2025-07-21 05:34:49,132 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758182_17358 src: /192.168.158.6:38154 dest: /192.168.158.4:9866 2025-07-21 05:34:49,151 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:38154, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-381465685_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758182_17358, duration(ns): 17577259 2025-07-21 05:34:49,152 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758182_17358, type=LAST_IN_PIPELINE terminating 2025-07-21 05:34:50,808 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758182_17358 replica FinalizedReplica, blk_1073758182_17358, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758182 for deletion 2025-07-21 05:34:50,809 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758182_17358 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758182 2025-07-21 05:35:49,122 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758183_17359 src: /192.168.158.8:50222 dest: /192.168.158.4:9866 2025-07-21 05:35:49,151 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:50222, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1693820044_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758183_17359, duration(ns): 23816622 2025-07-21 05:35:49,151 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758183_17359, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-21 05:35:50,811 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758183_17359 replica FinalizedReplica, blk_1073758183_17359, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758183 for deletion 2025-07-21 05:35:50,813 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758183_17359 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758183 2025-07-21 05:36:13,269 INFO org.apache.hadoop.hdfs.server.datanode.DirectoryScanner: BlockPool BP-1059995147-192.168.158.1-1752101929360 Total blocks: 8, missing metadata files:0, missing block files:0, missing blocks in memory:0, mismatched blocks:0 2025-07-21 05:38:49,133 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758186_17362 src: /192.168.158.6:32872 dest: /192.168.158.4:9866 2025-07-21 05:38:49,153 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:32872, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1467061805_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758186_17362, duration(ns): 18441791 2025-07-21 05:38:49,154 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758186_17362, type=LAST_IN_PIPELINE terminating 2025-07-21 05:38:50,817 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758186_17362 replica FinalizedReplica, blk_1073758186_17362, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758186 for deletion 2025-07-21 05:38:50,818 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758186_17362 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758186 2025-07-21 05:39:54,122 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758187_17363 src: /192.168.158.1:44940 dest: /192.168.158.4:9866 2025-07-21 05:39:54,155 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44940, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_618353154_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758187_17363, duration(ns): 24367624 2025-07-21 05:39:54,155 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758187_17363, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-21 05:39:56,819 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758187_17363 replica FinalizedReplica, blk_1073758187_17363, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758187 for deletion 2025-07-21 05:39:56,820 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758187_17363 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758187 2025-07-21 05:43:59,134 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758191_17367 src: /192.168.158.8:36532 dest: /192.168.158.4:9866 2025-07-21 05:43:59,153 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:36532, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-425573253_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758191_17367, duration(ns): 16697591 2025-07-21 05:43:59,153 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758191_17367, type=LAST_IN_PIPELINE terminating 2025-07-21 05:44:02,829 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758191_17367 replica FinalizedReplica, blk_1073758191_17367, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758191 for deletion 2025-07-21 05:44:02,830 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758191_17367 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758191 2025-07-21 05:44:59,133 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758192_17368 src: /192.168.158.1:51820 dest: /192.168.158.4:9866 2025-07-21 05:44:59,167 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51820, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1304167708_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758192_17368, duration(ns): 24857841 2025-07-21 05:44:59,167 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758192_17368, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-21 05:45:02,830 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758192_17368 replica FinalizedReplica, blk_1073758192_17368, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758192 for deletion 2025-07-21 05:45:02,831 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758192_17368 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758192 2025-07-21 05:48:59,143 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758196_17372 src: /192.168.158.8:38112 dest: /192.168.158.4:9866 2025-07-21 05:48:59,165 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:38112, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2118078743_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758196_17372, duration(ns): 19685975 2025-07-21 05:48:59,165 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758196_17372, type=LAST_IN_PIPELINE terminating 2025-07-21 05:49:02,837 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758196_17372 replica FinalizedReplica, blk_1073758196_17372, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758196 for deletion 2025-07-21 05:49:02,838 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758196_17372 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758196 2025-07-21 05:52:04,140 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758199_17375 src: /192.168.158.6:56286 dest: /192.168.158.4:9866 2025-07-21 05:52:04,166 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:56286, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_383192078_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758199_17375, duration(ns): 20928390 2025-07-21 05:52:04,166 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758199_17375, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-21 05:52:08,846 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758199_17375 replica FinalizedReplica, blk_1073758199_17375, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758199 for deletion 2025-07-21 05:52:08,847 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758199_17375 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758199 2025-07-21 05:53:09,141 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758200_17376 src: /192.168.158.1:52762 dest: /192.168.158.4:9866 2025-07-21 05:53:09,175 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52762, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_443421122_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758200_17376, duration(ns): 24274788 2025-07-21 05:53:09,175 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758200_17376, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-21 05:53:11,847 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758200_17376 replica FinalizedReplica, blk_1073758200_17376, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758200 for deletion 2025-07-21 05:53:11,848 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758200_17376 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758200 2025-07-21 05:54:14,145 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758201_17377 src: /192.168.158.1:49376 dest: /192.168.158.4:9866 2025-07-21 05:54:14,179 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49376, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1146483298_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758201_17377, duration(ns): 24539260 2025-07-21 05:54:14,179 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758201_17377, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-21 05:54:14,849 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758201_17377 replica FinalizedReplica, blk_1073758201_17377, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758201 for deletion 2025-07-21 05:54:14,850 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758201_17377 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758201 2025-07-21 05:57:19,147 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758204_17380 src: /192.168.158.7:46244 dest: /192.168.158.4:9866 2025-07-21 05:57:19,177 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:46244, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1917451056_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758204_17380, duration(ns): 22258149 2025-07-21 05:57:19,177 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758204_17380, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-21 05:57:23,857 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758204_17380 replica FinalizedReplica, blk_1073758204_17380, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758204 for deletion 2025-07-21 05:57:23,858 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758204_17380 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758204 2025-07-21 05:58:19,147 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758205_17381 src: /192.168.158.9:41556 dest: /192.168.158.4:9866 2025-07-21 05:58:19,177 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:41556, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-387746034_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758205_17381, duration(ns): 24506611 2025-07-21 05:58:19,177 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758205_17381, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-21 05:58:23,859 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758205_17381 replica FinalizedReplica, blk_1073758205_17381, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758205 for deletion 2025-07-21 05:58:23,861 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758205_17381 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir31/blk_1073758205 2025-07-21 06:02:24,148 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758209_17385 src: /192.168.158.1:34694 dest: /192.168.158.4:9866 2025-07-21 06:02:24,182 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34694, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1413564069_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758209_17385, duration(ns): 25147919 2025-07-21 06:02:24,182 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758209_17385, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-21 06:02:26,867 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758209_17385 replica FinalizedReplica, blk_1073758209_17385, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758209 for deletion 2025-07-21 06:02:26,868 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758209_17385 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758209 2025-07-21 06:03:24,156 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758210_17386 src: /192.168.158.7:54270 dest: /192.168.158.4:9866 2025-07-21 06:03:24,178 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:54270, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1176397197_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758210_17386, duration(ns): 19740802 2025-07-21 06:03:24,178 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758210_17386, type=LAST_IN_PIPELINE terminating 2025-07-21 06:03:26,870 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758210_17386 replica FinalizedReplica, blk_1073758210_17386, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758210 for deletion 2025-07-21 06:03:26,871 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758210_17386 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758210 2025-07-21 06:06:29,160 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758213_17389 src: /192.168.158.6:44504 dest: /192.168.158.4:9866 2025-07-21 06:06:29,179 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:44504, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_665713780_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758213_17389, duration(ns): 16648803 2025-07-21 06:06:29,179 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758213_17389, type=LAST_IN_PIPELINE terminating 2025-07-21 06:06:29,873 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758213_17389 replica FinalizedReplica, blk_1073758213_17389, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758213 for deletion 2025-07-21 06:06:29,874 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758213_17389 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758213 2025-07-21 06:08:29,154 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758215_17391 src: /192.168.158.1:39460 dest: /192.168.158.4:9866 2025-07-21 06:08:29,186 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39460, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-698146350_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758215_17391, duration(ns): 23065281 2025-07-21 06:08:29,187 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758215_17391, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-21 06:08:32,877 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758215_17391 replica FinalizedReplica, blk_1073758215_17391, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758215 for deletion 2025-07-21 06:08:32,878 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758215_17391 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758215 2025-07-21 06:11:34,185 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758218_17394 src: /192.168.158.7:52624 dest: /192.168.158.4:9866 2025-07-21 06:11:34,204 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:52624, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1325479863_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758218_17394, duration(ns): 17471117 2025-07-21 06:11:34,205 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758218_17394, type=LAST_IN_PIPELINE terminating 2025-07-21 06:11:35,882 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758218_17394 replica FinalizedReplica, blk_1073758218_17394, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758218 for deletion 2025-07-21 06:11:35,883 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758218_17394 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758218 2025-07-21 06:12:34,200 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758219_17395 src: /192.168.158.6:58668 dest: /192.168.158.4:9866 2025-07-21 06:12:34,220 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:58668, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1600194259_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758219_17395, duration(ns): 17962045 2025-07-21 06:12:34,220 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758219_17395, type=LAST_IN_PIPELINE terminating 2025-07-21 06:12:38,886 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758219_17395 replica FinalizedReplica, blk_1073758219_17395, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758219 for deletion 2025-07-21 06:12:38,887 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758219_17395 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758219 2025-07-21 06:13:34,182 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758220_17396 src: /192.168.158.6:54272 dest: /192.168.158.4:9866 2025-07-21 06:13:34,202 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:54272, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_661393619_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758220_17396, duration(ns): 18566441 2025-07-21 06:13:34,203 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758220_17396, type=LAST_IN_PIPELINE terminating 2025-07-21 06:13:35,886 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758220_17396 replica FinalizedReplica, blk_1073758220_17396, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758220 for deletion 2025-07-21 06:13:35,887 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758220_17396 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758220 2025-07-21 06:14:34,181 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758221_17397 src: /192.168.158.1:44272 dest: /192.168.158.4:9866 2025-07-21 06:14:34,219 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44272, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1974095368_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758221_17397, duration(ns): 28945188 2025-07-21 06:14:34,220 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758221_17397, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-21 06:14:35,888 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758221_17397 replica FinalizedReplica, blk_1073758221_17397, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758221 for deletion 2025-07-21 06:14:35,889 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758221_17397 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758221 2025-07-21 06:15:34,197 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758222_17398 src: /192.168.158.5:52630 dest: /192.168.158.4:9866 2025-07-21 06:15:34,217 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:52630, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-535438203_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758222_17398, duration(ns): 18022609 2025-07-21 06:15:34,218 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758222_17398, type=LAST_IN_PIPELINE terminating 2025-07-21 06:15:35,892 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758222_17398 replica FinalizedReplica, blk_1073758222_17398, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758222 for deletion 2025-07-21 06:15:35,893 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758222_17398 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758222 2025-07-21 06:16:34,195 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758223_17399 src: /192.168.158.9:33686 dest: /192.168.158.4:9866 2025-07-21 06:16:34,224 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:33686, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-435775117_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758223_17399, duration(ns): 23483721 2025-07-21 06:16:34,224 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758223_17399, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-21 06:16:38,893 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758223_17399 replica FinalizedReplica, blk_1073758223_17399, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758223 for deletion 2025-07-21 06:16:38,894 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758223_17399 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758223 2025-07-21 06:17:34,199 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758224_17400 src: /192.168.158.8:49386 dest: /192.168.158.4:9866 2025-07-21 06:17:34,220 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:49386, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-470170474_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758224_17400, duration(ns): 18747346 2025-07-21 06:17:34,220 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758224_17400, type=LAST_IN_PIPELINE terminating 2025-07-21 06:17:38,894 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758224_17400 replica FinalizedReplica, blk_1073758224_17400, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758224 for deletion 2025-07-21 06:17:38,896 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758224_17400 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758224 2025-07-21 06:18:34,193 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758225_17401 src: /192.168.158.9:32926 dest: /192.168.158.4:9866 2025-07-21 06:18:34,221 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:32926, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-660093077_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758225_17401, duration(ns): 21952072 2025-07-21 06:18:34,221 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758225_17401, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-21 06:18:35,895 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758225_17401 replica FinalizedReplica, blk_1073758225_17401, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758225 for deletion 2025-07-21 06:18:35,896 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758225_17401 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758225 2025-07-21 06:20:39,192 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758227_17403 src: /192.168.158.7:53174 dest: /192.168.158.4:9866 2025-07-21 06:20:39,212 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:53174, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1337026306_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758227_17403, duration(ns): 17896211 2025-07-21 06:20:39,212 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758227_17403, type=LAST_IN_PIPELINE terminating 2025-07-21 06:20:44,901 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758227_17403 replica FinalizedReplica, blk_1073758227_17403, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758227 for deletion 2025-07-21 06:20:44,902 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758227_17403 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758227 2025-07-21 06:21:39,197 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758228_17404 src: /192.168.158.6:36414 dest: /192.168.158.4:9866 2025-07-21 06:21:39,225 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:36414, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-766166731_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758228_17404, duration(ns): 22134398 2025-07-21 06:21:39,225 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758228_17404, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-21 06:21:41,902 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758228_17404 replica FinalizedReplica, blk_1073758228_17404, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758228 for deletion 2025-07-21 06:21:41,903 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758228_17404 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758228 2025-07-21 06:22:44,213 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758229_17405 src: /192.168.158.9:41092 dest: /192.168.158.4:9866 2025-07-21 06:22:44,235 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:41092, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1701489464_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758229_17405, duration(ns): 19688640 2025-07-21 06:22:44,235 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758229_17405, type=LAST_IN_PIPELINE terminating 2025-07-21 06:22:47,905 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758229_17405 replica FinalizedReplica, blk_1073758229_17405, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758229 for deletion 2025-07-21 06:22:47,906 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758229_17405 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758229 2025-07-21 06:23:44,193 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758230_17406 src: /192.168.158.1:47526 dest: /192.168.158.4:9866 2025-07-21 06:23:44,230 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47526, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-23375225_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758230_17406, duration(ns): 27509631 2025-07-21 06:23:44,231 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758230_17406, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-21 06:23:44,906 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758230_17406 replica FinalizedReplica, blk_1073758230_17406, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758230 for deletion 2025-07-21 06:23:44,908 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758230_17406 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758230 2025-07-21 06:24:44,199 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758231_17407 src: /192.168.158.6:40590 dest: /192.168.158.4:9866 2025-07-21 06:24:44,218 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:40590, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1604299062_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758231_17407, duration(ns): 16425483 2025-07-21 06:24:44,218 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758231_17407, type=LAST_IN_PIPELINE terminating 2025-07-21 06:24:44,909 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758231_17407 replica FinalizedReplica, blk_1073758231_17407, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758231 for deletion 2025-07-21 06:24:44,910 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758231_17407 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758231 2025-07-21 06:27:49,205 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758234_17410 src: /192.168.158.6:48222 dest: /192.168.158.4:9866 2025-07-21 06:27:49,234 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:48222, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1377890870_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758234_17410, duration(ns): 22666154 2025-07-21 06:27:49,234 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758234_17410, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-21 06:27:50,916 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758234_17410 replica FinalizedReplica, blk_1073758234_17410, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758234 for deletion 2025-07-21 06:27:50,917 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758234_17410 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758234 2025-07-21 06:29:49,188 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758236_17412 src: /192.168.158.5:44720 dest: /192.168.158.4:9866 2025-07-21 06:29:49,218 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:44720, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-704522422_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758236_17412, duration(ns): 22546103 2025-07-21 06:29:49,218 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758236_17412, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-21 06:29:53,921 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758236_17412 replica FinalizedReplica, blk_1073758236_17412, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758236 for deletion 2025-07-21 06:29:53,922 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758236_17412 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758236 2025-07-21 06:32:54,207 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758239_17415 src: /192.168.158.1:51248 dest: /192.168.158.4:9866 2025-07-21 06:32:54,243 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51248, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_133765129_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758239_17415, duration(ns): 27122511 2025-07-21 06:32:54,244 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758239_17415, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-21 06:32:56,928 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758239_17415 replica FinalizedReplica, blk_1073758239_17415, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758239 for deletion 2025-07-21 06:32:56,929 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758239_17415 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758239 2025-07-21 06:35:54,198 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758242_17418 src: /192.168.158.7:32784 dest: /192.168.158.4:9866 2025-07-21 06:35:54,224 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:32784, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_824100627_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758242_17418, duration(ns): 19911597 2025-07-21 06:35:54,224 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758242_17418, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-21 06:35:56,937 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758242_17418 replica FinalizedReplica, blk_1073758242_17418, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758242 for deletion 2025-07-21 06:35:56,938 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758242_17418 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758242 2025-07-21 06:36:54,225 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758243_17419 src: /192.168.158.1:44860 dest: /192.168.158.4:9866 2025-07-21 06:36:54,258 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44860, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1172396056_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758243_17419, duration(ns): 24045435 2025-07-21 06:36:54,258 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758243_17419, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-21 06:36:56,940 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758243_17419 replica FinalizedReplica, blk_1073758243_17419, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758243 for deletion 2025-07-21 06:36:56,941 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758243_17419 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758243 2025-07-21 06:37:54,210 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758244_17420 src: /192.168.158.6:54062 dest: /192.168.158.4:9866 2025-07-21 06:37:54,229 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:54062, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1228260333_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758244_17420, duration(ns): 17011884 2025-07-21 06:37:54,229 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758244_17420, type=LAST_IN_PIPELINE terminating 2025-07-21 06:37:56,940 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758244_17420 replica FinalizedReplica, blk_1073758244_17420, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758244 for deletion 2025-07-21 06:37:56,941 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758244_17420 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758244 2025-07-21 06:41:04,196 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758247_17423 src: /192.168.158.1:46096 dest: /192.168.158.4:9866 2025-07-21 06:41:04,229 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46096, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1206440370_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758247_17423, duration(ns): 23640904 2025-07-21 06:41:04,229 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758247_17423, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-21 06:41:08,947 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758247_17423 replica FinalizedReplica, blk_1073758247_17423, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758247 for deletion 2025-07-21 06:41:08,948 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758247_17423 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758247 2025-07-21 06:45:04,212 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758251_17427 src: /192.168.158.6:36128 dest: /192.168.158.4:9866 2025-07-21 06:45:04,231 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:36128, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2024953997_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758251_17427, duration(ns): 17311061 2025-07-21 06:45:04,232 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758251_17427, type=LAST_IN_PIPELINE terminating 2025-07-21 06:45:05,955 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758251_17427 replica FinalizedReplica, blk_1073758251_17427, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758251 for deletion 2025-07-21 06:45:05,956 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758251_17427 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758251 2025-07-21 06:48:04,211 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758254_17430 src: /192.168.158.5:55246 dest: /192.168.158.4:9866 2025-07-21 06:48:04,236 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:55246, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1909731636_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758254_17430, duration(ns): 19145023 2025-07-21 06:48:04,236 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758254_17430, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-21 06:48:08,963 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758254_17430 replica FinalizedReplica, blk_1073758254_17430, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758254 for deletion 2025-07-21 06:48:08,964 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758254_17430 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758254 2025-07-21 06:49:04,219 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758255_17431 src: /192.168.158.6:48256 dest: /192.168.158.4:9866 2025-07-21 06:49:04,239 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:48256, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-351544345_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758255_17431, duration(ns): 17529342 2025-07-21 06:49:04,239 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758255_17431, type=LAST_IN_PIPELINE terminating 2025-07-21 06:49:08,965 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758255_17431 replica FinalizedReplica, blk_1073758255_17431, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758255 for deletion 2025-07-21 06:49:08,966 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758255_17431 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758255 2025-07-21 06:53:09,215 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758259_17435 src: /192.168.158.1:59472 dest: /192.168.158.4:9866 2025-07-21 06:53:09,248 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59472, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_68684183_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758259_17435, duration(ns): 24568171 2025-07-21 06:53:09,248 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758259_17435, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-21 06:53:11,972 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758259_17435 replica FinalizedReplica, blk_1073758259_17435, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758259 for deletion 2025-07-21 06:53:11,973 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758259_17435 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758259 2025-07-21 06:56:19,221 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758262_17438 src: /192.168.158.8:55990 dest: /192.168.158.4:9866 2025-07-21 06:56:19,250 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:55990, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1639975771_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758262_17438, duration(ns): 23724558 2025-07-21 06:56:19,250 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758262_17438, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-21 06:56:20,977 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758262_17438 replica FinalizedReplica, blk_1073758262_17438, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758262 for deletion 2025-07-21 06:56:20,978 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758262_17438 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758262 2025-07-21 07:00:24,229 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758266_17442 src: /192.168.158.1:49562 dest: /192.168.158.4:9866 2025-07-21 07:00:24,262 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49562, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1855923100_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758266_17442, duration(ns): 24594868 2025-07-21 07:00:24,262 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758266_17442, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-21 07:00:26,984 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758266_17442 replica FinalizedReplica, blk_1073758266_17442, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758266 for deletion 2025-07-21 07:00:26,985 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758266_17442 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758266 2025-07-21 07:02:29,242 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758268_17444 src: /192.168.158.9:46302 dest: /192.168.158.4:9866 2025-07-21 07:02:29,260 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:46302, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1801405267_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758268_17444, duration(ns): 16038182 2025-07-21 07:02:29,260 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758268_17444, type=LAST_IN_PIPELINE terminating 2025-07-21 07:02:29,988 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758268_17444 replica FinalizedReplica, blk_1073758268_17444, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758268 for deletion 2025-07-21 07:02:29,989 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758268_17444 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758268 2025-07-21 07:04:34,237 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758270_17446 src: /192.168.158.6:58260 dest: /192.168.158.4:9866 2025-07-21 07:04:34,268 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:58260, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-485336904_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758270_17446, duration(ns): 24041117 2025-07-21 07:04:34,269 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758270_17446, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-21 07:04:35,990 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758270_17446 replica FinalizedReplica, blk_1073758270_17446, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758270 for deletion 2025-07-21 07:04:35,992 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758270_17446 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758270 2025-07-21 07:05:34,245 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758271_17447 src: /192.168.158.8:40048 dest: /192.168.158.4:9866 2025-07-21 07:05:34,271 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:40048, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-472357161_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758271_17447, duration(ns): 21007819 2025-07-21 07:05:34,271 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758271_17447, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-21 07:05:35,993 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758271_17447 replica FinalizedReplica, blk_1073758271_17447, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758271 for deletion 2025-07-21 07:05:35,994 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758271_17447 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758271 2025-07-21 07:08:39,242 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758274_17450 src: /192.168.158.8:50040 dest: /192.168.158.4:9866 2025-07-21 07:08:39,261 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:50040, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1631081095_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758274_17450, duration(ns): 16762706 2025-07-21 07:08:39,261 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758274_17450, type=LAST_IN_PIPELINE terminating 2025-07-21 07:08:44,997 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758274_17450 replica FinalizedReplica, blk_1073758274_17450, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758274 for deletion 2025-07-21 07:08:44,998 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758274_17450 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758274 2025-07-21 07:09:39,244 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758275_17451 src: /192.168.158.5:37444 dest: /192.168.158.4:9866 2025-07-21 07:09:39,264 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:37444, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1943626193_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758275_17451, duration(ns): 17481794 2025-07-21 07:09:39,265 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758275_17451, type=LAST_IN_PIPELINE terminating 2025-07-21 07:09:41,997 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758275_17451 replica FinalizedReplica, blk_1073758275_17451, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758275 for deletion 2025-07-21 07:09:41,999 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758275_17451 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758275 2025-07-21 07:10:44,246 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758276_17452 src: /192.168.158.8:54072 dest: /192.168.158.4:9866 2025-07-21 07:10:44,265 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:54072, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_345965125_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758276_17452, duration(ns): 16764238 2025-07-21 07:10:44,265 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758276_17452, type=LAST_IN_PIPELINE terminating 2025-07-21 07:10:45,000 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758276_17452 replica FinalizedReplica, blk_1073758276_17452, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758276 for deletion 2025-07-21 07:10:45,001 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758276_17452 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758276 2025-07-21 07:12:44,251 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758278_17454 src: /192.168.158.7:60628 dest: /192.168.158.4:9866 2025-07-21 07:12:44,277 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:60628, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_899374710_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758278_17454, duration(ns): 20070315 2025-07-21 07:12:44,277 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758278_17454, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-21 07:12:45,004 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758278_17454 replica FinalizedReplica, blk_1073758278_17454, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758278 for deletion 2025-07-21 07:12:45,006 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758278_17454 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758278 2025-07-21 07:15:49,251 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758281_17457 src: /192.168.158.8:56032 dest: /192.168.158.4:9866 2025-07-21 07:15:49,279 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:56032, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_251956687_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758281_17457, duration(ns): 22510976 2025-07-21 07:15:49,280 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758281_17457, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-21 07:15:51,014 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758281_17457 replica FinalizedReplica, blk_1073758281_17457, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758281 for deletion 2025-07-21 07:15:51,016 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758281_17457 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758281 2025-07-21 07:16:49,256 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758282_17458 src: /192.168.158.7:57532 dest: /192.168.158.4:9866 2025-07-21 07:16:49,277 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:57532, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-640379114_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758282_17458, duration(ns): 17443170 2025-07-21 07:16:49,277 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758282_17458, type=LAST_IN_PIPELINE terminating 2025-07-21 07:16:51,020 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758282_17458 replica FinalizedReplica, blk_1073758282_17458, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758282 for deletion 2025-07-21 07:16:51,021 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758282_17458 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758282 2025-07-21 07:17:49,249 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758283_17459 src: /192.168.158.1:53656 dest: /192.168.158.4:9866 2025-07-21 07:17:49,284 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53656, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_126647101_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758283_17459, duration(ns): 26572158 2025-07-21 07:17:49,285 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758283_17459, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-21 07:17:51,019 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758283_17459 replica FinalizedReplica, blk_1073758283_17459, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758283 for deletion 2025-07-21 07:17:51,021 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758283_17459 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758283 2025-07-21 07:20:59,265 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758286_17462 src: /192.168.158.8:52368 dest: /192.168.158.4:9866 2025-07-21 07:20:59,284 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:52368, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_780003940_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758286_17462, duration(ns): 16552017 2025-07-21 07:20:59,284 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758286_17462, type=LAST_IN_PIPELINE terminating 2025-07-21 07:21:00,027 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758286_17462 replica FinalizedReplica, blk_1073758286_17462, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758286 for deletion 2025-07-21 07:21:00,028 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758286_17462 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758286 2025-07-21 07:22:04,254 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758287_17463 src: /192.168.158.1:41048 dest: /192.168.158.4:9866 2025-07-21 07:22:04,287 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41048, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-293979835_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758287_17463, duration(ns): 22883584 2025-07-21 07:22:04,287 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758287_17463, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-21 07:22:06,030 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758287_17463 replica FinalizedReplica, blk_1073758287_17463, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758287 for deletion 2025-07-21 07:22:06,031 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758287_17463 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758287 2025-07-21 07:23:09,258 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758288_17464 src: /192.168.158.6:33538 dest: /192.168.158.4:9866 2025-07-21 07:23:09,283 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:33538, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-798985299_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758288_17464, duration(ns): 19838417 2025-07-21 07:23:09,284 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758288_17464, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-21 07:23:12,032 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758288_17464 replica FinalizedReplica, blk_1073758288_17464, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758288 for deletion 2025-07-21 07:23:12,033 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758288_17464 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758288 2025-07-21 07:24:09,263 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758289_17465 src: /192.168.158.7:41582 dest: /192.168.158.4:9866 2025-07-21 07:24:09,290 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:41582, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-38993028_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758289_17465, duration(ns): 22283556 2025-07-21 07:24:09,291 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758289_17465, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-21 07:24:12,036 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758289_17465 replica FinalizedReplica, blk_1073758289_17465, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758289 for deletion 2025-07-21 07:24:12,037 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758289_17465 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758289 2025-07-21 07:26:14,262 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758291_17467 src: /192.168.158.1:58076 dest: /192.168.158.4:9866 2025-07-21 07:26:14,295 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58076, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1915744754_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758291_17467, duration(ns): 24186037 2025-07-21 07:26:14,295 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758291_17467, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-21 07:26:15,039 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758291_17467 replica FinalizedReplica, blk_1073758291_17467, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758291 for deletion 2025-07-21 07:26:15,040 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758291_17467 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758291 2025-07-21 07:31:14,294 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758296_17472 src: /192.168.158.7:50816 dest: /192.168.158.4:9866 2025-07-21 07:31:14,313 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:50816, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_600563407_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758296_17472, duration(ns): 17393239 2025-07-21 07:31:14,313 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758296_17472, type=LAST_IN_PIPELINE terminating 2025-07-21 07:31:15,056 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758296_17472 replica FinalizedReplica, blk_1073758296_17472, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758296 for deletion 2025-07-21 07:31:15,058 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758296_17472 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758296 2025-07-21 07:33:19,288 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758298_17474 src: /192.168.158.1:54066 dest: /192.168.158.4:9866 2025-07-21 07:33:19,321 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54066, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-228020932_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758298_17474, duration(ns): 25206336 2025-07-21 07:33:19,322 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758298_17474, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-21 07:33:21,063 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758298_17474 replica FinalizedReplica, blk_1073758298_17474, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758298 for deletion 2025-07-21 07:33:21,065 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758298_17474 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758298 2025-07-21 07:34:19,291 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758299_17475 src: /192.168.158.9:42946 dest: /192.168.158.4:9866 2025-07-21 07:34:19,319 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:42946, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_578132763_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758299_17475, duration(ns): 22936873 2025-07-21 07:34:19,320 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758299_17475, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-21 07:34:21,066 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758299_17475 replica FinalizedReplica, blk_1073758299_17475, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758299 for deletion 2025-07-21 07:34:21,067 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758299_17475 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758299 2025-07-21 07:41:34,302 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758306_17482 src: /192.168.158.1:58536 dest: /192.168.158.4:9866 2025-07-21 07:41:34,336 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58536, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-299564920_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758306_17482, duration(ns): 25138835 2025-07-21 07:41:34,337 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758306_17482, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-21 07:41:36,078 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758306_17482 replica FinalizedReplica, blk_1073758306_17482, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758306 for deletion 2025-07-21 07:41:36,079 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758306_17482 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758306 2025-07-21 07:46:49,303 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758311_17487 src: /192.168.158.9:50238 dest: /192.168.158.4:9866 2025-07-21 07:46:49,331 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:50238, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-313476548_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758311_17487, duration(ns): 22561168 2025-07-21 07:46:49,332 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758311_17487, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-21 07:46:51,093 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758311_17487 replica FinalizedReplica, blk_1073758311_17487, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758311 for deletion 2025-07-21 07:46:51,094 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758311_17487 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758311 2025-07-21 07:47:49,311 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758312_17488 src: /192.168.158.6:55592 dest: /192.168.158.4:9866 2025-07-21 07:47:49,338 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:55592, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_69691398_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758312_17488, duration(ns): 21213740 2025-07-21 07:47:49,338 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758312_17488, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-21 07:47:51,094 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758312_17488 replica FinalizedReplica, blk_1073758312_17488, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758312 for deletion 2025-07-21 07:47:51,097 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758312_17488 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758312 2025-07-21 07:48:49,310 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758313_17489 src: /192.168.158.9:35404 dest: /192.168.158.4:9866 2025-07-21 07:48:49,328 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:35404, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1521495315_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758313_17489, duration(ns): 15433697 2025-07-21 07:48:49,328 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758313_17489, type=LAST_IN_PIPELINE terminating 2025-07-21 07:48:54,095 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758313_17489 replica FinalizedReplica, blk_1073758313_17489, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758313 for deletion 2025-07-21 07:48:54,096 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758313_17489 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758313 2025-07-21 07:51:59,328 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758316_17492 src: /192.168.158.1:46382 dest: /192.168.158.4:9866 2025-07-21 07:51:59,365 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46382, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_75560048_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758316_17492, duration(ns): 28237031 2025-07-21 07:51:59,366 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758316_17492, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-21 07:52:00,102 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758316_17492 replica FinalizedReplica, blk_1073758316_17492, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758316 for deletion 2025-07-21 07:52:00,104 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758316_17492 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758316 2025-07-21 07:52:59,367 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758317_17493 src: /192.168.158.1:49200 dest: /192.168.158.4:9866 2025-07-21 07:52:59,403 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49200, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1408594465_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758317_17493, duration(ns): 27183698 2025-07-21 07:52:59,403 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758317_17493, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-21 07:53:03,102 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758317_17493 replica FinalizedReplica, blk_1073758317_17493, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758317 for deletion 2025-07-21 07:53:03,104 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758317_17493 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758317 2025-07-21 07:53:59,335 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758318_17494 src: /192.168.158.1:55046 dest: /192.168.158.4:9866 2025-07-21 07:53:59,368 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55046, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1939396595_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758318_17494, duration(ns): 24002451 2025-07-21 07:53:59,368 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758318_17494, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-21 07:54:00,106 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758318_17494 replica FinalizedReplica, blk_1073758318_17494, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758318 for deletion 2025-07-21 07:54:00,107 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758318_17494 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758318 2025-07-21 07:54:59,340 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758319_17495 src: /192.168.158.9:60740 dest: /192.168.158.4:9866 2025-07-21 07:54:59,359 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:60740, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2033668835_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758319_17495, duration(ns): 17535243 2025-07-21 07:54:59,360 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758319_17495, type=LAST_IN_PIPELINE terminating 2025-07-21 07:55:03,107 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758319_17495 replica FinalizedReplica, blk_1073758319_17495, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758319 for deletion 2025-07-21 07:55:03,108 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758319_17495 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758319 2025-07-21 07:55:59,342 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758320_17496 src: /192.168.158.5:59622 dest: /192.168.158.4:9866 2025-07-21 07:55:59,368 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:59622, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1736345245_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758320_17496, duration(ns): 20293759 2025-07-21 07:55:59,370 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758320_17496, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-21 07:56:03,112 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758320_17496 replica FinalizedReplica, blk_1073758320_17496, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758320 for deletion 2025-07-21 07:56:03,113 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758320_17496 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758320 2025-07-21 07:58:04,318 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758322_17498 src: /192.168.158.1:59908 dest: /192.168.158.4:9866 2025-07-21 07:58:04,352 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59908, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1548256998_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758322_17498, duration(ns): 24685225 2025-07-21 07:58:04,352 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758322_17498, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-21 07:58:06,114 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758322_17498 replica FinalizedReplica, blk_1073758322_17498, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758322 for deletion 2025-07-21 07:58:06,115 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758322_17498 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758322 2025-07-21 07:59:09,321 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758323_17499 src: /192.168.158.8:57652 dest: /192.168.158.4:9866 2025-07-21 07:59:09,342 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:57652, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1279914429_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758323_17499, duration(ns): 17126080 2025-07-21 07:59:09,342 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758323_17499, type=LAST_IN_PIPELINE terminating 2025-07-21 07:59:15,117 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758323_17499 replica FinalizedReplica, blk_1073758323_17499, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758323 for deletion 2025-07-21 07:59:15,118 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758323_17499 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758323 2025-07-21 08:00:09,327 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758324_17500 src: /192.168.158.7:46148 dest: /192.168.158.4:9866 2025-07-21 08:00:09,347 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:46148, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_246214788_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758324_17500, duration(ns): 18135754 2025-07-21 08:00:09,348 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758324_17500, type=LAST_IN_PIPELINE terminating 2025-07-21 08:00:15,120 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758324_17500 replica FinalizedReplica, blk_1073758324_17500, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758324 for deletion 2025-07-21 08:00:15,121 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758324_17500 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758324 2025-07-21 08:02:14,325 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758326_17502 src: /192.168.158.1:59310 dest: /192.168.158.4:9866 2025-07-21 08:02:14,360 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59310, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_344437864_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758326_17502, duration(ns): 25111468 2025-07-21 08:02:14,361 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758326_17502, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-21 08:02:18,122 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758326_17502 replica FinalizedReplica, blk_1073758326_17502, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758326 for deletion 2025-07-21 08:02:18,124 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758326_17502 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758326 2025-07-21 08:03:14,322 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758327_17503 src: /192.168.158.1:44196 dest: /192.168.158.4:9866 2025-07-21 08:03:14,355 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44196, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_23076663_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758327_17503, duration(ns): 24022230 2025-07-21 08:03:14,355 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758327_17503, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-21 08:03:15,124 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758327_17503 replica FinalizedReplica, blk_1073758327_17503, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758327 for deletion 2025-07-21 08:03:15,126 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758327_17503 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758327 2025-07-21 08:04:14,323 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758328_17504 src: /192.168.158.1:48394 dest: /192.168.158.4:9866 2025-07-21 08:04:14,355 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48394, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-906590074_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758328_17504, duration(ns): 23074582 2025-07-21 08:04:14,356 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758328_17504, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-21 08:04:15,127 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758328_17504 replica FinalizedReplica, blk_1073758328_17504, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758328 for deletion 2025-07-21 08:04:15,128 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758328_17504 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758328 2025-07-21 08:05:19,325 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758329_17505 src: /192.168.158.8:37578 dest: /192.168.158.4:9866 2025-07-21 08:05:19,351 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:37578, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_721890285_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758329_17505, duration(ns): 20957477 2025-07-21 08:05:19,351 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758329_17505, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-21 08:05:21,130 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758329_17505 replica FinalizedReplica, blk_1073758329_17505, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758329 for deletion 2025-07-21 08:05:21,132 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758329_17505 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758329 2025-07-21 08:09:24,329 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758333_17509 src: /192.168.158.8:57304 dest: /192.168.158.4:9866 2025-07-21 08:09:24,357 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:57304, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-461608671_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758333_17509, duration(ns): 20976949 2025-07-21 08:09:24,358 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758333_17509, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-21 08:09:27,138 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758333_17509 replica FinalizedReplica, blk_1073758333_17509, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758333 for deletion 2025-07-21 08:09:27,139 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758333_17509 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758333 2025-07-21 08:10:24,336 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758334_17510 src: /192.168.158.9:59078 dest: /192.168.158.4:9866 2025-07-21 08:10:24,355 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:59078, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1650784157_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758334_17510, duration(ns): 16980432 2025-07-21 08:10:24,355 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758334_17510, type=LAST_IN_PIPELINE terminating 2025-07-21 08:10:27,142 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758334_17510 replica FinalizedReplica, blk_1073758334_17510, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758334 for deletion 2025-07-21 08:10:27,143 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758334_17510 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758334 2025-07-21 08:13:24,342 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758337_17513 src: /192.168.158.8:41518 dest: /192.168.158.4:9866 2025-07-21 08:13:24,363 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:41518, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_101695102_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758337_17513, duration(ns): 18864173 2025-07-21 08:13:24,363 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758337_17513, type=LAST_IN_PIPELINE terminating 2025-07-21 08:13:30,147 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758337_17513 replica FinalizedReplica, blk_1073758337_17513, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758337 for deletion 2025-07-21 08:13:30,148 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758337_17513 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758337 2025-07-21 08:14:24,344 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758338_17514 src: /192.168.158.6:41902 dest: /192.168.158.4:9866 2025-07-21 08:14:24,364 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:41902, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1603505066_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758338_17514, duration(ns): 17774945 2025-07-21 08:14:24,364 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758338_17514, type=LAST_IN_PIPELINE terminating 2025-07-21 08:14:27,150 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758338_17514 replica FinalizedReplica, blk_1073758338_17514, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758338 for deletion 2025-07-21 08:14:27,151 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758338_17514 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758338 2025-07-21 08:16:24,345 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758340_17516 src: /192.168.158.6:52736 dest: /192.168.158.4:9866 2025-07-21 08:16:24,364 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:52736, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1026235488_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758340_17516, duration(ns): 16902083 2025-07-21 08:16:24,365 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758340_17516, type=LAST_IN_PIPELINE terminating 2025-07-21 08:16:27,156 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758340_17516 replica FinalizedReplica, blk_1073758340_17516, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758340 for deletion 2025-07-21 08:16:27,157 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758340_17516 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758340 2025-07-21 08:17:24,346 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758341_17517 src: /192.168.158.7:46702 dest: /192.168.158.4:9866 2025-07-21 08:17:24,372 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:46702, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_740723103_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758341_17517, duration(ns): 21057926 2025-07-21 08:17:24,373 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758341_17517, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-21 08:17:27,158 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758341_17517 replica FinalizedReplica, blk_1073758341_17517, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758341 for deletion 2025-07-21 08:17:27,159 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758341_17517 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758341 2025-07-21 08:19:24,345 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758343_17519 src: /192.168.158.1:59162 dest: /192.168.158.4:9866 2025-07-21 08:19:24,378 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59162, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_121138667_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758343_17519, duration(ns): 23775131 2025-07-21 08:19:24,379 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758343_17519, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-21 08:19:30,162 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758343_17519 replica FinalizedReplica, blk_1073758343_17519, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758343 for deletion 2025-07-21 08:19:30,163 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758343_17519 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758343 2025-07-21 08:25:29,362 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758349_17525 src: /192.168.158.7:39278 dest: /192.168.158.4:9866 2025-07-21 08:25:29,381 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:39278, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1830947576_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758349_17525, duration(ns): 16613011 2025-07-21 08:25:29,381 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758349_17525, type=LAST_IN_PIPELINE terminating 2025-07-21 08:25:30,180 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758349_17525 replica FinalizedReplica, blk_1073758349_17525, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758349 for deletion 2025-07-21 08:25:30,181 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758349_17525 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758349 2025-07-21 08:27:29,357 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758351_17527 src: /192.168.158.1:33026 dest: /192.168.158.4:9866 2025-07-21 08:27:29,391 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33026, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-590913782_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758351_17527, duration(ns): 24703807 2025-07-21 08:27:29,392 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758351_17527, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-21 08:27:30,185 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758351_17527 replica FinalizedReplica, blk_1073758351_17527, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758351 for deletion 2025-07-21 08:27:30,186 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758351_17527 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758351 2025-07-21 08:28:29,365 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758352_17528 src: /192.168.158.5:55300 dest: /192.168.158.4:9866 2025-07-21 08:28:29,393 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:55300, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-13200239_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758352_17528, duration(ns): 21782037 2025-07-21 08:28:29,393 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758352_17528, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-21 08:28:30,186 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758352_17528 replica FinalizedReplica, blk_1073758352_17528, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758352 for deletion 2025-07-21 08:28:30,188 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758352_17528 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758352 2025-07-21 08:31:34,366 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758355_17531 src: /192.168.158.8:34622 dest: /192.168.158.4:9866 2025-07-21 08:31:34,394 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:34622, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1040140158_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758355_17531, duration(ns): 22719756 2025-07-21 08:31:34,395 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758355_17531, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-21 08:31:36,192 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758355_17531 replica FinalizedReplica, blk_1073758355_17531, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758355 for deletion 2025-07-21 08:31:36,193 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758355_17531 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758355 2025-07-21 08:32:34,368 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758356_17532 src: /192.168.158.1:50206 dest: /192.168.158.4:9866 2025-07-21 08:32:34,402 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50206, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_250676485_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758356_17532, duration(ns): 24956329 2025-07-21 08:32:34,402 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758356_17532, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-21 08:32:36,195 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758356_17532 replica FinalizedReplica, blk_1073758356_17532, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758356 for deletion 2025-07-21 08:32:36,196 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758356_17532 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758356 2025-07-21 08:33:34,364 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758357_17533 src: /192.168.158.1:52596 dest: /192.168.158.4:9866 2025-07-21 08:33:34,397 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52596, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2135825135_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758357_17533, duration(ns): 24447007 2025-07-21 08:33:34,398 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758357_17533, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-21 08:33:39,198 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758357_17533 replica FinalizedReplica, blk_1073758357_17533, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758357 for deletion 2025-07-21 08:33:39,199 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758357_17533 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758357 2025-07-21 08:35:44,375 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758359_17535 src: /192.168.158.9:53138 dest: /192.168.158.4:9866 2025-07-21 08:35:44,394 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:53138, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1785637941_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758359_17535, duration(ns): 16569128 2025-07-21 08:35:44,394 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758359_17535, type=LAST_IN_PIPELINE terminating 2025-07-21 08:35:45,204 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758359_17535 replica FinalizedReplica, blk_1073758359_17535, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758359 for deletion 2025-07-21 08:35:45,205 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758359_17535 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758359 2025-07-21 08:37:49,375 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758361_17537 src: /192.168.158.5:34432 dest: /192.168.158.4:9866 2025-07-21 08:37:49,396 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:34432, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1414434773_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758361_17537, duration(ns): 17500388 2025-07-21 08:37:49,396 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758361_17537, type=LAST_IN_PIPELINE terminating 2025-07-21 08:37:54,212 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758361_17537 replica FinalizedReplica, blk_1073758361_17537, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758361 for deletion 2025-07-21 08:37:54,213 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758361_17537 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758361 2025-07-21 08:38:54,393 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758362_17538 src: /192.168.158.1:39388 dest: /192.168.158.4:9866 2025-07-21 08:38:54,429 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39388, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_981830754_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758362_17538, duration(ns): 26100985 2025-07-21 08:38:54,429 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758362_17538, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-21 08:39:00,214 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758362_17538 replica FinalizedReplica, blk_1073758362_17538, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758362 for deletion 2025-07-21 08:39:00,215 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758362_17538 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758362 2025-07-21 08:40:59,389 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758364_17540 src: /192.168.158.8:59182 dest: /192.168.158.4:9866 2025-07-21 08:40:59,417 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:59182, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_969535254_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758364_17540, duration(ns): 22065556 2025-07-21 08:40:59,417 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758364_17540, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-21 08:41:03,218 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758364_17540 replica FinalizedReplica, blk_1073758364_17540, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758364 for deletion 2025-07-21 08:41:03,219 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758364_17540 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758364 2025-07-21 08:43:59,388 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758367_17543 src: /192.168.158.6:48178 dest: /192.168.158.4:9866 2025-07-21 08:43:59,415 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:48178, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_595686700_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758367_17543, duration(ns): 21149171 2025-07-21 08:43:59,415 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758367_17543, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-21 08:44:00,225 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758367_17543 replica FinalizedReplica, blk_1073758367_17543, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758367 for deletion 2025-07-21 08:44:00,226 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758367_17543 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758367 2025-07-21 08:45:59,394 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758369_17545 src: /192.168.158.9:52378 dest: /192.168.158.4:9866 2025-07-21 08:45:59,420 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:52378, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1464763223_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758369_17545, duration(ns): 20002897 2025-07-21 08:45:59,420 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758369_17545, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-21 08:46:00,226 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758369_17545 replica FinalizedReplica, blk_1073758369_17545, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758369 for deletion 2025-07-21 08:46:00,228 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758369_17545 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758369 2025-07-21 08:53:09,390 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758376_17552 src: /192.168.158.1:56988 dest: /192.168.158.4:9866 2025-07-21 08:53:09,425 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56988, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-233878418_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758376_17552, duration(ns): 25324471 2025-07-21 08:53:09,425 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758376_17552, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-21 08:53:15,243 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758376_17552 replica FinalizedReplica, blk_1073758376_17552, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758376 for deletion 2025-07-21 08:53:15,244 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758376_17552 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758376 2025-07-21 08:55:09,401 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758378_17554 src: /192.168.158.5:55484 dest: /192.168.158.4:9866 2025-07-21 08:55:09,429 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:55484, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1350503453_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758378_17554, duration(ns): 21825302 2025-07-21 08:55:09,429 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758378_17554, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-21 08:55:12,248 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758378_17554 replica FinalizedReplica, blk_1073758378_17554, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758378 for deletion 2025-07-21 08:55:12,249 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758378_17554 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758378 2025-07-21 08:57:14,400 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758380_17556 src: /192.168.158.1:43174 dest: /192.168.158.4:9866 2025-07-21 08:57:14,434 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43174, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1991085130_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758380_17556, duration(ns): 25014863 2025-07-21 08:57:14,434 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758380_17556, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-21 08:57:15,253 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758380_17556 replica FinalizedReplica, blk_1073758380_17556, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758380 for deletion 2025-07-21 08:57:15,254 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758380_17556 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758380 2025-07-21 08:58:14,399 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758381_17557 src: /192.168.158.1:35860 dest: /192.168.158.4:9866 2025-07-21 08:58:14,435 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35860, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1319021563_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758381_17557, duration(ns): 26824145 2025-07-21 08:58:14,436 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758381_17557, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-21 08:58:15,259 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758381_17557 replica FinalizedReplica, blk_1073758381_17557, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758381 for deletion 2025-07-21 08:58:15,260 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758381_17557 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758381 2025-07-21 09:01:19,419 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758384_17560 src: /192.168.158.5:52890 dest: /192.168.158.4:9866 2025-07-21 09:01:19,439 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:52890, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1091880222_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758384_17560, duration(ns): 17416462 2025-07-21 09:01:19,439 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758384_17560, type=LAST_IN_PIPELINE terminating 2025-07-21 09:01:21,261 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758384_17560 replica FinalizedReplica, blk_1073758384_17560, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758384 for deletion 2025-07-21 09:01:21,262 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758384_17560 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758384 2025-07-21 09:03:29,443 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758386_17562 src: /192.168.158.6:46738 dest: /192.168.158.4:9866 2025-07-21 09:03:29,462 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:46738, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-681869887_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758386_17562, duration(ns): 16741032 2025-07-21 09:03:29,462 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758386_17562, type=LAST_IN_PIPELINE terminating 2025-07-21 09:03:30,268 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758386_17562 replica FinalizedReplica, blk_1073758386_17562, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758386 for deletion 2025-07-21 09:03:30,269 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758386_17562 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758386 2025-07-21 09:05:39,446 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758388_17564 src: /192.168.158.6:54992 dest: /192.168.158.4:9866 2025-07-21 09:05:39,473 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:54992, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-460702908_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758388_17564, duration(ns): 22032778 2025-07-21 09:05:39,474 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758388_17564, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-21 09:05:42,273 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758388_17564 replica FinalizedReplica, blk_1073758388_17564, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758388 for deletion 2025-07-21 09:05:42,274 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758388_17564 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758388 2025-07-21 09:06:44,438 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758389_17565 src: /192.168.158.9:36360 dest: /192.168.158.4:9866 2025-07-21 09:06:44,465 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:36360, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_851937897_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758389_17565, duration(ns): 20862269 2025-07-21 09:06:44,465 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758389_17565, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-21 09:06:48,275 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758389_17565 replica FinalizedReplica, blk_1073758389_17565, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758389 for deletion 2025-07-21 09:06:48,276 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758389_17565 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758389 2025-07-21 09:08:44,453 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758391_17567 src: /192.168.158.1:42030 dest: /192.168.158.4:9866 2025-07-21 09:08:44,488 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42030, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1584285459_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758391_17567, duration(ns): 26242017 2025-07-21 09:08:44,488 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758391_17567, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-21 09:08:45,277 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758391_17567 replica FinalizedReplica, blk_1073758391_17567, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758391 for deletion 2025-07-21 09:08:45,278 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758391_17567 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758391 2025-07-21 09:11:44,463 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758394_17570 src: /192.168.158.8:46566 dest: /192.168.158.4:9866 2025-07-21 09:11:44,483 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:46566, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_157422761_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758394_17570, duration(ns): 17925401 2025-07-21 09:11:44,484 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758394_17570, type=LAST_IN_PIPELINE terminating 2025-07-21 09:11:45,283 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758394_17570 replica FinalizedReplica, blk_1073758394_17570, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758394 for deletion 2025-07-21 09:11:45,284 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758394_17570 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758394 2025-07-21 09:12:44,473 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758395_17571 src: /192.168.158.7:40726 dest: /192.168.158.4:9866 2025-07-21 09:12:44,493 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:40726, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1428629603_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758395_17571, duration(ns): 17362340 2025-07-21 09:12:44,493 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758395_17571, type=LAST_IN_PIPELINE terminating 2025-07-21 09:12:48,283 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758395_17571 replica FinalizedReplica, blk_1073758395_17571, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758395 for deletion 2025-07-21 09:12:48,284 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758395_17571 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758395 2025-07-21 09:13:49,445 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758396_17572 src: /192.168.158.1:38840 dest: /192.168.158.4:9866 2025-07-21 09:13:49,481 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38840, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1232481629_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758396_17572, duration(ns): 26332027 2025-07-21 09:13:49,481 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758396_17572, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-21 09:13:51,285 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758396_17572 replica FinalizedReplica, blk_1073758396_17572, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758396 for deletion 2025-07-21 09:13:51,286 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758396_17572 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758396 2025-07-21 09:14:49,438 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758397_17573 src: /192.168.158.1:34652 dest: /192.168.158.4:9866 2025-07-21 09:14:49,471 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34652, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1011003596_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758397_17573, duration(ns): 23790763 2025-07-21 09:14:49,472 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758397_17573, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-21 09:14:51,287 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758397_17573 replica FinalizedReplica, blk_1073758397_17573, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758397 for deletion 2025-07-21 09:14:51,288 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758397_17573 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758397 2025-07-21 09:16:49,456 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758399_17575 src: /192.168.158.1:52040 dest: /192.168.158.4:9866 2025-07-21 09:16:49,493 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52040, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_568736062_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758399_17575, duration(ns): 27976947 2025-07-21 09:16:49,494 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758399_17575, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-21 09:16:54,290 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758399_17575 replica FinalizedReplica, blk_1073758399_17575, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758399 for deletion 2025-07-21 09:16:54,291 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758399_17575 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758399 2025-07-21 09:20:49,472 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758403_17579 src: /192.168.158.7:35538 dest: /192.168.158.4:9866 2025-07-21 09:20:49,493 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:35538, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-805822376_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758403_17579, duration(ns): 18488685 2025-07-21 09:20:49,493 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758403_17579, type=LAST_IN_PIPELINE terminating 2025-07-21 09:20:51,296 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758403_17579 replica FinalizedReplica, blk_1073758403_17579, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758403 for deletion 2025-07-21 09:20:51,297 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758403_17579 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758403 2025-07-21 09:23:54,470 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758406_17582 src: /192.168.158.6:50708 dest: /192.168.158.4:9866 2025-07-21 09:23:54,490 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:50708, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2117400640_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758406_17582, duration(ns): 17329374 2025-07-21 09:23:54,491 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758406_17582, type=LAST_IN_PIPELINE terminating 2025-07-21 09:23:57,305 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758406_17582 replica FinalizedReplica, blk_1073758406_17582, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758406 for deletion 2025-07-21 09:23:57,306 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758406_17582 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758406 2025-07-21 09:24:54,471 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758407_17583 src: /192.168.158.7:37802 dest: /192.168.158.4:9866 2025-07-21 09:24:54,492 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:37802, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-468805291_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758407_17583, duration(ns): 18555797 2025-07-21 09:24:54,492 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758407_17583, type=LAST_IN_PIPELINE terminating 2025-07-21 09:24:57,309 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758407_17583 replica FinalizedReplica, blk_1073758407_17583, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758407 for deletion 2025-07-21 09:24:57,310 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758407_17583 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758407 2025-07-21 09:26:54,482 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758409_17585 src: /192.168.158.1:59304 dest: /192.168.158.4:9866 2025-07-21 09:26:54,515 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59304, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1527509894_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758409_17585, duration(ns): 24240712 2025-07-21 09:26:54,515 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758409_17585, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-21 09:26:57,314 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758409_17585 replica FinalizedReplica, blk_1073758409_17585, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758409 for deletion 2025-07-21 09:26:57,316 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758409_17585 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758409 2025-07-21 09:27:54,475 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758410_17586 src: /192.168.158.9:38956 dest: /192.168.158.4:9866 2025-07-21 09:27:54,503 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:38956, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1085388403_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758410_17586, duration(ns): 21717121 2025-07-21 09:27:54,503 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758410_17586, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-21 09:27:57,316 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758410_17586 replica FinalizedReplica, blk_1073758410_17586, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758410 for deletion 2025-07-21 09:27:57,318 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758410_17586 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758410 2025-07-21 09:30:54,536 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758413_17589 src: /192.168.158.6:60828 dest: /192.168.158.4:9866 2025-07-21 09:30:54,554 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:60828, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_820680863_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758413_17589, duration(ns): 16743529 2025-07-21 09:30:54,555 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758413_17589, type=LAST_IN_PIPELINE terminating 2025-07-21 09:30:57,321 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758413_17589 replica FinalizedReplica, blk_1073758413_17589, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758413 for deletion 2025-07-21 09:30:57,322 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758413_17589 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758413 2025-07-21 09:31:54,485 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758414_17590 src: /192.168.158.6:38262 dest: /192.168.158.4:9866 2025-07-21 09:31:54,511 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:38262, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2018832515_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758414_17590, duration(ns): 20595468 2025-07-21 09:31:54,512 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758414_17590, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-21 09:31:57,321 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758414_17590 replica FinalizedReplica, blk_1073758414_17590, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758414 for deletion 2025-07-21 09:31:57,325 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758414_17590 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758414 2025-07-21 09:37:04,498 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758419_17595 src: /192.168.158.1:35208 dest: /192.168.158.4:9866 2025-07-21 09:37:04,532 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35208, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1155590241_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758419_17595, duration(ns): 24595518 2025-07-21 09:37:04,532 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758419_17595, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-21 09:37:09,332 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758419_17595 replica FinalizedReplica, blk_1073758419_17595, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758419 for deletion 2025-07-21 09:37:09,333 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758419_17595 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758419 2025-07-21 09:40:09,498 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758422_17598 src: /192.168.158.9:43704 dest: /192.168.158.4:9866 2025-07-21 09:40:09,527 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:43704, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_105434976_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758422_17598, duration(ns): 23853616 2025-07-21 09:40:09,527 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758422_17598, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-21 09:40:15,341 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758422_17598 replica FinalizedReplica, blk_1073758422_17598, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758422 for deletion 2025-07-21 09:40:15,342 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758422_17598 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758422 2025-07-21 09:41:09,491 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758423_17599 src: /192.168.158.9:40212 dest: /192.168.158.4:9866 2025-07-21 09:41:09,520 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:40212, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-624105578_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758423_17599, duration(ns): 21734391 2025-07-21 09:41:09,520 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758423_17599, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-21 09:41:15,341 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758423_17599 replica FinalizedReplica, blk_1073758423_17599, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758423 for deletion 2025-07-21 09:41:15,342 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758423_17599 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758423 2025-07-21 09:44:09,483 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758426_17602 src: /192.168.158.8:54962 dest: /192.168.158.4:9866 2025-07-21 09:44:09,502 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:54962, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_218512709_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758426_17602, duration(ns): 16631641 2025-07-21 09:44:09,502 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758426_17602, type=LAST_IN_PIPELINE terminating 2025-07-21 09:44:15,345 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758426_17602 replica FinalizedReplica, blk_1073758426_17602, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758426 for deletion 2025-07-21 09:44:15,346 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758426_17602 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758426 2025-07-21 09:45:09,478 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758427_17603 src: /192.168.158.1:58580 dest: /192.168.158.4:9866 2025-07-21 09:45:09,514 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58580, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1140976033_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758427_17603, duration(ns): 25731409 2025-07-21 09:45:09,514 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758427_17603, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-21 09:45:15,346 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758427_17603 replica FinalizedReplica, blk_1073758427_17603, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758427 for deletion 2025-07-21 09:45:15,347 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758427_17603 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758427 2025-07-21 09:47:09,478 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758429_17605 src: /192.168.158.1:57270 dest: /192.168.158.4:9866 2025-07-21 09:47:09,514 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57270, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1624337500_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758429_17605, duration(ns): 26386943 2025-07-21 09:47:09,514 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758429_17605, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-21 09:47:15,355 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758429_17605 replica FinalizedReplica, blk_1073758429_17605, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758429 for deletion 2025-07-21 09:47:15,356 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758429_17605 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758429 2025-07-21 09:48:09,487 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758430_17606 src: /192.168.158.5:54454 dest: /192.168.158.4:9866 2025-07-21 09:48:09,508 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:54454, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1598530072_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758430_17606, duration(ns): 19265394 2025-07-21 09:48:09,509 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758430_17606, type=LAST_IN_PIPELINE terminating 2025-07-21 09:48:18,356 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758430_17606 replica FinalizedReplica, blk_1073758430_17606, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758430 for deletion 2025-07-21 09:48:18,358 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758430_17606 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758430 2025-07-21 09:49:14,483 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758431_17607 src: /192.168.158.6:35972 dest: /192.168.158.4:9866 2025-07-21 09:49:14,511 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:35972, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1465091539_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758431_17607, duration(ns): 21587219 2025-07-21 09:49:14,511 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758431_17607, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-21 09:49:18,359 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758431_17607 replica FinalizedReplica, blk_1073758431_17607, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758431 for deletion 2025-07-21 09:49:18,360 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758431_17607 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758431 2025-07-21 09:50:14,492 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758432_17608 src: /192.168.158.1:48136 dest: /192.168.158.4:9866 2025-07-21 09:50:14,526 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48136, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1869276893_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758432_17608, duration(ns): 25460569 2025-07-21 09:50:14,526 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758432_17608, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-21 09:50:21,359 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758432_17608 replica FinalizedReplica, blk_1073758432_17608, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758432 for deletion 2025-07-21 09:50:21,360 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758432_17608 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758432 2025-07-21 09:51:14,494 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758433_17609 src: /192.168.158.9:36930 dest: /192.168.158.4:9866 2025-07-21 09:51:14,512 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:36930, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-213037950_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758433_17609, duration(ns): 16533953 2025-07-21 09:51:14,513 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758433_17609, type=LAST_IN_PIPELINE terminating 2025-07-21 09:51:21,360 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758433_17609 replica FinalizedReplica, blk_1073758433_17609, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758433 for deletion 2025-07-21 09:51:21,362 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758433_17609 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758433 2025-07-21 09:58:14,506 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758440_17616 src: /192.168.158.8:60050 dest: /192.168.158.4:9866 2025-07-21 09:58:14,527 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:60050, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2066187611_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758440_17616, duration(ns): 18814927 2025-07-21 09:58:14,528 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758440_17616, type=LAST_IN_PIPELINE terminating 2025-07-21 09:58:18,373 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758440_17616 replica FinalizedReplica, blk_1073758440_17616, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758440 for deletion 2025-07-21 09:58:18,375 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758440_17616 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758440 2025-07-21 09:59:18,381 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Successfully sent block report 0x12cfd9a757d31f55, containing 4 storage report(s), of which we sent 4. The reports had 8 total blocks and used 1 RPC(s). This took 0 msec to generate and 4 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5. 2025-07-21 09:59:18,381 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Got finalize command for block pool BP-1059995147-192.168.158.1-1752101929360 2025-07-21 10:00:14,511 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758442_17618 src: /192.168.158.5:42052 dest: /192.168.158.4:9866 2025-07-21 10:00:14,532 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:42052, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-282235625_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758442_17618, duration(ns): 17945917 2025-07-21 10:00:14,532 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758442_17618, type=LAST_IN_PIPELINE terminating 2025-07-21 10:00:18,377 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758442_17618 replica FinalizedReplica, blk_1073758442_17618, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758442 for deletion 2025-07-21 10:00:18,378 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758442_17618 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758442 2025-07-21 10:09:19,527 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758451_17627 src: /192.168.158.1:38314 dest: /192.168.158.4:9866 2025-07-21 10:09:19,568 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38314, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1507794563_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758451_17627, duration(ns): 31784680 2025-07-21 10:09:19,568 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758451_17627, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-21 10:09:24,401 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758451_17627 replica FinalizedReplica, blk_1073758451_17627, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758451 for deletion 2025-07-21 10:09:24,402 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758451_17627 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758451 2025-07-21 10:10:19,527 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758452_17628 src: /192.168.158.1:50736 dest: /192.168.158.4:9866 2025-07-21 10:10:19,562 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50736, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_151592697_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758452_17628, duration(ns): 25572669 2025-07-21 10:10:19,562 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758452_17628, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-21 10:10:27,403 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758452_17628 replica FinalizedReplica, blk_1073758452_17628, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758452 for deletion 2025-07-21 10:10:27,405 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758452_17628 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758452 2025-07-21 10:11:19,523 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758453_17629 src: /192.168.158.7:52954 dest: /192.168.158.4:9866 2025-07-21 10:11:19,555 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:52954, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-700819648_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758453_17629, duration(ns): 24945642 2025-07-21 10:11:19,555 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758453_17629, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-21 10:11:27,405 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758453_17629 replica FinalizedReplica, blk_1073758453_17629, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758453 for deletion 2025-07-21 10:11:27,407 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758453_17629 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758453 2025-07-21 10:16:19,526 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758458_17634 src: /192.168.158.5:50362 dest: /192.168.158.4:9866 2025-07-21 10:16:19,553 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:50362, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1102800077_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758458_17634, duration(ns): 21506343 2025-07-21 10:16:19,553 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758458_17634, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-21 10:16:27,414 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758458_17634 replica FinalizedReplica, blk_1073758458_17634, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758458 for deletion 2025-07-21 10:16:27,415 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758458_17634 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758458 2025-07-21 10:17:24,662 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758459_17635 src: /192.168.158.1:39584 dest: /192.168.158.4:9866 2025-07-21 10:17:24,694 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39584, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1965327633_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758459_17635, duration(ns): 23088746 2025-07-21 10:17:24,695 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758459_17635, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-21 10:17:33,418 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758459_17635 replica FinalizedReplica, blk_1073758459_17635, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758459 for deletion 2025-07-21 10:17:33,420 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758459_17635 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758459 2025-07-21 10:18:24,536 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758460_17636 src: /192.168.158.8:49290 dest: /192.168.158.4:9866 2025-07-21 10:18:24,556 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:49290, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-192430486_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758460_17636, duration(ns): 17543464 2025-07-21 10:18:24,556 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758460_17636, type=LAST_IN_PIPELINE terminating 2025-07-21 10:18:27,420 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758460_17636 replica FinalizedReplica, blk_1073758460_17636, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758460 for deletion 2025-07-21 10:18:27,421 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758460_17636 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758460 2025-07-21 10:19:24,532 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758461_17637 src: /192.168.158.8:51370 dest: /192.168.158.4:9866 2025-07-21 10:19:24,559 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:51370, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1034117920_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758461_17637, duration(ns): 21613079 2025-07-21 10:19:24,559 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758461_17637, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-21 10:19:27,421 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758461_17637 replica FinalizedReplica, blk_1073758461_17637, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758461 for deletion 2025-07-21 10:19:27,423 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758461_17637 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758461 2025-07-21 10:21:29,555 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758463_17639 src: /192.168.158.7:35936 dest: /192.168.158.4:9866 2025-07-21 10:21:29,574 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:35936, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_743623538_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758463_17639, duration(ns): 16674154 2025-07-21 10:21:29,574 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758463_17639, type=LAST_IN_PIPELINE terminating 2025-07-21 10:21:36,424 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758463_17639 replica FinalizedReplica, blk_1073758463_17639, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758463 for deletion 2025-07-21 10:21:36,425 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758463_17639 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir0/blk_1073758463 2025-07-21 10:22:29,547 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758464_17640 src: /192.168.158.1:60042 dest: /192.168.158.4:9866 2025-07-21 10:22:29,581 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60042, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1642318547_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758464_17640, duration(ns): 23172320 2025-07-21 10:22:29,581 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758464_17640, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-21 10:22:36,426 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758464_17640 replica FinalizedReplica, blk_1073758464_17640, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758464 for deletion 2025-07-21 10:22:36,428 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758464_17640 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758464 2025-07-21 10:24:34,552 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758466_17642 src: /192.168.158.7:42738 dest: /192.168.158.4:9866 2025-07-21 10:24:34,578 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:42738, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1949205106_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758466_17642, duration(ns): 20684957 2025-07-21 10:24:34,578 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758466_17642, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-21 10:24:39,432 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758466_17642 replica FinalizedReplica, blk_1073758466_17642, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758466 for deletion 2025-07-21 10:24:39,433 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758466_17642 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758466 2025-07-21 10:25:34,546 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758467_17643 src: /192.168.158.1:43368 dest: /192.168.158.4:9866 2025-07-21 10:25:34,580 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43368, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_199163907_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758467_17643, duration(ns): 25530800 2025-07-21 10:25:34,580 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758467_17643, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-21 10:25:39,432 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758467_17643 replica FinalizedReplica, blk_1073758467_17643, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758467 for deletion 2025-07-21 10:25:39,433 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758467_17643 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758467 2025-07-21 10:28:39,592 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758470_17646 src: /192.168.158.7:59540 dest: /192.168.158.4:9866 2025-07-21 10:28:39,612 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:59540, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-656489927_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758470_17646, duration(ns): 18827778 2025-07-21 10:28:39,613 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758470_17646, type=LAST_IN_PIPELINE terminating 2025-07-21 10:28:45,436 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758470_17646 replica FinalizedReplica, blk_1073758470_17646, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758470 for deletion 2025-07-21 10:28:45,437 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758470_17646 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758470 2025-07-21 10:29:39,555 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758471_17647 src: /192.168.158.5:53328 dest: /192.168.158.4:9866 2025-07-21 10:29:39,583 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:53328, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-99756493_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758471_17647, duration(ns): 21935922 2025-07-21 10:29:39,583 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758471_17647, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-21 10:29:42,438 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758471_17647 replica FinalizedReplica, blk_1073758471_17647, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758471 for deletion 2025-07-21 10:29:42,440 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758471_17647 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758471 2025-07-21 10:30:39,555 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758472_17648 src: /192.168.158.1:56542 dest: /192.168.158.4:9866 2025-07-21 10:30:39,589 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56542, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1994635569_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758472_17648, duration(ns): 25999377 2025-07-21 10:30:39,590 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758472_17648, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-21 10:30:45,439 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758472_17648 replica FinalizedReplica, blk_1073758472_17648, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758472 for deletion 2025-07-21 10:30:45,440 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758472_17648 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758472 2025-07-21 10:33:39,566 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758475_17651 src: /192.168.158.6:59188 dest: /192.168.158.4:9866 2025-07-21 10:33:39,586 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:59188, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1856846750_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758475_17651, duration(ns): 17203682 2025-07-21 10:33:39,586 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758475_17651, type=LAST_IN_PIPELINE terminating 2025-07-21 10:33:42,447 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758475_17651 replica FinalizedReplica, blk_1073758475_17651, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758475 for deletion 2025-07-21 10:33:42,448 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758475_17651 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758475 2025-07-21 10:34:44,564 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758476_17652 src: /192.168.158.1:55252 dest: /192.168.158.4:9866 2025-07-21 10:34:44,596 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55252, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1259084083_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758476_17652, duration(ns): 22920283 2025-07-21 10:34:44,596 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758476_17652, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-21 10:34:48,449 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758476_17652 replica FinalizedReplica, blk_1073758476_17652, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758476 for deletion 2025-07-21 10:34:48,450 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758476_17652 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758476 2025-07-21 10:35:44,567 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758477_17653 src: /192.168.158.7:45970 dest: /192.168.158.4:9866 2025-07-21 10:35:44,592 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:45970, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-566653719_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758477_17653, duration(ns): 19882409 2025-07-21 10:35:44,592 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758477_17653, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-21 10:35:48,452 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758477_17653 replica FinalizedReplica, blk_1073758477_17653, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758477 for deletion 2025-07-21 10:35:48,453 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758477_17653 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758477 2025-07-21 10:41:54,602 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758483_17659 src: /192.168.158.9:57790 dest: /192.168.158.4:9866 2025-07-21 10:41:54,628 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:57790, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-559849701_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758483_17659, duration(ns): 20522795 2025-07-21 10:41:54,628 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758483_17659, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-21 10:41:57,466 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758483_17659 replica FinalizedReplica, blk_1073758483_17659, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758483 for deletion 2025-07-21 10:41:57,467 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758483_17659 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758483 2025-07-21 10:47:59,582 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758489_17665 src: /192.168.158.9:36602 dest: /192.168.158.4:9866 2025-07-21 10:47:59,611 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:36602, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1619889373_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758489_17665, duration(ns): 22905668 2025-07-21 10:47:59,611 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758489_17665, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-21 10:48:03,475 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758489_17665 replica FinalizedReplica, blk_1073758489_17665, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758489 for deletion 2025-07-21 10:48:03,476 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758489_17665 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758489 2025-07-21 10:48:59,580 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758490_17666 src: /192.168.158.1:32842 dest: /192.168.158.4:9866 2025-07-21 10:48:59,615 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:32842, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1228865048_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758490_17666, duration(ns): 25766775 2025-07-21 10:48:59,615 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758490_17666, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-21 10:49:06,478 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758490_17666 replica FinalizedReplica, blk_1073758490_17666, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758490 for deletion 2025-07-21 10:49:06,479 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758490_17666 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758490 2025-07-21 10:51:59,595 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758493_17669 src: /192.168.158.9:36870 dest: /192.168.158.4:9866 2025-07-21 10:51:59,615 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:36870, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-834199884_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758493_17669, duration(ns): 17608494 2025-07-21 10:51:59,615 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758493_17669, type=LAST_IN_PIPELINE terminating 2025-07-21 10:52:06,484 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758493_17669 replica FinalizedReplica, blk_1073758493_17669, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758493 for deletion 2025-07-21 10:52:06,485 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758493_17669 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758493 2025-07-21 10:53:59,597 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758495_17671 src: /192.168.158.1:58262 dest: /192.168.158.4:9866 2025-07-21 10:53:59,632 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58262, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1155631886_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758495_17671, duration(ns): 25244260 2025-07-21 10:53:59,632 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758495_17671, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-21 10:54:06,488 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758495_17671 replica FinalizedReplica, blk_1073758495_17671, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758495 for deletion 2025-07-21 10:54:06,489 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758495_17671 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758495 2025-07-21 10:54:59,603 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758496_17672 src: /192.168.158.9:42260 dest: /192.168.158.4:9866 2025-07-21 10:54:59,631 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:42260, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_430224856_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758496_17672, duration(ns): 21258495 2025-07-21 10:54:59,631 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758496_17672, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-21 10:55:03,490 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758496_17672 replica FinalizedReplica, blk_1073758496_17672, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758496 for deletion 2025-07-21 10:55:03,491 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758496_17672 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758496 2025-07-21 10:55:59,602 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758497_17673 src: /192.168.158.8:49304 dest: /192.168.158.4:9866 2025-07-21 10:55:59,622 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:49304, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_679793498_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758497_17673, duration(ns): 18047014 2025-07-21 10:55:59,623 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758497_17673, type=LAST_IN_PIPELINE terminating 2025-07-21 10:56:03,491 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758497_17673 replica FinalizedReplica, blk_1073758497_17673, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758497 for deletion 2025-07-21 10:56:03,493 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758497_17673 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758497 2025-07-21 11:01:04,596 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758502_17678 src: /192.168.158.7:35944 dest: /192.168.158.4:9866 2025-07-21 11:01:04,620 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:35944, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-498756639_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758502_17678, duration(ns): 19001529 2025-07-21 11:01:04,621 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758502_17678, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-21 11:01:09,500 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758502_17678 replica FinalizedReplica, blk_1073758502_17678, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758502 for deletion 2025-07-21 11:01:09,501 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758502_17678 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758502 2025-07-21 11:02:04,610 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758503_17679 src: /192.168.158.8:51672 dest: /192.168.158.4:9866 2025-07-21 11:02:04,629 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:51672, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_205859889_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758503_17679, duration(ns): 16783495 2025-07-21 11:02:04,629 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758503_17679, type=LAST_IN_PIPELINE terminating 2025-07-21 11:02:09,503 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758503_17679 replica FinalizedReplica, blk_1073758503_17679, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758503 for deletion 2025-07-21 11:02:09,504 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758503_17679 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758503 2025-07-21 11:07:04,610 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758508_17684 src: /192.168.158.1:35804 dest: /192.168.158.4:9866 2025-07-21 11:07:04,644 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35804, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2115766451_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758508_17684, duration(ns): 25025844 2025-07-21 11:07:04,644 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758508_17684, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-21 11:07:12,511 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758508_17684 replica FinalizedReplica, blk_1073758508_17684, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758508 for deletion 2025-07-21 11:07:12,512 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758508_17684 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758508 2025-07-21 11:09:04,625 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758510_17686 src: /192.168.158.9:34656 dest: /192.168.158.4:9866 2025-07-21 11:09:04,646 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:34656, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-385939242_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758510_17686, duration(ns): 18734768 2025-07-21 11:09:04,646 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758510_17686, type=LAST_IN_PIPELINE terminating 2025-07-21 11:09:12,513 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758510_17686 replica FinalizedReplica, blk_1073758510_17686, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758510 for deletion 2025-07-21 11:09:12,514 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758510_17686 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758510 2025-07-21 11:12:09,603 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758513_17689 src: /192.168.158.6:54772 dest: /192.168.158.4:9866 2025-07-21 11:12:09,629 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:54772, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1374815618_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758513_17689, duration(ns): 20384092 2025-07-21 11:12:09,629 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758513_17689, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-21 11:12:12,520 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758513_17689 replica FinalizedReplica, blk_1073758513_17689, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758513 for deletion 2025-07-21 11:12:12,521 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758513_17689 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758513 2025-07-21 11:15:09,612 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758516_17692 src: /192.168.158.6:56456 dest: /192.168.158.4:9866 2025-07-21 11:15:09,641 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:56456, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1867074591_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758516_17692, duration(ns): 23793592 2025-07-21 11:15:09,642 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758516_17692, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-21 11:15:15,528 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758516_17692 replica FinalizedReplica, blk_1073758516_17692, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758516 for deletion 2025-07-21 11:15:15,529 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758516_17692 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758516 2025-07-21 11:17:19,608 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758518_17694 src: /192.168.158.1:50450 dest: /192.168.158.4:9866 2025-07-21 11:17:19,646 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50450, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_553047086_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758518_17694, duration(ns): 28991817 2025-07-21 11:17:19,646 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758518_17694, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-21 11:17:24,531 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758518_17694 replica FinalizedReplica, blk_1073758518_17694, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758518 for deletion 2025-07-21 11:17:24,532 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758518_17694 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758518 2025-07-21 11:20:29,622 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758521_17697 src: /192.168.158.5:51250 dest: /192.168.158.4:9866 2025-07-21 11:20:29,648 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:51250, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-438348799_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758521_17697, duration(ns): 20203645 2025-07-21 11:20:29,648 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758521_17697, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-21 11:20:36,534 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758521_17697 replica FinalizedReplica, blk_1073758521_17697, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758521 for deletion 2025-07-21 11:20:36,535 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758521_17697 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758521 2025-07-21 11:23:39,641 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758524_17700 src: /192.168.158.8:47388 dest: /192.168.158.4:9866 2025-07-21 11:23:39,661 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:47388, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1952815034_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758524_17700, duration(ns): 17595748 2025-07-21 11:23:39,661 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758524_17700, type=LAST_IN_PIPELINE terminating 2025-07-21 11:23:42,538 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758524_17700 replica FinalizedReplica, blk_1073758524_17700, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758524 for deletion 2025-07-21 11:23:42,539 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758524_17700 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758524 2025-07-21 11:25:44,624 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758526_17702 src: /192.168.158.7:42676 dest: /192.168.158.4:9866 2025-07-21 11:25:44,645 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:42676, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_952254219_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758526_17702, duration(ns): 18385396 2025-07-21 11:25:44,645 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758526_17702, type=LAST_IN_PIPELINE terminating 2025-07-21 11:25:51,537 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758526_17702 replica FinalizedReplica, blk_1073758526_17702, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758526 for deletion 2025-07-21 11:25:51,538 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758526_17702 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758526 2025-07-21 11:26:44,622 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758527_17703 src: /192.168.158.1:35472 dest: /192.168.158.4:9866 2025-07-21 11:26:44,655 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35472, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-470329690_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758527_17703, duration(ns): 23448751 2025-07-21 11:26:44,655 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758527_17703, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-21 11:26:48,540 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758527_17703 replica FinalizedReplica, blk_1073758527_17703, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758527 for deletion 2025-07-21 11:26:48,541 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758527_17703 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758527 2025-07-21 11:27:44,636 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758528_17704 src: /192.168.158.6:58228 dest: /192.168.158.4:9866 2025-07-21 11:27:44,655 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:58228, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1768279404_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758528_17704, duration(ns): 16477070 2025-07-21 11:27:44,655 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758528_17704, type=LAST_IN_PIPELINE terminating 2025-07-21 11:27:48,544 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758528_17704 replica FinalizedReplica, blk_1073758528_17704, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758528 for deletion 2025-07-21 11:27:48,545 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758528_17704 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758528 2025-07-21 11:28:49,629 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758529_17705 src: /192.168.158.5:56004 dest: /192.168.158.4:9866 2025-07-21 11:28:49,647 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:56004, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1899300346_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758529_17705, duration(ns): 16364571 2025-07-21 11:28:49,647 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758529_17705, type=LAST_IN_PIPELINE terminating 2025-07-21 11:28:57,544 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758529_17705 replica FinalizedReplica, blk_1073758529_17705, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758529 for deletion 2025-07-21 11:28:57,545 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758529_17705 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758529 2025-07-21 11:31:49,630 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758532_17708 src: /192.168.158.1:34454 dest: /192.168.158.4:9866 2025-07-21 11:31:49,664 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34454, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1623887961_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758532_17708, duration(ns): 23851010 2025-07-21 11:31:49,665 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758532_17708, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-21 11:31:57,550 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758532_17708 replica FinalizedReplica, blk_1073758532_17708, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758532 for deletion 2025-07-21 11:31:57,552 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758532_17708 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758532 2025-07-21 11:32:49,647 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758533_17709 src: /192.168.158.7:40824 dest: /192.168.158.4:9866 2025-07-21 11:32:49,667 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:40824, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1616348220_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758533_17709, duration(ns): 17794789 2025-07-21 11:32:49,667 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758533_17709, type=LAST_IN_PIPELINE terminating 2025-07-21 11:32:54,554 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758533_17709 replica FinalizedReplica, blk_1073758533_17709, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758533 for deletion 2025-07-21 11:32:54,555 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758533_17709 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758533 2025-07-21 11:34:49,628 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758535_17711 src: /192.168.158.1:47010 dest: /192.168.158.4:9866 2025-07-21 11:34:49,662 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47010, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-686631916_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758535_17711, duration(ns): 25348071 2025-07-21 11:34:49,662 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758535_17711, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-21 11:34:54,560 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758535_17711 replica FinalizedReplica, blk_1073758535_17711, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758535 for deletion 2025-07-21 11:34:54,561 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758535_17711 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758535 2025-07-21 11:35:49,630 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758536_17712 src: /192.168.158.1:53456 dest: /192.168.158.4:9866 2025-07-21 11:35:49,664 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53456, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1690522108_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758536_17712, duration(ns): 25214496 2025-07-21 11:35:49,664 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758536_17712, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-21 11:35:54,560 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758536_17712 replica FinalizedReplica, blk_1073758536_17712, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758536 for deletion 2025-07-21 11:35:54,561 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758536_17712 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758536 2025-07-21 11:36:13,269 INFO org.apache.hadoop.hdfs.server.datanode.DirectoryScanner: BlockPool BP-1059995147-192.168.158.1-1752101929360 Total blocks: 8, missing metadata files:0, missing block files:0, missing blocks in memory:0, mismatched blocks:0 2025-07-21 11:37:54,650 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758538_17714 src: /192.168.158.8:48118 dest: /192.168.158.4:9866 2025-07-21 11:37:54,669 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:48118, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_364740139_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758538_17714, duration(ns): 16974642 2025-07-21 11:37:54,669 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758538_17714, type=LAST_IN_PIPELINE terminating 2025-07-21 11:37:57,564 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758538_17714 replica FinalizedReplica, blk_1073758538_17714, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758538 for deletion 2025-07-21 11:37:57,565 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758538_17714 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758538 2025-07-21 11:38:54,635 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758539_17715 src: /192.168.158.8:46586 dest: /192.168.158.4:9866 2025-07-21 11:38:54,664 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:46586, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1552556574_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758539_17715, duration(ns): 23311526 2025-07-21 11:38:54,665 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758539_17715, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-21 11:38:57,567 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758539_17715 replica FinalizedReplica, blk_1073758539_17715, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758539 for deletion 2025-07-21 11:38:57,568 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758539_17715 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758539 2025-07-21 11:39:54,633 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758540_17716 src: /192.168.158.1:42684 dest: /192.168.158.4:9866 2025-07-21 11:39:54,670 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42684, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1087160565_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758540_17716, duration(ns): 27133478 2025-07-21 11:39:54,670 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758540_17716, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-21 11:39:57,570 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758540_17716 replica FinalizedReplica, blk_1073758540_17716, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758540 for deletion 2025-07-21 11:39:57,571 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758540_17716 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758540 2025-07-21 11:40:54,636 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758541_17717 src: /192.168.158.1:59258 dest: /192.168.158.4:9866 2025-07-21 11:40:54,671 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59258, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1086383858_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758541_17717, duration(ns): 25616456 2025-07-21 11:40:54,671 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758541_17717, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-21 11:40:57,573 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758541_17717 replica FinalizedReplica, blk_1073758541_17717, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758541 for deletion 2025-07-21 11:40:57,574 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758541_17717 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758541 2025-07-21 11:42:54,647 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758543_17719 src: /192.168.158.8:58642 dest: /192.168.158.4:9866 2025-07-21 11:42:54,674 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:58642, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-537786423_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758543_17719, duration(ns): 21002431 2025-07-21 11:42:54,674 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758543_17719, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-21 11:42:57,574 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758543_17719 replica FinalizedReplica, blk_1073758543_17719, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758543 for deletion 2025-07-21 11:42:57,576 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758543_17719 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758543 2025-07-21 11:43:54,641 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758544_17720 src: /192.168.158.8:57806 dest: /192.168.158.4:9866 2025-07-21 11:43:54,667 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:57806, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-521580053_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758544_17720, duration(ns): 20607496 2025-07-21 11:43:54,668 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758544_17720, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-21 11:43:57,577 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758544_17720 replica FinalizedReplica, blk_1073758544_17720, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758544 for deletion 2025-07-21 11:43:57,579 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758544_17720 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758544 2025-07-21 11:46:04,649 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758546_17722 src: /192.168.158.5:35000 dest: /192.168.158.4:9866 2025-07-21 11:46:04,667 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:35000, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1993444654_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758546_17722, duration(ns): 15890811 2025-07-21 11:46:04,667 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758546_17722, type=LAST_IN_PIPELINE terminating 2025-07-21 11:46:12,584 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758546_17722 replica FinalizedReplica, blk_1073758546_17722, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758546 for deletion 2025-07-21 11:46:12,585 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758546_17722 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758546 2025-07-21 11:48:14,642 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758548_17724 src: /192.168.158.1:44906 dest: /192.168.158.4:9866 2025-07-21 11:48:14,674 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44906, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-412222456_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758548_17724, duration(ns): 23601917 2025-07-21 11:48:14,674 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758548_17724, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-21 11:48:18,590 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758548_17724 replica FinalizedReplica, blk_1073758548_17724, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758548 for deletion 2025-07-21 11:48:18,591 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758548_17724 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758548 2025-07-21 11:50:14,647 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758550_17726 src: /192.168.158.1:43050 dest: /192.168.158.4:9866 2025-07-21 11:50:14,683 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43050, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_91473577_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758550_17726, duration(ns): 26637235 2025-07-21 11:50:14,683 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758550_17726, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-21 11:50:18,595 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758550_17726 replica FinalizedReplica, blk_1073758550_17726, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758550 for deletion 2025-07-21 11:50:18,596 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758550_17726 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758550 2025-07-21 11:53:14,650 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758553_17729 src: /192.168.158.1:40702 dest: /192.168.158.4:9866 2025-07-21 11:53:14,684 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40702, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2099111894_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758553_17729, duration(ns): 25525794 2025-07-21 11:53:14,684 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758553_17729, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-21 11:53:21,604 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758553_17729 replica FinalizedReplica, blk_1073758553_17729, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758553 for deletion 2025-07-21 11:53:21,605 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758553_17729 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758553 2025-07-21 11:54:19,666 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758554_17730 src: /192.168.158.5:53048 dest: /192.168.158.4:9866 2025-07-21 11:54:19,691 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:53048, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-521222520_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758554_17730, duration(ns): 19627621 2025-07-21 11:54:19,691 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758554_17730, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-21 11:54:24,605 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758554_17730 replica FinalizedReplica, blk_1073758554_17730, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758554 for deletion 2025-07-21 11:54:24,607 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758554_17730 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758554 2025-07-21 11:55:19,674 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758555_17731 src: /192.168.158.7:38048 dest: /192.168.158.4:9866 2025-07-21 11:55:19,692 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:38048, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2053880168_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758555_17731, duration(ns): 16320625 2025-07-21 11:55:19,693 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758555_17731, type=LAST_IN_PIPELINE terminating 2025-07-21 11:55:27,610 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758555_17731 replica FinalizedReplica, blk_1073758555_17731, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758555 for deletion 2025-07-21 11:55:27,611 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758555_17731 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758555 2025-07-21 11:58:19,689 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758558_17734 src: /192.168.158.6:34796 dest: /192.168.158.4:9866 2025-07-21 11:58:19,709 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:34796, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1735895910_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758558_17734, duration(ns): 17526044 2025-07-21 11:58:19,709 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758558_17734, type=LAST_IN_PIPELINE terminating 2025-07-21 11:58:24,616 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758558_17734 replica FinalizedReplica, blk_1073758558_17734, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758558 for deletion 2025-07-21 11:58:24,617 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758558_17734 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758558 2025-07-21 11:59:19,683 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758559_17735 src: /192.168.158.1:37808 dest: /192.168.158.4:9866 2025-07-21 11:59:19,720 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37808, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_860441712_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758559_17735, duration(ns): 27697401 2025-07-21 11:59:19,720 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758559_17735, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-21 11:59:24,617 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758559_17735 replica FinalizedReplica, blk_1073758559_17735, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758559 for deletion 2025-07-21 11:59:24,619 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758559_17735 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758559 2025-07-21 12:02:24,711 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758562_17738 src: /192.168.158.7:43236 dest: /192.168.158.4:9866 2025-07-21 12:02:24,737 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:43236, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1427480456_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758562_17738, duration(ns): 19860346 2025-07-21 12:02:24,738 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758562_17738, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-21 12:02:27,619 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758562_17738 replica FinalizedReplica, blk_1073758562_17738, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758562 for deletion 2025-07-21 12:02:27,621 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758562_17738 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758562 2025-07-21 12:03:24,711 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758563_17739 src: /192.168.158.1:35158 dest: /192.168.158.4:9866 2025-07-21 12:03:24,746 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35158, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1959538389_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758563_17739, duration(ns): 25448926 2025-07-21 12:03:24,746 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758563_17739, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-21 12:03:27,620 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758563_17739 replica FinalizedReplica, blk_1073758563_17739, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758563 for deletion 2025-07-21 12:03:27,621 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758563_17739 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758563 2025-07-21 12:08:24,716 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758568_17744 src: /192.168.158.9:42888 dest: /192.168.158.4:9866 2025-07-21 12:08:24,735 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:42888, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-186700886_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758568_17744, duration(ns): 17407777 2025-07-21 12:08:24,736 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758568_17744, type=LAST_IN_PIPELINE terminating 2025-07-21 12:08:27,630 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758568_17744 replica FinalizedReplica, blk_1073758568_17744, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758568 for deletion 2025-07-21 12:08:27,632 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758568_17744 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758568 2025-07-21 12:09:24,715 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758569_17745 src: /192.168.158.6:34030 dest: /192.168.158.4:9866 2025-07-21 12:09:24,741 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:34030, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1756264845_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758569_17745, duration(ns): 21073046 2025-07-21 12:09:24,742 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758569_17745, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-21 12:09:27,631 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758569_17745 replica FinalizedReplica, blk_1073758569_17745, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758569 for deletion 2025-07-21 12:09:27,635 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758569_17745 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758569 2025-07-21 12:10:24,741 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758570_17746 src: /192.168.158.5:48210 dest: /192.168.158.4:9866 2025-07-21 12:10:24,771 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:48210, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1325071210_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758570_17746, duration(ns): 23677123 2025-07-21 12:10:24,771 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758570_17746, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-21 12:10:27,633 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758570_17746 replica FinalizedReplica, blk_1073758570_17746, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758570 for deletion 2025-07-21 12:10:27,634 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758570_17746 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758570 2025-07-21 12:13:24,709 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758573_17749 src: /192.168.158.5:40160 dest: /192.168.158.4:9866 2025-07-21 12:13:24,728 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:40160, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1573132789_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758573_17749, duration(ns): 16723893 2025-07-21 12:13:24,728 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758573_17749, type=LAST_IN_PIPELINE terminating 2025-07-21 12:13:27,636 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758573_17749 replica FinalizedReplica, blk_1073758573_17749, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758573 for deletion 2025-07-21 12:13:27,637 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758573_17749 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758573 2025-07-21 12:15:29,724 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758575_17751 src: /192.168.158.8:60960 dest: /192.168.158.4:9866 2025-07-21 12:15:29,742 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:60960, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1342000954_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758575_17751, duration(ns): 15658040 2025-07-21 12:15:29,742 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758575_17751, type=LAST_IN_PIPELINE terminating 2025-07-21 12:15:33,641 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758575_17751 replica FinalizedReplica, blk_1073758575_17751, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758575 for deletion 2025-07-21 12:15:33,642 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758575_17751 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758575 2025-07-21 12:16:29,741 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758576_17752 src: /192.168.158.6:46086 dest: /192.168.158.4:9866 2025-07-21 12:16:29,771 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:46086, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1955570807_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758576_17752, duration(ns): 23358164 2025-07-21 12:16:29,772 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758576_17752, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-21 12:16:33,645 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758576_17752 replica FinalizedReplica, blk_1073758576_17752, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758576 for deletion 2025-07-21 12:16:33,646 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758576_17752 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758576 2025-07-21 12:17:34,713 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758577_17753 src: /192.168.158.8:37530 dest: /192.168.158.4:9866 2025-07-21 12:17:34,732 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:37530, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_489784608_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758577_17753, duration(ns): 16799176 2025-07-21 12:17:34,733 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758577_17753, type=LAST_IN_PIPELINE terminating 2025-07-21 12:17:39,647 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758577_17753 replica FinalizedReplica, blk_1073758577_17753, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758577 for deletion 2025-07-21 12:17:39,648 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758577_17753 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758577 2025-07-21 12:20:34,790 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758580_17756 src: /192.168.158.9:54380 dest: /192.168.158.4:9866 2025-07-21 12:20:34,816 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:54380, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-590879121_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758580_17756, duration(ns): 20522496 2025-07-21 12:20:34,816 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758580_17756, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-21 12:20:39,653 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758580_17756 replica FinalizedReplica, blk_1073758580_17756, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758580 for deletion 2025-07-21 12:20:39,654 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758580_17756 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758580 2025-07-21 12:24:34,721 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758584_17760 src: /192.168.158.5:39322 dest: /192.168.158.4:9866 2025-07-21 12:24:34,771 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:39322, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_479790655_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758584_17760, duration(ns): 43143637 2025-07-21 12:24:34,771 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758584_17760, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-21 12:24:39,663 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758584_17760 replica FinalizedReplica, blk_1073758584_17760, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758584 for deletion 2025-07-21 12:24:39,664 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758584_17760 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758584 2025-07-21 12:28:34,725 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758588_17764 src: /192.168.158.7:54122 dest: /192.168.158.4:9866 2025-07-21 12:28:34,750 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:54122, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-455907934_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758588_17764, duration(ns): 19865911 2025-07-21 12:28:34,751 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758588_17764, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-21 12:28:39,668 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758588_17764 replica FinalizedReplica, blk_1073758588_17764, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758588 for deletion 2025-07-21 12:28:39,669 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758588_17764 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758588 2025-07-21 12:30:34,732 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758590_17766 src: /192.168.158.7:46278 dest: /192.168.158.4:9866 2025-07-21 12:30:34,751 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:46278, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1319430748_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758590_17766, duration(ns): 16888556 2025-07-21 12:30:34,752 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758590_17766, type=LAST_IN_PIPELINE terminating 2025-07-21 12:30:39,674 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758590_17766 replica FinalizedReplica, blk_1073758590_17766, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758590 for deletion 2025-07-21 12:30:39,675 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758590_17766 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758590 2025-07-21 12:31:34,732 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758591_17767 src: /192.168.158.6:55164 dest: /192.168.158.4:9866 2025-07-21 12:31:34,751 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:55164, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_782995433_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758591_17767, duration(ns): 16327798 2025-07-21 12:31:34,751 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758591_17767, type=LAST_IN_PIPELINE terminating 2025-07-21 12:31:39,677 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758591_17767 replica FinalizedReplica, blk_1073758591_17767, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758591 for deletion 2025-07-21 12:31:39,678 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758591_17767 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758591 2025-07-21 12:33:34,742 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758593_17769 src: /192.168.158.7:59296 dest: /192.168.158.4:9866 2025-07-21 12:33:34,801 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:59296, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1908899063_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758593_17769, duration(ns): 53260375 2025-07-21 12:33:34,801 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758593_17769, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-21 12:33:39,679 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758593_17769 replica FinalizedReplica, blk_1073758593_17769, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758593 for deletion 2025-07-21 12:33:39,680 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758593_17769 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758593 2025-07-21 12:36:34,750 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758596_17772 src: /192.168.158.8:48842 dest: /192.168.158.4:9866 2025-07-21 12:36:34,777 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:48842, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_301093288_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758596_17772, duration(ns): 20615151 2025-07-21 12:36:34,777 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758596_17772, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-21 12:36:42,685 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758596_17772 replica FinalizedReplica, blk_1073758596_17772, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758596 for deletion 2025-07-21 12:36:42,686 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758596_17772 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758596 2025-07-21 12:40:34,804 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758600_17776 src: /192.168.158.7:44952 dest: /192.168.158.4:9866 2025-07-21 12:40:34,822 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:44952, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1710726585_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758600_17776, duration(ns): 16404489 2025-07-21 12:40:34,823 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758600_17776, type=LAST_IN_PIPELINE terminating 2025-07-21 12:40:42,694 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758600_17776 replica FinalizedReplica, blk_1073758600_17776, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758600 for deletion 2025-07-21 12:40:42,695 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758600_17776 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758600 2025-07-21 12:42:34,806 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758602_17778 src: /192.168.158.6:37166 dest: /192.168.158.4:9866 2025-07-21 12:42:34,831 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:37166, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-466590398_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758602_17778, duration(ns): 19995378 2025-07-21 12:42:34,832 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758602_17778, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-21 12:42:39,700 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758602_17778 replica FinalizedReplica, blk_1073758602_17778, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758602 for deletion 2025-07-21 12:42:39,701 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758602_17778 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758602 2025-07-21 12:43:34,764 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758603_17779 src: /192.168.158.9:57816 dest: /192.168.158.4:9866 2025-07-21 12:43:34,834 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:57816, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-970349578_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758603_17779, duration(ns): 65280391 2025-07-21 12:43:34,835 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758603_17779, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-21 12:43:39,700 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758603_17779 replica FinalizedReplica, blk_1073758603_17779, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758603 for deletion 2025-07-21 12:43:39,702 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758603_17779 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758603 2025-07-21 12:46:34,811 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758606_17782 src: /192.168.158.6:53716 dest: /192.168.158.4:9866 2025-07-21 12:46:34,829 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:53716, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-957227615_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758606_17782, duration(ns): 15983745 2025-07-21 12:46:34,830 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758606_17782, type=LAST_IN_PIPELINE terminating 2025-07-21 12:46:39,707 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758606_17782 replica FinalizedReplica, blk_1073758606_17782, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758606 for deletion 2025-07-21 12:46:39,709 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758606_17782 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758606 2025-07-21 12:47:34,765 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758607_17783 src: /192.168.158.1:46130 dest: /192.168.158.4:9866 2025-07-21 12:47:34,800 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46130, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1158266803_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758607_17783, duration(ns): 25462055 2025-07-21 12:47:34,801 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758607_17783, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-21 12:47:39,706 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758607_17783 replica FinalizedReplica, blk_1073758607_17783, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758607 for deletion 2025-07-21 12:47:39,708 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758607_17783 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758607 2025-07-21 12:48:34,810 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758608_17784 src: /192.168.158.7:43680 dest: /192.168.158.4:9866 2025-07-21 12:48:34,829 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:43680, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1041356609_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758608_17784, duration(ns): 16678718 2025-07-21 12:48:34,829 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758608_17784, type=LAST_IN_PIPELINE terminating 2025-07-21 12:48:39,709 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758608_17784 replica FinalizedReplica, blk_1073758608_17784, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758608 for deletion 2025-07-21 12:48:39,710 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758608_17784 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758608 2025-07-21 12:51:34,809 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758611_17787 src: /192.168.158.1:40732 dest: /192.168.158.4:9866 2025-07-21 12:51:34,844 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40732, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_949323350_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758611_17787, duration(ns): 25764468 2025-07-21 12:51:34,844 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758611_17787, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-21 12:51:39,713 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758611_17787 replica FinalizedReplica, blk_1073758611_17787, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758611 for deletion 2025-07-21 12:51:39,714 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758611_17787 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758611 2025-07-21 12:52:34,825 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758612_17788 src: /192.168.158.6:50364 dest: /192.168.158.4:9866 2025-07-21 12:52:34,844 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:50364, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1236409427_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758612_17788, duration(ns): 17250266 2025-07-21 12:52:34,844 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758612_17788, type=LAST_IN_PIPELINE terminating 2025-07-21 12:52:39,714 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758612_17788 replica FinalizedReplica, blk_1073758612_17788, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758612 for deletion 2025-07-21 12:52:39,715 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758612_17788 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758612 2025-07-21 12:53:34,815 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758613_17789 src: /192.168.158.1:57802 dest: /192.168.158.4:9866 2025-07-21 12:53:34,848 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57802, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_497346745_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758613_17789, duration(ns): 24091017 2025-07-21 12:53:34,849 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758613_17789, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-21 12:53:39,717 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758613_17789 replica FinalizedReplica, blk_1073758613_17789, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758613 for deletion 2025-07-21 12:53:39,718 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758613_17789 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758613 2025-07-21 12:54:34,857 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758614_17790 src: /192.168.158.1:49834 dest: /192.168.158.4:9866 2025-07-21 12:54:34,892 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49834, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2014283683_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758614_17790, duration(ns): 23764078 2025-07-21 12:54:34,892 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758614_17790, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-21 12:54:39,720 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758614_17790 replica FinalizedReplica, blk_1073758614_17790, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758614 for deletion 2025-07-21 12:54:39,721 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758614_17790 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758614 2025-07-21 12:55:39,787 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758615_17791 src: /192.168.158.8:50802 dest: /192.168.158.4:9866 2025-07-21 12:55:39,806 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:50802, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1929601338_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758615_17791, duration(ns): 17353631 2025-07-21 12:55:39,807 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758615_17791, type=LAST_IN_PIPELINE terminating 2025-07-21 12:55:45,719 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758615_17791 replica FinalizedReplica, blk_1073758615_17791, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758615 for deletion 2025-07-21 12:55:45,721 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758615_17791 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758615 2025-07-21 12:58:39,809 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758618_17794 src: /192.168.158.7:57224 dest: /192.168.158.4:9866 2025-07-21 12:58:39,839 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:57224, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2033216292_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758618_17794, duration(ns): 23128566 2025-07-21 12:58:39,839 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758618_17794, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-21 12:58:45,730 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758618_17794 replica FinalizedReplica, blk_1073758618_17794, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758618 for deletion 2025-07-21 12:58:45,731 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758618_17794 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758618 2025-07-21 12:59:39,797 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758619_17795 src: /192.168.158.9:42082 dest: /192.168.158.4:9866 2025-07-21 12:59:39,816 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:42082, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2104752030_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758619_17795, duration(ns): 17627811 2025-07-21 12:59:39,817 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758619_17795, type=LAST_IN_PIPELINE terminating 2025-07-21 12:59:45,729 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758619_17795 replica FinalizedReplica, blk_1073758619_17795, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758619 for deletion 2025-07-21 12:59:45,731 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758619_17795 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758619 2025-07-21 13:01:39,819 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758621_17797 src: /192.168.158.1:39712 dest: /192.168.158.4:9866 2025-07-21 13:01:39,854 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39712, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1528493279_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758621_17797, duration(ns): 24356605 2025-07-21 13:01:39,854 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758621_17797, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-21 13:01:42,733 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758621_17797 replica FinalizedReplica, blk_1073758621_17797, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758621 for deletion 2025-07-21 13:01:42,734 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758621_17797 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758621 2025-07-21 13:03:44,793 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758623_17799 src: /192.168.158.1:45622 dest: /192.168.158.4:9866 2025-07-21 13:03:44,827 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45622, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1208886677_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758623_17799, duration(ns): 24528850 2025-07-21 13:03:44,827 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758623_17799, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-21 13:03:48,740 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758623_17799 replica FinalizedReplica, blk_1073758623_17799, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758623 for deletion 2025-07-21 13:03:48,741 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758623_17799 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758623 2025-07-21 13:05:44,802 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758625_17801 src: /192.168.158.8:54092 dest: /192.168.158.4:9866 2025-07-21 13:05:44,827 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:54092, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2031500235_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758625_17801, duration(ns): 19827054 2025-07-21 13:05:44,828 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758625_17801, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-21 13:05:51,746 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758625_17801 replica FinalizedReplica, blk_1073758625_17801, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758625 for deletion 2025-07-21 13:05:51,748 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758625_17801 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758625 2025-07-21 13:07:44,811 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758627_17803 src: /192.168.158.8:52958 dest: /192.168.158.4:9866 2025-07-21 13:07:44,838 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:52958, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_10102246_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758627_17803, duration(ns): 22025942 2025-07-21 13:07:44,839 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758627_17803, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-21 13:07:51,751 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758627_17803 replica FinalizedReplica, blk_1073758627_17803, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758627 for deletion 2025-07-21 13:07:51,753 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758627_17803 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758627 2025-07-21 13:09:44,815 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758629_17805 src: /192.168.158.9:53194 dest: /192.168.158.4:9866 2025-07-21 13:09:44,835 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:53194, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-680788222_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758629_17805, duration(ns): 18492295 2025-07-21 13:09:44,836 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758629_17805, type=LAST_IN_PIPELINE terminating 2025-07-21 13:09:48,757 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758629_17805 replica FinalizedReplica, blk_1073758629_17805, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758629 for deletion 2025-07-21 13:09:48,758 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758629_17805 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758629 2025-07-21 13:11:44,820 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758631_17807 src: /192.168.158.7:52304 dest: /192.168.158.4:9866 2025-07-21 13:11:44,839 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:52304, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_845649184_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758631_17807, duration(ns): 17085527 2025-07-21 13:11:44,839 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758631_17807, type=LAST_IN_PIPELINE terminating 2025-07-21 13:11:48,759 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758631_17807 replica FinalizedReplica, blk_1073758631_17807, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758631 for deletion 2025-07-21 13:11:48,760 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758631_17807 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758631 2025-07-21 13:13:44,813 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758633_17809 src: /192.168.158.1:53744 dest: /192.168.158.4:9866 2025-07-21 13:13:44,846 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53744, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1820063299_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758633_17809, duration(ns): 23325379 2025-07-21 13:13:44,846 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758633_17809, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-21 13:13:48,761 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758633_17809 replica FinalizedReplica, blk_1073758633_17809, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758633 for deletion 2025-07-21 13:13:48,762 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758633_17809 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758633 2025-07-21 13:14:44,822 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758634_17810 src: /192.168.158.6:47606 dest: /192.168.158.4:9866 2025-07-21 13:14:44,841 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:47606, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_772739140_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758634_17810, duration(ns): 16813330 2025-07-21 13:14:44,842 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758634_17810, type=LAST_IN_PIPELINE terminating 2025-07-21 13:14:51,766 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758634_17810 replica FinalizedReplica, blk_1073758634_17810, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758634 for deletion 2025-07-21 13:14:51,767 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758634_17810 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758634 2025-07-21 13:20:44,829 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758640_17816 src: /192.168.158.8:38936 dest: /192.168.158.4:9866 2025-07-21 13:20:44,855 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:38936, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1659634766_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758640_17816, duration(ns): 20176078 2025-07-21 13:20:44,855 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758640_17816, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-21 13:20:51,777 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758640_17816 replica FinalizedReplica, blk_1073758640_17816, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758640 for deletion 2025-07-21 13:20:51,778 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758640_17816 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758640 2025-07-21 13:21:44,825 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758641_17817 src: /192.168.158.1:47672 dest: /192.168.158.4:9866 2025-07-21 13:21:44,859 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47672, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2087534822_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758641_17817, duration(ns): 25565067 2025-07-21 13:21:44,860 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758641_17817, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-21 13:21:48,780 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758641_17817 replica FinalizedReplica, blk_1073758641_17817, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758641 for deletion 2025-07-21 13:21:48,781 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758641_17817 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758641 2025-07-21 13:22:44,835 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758642_17818 src: /192.168.158.1:53868 dest: /192.168.158.4:9866 2025-07-21 13:22:44,869 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53868, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1660041010_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758642_17818, duration(ns): 24513195 2025-07-21 13:22:44,869 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758642_17818, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-21 13:22:48,781 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758642_17818 replica FinalizedReplica, blk_1073758642_17818, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758642 for deletion 2025-07-21 13:22:48,783 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758642_17818 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758642 2025-07-21 13:23:49,832 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758643_17819 src: /192.168.158.6:42064 dest: /192.168.158.4:9866 2025-07-21 13:23:49,860 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:42064, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1359179534_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758643_17819, duration(ns): 22620755 2025-07-21 13:23:49,860 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758643_17819, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-21 13:23:54,781 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758643_17819 replica FinalizedReplica, blk_1073758643_17819, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758643 for deletion 2025-07-21 13:23:54,783 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758643_17819 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758643 2025-07-21 13:24:54,835 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758644_17820 src: /192.168.158.7:60298 dest: /192.168.158.4:9866 2025-07-21 13:24:54,854 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:60298, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-486021824_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758644_17820, duration(ns): 17170920 2025-07-21 13:24:54,855 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758644_17820, type=LAST_IN_PIPELINE terminating 2025-07-21 13:24:57,784 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758644_17820 replica FinalizedReplica, blk_1073758644_17820, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758644 for deletion 2025-07-21 13:24:57,785 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758644_17820 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758644 2025-07-21 13:28:59,847 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758648_17824 src: /192.168.158.5:54198 dest: /192.168.158.4:9866 2025-07-21 13:28:59,865 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:54198, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_512770050_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758648_17824, duration(ns): 16498980 2025-07-21 13:28:59,866 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758648_17824, type=LAST_IN_PIPELINE terminating 2025-07-21 13:29:03,792 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758648_17824 replica FinalizedReplica, blk_1073758648_17824, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758648 for deletion 2025-07-21 13:29:03,794 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758648_17824 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758648 2025-07-21 13:31:28,373 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: RECEIVED SIGNAL 15: SIGTERM 2025-07-21 13:31:28,379 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG: /************************************************************ SHUTDOWN_MSG: Shutting down DataNode at dmidlkprdls04.svr.luc.edu/192.168.158.4 ************************************************************/ 2025-07-21 13:33:32,068 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: STARTUP_MSG: /************************************************************ STARTUP_MSG: Starting DataNode STARTUP_MSG: host = dmidlkprdls04.svr.luc.edu/192.168.158.4 STARTUP_MSG: args = [] STARTUP_MSG: version = 3.1.1.7.3.1.0-197 STARTUP_MSG: classpath = /var/run/cloudera-scm-agent/process/342-hdfs-DATANODE:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/audience-annotations-0.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/avro.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/aws-java-sdk-bundle-1.12.720.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-hdfs-plugin-shim-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-plugin-classloader-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-yarn-plugin-shim-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/azure-data-lake-store-sdk-2.3.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/checker-qual-3.33.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-beanutils-1.9.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-cli-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-codec-1.15.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-collections-3.2.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-compress-1.23.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-configuration2-2.10.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-io-2.11.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-lang-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-lang3-3.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-logging-1.1.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-math3-3.6.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-net-3.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-text-1.10.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-client-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-framework-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-recipes-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/failureaccess-1.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/gson-2.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/guava-32.0.1-jre.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/httpclient-4.5.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/httpcore-4.4.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/j2objc-annotations-2.8.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-core-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-core-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-databind-2.12.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-mapper-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-xc-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/javax.activation-api-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/javax.servlet-api-3.1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jaxb-api-2.2.11.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jcip-annotations-1.0-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-core-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-json-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-server-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-servlet-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jettison-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-http-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-io-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-security-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-server-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-servlet-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-util-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-util-ajax-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-webapp-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-xml-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jline-3.22.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsch-0.1.54.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsp-api-2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsr305-3.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsr311-api-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jul-to-slf4j-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-admin-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-client-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-common-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-core-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-crypto-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-identity-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-server-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-simplekdc-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-asn1-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-config-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-pkix-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-xdr-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/logredactor-2.0.16.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/lz4-java-1.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/metrics-core-3.2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-all-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-buffer-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-haproxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-http-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-http2-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-memcache-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-mqtt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-redis-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-smtp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-log4j12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-reload4j-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-api-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/reload4j-1.2.22.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/zookeeper.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-udt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-unix-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-kqueue-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final-linux-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-classes-kqueue-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-rxtx-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-classes-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-classes-macos-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-ssl-ocsp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-proxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-xml-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-stomp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-socks-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/wildfly-openssl-2.1.4.ClouderaFinal.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/protobuf-java-2.5.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/nimbus-jose-jwt-9.37.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/woodstox-core-5.4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/token-provider-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/stax2-api-4.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/snappy-java-1.1.10.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/re2j-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/zookeeper-jute.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-sctp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-kqueue-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final-linux-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//gcs-connector-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-annotations.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-auth.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-aws.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-datalake.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-kms.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-nfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//ozone-filesystem-hadoop3-1.3.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-nfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-kms-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-datalake-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-aws-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-auth-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-annotations-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//gcs-connector-2.1.2.7.3.1.0-197-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-thrift.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-scala_2.12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-protobuf.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-pig.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-pig-bundle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-jackson.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-hadoop.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-hadoop-bundle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-generator.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-format-structures.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-encoding.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-column.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-cascading3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-cascading.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-avro.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/./:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/audience-annotations-0.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/avro-1.11.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/checker-qual-3.33.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-beanutils-1.9.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-codec-1.15.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-collections-3.2.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-compress-1.23.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-configuration2-2.10.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-io-2.11.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-lang3-3.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-math3-3.6.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-net-3.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-text-1.10.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-client-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-framework-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-recipes-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/failureaccess-1.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/gson-2.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/guava-32.0.1-jre.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/httpclient-4.5.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/httpcore-4.4.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/j2objc-annotations-2.8.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-core-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-databind-2.12.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-jaxrs-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-xc-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/javax.activation-api-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/javax.servlet-api-3.1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jaxb-api-2.2.11.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jaxb-impl-2.2.3-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jcip-annotations-1.0-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-core-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-json-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-server-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-servlet-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jettison-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-http-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-io-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-security-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-server-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-servlet-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-util-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-util-ajax-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-webapp-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-xml-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jline-3.22.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsch-0.1.54.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/json-simple-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsr311-api-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-admin-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-client-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-common-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-core-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-crypto-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-identity-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-server-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-simplekdc-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-asn1-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-config-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-pkix-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-xdr-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/leveldbjni-cldr-1.9.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/lz4-java-1.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/metrics-core-3.2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-all-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-buffer-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-haproxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-http-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-http2-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-memcache-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-mqtt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-redis-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-smtp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-socks-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-stomp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-xml-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-proxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-ssl-ocsp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/re2j-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/zookeeper-3.8.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-sctp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-rxtx-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-unix-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-kqueue-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-kqueue-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final-linux-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final-linux-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-classes-kqueue-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-classes-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-classes-macos-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/nimbus-jose-jwt-9.37.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/woodstox-core-5.4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/token-provider-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/stax2-api-4.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/snappy-java-1.1.10.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/reload4j-1.2.22.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/zookeeper-jute-3.8.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-udt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-httpfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-nfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-nfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-httpfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//asm-5.0.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//aspectjweaver-1.9.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//azure-storage-7.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//checker-compat-qual-2.5.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-slf4j-backend-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-system-backend-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//google-extensions-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//azure-keyvault-core-1.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//accessors-smart-2.4.9.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gcs-connector-2.1.2.7.3.1.0-197-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archive-logs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archive-logs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archives-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archives.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-datajoin-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-datajoin.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-distcp-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-distcp.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-extras-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-extras.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-fs2img-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-fs2img.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-gridmix-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-gridmix.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-kafka-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-kafka.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-app-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-app.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-core-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-core.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-nativetask-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-nativetask.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-uploader-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-uploader.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-examples-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-examples.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-openstack-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-openstack.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-resourceestimator-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-resourceestimator.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-rumen-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-rumen.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-sls-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-sls.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-streaming-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-streaming.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ojalgo-43.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//kafka-clients-2.8.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//log4j-core-2.18.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-hook-abfs-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//forbiddenapis-3.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-intg-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//log4j-api-2.18.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//zstd-jni-1.4.9-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-i18n.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-s3-lib-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//javax.activation-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//bundle-2.23.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//json-smart-2.4.10.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-util-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-shell.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-cloud-bindings.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-hook-s3-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//aspectjrt-1.9.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/./:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/swagger-annotations-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/objenesis-1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jersey-client-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jakarta.xml.bind-api-2.3.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jakarta.activation-api-1.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-dataformat-yaml-2.9.10.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcutil-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcprov-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcpkix-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/HikariCP-java7-2.4.12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/snakeyaml-2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jsonschema2pojo-core-1.0.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/joda-time-2.10.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jna-5.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jersey-guice-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/javax.inject-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-module-jaxb-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-jaxrs-json-provider-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-jaxrs-base-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/guice-servlet-4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/guice-4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/geronimo-jcache_1.0_spec-1.0-alpha-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/fst-2.50.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/ehcache-3.3.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/dnsjava-2.1.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/codemodel-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-api.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-distributedshell.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-cloud-resourcemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-registry.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-nodemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-resourcemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-router.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-sharedcachemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-timeline-pluginstorage.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-web-proxy.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-api.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-core.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-registry-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-api-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-core-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-api-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-web-proxy-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-timeline-pluginstorage-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-tests-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-sharedcachemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-router-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-resourcemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-nodemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-cloud-resourcemanager-1.0.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-distributedshell-3.1.1.7.3.1.0-197.jar:/opt/cloudera/cm/lib/plugins/event-publish-7.13.1-shaded.jar:/opt/cloudera/cm/lib/plugins/tt-instrumentation-7.13.1.jar STARTUP_MSG: build = git@github.infra.cloudera.com:CDH/hadoop.git -r 31a42fb39494f541ffae15c3c61185deeeacca86; compiled by 'jenkins' on 2024-12-04T01:09Z STARTUP_MSG: java = 1.8.0_432 ************************************************************/ 2025-07-21 13:33:32,162 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: registered UNIX signal handlers for [TERM, HUP, INT] 2025-07-21 13:33:32,555 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d1/dfs/dn 2025-07-21 13:33:32,561 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d2/dfs/dn 2025-07-21 13:33:32,562 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d3/dfs/dn 2025-07-21 13:33:32,562 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d4/dfs/dn 2025-07-21 13:33:32,726 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: Loaded properties from hadoop-metrics2.properties 2025-07-21 13:33:32,845 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled Metric snapshot period at 10 second(s). 2025-07-21 13:33:32,845 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: DataNode metrics system started 2025-07-21 13:33:33,248 INFO org.apache.hadoop.hdfs.server.common.Util: dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling 2025-07-21 13:33:33,278 INFO org.apache.hadoop.hdfs.server.datanode.BlockScanner: Initialized block scanner with targetBytesPerSec 1048576 2025-07-21 13:33:33,285 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: File descriptor passing is enabled. 2025-07-21 13:33:33,286 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Configured hostname is dmidlkprdls04.svr.luc.edu 2025-07-21 13:33:33,287 INFO org.apache.hadoop.hdfs.server.common.Util: dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling 2025-07-21 13:33:33,296 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting DataNode with maxLockedMemory = 4294967296 2025-07-21 13:33:33,326 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened streaming server at /192.168.158.4:9866 2025-07-21 13:33:33,329 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Balancing bandwidth is 10485760 bytes/s 2025-07-21 13:33:33,329 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Number threads for balancing is 50 2025-07-21 13:33:33,333 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Balancing bandwidth is 10485760 bytes/s 2025-07-21 13:33:33,334 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Number threads for balancing is 50 2025-07-21 13:33:33,334 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Listening on UNIX domain socket: /var/run/hdfs-sockets/dn 2025-07-21 13:33:33,390 INFO org.eclipse.jetty.util.log: Logging initialized @2491ms to org.eclipse.jetty.util.log.Slf4jLog 2025-07-21 13:33:33,536 WARN org.apache.hadoop.security.authentication.server.AuthenticationFilter: Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /var/lib/hadoop-hdfs/hadoop-http-auth-signature-secret 2025-07-21 13:33:33,547 INFO org.apache.hadoop.http.HttpRequestLog: Http request log for http.requests.datanode is not defined 2025-07-21 13:33:33,559 INFO org.apache.hadoop.http.HttpServer2: Added global filter 'safety' (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter) 2025-07-21 13:33:33,563 INFO org.apache.hadoop.security.HttpCrossOriginFilterInitializer: CORS filter not enabled. Please set hadoop.http.cross-origin.enabled to 'true' to enable it 2025-07-21 13:33:33,564 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context datanode 2025-07-21 13:33:33,564 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context logs 2025-07-21 13:33:33,564 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context static 2025-07-21 13:33:33,611 INFO org.apache.hadoop.http.HttpServer2: Jetty bound to port 37817 2025-07-21 13:33:33,612 INFO org.eclipse.jetty.server.Server: jetty-9.4.54.v20240208; built: 2024-02-08T19:42:39.027Z; git: cef3fbd6d736a21e7d541a5db490381d95a2047d; jvm 1.8.0_432-b06 2025-07-21 13:33:33,668 INFO org.eclipse.jetty.server.session: DefaultSessionIdManager workerName=node0 2025-07-21 13:33:33,668 INFO org.eclipse.jetty.server.session: No SessionScavenger set, using defaults 2025-07-21 13:33:33,671 INFO org.eclipse.jetty.server.session: node0 Scavenging every 660000ms 2025-07-21 13:33:33,700 WARN org.apache.hadoop.security.authentication.server.AuthenticationFilter: Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /var/lib/hadoop-hdfs/hadoop-http-auth-signature-secret 2025-07-21 13:33:33,705 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.s.ServletContextHandler@1b11ef33{logs,/logs,file:///var/log/hadoop-hdfs/,AVAILABLE} 2025-07-21 13:33:33,706 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.s.ServletContextHandler@2f2bf0e2{static,/static,file:///opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/static/,AVAILABLE} 2025-07-21 13:33:33,827 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.w.WebAppContext@6ebd78d1{datanode,/,file:///opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/datanode/,AVAILABLE}{file:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/datanode} 2025-07-21 13:33:33,840 INFO org.eclipse.jetty.server.AbstractConnector: Started ServerConnector@7fb9f71f{HTTP/1.1, (http/1.1)}{localhost:37817} 2025-07-21 13:33:33,840 INFO org.eclipse.jetty.server.Server: Started @2942ms 2025-07-21 13:33:34,123 INFO org.apache.hadoop.hdfs.server.datanode.web.DatanodeHttpServer: Listening HTTP traffic on /192.168.158.4:9864 2025-07-21 13:33:34,133 INFO org.apache.hadoop.util.JvmPauseMonitor: Starting JVM pause monitor 2025-07-21 13:33:34,135 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: dnUserName = hdfs 2025-07-21 13:33:34,135 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: supergroup = supergroup 2025-07-21 13:33:34,204 INFO org.apache.hadoop.ipc.CallQueueManager: Using callQueue: class java.util.concurrent.LinkedBlockingQueue queueCapacity: 1000 scheduler: class org.apache.hadoop.ipc.DefaultRpcScheduler 2025-07-21 13:33:34,225 INFO org.apache.hadoop.ipc.Server: Starting Socket Reader #1 for port 9867 2025-07-21 13:33:34,272 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened IPC server at /192.168.158.4:9867 2025-07-21 13:33:34,310 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Refresh request received for nameservices: null 2025-07-21 13:33:34,320 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting BPOfferServices for nameservices: 2025-07-21 13:33:34,337 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool (Datanode Uuid unassigned) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 starting to offer service 2025-07-21 13:33:34,345 INFO org.apache.hadoop.ipc.Server: IPC Server Responder: starting 2025-07-21 13:33:34,345 INFO org.apache.hadoop.ipc.Server: IPC Server listener on 9867: starting 2025-07-21 13:33:35,528 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:33:36,530 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:33:37,531 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:33:38,533 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:33:39,535 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:33:40,537 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:33:41,539 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:33:42,540 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:33:43,542 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:33:44,544 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:33:45,546 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:33:46,547 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:33:47,549 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:33:48,551 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:33:49,553 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:33:50,554 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:33:51,556 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:33:51,578 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-21 13:33:51,587 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-21 13:33:51,590 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 45 more 2025-07-21 13:33:51,925 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.produce(EatWhatYouKill.java:137) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-21 13:33:51,926 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.produce(EatWhatYouKill.java:137) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-21 13:33:51,928 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.produce(EatWhatYouKill.java:137) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 49 more 2025-07-21 13:33:52,559 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:33:53,561 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:33:54,563 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:33:55,565 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:33:55,620 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-21 13:33:55,622 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-21 13:33:55,623 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-21 13:33:56,566 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:33:57,568 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:33:58,570 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:33:59,571 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:34:00,573 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:34:01,575 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:34:02,577 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:34:03,578 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:34:04,580 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:34:05,582 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:34:06,584 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:34:07,587 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:34:08,589 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:34:09,590 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:34:10,592 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:34:11,594 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:34:12,596 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:34:13,597 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:34:14,599 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:34:15,601 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:34:16,602 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:34:17,604 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:34:18,606 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:34:19,607 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:34:20,609 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:34:21,611 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:34:22,613 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:34:23,615 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:34:24,617 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:34:24,620 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-21 13:34:30,623 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:34:31,625 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:34:32,626 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:34:33,628 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:34:34,630 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:34:35,632 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:34:36,633 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:34:37,635 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:34:38,637 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:34:39,638 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:34:40,640 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 10 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:34:41,642 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 11 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:34:42,643 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 12 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:34:43,645 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 13 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:34:44,647 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 14 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:34:45,648 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 15 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:34:46,650 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 16 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:34:47,652 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 17 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:34:48,654 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 18 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:34:49,655 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 19 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:34:50,657 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:34:51,659 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 21 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:34:52,660 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 22 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:34:53,662 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 23 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:34:54,664 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 24 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:34:55,610 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.DataNode.getNamenodeAddresses(DataNode.java:3151) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-21 13:34:55,614 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool ID needed, but service not yet registered with NN, trace: java.lang.Exception at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:216) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.getBlockPoolId(BPOfferService.java:227) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.getActorInfoMap(BPServiceActor.java:200) at org.apache.hadoop.hdfs.server.datanode.DataNode.getBPServiceActorInfo(DataNode.java:3178) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) 2025-07-21 13:34:55,616 ERROR org.apache.hadoop.jmx.JMXJsonServlet: getting attribute VolumeInfo of Hadoop:service=DataNode,name=DataNodeInfo threw an exception javax.management.RuntimeMBeanException: java.lang.NullPointerException: Storage not yet initialized at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:839) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:852) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:651) at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678) at org.apache.hadoop.jmx.JMXJsonServlet.writeAttribute(JMXJsonServlet.java:341) at org.apache.hadoop.jmx.JMXJsonServlet.listBeans(JMXJsonServlet.java:319) at org.apache.hadoop.jmx.JMXJsonServlet.doGet(JMXJsonServlet.java:213) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:652) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1700) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) at org.eclipse.jetty.server.Server.handle(Server.java:516) at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) at java.lang.Thread.run(Thread.java:750) Caused by: java.lang.NullPointerException: Storage not yet initialized at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:921) at org.apache.hadoop.hdfs.server.datanode.DataNode.getVolumeInfo(DataNode.java:3192) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:72) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193) at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117) at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54) at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237) at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83) at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647) ... 50 more 2025-07-21 13:34:55,666 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 25 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:34:56,667 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 26 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:34:57,669 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 27 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:34:58,671 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 28 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:34:59,672 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:35:00,674 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 30 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:35:01,676 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 31 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:35:02,677 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 32 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:35:03,679 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 33 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:35:04,681 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 34 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:35:05,682 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 35 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:35:06,684 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 36 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:35:07,686 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 37 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:35:08,687 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 38 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:35:09,689 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 39 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:35:10,691 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 40 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:35:11,692 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 41 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:35:12,694 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 42 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:35:13,696 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 43 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:35:14,698 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 44 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:35:15,699 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 45 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:35:16,701 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 46 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:35:17,703 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 47 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:35:18,704 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 48 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:35:19,706 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:35:19,709 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-21 13:35:25,711 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:35:26,713 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:35:27,715 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:35:28,716 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:35:29,315 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: RECEIVED SIGNAL 15: SIGTERM 2025-07-21 13:35:29,320 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG: /************************************************************ SHUTDOWN_MSG: Shutting down DataNode at dmidlkprdls04.svr.luc.edu/192.168.158.4 ************************************************************/ 2025-07-21 13:36:06,850 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: STARTUP_MSG: /************************************************************ STARTUP_MSG: Starting DataNode STARTUP_MSG: host = dmidlkprdls04.svr.luc.edu/192.168.158.4 STARTUP_MSG: args = [] STARTUP_MSG: version = 3.1.1.7.3.1.0-197 STARTUP_MSG: classpath = /var/run/cloudera-scm-agent/process/350-hdfs-DATANODE:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/audience-annotations-0.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/avro.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/aws-java-sdk-bundle-1.12.720.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-hdfs-plugin-shim-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-plugin-classloader-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-yarn-plugin-shim-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/azure-data-lake-store-sdk-2.3.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/checker-qual-3.33.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-beanutils-1.9.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-cli-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-codec-1.15.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-collections-3.2.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-compress-1.23.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-configuration2-2.10.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-io-2.11.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-lang-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-lang3-3.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-logging-1.1.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-math3-3.6.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-net-3.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-text-1.10.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-client-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-framework-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-recipes-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/failureaccess-1.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/gson-2.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/guava-32.0.1-jre.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/httpclient-4.5.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/httpcore-4.4.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/j2objc-annotations-2.8.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-core-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-core-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-databind-2.12.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-mapper-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-xc-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/javax.activation-api-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/javax.servlet-api-3.1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jaxb-api-2.2.11.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jcip-annotations-1.0-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-core-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-json-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-server-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-servlet-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jettison-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-http-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-io-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-security-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-server-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-servlet-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-util-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-util-ajax-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-webapp-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-xml-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jline-3.22.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsch-0.1.54.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsp-api-2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsr305-3.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsr311-api-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jul-to-slf4j-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-admin-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-client-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-common-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-core-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-crypto-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-identity-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-server-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-simplekdc-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-asn1-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-config-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-pkix-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-xdr-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/logredactor-2.0.16.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/lz4-java-1.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/metrics-core-3.2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-all-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-buffer-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-haproxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-http-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-http2-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-memcache-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-mqtt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-redis-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-smtp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-log4j12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-reload4j-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-api-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/reload4j-1.2.22.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/zookeeper.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-udt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-unix-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-kqueue-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final-linux-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-classes-kqueue-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-rxtx-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-classes-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-classes-macos-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-ssl-ocsp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-proxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-xml-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-stomp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-socks-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/wildfly-openssl-2.1.4.ClouderaFinal.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/protobuf-java-2.5.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/nimbus-jose-jwt-9.37.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/woodstox-core-5.4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/token-provider-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/stax2-api-4.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/snappy-java-1.1.10.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/re2j-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/zookeeper-jute.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-sctp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-kqueue-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final-linux-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//gcs-connector-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-annotations.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-auth.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-aws.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-datalake.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-kms.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-nfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//ozone-filesystem-hadoop3-1.3.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-nfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-kms-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-datalake-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-aws-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-auth-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-annotations-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//gcs-connector-2.1.2.7.3.1.0-197-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-thrift.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-scala_2.12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-protobuf.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-pig.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-pig-bundle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-jackson.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-hadoop.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-hadoop-bundle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-generator.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-format-structures.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-encoding.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-column.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-cascading3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-cascading.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-avro.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/./:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/audience-annotations-0.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/avro-1.11.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/checker-qual-3.33.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-beanutils-1.9.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-codec-1.15.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-collections-3.2.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-compress-1.23.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-configuration2-2.10.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-io-2.11.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-lang3-3.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-math3-3.6.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-net-3.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-text-1.10.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-client-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-framework-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-recipes-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/failureaccess-1.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/gson-2.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/guava-32.0.1-jre.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/httpclient-4.5.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/httpcore-4.4.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/j2objc-annotations-2.8.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-core-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-databind-2.12.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-jaxrs-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-xc-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/javax.activation-api-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/javax.servlet-api-3.1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jaxb-api-2.2.11.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jaxb-impl-2.2.3-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jcip-annotations-1.0-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-core-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-json-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-server-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-servlet-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jettison-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-http-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-io-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-security-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-server-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-servlet-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-util-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-util-ajax-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-webapp-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-xml-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jline-3.22.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsch-0.1.54.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/json-simple-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsr311-api-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-admin-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-client-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-common-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-core-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-crypto-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-identity-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-server-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-simplekdc-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-asn1-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-config-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-pkix-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-xdr-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/leveldbjni-cldr-1.9.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/lz4-java-1.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/metrics-core-3.2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-all-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-buffer-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-haproxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-http-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-http2-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-memcache-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-mqtt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-redis-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-smtp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-socks-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-stomp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-xml-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-proxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-ssl-ocsp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/re2j-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/zookeeper-3.8.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-sctp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-rxtx-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-unix-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-kqueue-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-kqueue-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final-linux-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final-linux-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-classes-kqueue-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-classes-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-classes-macos-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/nimbus-jose-jwt-9.37.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/woodstox-core-5.4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/token-provider-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/stax2-api-4.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/snappy-java-1.1.10.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/reload4j-1.2.22.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/zookeeper-jute-3.8.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-udt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-httpfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-nfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-nfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-httpfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//asm-5.0.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//aspectjweaver-1.9.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//azure-storage-7.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//checker-compat-qual-2.5.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-slf4j-backend-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-system-backend-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//google-extensions-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//azure-keyvault-core-1.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//accessors-smart-2.4.9.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gcs-connector-2.1.2.7.3.1.0-197-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archive-logs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archive-logs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archives-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archives.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-datajoin-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-datajoin.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-distcp-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-distcp.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-extras-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-extras.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-fs2img-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-fs2img.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-gridmix-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-gridmix.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-kafka-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-kafka.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-app-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-app.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-core-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-core.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-nativetask-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-nativetask.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-uploader-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-uploader.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-examples-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-examples.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-openstack-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-openstack.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-resourceestimator-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-resourceestimator.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-rumen-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-rumen.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-sls-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-sls.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-streaming-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-streaming.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ojalgo-43.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//kafka-clients-2.8.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//log4j-core-2.18.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-hook-abfs-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//forbiddenapis-3.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-intg-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//log4j-api-2.18.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//zstd-jni-1.4.9-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-i18n.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-s3-lib-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//javax.activation-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//bundle-2.23.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//json-smart-2.4.10.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-util-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-shell.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-cloud-bindings.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-hook-s3-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//aspectjrt-1.9.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/./:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/swagger-annotations-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/objenesis-1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jersey-client-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jakarta.xml.bind-api-2.3.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jakarta.activation-api-1.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-dataformat-yaml-2.9.10.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcutil-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcprov-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcpkix-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/HikariCP-java7-2.4.12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/snakeyaml-2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jsonschema2pojo-core-1.0.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/joda-time-2.10.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jna-5.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jersey-guice-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/javax.inject-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-module-jaxb-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-jaxrs-json-provider-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-jaxrs-base-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/guice-servlet-4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/guice-4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/geronimo-jcache_1.0_spec-1.0-alpha-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/fst-2.50.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/ehcache-3.3.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/dnsjava-2.1.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/codemodel-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-api.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-distributedshell.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-cloud-resourcemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-registry.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-nodemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-resourcemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-router.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-sharedcachemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-timeline-pluginstorage.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-web-proxy.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-api.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-core.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-registry-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-api-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-core-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-api-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-web-proxy-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-timeline-pluginstorage-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-tests-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-sharedcachemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-router-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-resourcemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-nodemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-cloud-resourcemanager-1.0.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-distributedshell-3.1.1.7.3.1.0-197.jar:/opt/cloudera/cm/lib/plugins/event-publish-7.13.1-shaded.jar:/opt/cloudera/cm/lib/plugins/tt-instrumentation-7.13.1.jar STARTUP_MSG: build = git@github.infra.cloudera.com:CDH/hadoop.git -r 31a42fb39494f541ffae15c3c61185deeeacca86; compiled by 'jenkins' on 2024-12-04T01:09Z STARTUP_MSG: java = 1.8.0_432 ************************************************************/ 2025-07-21 13:36:06,934 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: registered UNIX signal handlers for [TERM, HUP, INT] 2025-07-21 13:36:07,295 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d1/dfs/dn 2025-07-21 13:36:07,301 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d2/dfs/dn 2025-07-21 13:36:07,302 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d3/dfs/dn 2025-07-21 13:36:07,302 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d4/dfs/dn 2025-07-21 13:36:07,464 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: Loaded properties from hadoop-metrics2.properties 2025-07-21 13:36:07,573 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled Metric snapshot period at 10 second(s). 2025-07-21 13:36:07,574 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: DataNode metrics system started 2025-07-21 13:36:07,877 INFO org.apache.hadoop.hdfs.server.common.Util: dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling 2025-07-21 13:36:08,011 INFO org.apache.hadoop.hdfs.server.datanode.BlockScanner: Initialized block scanner with targetBytesPerSec 1048576 2025-07-21 13:36:08,021 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: File descriptor passing is enabled. 2025-07-21 13:36:08,022 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Configured hostname is dmidlkprdls04.svr.luc.edu 2025-07-21 13:36:08,023 INFO org.apache.hadoop.hdfs.server.common.Util: dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling 2025-07-21 13:36:08,031 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting DataNode with maxLockedMemory = 4294967296 2025-07-21 13:36:08,063 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened streaming server at /192.168.158.4:9866 2025-07-21 13:36:08,066 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Balancing bandwidth is 10485760 bytes/s 2025-07-21 13:36:08,066 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Number threads for balancing is 50 2025-07-21 13:36:08,070 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Balancing bandwidth is 10485760 bytes/s 2025-07-21 13:36:08,070 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Number threads for balancing is 50 2025-07-21 13:36:08,070 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Listening on UNIX domain socket: /var/run/hdfs-sockets/dn 2025-07-21 13:36:08,126 INFO org.eclipse.jetty.util.log: Logging initialized @2463ms to org.eclipse.jetty.util.log.Slf4jLog 2025-07-21 13:36:08,260 WARN org.apache.hadoop.security.authentication.server.AuthenticationFilter: Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /var/lib/hadoop-hdfs/hadoop-http-auth-signature-secret 2025-07-21 13:36:08,269 INFO org.apache.hadoop.http.HttpRequestLog: Http request log for http.requests.datanode is not defined 2025-07-21 13:36:08,278 INFO org.apache.hadoop.http.HttpServer2: Added global filter 'safety' (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter) 2025-07-21 13:36:08,280 INFO org.apache.hadoop.security.HttpCrossOriginFilterInitializer: CORS filter not enabled. Please set hadoop.http.cross-origin.enabled to 'true' to enable it 2025-07-21 13:36:08,281 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context datanode 2025-07-21 13:36:08,282 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context logs 2025-07-21 13:36:08,282 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context static 2025-07-21 13:36:08,324 INFO org.apache.hadoop.http.HttpServer2: Jetty bound to port 35121 2025-07-21 13:36:08,325 INFO org.eclipse.jetty.server.Server: jetty-9.4.54.v20240208; built: 2024-02-08T19:42:39.027Z; git: cef3fbd6d736a21e7d541a5db490381d95a2047d; jvm 1.8.0_432-b06 2025-07-21 13:36:08,365 INFO org.eclipse.jetty.server.session: DefaultSessionIdManager workerName=node0 2025-07-21 13:36:08,365 INFO org.eclipse.jetty.server.session: No SessionScavenger set, using defaults 2025-07-21 13:36:08,368 INFO org.eclipse.jetty.server.session: node0 Scavenging every 660000ms 2025-07-21 13:36:08,402 WARN org.apache.hadoop.security.authentication.server.AuthenticationFilter: Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /var/lib/hadoop-hdfs/hadoop-http-auth-signature-secret 2025-07-21 13:36:08,408 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.s.ServletContextHandler@1b11ef33{logs,/logs,file:///var/log/hadoop-hdfs/,AVAILABLE} 2025-07-21 13:36:08,409 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.s.ServletContextHandler@2f2bf0e2{static,/static,file:///opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/static/,AVAILABLE} 2025-07-21 13:36:08,525 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.w.WebAppContext@6ebd78d1{datanode,/,file:///opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/datanode/,AVAILABLE}{file:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/datanode} 2025-07-21 13:36:08,546 INFO org.eclipse.jetty.server.AbstractConnector: Started ServerConnector@7fb9f71f{HTTP/1.1, (http/1.1)}{localhost:35121} 2025-07-21 13:36:08,546 INFO org.eclipse.jetty.server.Server: Started @2883ms 2025-07-21 13:36:08,855 INFO org.apache.hadoop.hdfs.server.datanode.web.DatanodeHttpServer: Listening HTTP traffic on /192.168.158.4:9864 2025-07-21 13:36:08,866 INFO org.apache.hadoop.util.JvmPauseMonitor: Starting JVM pause monitor 2025-07-21 13:36:08,868 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: dnUserName = hdfs 2025-07-21 13:36:08,868 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: supergroup = supergroup 2025-07-21 13:36:08,938 INFO org.apache.hadoop.ipc.CallQueueManager: Using callQueue: class java.util.concurrent.LinkedBlockingQueue queueCapacity: 1000 scheduler: class org.apache.hadoop.ipc.DefaultRpcScheduler 2025-07-21 13:36:08,960 INFO org.apache.hadoop.ipc.Server: Starting Socket Reader #1 for port 9867 2025-07-21 13:36:09,009 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened IPC server at /192.168.158.4:9867 2025-07-21 13:36:09,053 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Refresh request received for nameservices: null 2025-07-21 13:36:09,063 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting BPOfferServices for nameservices: 2025-07-21 13:36:09,078 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool (Datanode Uuid unassigned) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 starting to offer service 2025-07-21 13:36:09,086 INFO org.apache.hadoop.ipc.Server: IPC Server Responder: starting 2025-07-21 13:36:09,086 INFO org.apache.hadoop.ipc.Server: IPC Server listener on 9867: starting 2025-07-21 13:36:10,268 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:36:11,270 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:36:12,272 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:36:13,273 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-21 13:36:13,801 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-21 13:36:18,930 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Acknowledging ACTIVE Namenode during handshakeBlock pool (Datanode Uuid unassigned) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-21 13:36:18,933 INFO org.apache.hadoop.hdfs.server.common.Storage: Using 4 threads to upgrade data directories (dfs.datanode.parallel.volumes.load.threads.num=4, dataDirs=4) 2025-07-21 13:36:18,939 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d1/dfs/dn/in_use.lock acquired by nodename 3305276@dmidlkprdls04.svr.luc.edu 2025-07-21 13:36:18,948 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d2/dfs/dn/in_use.lock acquired by nodename 3305276@dmidlkprdls04.svr.luc.edu 2025-07-21 13:36:18,949 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d3/dfs/dn/in_use.lock acquired by nodename 3305276@dmidlkprdls04.svr.luc.edu 2025-07-21 13:36:18,951 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d4/dfs/dn/in_use.lock acquired by nodename 3305276@dmidlkprdls04.svr.luc.edu 2025-07-21 13:36:18,974 INFO org.apache.hadoop.hdfs.server.common.Storage: Analyzing storage directories for bpid BP-1059995147-192.168.158.1-1752101929360 2025-07-21 13:36:18,974 INFO org.apache.hadoop.hdfs.server.common.Storage: Locking is disabled for /hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360 2025-07-21 13:36:18,993 INFO org.apache.hadoop.hdfs.server.common.Storage: Analyzing storage directories for bpid BP-1059995147-192.168.158.1-1752101929360 2025-07-21 13:36:18,994 INFO org.apache.hadoop.hdfs.server.common.Storage: Locking is disabled for /hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360 2025-07-21 13:36:19,011 INFO org.apache.hadoop.hdfs.server.common.Storage: Analyzing storage directories for bpid BP-1059995147-192.168.158.1-1752101929360 2025-07-21 13:36:19,011 INFO org.apache.hadoop.hdfs.server.common.Storage: Locking is disabled for /hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360 2025-07-21 13:36:19,027 INFO org.apache.hadoop.hdfs.server.common.Storage: Analyzing storage directories for bpid BP-1059995147-192.168.158.1-1752101929360 2025-07-21 13:36:19,027 INFO org.apache.hadoop.hdfs.server.common.Storage: Locking is disabled for /hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360 2025-07-21 13:36:19,028 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Setting up storage: nsid=2068539957;bpid=BP-1059995147-192.168.158.1-1752101929360;lv=-57;nsInfo=lv=-64;cid=cluster59;nsid=2068539957;c=1752101929360;bpid=BP-1059995147-192.168.158.1-1752101929360;dnuuid=be50c32a-aa23-4b9d-aa7f-05816b6e5f1a 2025-07-21 13:36:19,040 INFO org.apache.hadoop.conf.Configuration.deprecation: No unit for dfs.datanode.lock-reporting-threshold-ms(300) assuming MILLISECONDS 2025-07-21 13:36:19,042 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: The datanode lock is a read write lock 2025-07-21 13:36:19,079 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Added new volume: DS-c6caf9b4-0cd0-462e-a7af-39538ffb6d0e 2025-07-21 13:36:19,079 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Added volume - [DISK]file:/hdfs/d1/dfs/dn, StorageType: DISK 2025-07-21 13:36:19,082 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Added new volume: DS-ab1b4344-d9fe-4401-915a-b02983ca3944 2025-07-21 13:36:19,082 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Added volume - [DISK]file:/hdfs/d2/dfs/dn, StorageType: DISK 2025-07-21 13:36:19,084 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Added new volume: DS-f02a6d6f-472c-481a-aa41-d58991ac764f 2025-07-21 13:36:19,084 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Added volume - [DISK]file:/hdfs/d3/dfs/dn, StorageType: DISK 2025-07-21 13:36:19,087 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Added new volume: DS-e9eccc83-296b-4afa-bee5-915188e0d9a5 2025-07-21 13:36:19,087 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Added volume - [DISK]file:/hdfs/d4/dfs/dn, StorageType: DISK 2025-07-21 13:36:19,093 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Registered FSDatasetState MBean 2025-07-21 13:36:19,100 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Adding block pool BP-1059995147-192.168.158.1-1752101929360 2025-07-21 13:36:19,101 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Scanning block pool BP-1059995147-192.168.158.1-1752101929360 on volume /hdfs/d1/dfs/dn... 2025-07-21 13:36:19,102 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Scanning block pool BP-1059995147-192.168.158.1-1752101929360 on volume /hdfs/d3/dfs/dn... 2025-07-21 13:36:19,102 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Scanning block pool BP-1059995147-192.168.158.1-1752101929360 on volume /hdfs/d4/dfs/dn... 2025-07-21 13:36:19,101 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Scanning block pool BP-1059995147-192.168.158.1-1752101929360 on volume /hdfs/d2/dfs/dn... 2025-07-21 13:36:19,131 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Cached dfsUsed found for /hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current: 270553088 2025-07-21 13:36:19,132 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Cached dfsUsed found for /hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current: 270553088 2025-07-21 13:36:19,132 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Cached dfsUsed found for /hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current: 270553088 2025-07-21 13:36:19,134 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Cached dfsUsed found for /hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current: 204349440 2025-07-21 13:36:19,180 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Time taken to scan block pool BP-1059995147-192.168.158.1-1752101929360 on /hdfs/d2/dfs/dn: 75ms 2025-07-21 13:36:19,180 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Time taken to scan block pool BP-1059995147-192.168.158.1-1752101929360 on /hdfs/d1/dfs/dn: 78ms 2025-07-21 13:36:19,180 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Time taken to scan block pool BP-1059995147-192.168.158.1-1752101929360 on /hdfs/d4/dfs/dn: 76ms 2025-07-21 13:36:19,183 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Time taken to scan block pool BP-1059995147-192.168.158.1-1752101929360 on /hdfs/d3/dfs/dn: 80ms 2025-07-21 13:36:19,183 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Total time to scan all replicas for block pool BP-1059995147-192.168.158.1-1752101929360: 83ms 2025-07-21 13:36:19,186 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Adding replicas to map for block pool BP-1059995147-192.168.158.1-1752101929360 on volume /hdfs/d1/dfs/dn... 2025-07-21 13:36:19,186 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Adding replicas to map for block pool BP-1059995147-192.168.158.1-1752101929360 on volume /hdfs/d2/dfs/dn... 2025-07-21 13:36:19,187 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.BlockPoolSlice: Replica Cache file: /hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/replicas doesn't exist 2025-07-21 13:36:19,188 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Adding replicas to map for block pool BP-1059995147-192.168.158.1-1752101929360 on volume /hdfs/d4/dfs/dn... 2025-07-21 13:36:19,187 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.BlockPoolSlice: Replica Cache file: /hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/replicas doesn't exist 2025-07-21 13:36:19,188 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.BlockPoolSlice: Replica Cache file: /hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/replicas doesn't exist 2025-07-21 13:36:19,186 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Adding replicas to map for block pool BP-1059995147-192.168.158.1-1752101929360 on volume /hdfs/d3/dfs/dn... 2025-07-21 13:36:19,189 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.BlockPoolSlice: Replica Cache file: /hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/replicas doesn't exist 2025-07-21 13:36:19,220 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Time to add replicas to map for block pool BP-1059995147-192.168.158.1-1752101929360 on volume /hdfs/d4/dfs/dn: 32ms 2025-07-21 13:36:19,220 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Time to add replicas to map for block pool BP-1059995147-192.168.158.1-1752101929360 on volume /hdfs/d2/dfs/dn: 33ms 2025-07-21 13:36:19,220 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Time to add replicas to map for block pool BP-1059995147-192.168.158.1-1752101929360 on volume /hdfs/d1/dfs/dn: 33ms 2025-07-21 13:36:19,220 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Time to add replicas to map for block pool BP-1059995147-192.168.158.1-1752101929360 on volume /hdfs/d3/dfs/dn: 31ms 2025-07-21 13:36:19,222 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Total time to add all replicas to map for block pool BP-1059995147-192.168.158.1-1752101929360: 37ms 2025-07-21 13:36:19,223 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for /hdfs/d1/dfs/dn 2025-07-21 13:36:19,239 INFO org.apache.hadoop.hdfs.server.datanode.checker.DatasetVolumeChecker: Scheduled health check for volume /hdfs/d1/dfs/dn 2025-07-21 13:36:19,242 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for /hdfs/d2/dfs/dn 2025-07-21 13:36:19,242 INFO org.apache.hadoop.hdfs.server.datanode.checker.DatasetVolumeChecker: Scheduled health check for volume /hdfs/d2/dfs/dn 2025-07-21 13:36:19,242 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for /hdfs/d3/dfs/dn 2025-07-21 13:36:19,243 INFO org.apache.hadoop.hdfs.server.datanode.checker.DatasetVolumeChecker: Scheduled health check for volume /hdfs/d3/dfs/dn 2025-07-21 13:36:19,243 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for /hdfs/d4/dfs/dn 2025-07-21 13:36:19,243 INFO org.apache.hadoop.hdfs.server.datanode.checker.DatasetVolumeChecker: Scheduled health check for volume /hdfs/d4/dfs/dn 2025-07-21 13:36:19,266 INFO org.apache.hadoop.hdfs.server.datanode.VolumeScanner: VolumeScanner(/hdfs/d2/dfs/dn, DS-ab1b4344-d9fe-4401-915a-b02983ca3944): no suitable block pools found to scan. Waiting 793385982 ms. 2025-07-21 13:36:19,266 INFO org.apache.hadoop.hdfs.server.datanode.VolumeScanner: VolumeScanner(/hdfs/d1/dfs/dn, DS-c6caf9b4-0cd0-462e-a7af-39538ffb6d0e): no suitable block pools found to scan. Waiting 793385982 ms. 2025-07-21 13:36:19,267 INFO org.apache.hadoop.hdfs.server.datanode.VolumeScanner: VolumeScanner(/hdfs/d3/dfs/dn, DS-f02a6d6f-472c-481a-aa41-d58991ac764f): no suitable block pools found to scan. Waiting 793385981 ms. 2025-07-21 13:36:19,267 INFO org.apache.hadoop.hdfs.server.datanode.VolumeScanner: VolumeScanner(/hdfs/d4/dfs/dn, DS-e9eccc83-296b-4afa-bee5-915188e0d9a5): no suitable block pools found to scan. Waiting 793385981 ms. 2025-07-21 13:36:19,277 INFO org.apache.hadoop.hdfs.server.datanode.DirectoryScanner: Periodic Directory Tree Verification scan starting at 7/21/25 1:59 PM with interval of 21600000ms 2025-07-21 13:36:19,287 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool BP-1059995147-192.168.158.1-1752101929360 (Datanode Uuid be50c32a-aa23-4b9d-aa7f-05816b6e5f1a) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 beginning handshake with NN 2025-07-21 13:36:19,375 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool Block pool BP-1059995147-192.168.158.1-1752101929360 (Datanode Uuid be50c32a-aa23-4b9d-aa7f-05816b6e5f1a) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 successfully registered with NN 2025-07-21 13:36:19,378 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: For namenode dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 using BLOCKREPORT_INTERVAL of 21600000msec CACHEREPORT_INTERVAL of 10000msec Initial delay: 0msec; heartBeatInterval=3000 2025-07-21 13:36:19,378 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting IBR Task Handler. 2025-07-21 13:36:19,641 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Successfully sent block report 0x6cb5f0b7da23e673, containing 4 storage report(s), of which we sent 4. The reports had 8 total blocks and used 1 RPC(s). This took 9 msec to generate and 136 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5. 2025-07-21 13:36:19,644 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Got finalize command for block pool BP-1059995147-192.168.158.1-1752101929360 2025-07-21 13:37:35,271 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758651_17827 src: /192.168.158.6:37734 dest: /192.168.158.4:9866 2025-07-21 13:37:35,488 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:37734, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_675556231_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758651_17827, duration(ns): 69725397 2025-07-21 13:37:35,488 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758651_17827, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-21 13:45:44,891 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758659_17835 src: /192.168.158.5:34422 dest: /192.168.158.4:9866 2025-07-21 13:45:44,917 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:34422, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-944802046_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758659_17835, duration(ns): 18796533 2025-07-21 13:45:44,917 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758659_17835, type=LAST_IN_PIPELINE terminating 2025-07-21 13:49:49,889 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758663_17839 src: /192.168.158.7:35466 dest: /192.168.158.4:9866 2025-07-21 13:49:49,917 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:35466, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1587375159_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758663_17839, duration(ns): 19310486 2025-07-21 13:49:49,917 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758663_17839, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-21 13:52:59,890 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758666_17842 src: /192.168.158.9:38294 dest: /192.168.158.4:9866 2025-07-21 13:52:59,919 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:38294, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1305700997_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758666_17842, duration(ns): 20837928 2025-07-21 13:52:59,920 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758666_17842, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-21 13:53:59,889 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758667_17843 src: /192.168.158.1:45360 dest: /192.168.158.4:9866 2025-07-21 13:53:59,930 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45360, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1787299687_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758667_17843, duration(ns): 23138074 2025-07-21 13:53:59,931 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758667_17843, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-21 13:55:59,900 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758669_17845 src: /192.168.158.8:48798 dest: /192.168.158.4:9866 2025-07-21 13:55:59,926 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:48798, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_353569225_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758669_17845, duration(ns): 18674971 2025-07-21 13:55:59,928 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758669_17845, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-21 13:59:36,300 INFO org.apache.hadoop.hdfs.server.datanode.DirectoryScanner: BlockPool BP-1059995147-192.168.158.1-1752101929360 Total blocks: 14, missing metadata files:0, missing block files:0, missing blocks in memory:0, mismatched blocks:0 2025-07-21 14:01:59,930 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758675_17851 src: /192.168.158.6:36850 dest: /192.168.158.4:9866 2025-07-21 14:01:59,958 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:36850, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_21914321_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758675_17851, duration(ns): 19740687 2025-07-21 14:01:59,961 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758675_17851, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-21 14:03:59,937 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758677_17853 src: /192.168.158.7:41714 dest: /192.168.158.4:9866 2025-07-21 14:03:59,957 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:41714, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-158627220_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758677_17853, duration(ns): 17093555 2025-07-21 14:03:59,958 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758677_17853, type=LAST_IN_PIPELINE terminating 2025-07-21 14:07:04,937 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758680_17856 src: /192.168.158.6:33400 dest: /192.168.158.4:9866 2025-07-21 14:07:04,966 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:33400, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_46581919_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758680_17856, duration(ns): 19954684 2025-07-21 14:07:04,966 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758680_17856, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-21 14:09:09,925 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758682_17858 src: /192.168.158.1:48852 dest: /192.168.158.4:9866 2025-07-21 14:09:09,966 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48852, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2028147633_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758682_17858, duration(ns): 25860778 2025-07-21 14:09:09,967 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758682_17858, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-21 14:11:09,921 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758684_17860 src: /192.168.158.1:42970 dest: /192.168.158.4:9866 2025-07-21 14:11:09,962 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42970, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_791204772_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758684_17860, duration(ns): 27166476 2025-07-21 14:11:09,963 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758684_17860, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-21 14:13:09,926 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758686_17862 src: /192.168.158.1:42152 dest: /192.168.158.4:9866 2025-07-21 14:13:09,965 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42152, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1147574728_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758686_17862, duration(ns): 25350317 2025-07-21 14:13:09,966 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758686_17862, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-21 14:14:14,918 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758687_17863 src: /192.168.158.6:34752 dest: /192.168.158.4:9866 2025-07-21 14:14:14,949 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:34752, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-503254566_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758687_17863, duration(ns): 22398047 2025-07-21 14:14:14,950 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758687_17863, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-21 14:17:19,926 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758690_17866 src: /192.168.158.9:39680 dest: /192.168.158.4:9866 2025-07-21 14:17:19,947 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:39680, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1649871603_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758690_17866, duration(ns): 17854736 2025-07-21 14:17:19,948 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758690_17866, type=LAST_IN_PIPELINE terminating 2025-07-21 14:18:19,925 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758691_17867 src: /192.168.158.1:47890 dest: /192.168.158.4:9866 2025-07-21 14:18:19,963 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47890, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1327904721_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758691_17867, duration(ns): 24537700 2025-07-21 14:18:19,964 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758691_17867, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-21 14:20:19,930 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758693_17869 src: /192.168.158.5:43480 dest: /192.168.158.4:9866 2025-07-21 14:20:19,952 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:43480, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2115818161_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758693_17869, duration(ns): 18951038 2025-07-21 14:20:19,953 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758693_17869, type=LAST_IN_PIPELINE terminating 2025-07-21 14:21:24,934 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758694_17870 src: /192.168.158.1:56232 dest: /192.168.158.4:9866 2025-07-21 14:21:24,972 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56232, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-714939490_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758694_17870, duration(ns): 23893900 2025-07-21 14:21:24,974 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758694_17870, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-21 14:25:34,943 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758698_17874 src: /192.168.158.1:38738 dest: /192.168.158.4:9866 2025-07-21 14:25:34,981 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38738, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_220016885_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758698_17874, duration(ns): 23910383 2025-07-21 14:25:34,982 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758698_17874, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-21 14:26:34,979 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758699_17875 src: /192.168.158.5:36440 dest: /192.168.158.4:9866 2025-07-21 14:26:35,000 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:36440, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2040785881_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758699_17875, duration(ns): 17677223 2025-07-21 14:26:35,001 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758699_17875, type=LAST_IN_PIPELINE terminating 2025-07-21 14:27:34,966 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758700_17876 src: /192.168.158.1:48812 dest: /192.168.158.4:9866 2025-07-21 14:27:35,012 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48812, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1809590400_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758700_17876, duration(ns): 31791843 2025-07-21 14:27:35,012 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758700_17876, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-21 14:30:39,972 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758703_17879 src: /192.168.158.9:38040 dest: /192.168.158.4:9866 2025-07-21 14:30:39,993 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:38040, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1908539011_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758703_17879, duration(ns): 17570366 2025-07-21 14:30:39,994 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758703_17879, type=LAST_IN_PIPELINE terminating 2025-07-21 14:32:39,988 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758705_17881 src: /192.168.158.8:41822 dest: /192.168.158.4:9866 2025-07-21 14:32:40,010 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:41822, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1179408115_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758705_17881, duration(ns): 18571673 2025-07-21 14:32:40,011 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758705_17881, type=LAST_IN_PIPELINE terminating 2025-07-21 14:34:39,979 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758707_17883 src: /192.168.158.7:39360 dest: /192.168.158.4:9866 2025-07-21 14:34:40,012 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:39360, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1778537992_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758707_17883, duration(ns): 24734891 2025-07-21 14:34:40,012 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758707_17883, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-21 14:36:16,545 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758690_17866 replica FinalizedReplica, blk_1073758690_17866, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758690 for deletion 2025-07-21 14:36:16,548 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758691_17867 replica FinalizedReplica, blk_1073758691_17867, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758691 for deletion 2025-07-21 14:36:16,549 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758690_17866 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758690 2025-07-21 14:36:16,550 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758691_17867 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758691 2025-07-21 14:36:16,550 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758659_17835 replica FinalizedReplica, blk_1073758659_17835, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758659 for deletion 2025-07-21 14:36:16,551 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758693_17869 replica FinalizedReplica, blk_1073758693_17869, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758693 for deletion 2025-07-21 14:36:16,552 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758659_17835 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758659 2025-07-21 14:36:16,552 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758694_17870 replica FinalizedReplica, blk_1073758694_17870, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758694 for deletion 2025-07-21 14:36:16,553 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758693_17869 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758693 2025-07-21 14:36:16,553 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758663_17839 replica FinalizedReplica, blk_1073758663_17839, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758663 for deletion 2025-07-21 14:36:16,553 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758694_17870 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758694 2025-07-21 14:36:16,553 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758698_17874 replica FinalizedReplica, blk_1073758698_17874, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758698 for deletion 2025-07-21 14:36:16,554 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758663_17839 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758663 2025-07-21 14:36:16,554 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758698_17874 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758698 2025-07-21 14:36:16,554 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758666_17842 replica FinalizedReplica, blk_1073758666_17842, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758666 for deletion 2025-07-21 14:36:16,555 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758666_17842 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758666 2025-07-21 14:36:16,555 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758699_17875 replica FinalizedReplica, blk_1073758699_17875, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758699 for deletion 2025-07-21 14:36:16,556 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758667_17843 replica FinalizedReplica, blk_1073758667_17843, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758667 for deletion 2025-07-21 14:36:16,556 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758699_17875 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758699 2025-07-21 14:36:16,556 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758700_17876 replica FinalizedReplica, blk_1073758700_17876, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758700 for deletion 2025-07-21 14:36:16,556 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758667_17843 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758667 2025-07-21 14:36:16,557 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758700_17876 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758700 2025-07-21 14:36:16,557 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758669_17845 replica FinalizedReplica, blk_1073758669_17845, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758669 for deletion 2025-07-21 14:36:16,557 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758703_17879 replica FinalizedReplica, blk_1073758703_17879, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758703 for deletion 2025-07-21 14:36:16,558 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758669_17845 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758669 2025-07-21 14:36:16,558 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758703_17879 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758703 2025-07-21 14:36:16,558 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758705_17881 replica FinalizedReplica, blk_1073758705_17881, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758705 for deletion 2025-07-21 14:36:16,559 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758705_17881 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758705 2025-07-21 14:36:16,559 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758675_17851 replica FinalizedReplica, blk_1073758675_17851, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758675 for deletion 2025-07-21 14:36:16,560 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758675_17851 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758675 2025-07-21 14:36:16,560 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758707_17883 replica FinalizedReplica, blk_1073758707_17883, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758707 for deletion 2025-07-21 14:36:16,561 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758707_17883 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758707 2025-07-21 14:36:16,561 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758677_17853 replica FinalizedReplica, blk_1073758677_17853, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758677 for deletion 2025-07-21 14:36:16,561 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758677_17853 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758677 2025-07-21 14:36:16,561 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758680_17856 replica FinalizedReplica, blk_1073758680_17856, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758680 for deletion 2025-07-21 14:36:16,562 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758682_17858 replica FinalizedReplica, blk_1073758682_17858, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758682 for deletion 2025-07-21 14:36:16,562 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758680_17856 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758680 2025-07-21 14:36:16,563 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758651_17827 replica FinalizedReplica, blk_1073758651_17827, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758651 for deletion 2025-07-21 14:36:16,563 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758682_17858 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758682 2025-07-21 14:36:16,563 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758684_17860 replica FinalizedReplica, blk_1073758684_17860, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758684 for deletion 2025-07-21 14:36:16,563 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758651_17827 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758651 2025-07-21 14:36:16,564 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758686_17862 replica FinalizedReplica, blk_1073758686_17862, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758686 for deletion 2025-07-21 14:36:16,564 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758684_17860 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758684 2025-07-21 14:36:16,564 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758686_17862 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758686 2025-07-21 14:36:16,564 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758687_17863 replica FinalizedReplica, blk_1073758687_17863, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758687 for deletion 2025-07-21 14:36:16,565 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758687_17863 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758687 2025-07-21 14:36:39,955 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758709_17885 src: /192.168.158.7:54032 dest: /192.168.158.4:9866 2025-07-21 14:36:39,977 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:54032, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_839420715_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758709_17885, duration(ns): 17934391 2025-07-21 14:36:39,977 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758709_17885, type=LAST_IN_PIPELINE terminating 2025-07-21 14:36:43,536 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758709_17885 replica FinalizedReplica, blk_1073758709_17885, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758709 for deletion 2025-07-21 14:36:43,537 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758709_17885 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758709 2025-07-21 14:37:39,960 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758710_17886 src: /192.168.158.8:36542 dest: /192.168.158.4:9866 2025-07-21 14:37:39,983 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:36542, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1011164281_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758710_17886, duration(ns): 19395882 2025-07-21 14:37:39,983 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758710_17886, type=LAST_IN_PIPELINE terminating 2025-07-21 14:37:46,538 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758710_17886 replica FinalizedReplica, blk_1073758710_17886, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758710 for deletion 2025-07-21 14:37:46,539 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758710_17886 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758710 2025-07-21 14:38:39,947 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758711_17887 src: /192.168.158.1:36430 dest: /192.168.158.4:9866 2025-07-21 14:38:39,991 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36430, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-783657910_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758711_17887, duration(ns): 30533459 2025-07-21 14:38:39,996 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758711_17887, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-21 14:38:43,540 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758711_17887 replica FinalizedReplica, blk_1073758711_17887, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758711 for deletion 2025-07-21 14:38:43,541 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758711_17887 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758711 2025-07-21 14:39:44,969 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758712_17888 src: /192.168.158.1:41256 dest: /192.168.158.4:9866 2025-07-21 14:39:45,010 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41256, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_646813302_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758712_17888, duration(ns): 27914610 2025-07-21 14:39:45,011 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758712_17888, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-21 14:39:52,544 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758712_17888 replica FinalizedReplica, blk_1073758712_17888, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758712 for deletion 2025-07-21 14:39:52,545 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758712_17888 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758712 2025-07-21 14:41:49,957 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758714_17890 src: /192.168.158.8:56488 dest: /192.168.158.4:9866 2025-07-21 14:41:49,987 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:56488, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1440807499_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758714_17890, duration(ns): 22726510 2025-07-21 14:41:49,988 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758714_17890, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-21 14:41:55,552 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758714_17890 replica FinalizedReplica, blk_1073758714_17890, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758714 for deletion 2025-07-21 14:41:55,553 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758714_17890 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758714 2025-07-21 14:43:01,565 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Successfully sent block report 0x6cb5f0b7da23e674, containing 4 storage report(s), of which we sent 4. The reports had 8 total blocks and used 1 RPC(s). This took 0 msec to generate and 13 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5. 2025-07-21 14:43:01,566 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Got finalize command for block pool BP-1059995147-192.168.158.1-1752101929360 2025-07-21 14:43:54,962 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758716_17892 src: /192.168.158.6:47542 dest: /192.168.158.4:9866 2025-07-21 14:43:54,984 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:47542, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-953918358_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758716_17892, duration(ns): 17885441 2025-07-21 14:43:54,986 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758716_17892, type=LAST_IN_PIPELINE terminating 2025-07-21 14:43:58,553 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758716_17892 replica FinalizedReplica, blk_1073758716_17892, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758716 for deletion 2025-07-21 14:43:58,555 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758716_17892 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758716 2025-07-21 14:44:59,954 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758717_17893 src: /192.168.158.1:56950 dest: /192.168.158.4:9866 2025-07-21 14:44:59,993 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56950, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_830392357_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758717_17893, duration(ns): 25868054 2025-07-21 14:44:59,994 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758717_17893, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-21 14:45:04,558 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758717_17893 replica FinalizedReplica, blk_1073758717_17893, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758717 for deletion 2025-07-21 14:45:04,559 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758717_17893 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758717 2025-07-21 14:45:59,971 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758718_17894 src: /192.168.158.8:52286 dest: /192.168.158.4:9866 2025-07-21 14:45:59,992 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:52286, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1636630949_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758718_17894, duration(ns): 17597441 2025-07-21 14:45:59,992 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758718_17894, type=LAST_IN_PIPELINE terminating 2025-07-21 14:46:07,562 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758718_17894 replica FinalizedReplica, blk_1073758718_17894, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758718 for deletion 2025-07-21 14:46:07,564 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758718_17894 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758718 2025-07-21 14:46:59,971 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758719_17895 src: /192.168.158.9:44936 dest: /192.168.158.4:9866 2025-07-21 14:46:59,991 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:44936, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_142074968_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758719_17895, duration(ns): 17178838 2025-07-21 14:46:59,992 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758719_17895, type=LAST_IN_PIPELINE terminating 2025-07-21 14:47:04,565 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758719_17895 replica FinalizedReplica, blk_1073758719_17895, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758719 for deletion 2025-07-21 14:47:04,566 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758719_17895 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir1/blk_1073758719 2025-07-21 14:50:04,976 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758722_17898 src: /192.168.158.5:43498 dest: /192.168.158.4:9866 2025-07-21 14:50:04,997 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:43498, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1462267316_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758722_17898, duration(ns): 18404172 2025-07-21 14:50:04,998 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758722_17898, type=LAST_IN_PIPELINE terminating 2025-07-21 14:50:07,573 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758722_17898 replica FinalizedReplica, blk_1073758722_17898, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758722 for deletion 2025-07-21 14:50:07,574 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758722_17898 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758722 2025-07-21 14:51:09,966 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758723_17899 src: /192.168.158.1:45002 dest: /192.168.158.4:9866 2025-07-21 14:51:10,006 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45002, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-263032877_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758723_17899, duration(ns): 26897845 2025-07-21 14:51:10,007 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758723_17899, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-21 14:51:16,574 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758723_17899 replica FinalizedReplica, blk_1073758723_17899, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758723 for deletion 2025-07-21 14:51:16,575 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758723_17899 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758723 2025-07-21 14:54:09,981 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758726_17902 src: /192.168.158.8:33630 dest: /192.168.158.4:9866 2025-07-21 14:54:10,002 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:33630, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-810918301_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758726_17902, duration(ns): 17757610 2025-07-21 14:54:10,002 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758726_17902, type=LAST_IN_PIPELINE terminating 2025-07-21 14:54:13,587 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758726_17902 replica FinalizedReplica, blk_1073758726_17902, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758726 for deletion 2025-07-21 14:54:13,589 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758726_17902 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758726 2025-07-21 14:55:14,972 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758727_17903 src: /192.168.158.1:33156 dest: /192.168.158.4:9866 2025-07-21 14:55:15,015 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33156, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1902540244_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758727_17903, duration(ns): 28884379 2025-07-21 14:55:15,016 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758727_17903, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-21 14:55:19,591 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758727_17903 replica FinalizedReplica, blk_1073758727_17903, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758727 for deletion 2025-07-21 14:55:19,593 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758727_17903 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758727 2025-07-21 14:56:19,979 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758728_17904 src: /192.168.158.1:32924 dest: /192.168.158.4:9866 2025-07-21 14:56:20,017 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:32924, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2085817873_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758728_17904, duration(ns): 24390660 2025-07-21 14:56:20,018 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758728_17904, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-21 14:56:22,593 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758728_17904 replica FinalizedReplica, blk_1073758728_17904, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758728 for deletion 2025-07-21 14:56:22,594 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758728_17904 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758728 2025-07-21 14:57:19,977 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758729_17905 src: /192.168.158.1:54042 dest: /192.168.158.4:9866 2025-07-21 14:57:20,017 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54042, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1607621706_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758729_17905, duration(ns): 27501539 2025-07-21 14:57:20,018 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758729_17905, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-21 14:57:22,596 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758729_17905 replica FinalizedReplica, blk_1073758729_17905, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758729 for deletion 2025-07-21 14:57:22,598 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758729_17905 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758729 2025-07-21 15:01:25,007 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758733_17909 src: /192.168.158.7:57534 dest: /192.168.158.4:9866 2025-07-21 15:01:25,028 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:57534, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-278631030_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758733_17909, duration(ns): 17898154 2025-07-21 15:01:25,029 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758733_17909, type=LAST_IN_PIPELINE terminating 2025-07-21 15:01:31,605 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758733_17909 replica FinalizedReplica, blk_1073758733_17909, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758733 for deletion 2025-07-21 15:01:31,607 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758733_17909 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758733 2025-07-21 15:03:30,000 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758735_17911 src: /192.168.158.5:35760 dest: /192.168.158.4:9866 2025-07-21 15:03:30,023 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:35760, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2119194000_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758735_17911, duration(ns): 18928874 2025-07-21 15:03:30,024 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758735_17911, type=LAST_IN_PIPELINE terminating 2025-07-21 15:03:34,610 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758735_17911 replica FinalizedReplica, blk_1073758735_17911, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758735 for deletion 2025-07-21 15:03:34,611 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758735_17911 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758735 2025-07-21 15:05:34,996 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758737_17913 src: /192.168.158.7:33858 dest: /192.168.158.4:9866 2025-07-21 15:05:35,017 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:33858, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-630312150_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758737_17913, duration(ns): 18079180 2025-07-21 15:05:35,018 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758737_17913, type=LAST_IN_PIPELINE terminating 2025-07-21 15:05:40,616 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758737_17913 replica FinalizedReplica, blk_1073758737_17913, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758737 for deletion 2025-07-21 15:05:40,617 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758737_17913 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758737 2025-07-21 15:07:35,000 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758739_17915 src: /192.168.158.5:41246 dest: /192.168.158.4:9866 2025-07-21 15:07:35,022 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:41246, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-636532747_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758739_17915, duration(ns): 18528506 2025-07-21 15:07:35,023 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758739_17915, type=LAST_IN_PIPELINE terminating 2025-07-21 15:07:40,618 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758739_17915 replica FinalizedReplica, blk_1073758739_17915, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758739 for deletion 2025-07-21 15:07:40,619 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758739_17915 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758739 2025-07-21 15:09:34,994 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758741_17917 src: /192.168.158.5:47248 dest: /192.168.158.4:9866 2025-07-21 15:09:35,024 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:47248, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-392793165_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758741_17917, duration(ns): 22140432 2025-07-21 15:09:35,025 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758741_17917, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-21 15:09:40,623 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758741_17917 replica FinalizedReplica, blk_1073758741_17917, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758741 for deletion 2025-07-21 15:09:40,626 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758741_17917 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758741 2025-07-21 15:12:45,005 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758744_17920 src: /192.168.158.7:52428 dest: /192.168.158.4:9866 2025-07-21 15:12:45,026 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:52428, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_168483296_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758744_17920, duration(ns): 17935409 2025-07-21 15:12:45,027 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758744_17920, type=LAST_IN_PIPELINE terminating 2025-07-21 15:12:52,627 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758744_17920 replica FinalizedReplica, blk_1073758744_17920, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758744 for deletion 2025-07-21 15:12:52,629 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758744_17920 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758744 2025-07-21 15:13:50,004 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758745_17921 src: /192.168.158.9:53428 dest: /192.168.158.4:9866 2025-07-21 15:13:50,036 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:53428, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1881421782_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758745_17921, duration(ns): 22848827 2025-07-21 15:13:50,038 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758745_17921, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-21 15:13:52,631 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758745_17921 replica FinalizedReplica, blk_1073758745_17921, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758745 for deletion 2025-07-21 15:13:52,632 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758745_17921 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758745 2025-07-21 15:15:50,042 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758747_17923 src: /192.168.158.5:40036 dest: /192.168.158.4:9866 2025-07-21 15:15:50,064 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:40036, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_285921646_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758747_17923, duration(ns): 19424172 2025-07-21 15:15:50,065 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758747_17923, type=LAST_IN_PIPELINE terminating 2025-07-21 15:15:52,635 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758747_17923 replica FinalizedReplica, blk_1073758747_17923, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758747 for deletion 2025-07-21 15:15:52,636 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758747_17923 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758747 2025-07-21 15:16:50,017 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758748_17924 src: /192.168.158.9:53420 dest: /192.168.158.4:9866 2025-07-21 15:16:50,038 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:53420, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-361669361_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758748_17924, duration(ns): 18504569 2025-07-21 15:16:50,039 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758748_17924, type=LAST_IN_PIPELINE terminating 2025-07-21 15:16:55,638 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758748_17924 replica FinalizedReplica, blk_1073758748_17924, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758748 for deletion 2025-07-21 15:16:55,639 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758748_17924 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758748 2025-07-21 15:21:00,020 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758752_17928 src: /192.168.158.7:43306 dest: /192.168.158.4:9866 2025-07-21 15:21:00,053 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:43306, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-755725405_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758752_17928, duration(ns): 25312856 2025-07-21 15:21:00,054 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758752_17928, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-21 15:21:04,649 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758752_17928 replica FinalizedReplica, blk_1073758752_17928, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758752 for deletion 2025-07-21 15:21:04,651 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758752_17928 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758752 2025-07-21 15:22:00,029 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758753_17929 src: /192.168.158.6:56112 dest: /192.168.158.4:9866 2025-07-21 15:22:00,062 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:56112, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-185089762_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758753_17929, duration(ns): 24065613 2025-07-21 15:22:00,063 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758753_17929, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-21 15:22:04,656 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758753_17929 replica FinalizedReplica, blk_1073758753_17929, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758753 for deletion 2025-07-21 15:22:04,657 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758753_17929 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758753 2025-07-21 15:26:10,013 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758757_17933 src: /192.168.158.6:37076 dest: /192.168.158.4:9866 2025-07-21 15:26:10,043 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:37076, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1406545157_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758757_17933, duration(ns): 21775196 2025-07-21 15:26:10,043 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758757_17933, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-21 15:26:13,654 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758757_17933 replica FinalizedReplica, blk_1073758757_17933, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758757 for deletion 2025-07-21 15:26:13,655 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758757_17933 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758757 2025-07-21 15:27:10,030 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758758_17934 src: /192.168.158.7:60302 dest: /192.168.158.4:9866 2025-07-21 15:27:10,051 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:60302, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_77498928_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758758_17934, duration(ns): 18444266 2025-07-21 15:27:10,052 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758758_17934, type=LAST_IN_PIPELINE terminating 2025-07-21 15:27:13,657 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758758_17934 replica FinalizedReplica, blk_1073758758_17934, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758758 for deletion 2025-07-21 15:27:13,659 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758758_17934 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758758 2025-07-21 15:28:10,020 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758759_17935 src: /192.168.158.5:53330 dest: /192.168.158.4:9866 2025-07-21 15:28:10,042 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:53330, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1715532396_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758759_17935, duration(ns): 18455820 2025-07-21 15:28:10,042 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758759_17935, type=LAST_IN_PIPELINE terminating 2025-07-21 15:28:16,659 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758759_17935 replica FinalizedReplica, blk_1073758759_17935, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758759 for deletion 2025-07-21 15:28:16,660 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758759_17935 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758759 2025-07-21 15:29:10,025 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758760_17936 src: /192.168.158.8:40620 dest: /192.168.158.4:9866 2025-07-21 15:29:10,046 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:40620, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_507161372_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758760_17936, duration(ns): 18056775 2025-07-21 15:29:10,046 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758760_17936, type=LAST_IN_PIPELINE terminating 2025-07-21 15:29:13,663 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758760_17936 replica FinalizedReplica, blk_1073758760_17936, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758760 for deletion 2025-07-21 15:29:13,664 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758760_17936 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758760 2025-07-21 15:33:20,028 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758764_17940 src: /192.168.158.9:51286 dest: /192.168.158.4:9866 2025-07-21 15:33:20,050 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:51286, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1373344984_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758764_17940, duration(ns): 19003994 2025-07-21 15:33:20,050 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758764_17940, type=LAST_IN_PIPELINE terminating 2025-07-21 15:33:22,666 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758764_17940 replica FinalizedReplica, blk_1073758764_17940, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758764 for deletion 2025-07-21 15:33:22,667 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758764_17940 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758764 2025-07-21 15:35:20,022 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758766_17942 src: /192.168.158.1:50908 dest: /192.168.158.4:9866 2025-07-21 15:35:20,067 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50908, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_765636901_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758766_17942, duration(ns): 32516817 2025-07-21 15:35:20,068 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758766_17942, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-21 15:35:22,673 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758766_17942 replica FinalizedReplica, blk_1073758766_17942, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758766 for deletion 2025-07-21 15:35:22,674 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758766_17942 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758766 2025-07-21 15:37:20,041 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758768_17944 src: /192.168.158.6:38708 dest: /192.168.158.4:9866 2025-07-21 15:37:20,068 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:38708, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_190329869_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758768_17944, duration(ns): 19665352 2025-07-21 15:37:20,069 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758768_17944, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-21 15:37:22,681 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758768_17944 replica FinalizedReplica, blk_1073758768_17944, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758768 for deletion 2025-07-21 15:37:22,682 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758768_17944 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758768 2025-07-21 15:38:20,026 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758769_17945 src: /192.168.158.1:52234 dest: /192.168.158.4:9866 2025-07-21 15:38:20,062 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52234, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1690265902_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758769_17945, duration(ns): 24804290 2025-07-21 15:38:20,063 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758769_17945, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-21 15:38:22,682 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758769_17945 replica FinalizedReplica, blk_1073758769_17945, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758769 for deletion 2025-07-21 15:38:22,684 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758769_17945 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758769 2025-07-21 15:40:20,031 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758771_17947 src: /192.168.158.1:58184 dest: /192.168.158.4:9866 2025-07-21 15:40:20,072 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58184, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-448090611_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758771_17947, duration(ns): 28677152 2025-07-21 15:40:20,072 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758771_17947, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-21 15:40:25,688 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758771_17947 replica FinalizedReplica, blk_1073758771_17947, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758771 for deletion 2025-07-21 15:40:25,690 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758771_17947 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758771 2025-07-21 15:41:20,046 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758772_17948 src: /192.168.158.8:55654 dest: /192.168.158.4:9866 2025-07-21 15:41:20,068 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:55654, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1684745883_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758772_17948, duration(ns): 18725981 2025-07-21 15:41:20,068 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758772_17948, type=LAST_IN_PIPELINE terminating 2025-07-21 15:41:22,690 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758772_17948 replica FinalizedReplica, blk_1073758772_17948, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758772 for deletion 2025-07-21 15:41:22,692 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758772_17948 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758772 2025-07-21 15:44:25,046 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758775_17951 src: /192.168.158.5:44100 dest: /192.168.158.4:9866 2025-07-21 15:44:25,069 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:44100, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-219058936_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758775_17951, duration(ns): 18635551 2025-07-21 15:44:25,069 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758775_17951, type=LAST_IN_PIPELINE terminating 2025-07-21 15:44:31,698 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758775_17951 replica FinalizedReplica, blk_1073758775_17951, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758775 for deletion 2025-07-21 15:44:31,699 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758775_17951 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758775 2025-07-21 15:49:30,053 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758780_17956 src: /192.168.158.5:44176 dest: /192.168.158.4:9866 2025-07-21 15:49:30,074 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:44176, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1199021952_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758780_17956, duration(ns): 18669159 2025-07-21 15:49:30,075 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758780_17956, type=LAST_IN_PIPELINE terminating 2025-07-21 15:49:34,710 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758780_17956 replica FinalizedReplica, blk_1073758780_17956, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758780 for deletion 2025-07-21 15:49:34,711 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758780_17956 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758780 2025-07-21 15:51:30,060 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758782_17958 src: /192.168.158.1:54746 dest: /192.168.158.4:9866 2025-07-21 15:51:30,097 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54746, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1181436458_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758782_17958, duration(ns): 25116572 2025-07-21 15:51:30,098 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758782_17958, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-21 15:51:34,718 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758782_17958 replica FinalizedReplica, blk_1073758782_17958, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758782 for deletion 2025-07-21 15:51:34,719 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758782_17958 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758782 2025-07-21 15:52:35,063 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758783_17959 src: /192.168.158.5:48456 dest: /192.168.158.4:9866 2025-07-21 15:52:35,083 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:48456, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-317724085_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758783_17959, duration(ns): 16764899 2025-07-21 15:52:35,083 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758783_17959, type=LAST_IN_PIPELINE terminating 2025-07-21 15:52:37,724 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758783_17959 replica FinalizedReplica, blk_1073758783_17959, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758783 for deletion 2025-07-21 15:52:37,725 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758783_17959 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758783 2025-07-21 15:55:40,069 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758786_17962 src: /192.168.158.6:52140 dest: /192.168.158.4:9866 2025-07-21 15:55:40,099 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:52140, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1475138046_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758786_17962, duration(ns): 23152175 2025-07-21 15:55:40,106 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758786_17962, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-21 15:55:43,733 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758786_17962 replica FinalizedReplica, blk_1073758786_17962, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758786 for deletion 2025-07-21 15:55:43,735 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758786_17962 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758786 2025-07-21 16:00:55,103 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758791_17967 src: /192.168.158.1:43944 dest: /192.168.158.4:9866 2025-07-21 16:00:55,141 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43944, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2040138876_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758791_17967, duration(ns): 26252104 2025-07-21 16:00:55,142 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758791_17967, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-21 16:00:58,745 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758791_17967 replica FinalizedReplica, blk_1073758791_17967, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758791 for deletion 2025-07-21 16:00:58,746 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758791_17967 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758791 2025-07-21 16:05:00,092 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758795_17971 src: /192.168.158.6:52408 dest: /192.168.158.4:9866 2025-07-21 16:05:00,114 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:52408, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1774002384_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758795_17971, duration(ns): 18924664 2025-07-21 16:05:00,114 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758795_17971, type=LAST_IN_PIPELINE terminating 2025-07-21 16:05:04,752 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758795_17971 replica FinalizedReplica, blk_1073758795_17971, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758795 for deletion 2025-07-21 16:05:04,753 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758795_17971 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758795 2025-07-21 16:07:05,094 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758797_17973 src: /192.168.158.9:43016 dest: /192.168.158.4:9866 2025-07-21 16:07:05,114 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:43016, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1040437243_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758797_17973, duration(ns): 17538320 2025-07-21 16:07:05,114 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758797_17973, type=LAST_IN_PIPELINE terminating 2025-07-21 16:07:07,755 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758797_17973 replica FinalizedReplica, blk_1073758797_17973, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758797 for deletion 2025-07-21 16:07:07,757 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758797_17973 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758797 2025-07-21 16:08:05,095 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758798_17974 src: /192.168.158.6:44852 dest: /192.168.158.4:9866 2025-07-21 16:08:05,126 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:44852, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_210585980_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758798_17974, duration(ns): 23727651 2025-07-21 16:08:05,126 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758798_17974, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-21 16:08:07,756 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758798_17974 replica FinalizedReplica, blk_1073758798_17974, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758798 for deletion 2025-07-21 16:08:07,758 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758798_17974 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758798 2025-07-21 16:09:05,092 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758799_17975 src: /192.168.158.7:42042 dest: /192.168.158.4:9866 2025-07-21 16:09:05,115 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:42042, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_706537221_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758799_17975, duration(ns): 19739344 2025-07-21 16:09:05,116 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758799_17975, type=LAST_IN_PIPELINE terminating 2025-07-21 16:09:07,759 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758799_17975 replica FinalizedReplica, blk_1073758799_17975, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758799 for deletion 2025-07-21 16:09:07,761 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758799_17975 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758799 2025-07-21 16:10:05,097 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758800_17976 src: /192.168.158.9:34038 dest: /192.168.158.4:9866 2025-07-21 16:10:05,120 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:34038, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2034260595_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758800_17976, duration(ns): 20240203 2025-07-21 16:10:05,120 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758800_17976, type=LAST_IN_PIPELINE terminating 2025-07-21 16:10:07,760 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758800_17976 replica FinalizedReplica, blk_1073758800_17976, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758800 for deletion 2025-07-21 16:10:07,761 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758800_17976 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758800 2025-07-21 16:11:05,077 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758801_17977 src: /192.168.158.6:55036 dest: /192.168.158.4:9866 2025-07-21 16:11:05,106 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:55036, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-491921042_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758801_17977, duration(ns): 21593057 2025-07-21 16:11:05,107 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758801_17977, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-21 16:11:10,761 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758801_17977 replica FinalizedReplica, blk_1073758801_17977, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758801 for deletion 2025-07-21 16:11:10,762 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758801_17977 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758801 2025-07-21 16:12:05,084 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758802_17978 src: /192.168.158.5:58188 dest: /192.168.158.4:9866 2025-07-21 16:12:05,104 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:58188, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1625789341_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758802_17978, duration(ns): 17458142 2025-07-21 16:12:05,105 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758802_17978, type=LAST_IN_PIPELINE terminating 2025-07-21 16:12:10,763 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758802_17978 replica FinalizedReplica, blk_1073758802_17978, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758802 for deletion 2025-07-21 16:12:10,765 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758802_17978 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758802 2025-07-21 16:14:05,082 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758804_17980 src: /192.168.158.7:38224 dest: /192.168.158.4:9866 2025-07-21 16:14:05,110 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:38224, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1311519642_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758804_17980, duration(ns): 21000910 2025-07-21 16:14:05,111 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758804_17980, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-21 16:14:07,764 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758804_17980 replica FinalizedReplica, blk_1073758804_17980, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758804 for deletion 2025-07-21 16:14:07,765 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758804_17980 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758804 2025-07-21 16:15:05,084 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758805_17981 src: /192.168.158.6:52404 dest: /192.168.158.4:9866 2025-07-21 16:15:05,115 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:52404, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-494585635_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758805_17981, duration(ns): 22980757 2025-07-21 16:15:05,115 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758805_17981, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-21 16:15:07,767 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758805_17981 replica FinalizedReplica, blk_1073758805_17981, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758805 for deletion 2025-07-21 16:15:07,769 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758805_17981 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758805 2025-07-21 16:16:05,080 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758806_17982 src: /192.168.158.1:35952 dest: /192.168.158.4:9866 2025-07-21 16:16:05,116 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35952, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-667126607_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758806_17982, duration(ns): 23997624 2025-07-21 16:16:05,116 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758806_17982, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-21 16:16:07,770 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758806_17982 replica FinalizedReplica, blk_1073758806_17982, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758806 for deletion 2025-07-21 16:16:07,772 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758806_17982 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758806 2025-07-21 16:18:05,109 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758808_17984 src: /192.168.158.1:47312 dest: /192.168.158.4:9866 2025-07-21 16:18:05,147 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47312, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2044100840_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758808_17984, duration(ns): 25958854 2025-07-21 16:18:05,147 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758808_17984, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-21 16:18:10,771 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758808_17984 replica FinalizedReplica, blk_1073758808_17984, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758808 for deletion 2025-07-21 16:18:10,772 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758808_17984 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758808 2025-07-21 16:20:10,096 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758810_17986 src: /192.168.158.7:46332 dest: /192.168.158.4:9866 2025-07-21 16:20:10,118 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:46332, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1508676704_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758810_17986, duration(ns): 19263006 2025-07-21 16:20:10,118 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758810_17986, type=LAST_IN_PIPELINE terminating 2025-07-21 16:20:13,773 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758810_17986 replica FinalizedReplica, blk_1073758810_17986, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758810 for deletion 2025-07-21 16:20:13,774 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758810_17986 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758810 2025-07-21 16:21:10,089 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758811_17987 src: /192.168.158.1:52256 dest: /192.168.158.4:9866 2025-07-21 16:21:10,129 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52256, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1103520819_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758811_17987, duration(ns): 28379099 2025-07-21 16:21:10,130 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758811_17987, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-21 16:21:13,777 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758811_17987 replica FinalizedReplica, blk_1073758811_17987, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758811 for deletion 2025-07-21 16:21:13,778 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758811_17987 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758811 2025-07-21 16:22:10,095 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758812_17988 src: /192.168.158.9:56518 dest: /192.168.158.4:9866 2025-07-21 16:22:10,125 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:56518, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_914305698_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758812_17988, duration(ns): 22988361 2025-07-21 16:22:10,125 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758812_17988, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-21 16:22:13,780 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758812_17988 replica FinalizedReplica, blk_1073758812_17988, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758812 for deletion 2025-07-21 16:22:13,781 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758812_17988 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758812 2025-07-21 16:24:10,095 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758814_17990 src: /192.168.158.5:40698 dest: /192.168.158.4:9866 2025-07-21 16:24:10,131 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:40698, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1773184110_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758814_17990, duration(ns): 27195365 2025-07-21 16:24:10,131 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758814_17990, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-21 16:24:13,785 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758814_17990 replica FinalizedReplica, blk_1073758814_17990, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758814 for deletion 2025-07-21 16:24:13,786 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758814_17990 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758814 2025-07-21 16:28:15,108 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758818_17994 src: /192.168.158.8:56206 dest: /192.168.158.4:9866 2025-07-21 16:28:15,137 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:56206, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_365904924_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758818_17994, duration(ns): 21853365 2025-07-21 16:28:15,139 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758818_17994, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-21 16:28:19,794 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758818_17994 replica FinalizedReplica, blk_1073758818_17994, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758818 for deletion 2025-07-21 16:28:19,796 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758818_17994 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758818 2025-07-21 16:29:15,113 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758819_17995 src: /192.168.158.8:41270 dest: /192.168.158.4:9866 2025-07-21 16:29:15,142 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:41270, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_730513651_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758819_17995, duration(ns): 21531677 2025-07-21 16:29:15,143 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758819_17995, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-21 16:29:19,796 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758819_17995 replica FinalizedReplica, blk_1073758819_17995, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758819 for deletion 2025-07-21 16:29:19,798 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758819_17995 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758819 2025-07-21 16:31:15,115 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758821_17997 src: /192.168.158.6:36290 dest: /192.168.158.4:9866 2025-07-21 16:31:15,135 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:36290, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1145402946_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758821_17997, duration(ns): 16943208 2025-07-21 16:31:15,135 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758821_17997, type=LAST_IN_PIPELINE terminating 2025-07-21 16:31:19,803 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758821_17997 replica FinalizedReplica, blk_1073758821_17997, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758821 for deletion 2025-07-21 16:31:19,804 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758821_17997 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758821 2025-07-21 16:32:20,122 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758822_17998 src: /192.168.158.5:58704 dest: /192.168.158.4:9866 2025-07-21 16:32:20,151 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:58704, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2006637138_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758822_17998, duration(ns): 20112085 2025-07-21 16:32:20,152 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758822_17998, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-21 16:32:22,806 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758822_17998 replica FinalizedReplica, blk_1073758822_17998, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758822 for deletion 2025-07-21 16:32:22,808 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758822_17998 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758822 2025-07-21 16:37:25,125 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758827_18003 src: /192.168.158.9:46520 dest: /192.168.158.4:9866 2025-07-21 16:37:25,146 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:46520, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-423939897_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758827_18003, duration(ns): 18594403 2025-07-21 16:37:25,147 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758827_18003, type=LAST_IN_PIPELINE terminating 2025-07-21 16:37:31,814 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758827_18003 replica FinalizedReplica, blk_1073758827_18003, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758827 for deletion 2025-07-21 16:37:31,815 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758827_18003 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758827 2025-07-21 16:38:30,126 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758828_18004 src: /192.168.158.7:33740 dest: /192.168.158.4:9866 2025-07-21 16:38:30,156 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:33740, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1742437327_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758828_18004, duration(ns): 22699015 2025-07-21 16:38:30,157 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758828_18004, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-21 16:38:34,817 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758828_18004 replica FinalizedReplica, blk_1073758828_18004, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758828 for deletion 2025-07-21 16:38:34,818 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758828_18004 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758828 2025-07-21 16:39:30,124 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758829_18005 src: /192.168.158.8:60942 dest: /192.168.158.4:9866 2025-07-21 16:39:30,151 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:60942, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2065389200_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758829_18005, duration(ns): 20252564 2025-07-21 16:39:30,155 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758829_18005, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-21 16:39:34,818 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758829_18005 replica FinalizedReplica, blk_1073758829_18005, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758829 for deletion 2025-07-21 16:39:34,820 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758829_18005 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758829 2025-07-21 16:43:30,133 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758833_18009 src: /192.168.158.1:47562 dest: /192.168.158.4:9866 2025-07-21 16:43:30,170 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47562, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1210594260_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758833_18009, duration(ns): 25442466 2025-07-21 16:43:30,171 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758833_18009, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-21 16:43:34,828 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758833_18009 replica FinalizedReplica, blk_1073758833_18009, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758833 for deletion 2025-07-21 16:43:34,830 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758833_18009 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758833 2025-07-21 16:46:30,141 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758836_18012 src: /192.168.158.6:59052 dest: /192.168.158.4:9866 2025-07-21 16:46:30,162 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:59052, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_929364307_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758836_18012, duration(ns): 17846862 2025-07-21 16:46:30,163 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758836_18012, type=LAST_IN_PIPELINE terminating 2025-07-21 16:46:37,836 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758836_18012 replica FinalizedReplica, blk_1073758836_18012, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758836 for deletion 2025-07-21 16:46:37,838 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758836_18012 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758836 2025-07-21 16:52:35,154 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758842_18018 src: /192.168.158.1:35506 dest: /192.168.158.4:9866 2025-07-21 16:52:35,193 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35506, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-356195446_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758842_18018, duration(ns): 25156053 2025-07-21 16:52:35,194 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758842_18018, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-21 16:52:40,849 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758842_18018 replica FinalizedReplica, blk_1073758842_18018, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758842 for deletion 2025-07-21 16:52:40,851 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758842_18018 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758842 2025-07-21 16:53:35,146 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758843_18019 src: /192.168.158.1:34990 dest: /192.168.158.4:9866 2025-07-21 16:53:35,182 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34990, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1416162773_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758843_18019, duration(ns): 25232530 2025-07-21 16:53:35,184 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758843_18019, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-21 16:53:37,852 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758843_18019 replica FinalizedReplica, blk_1073758843_18019, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758843 for deletion 2025-07-21 16:53:37,853 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758843_18019 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758843 2025-07-21 16:57:45,162 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758847_18023 src: /192.168.158.1:41830 dest: /192.168.158.4:9866 2025-07-21 16:57:45,200 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41830, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1853534664_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758847_18023, duration(ns): 26590218 2025-07-21 16:57:45,201 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758847_18023, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-21 16:57:46,857 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758847_18023 replica FinalizedReplica, blk_1073758847_18023, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758847 for deletion 2025-07-21 16:57:46,858 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758847_18023 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758847 2025-07-21 16:59:50,166 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758849_18025 src: /192.168.158.5:42608 dest: /192.168.158.4:9866 2025-07-21 16:59:50,195 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:42608, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1624407526_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758849_18025, duration(ns): 21402070 2025-07-21 16:59:50,195 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758849_18025, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-21 16:59:52,859 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758849_18025 replica FinalizedReplica, blk_1073758849_18025, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758849 for deletion 2025-07-21 16:59:52,860 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758849_18025 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758849 2025-07-21 17:00:50,159 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758850_18026 src: /192.168.158.1:55354 dest: /192.168.158.4:9866 2025-07-21 17:00:50,198 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55354, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1725340359_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758850_18026, duration(ns): 24972960 2025-07-21 17:00:50,199 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758850_18026, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-21 17:00:52,859 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758850_18026 replica FinalizedReplica, blk_1073758850_18026, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758850 for deletion 2025-07-21 17:00:52,860 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758850_18026 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758850 2025-07-21 17:01:50,171 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758851_18027 src: /192.168.158.9:40258 dest: /192.168.158.4:9866 2025-07-21 17:01:50,191 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:40258, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1916660097_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758851_18027, duration(ns): 17780566 2025-07-21 17:01:50,192 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758851_18027, type=LAST_IN_PIPELINE terminating 2025-07-21 17:01:52,861 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758851_18027 replica FinalizedReplica, blk_1073758851_18027, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758851 for deletion 2025-07-21 17:01:52,863 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758851_18027 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758851 2025-07-21 17:03:50,335 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758853_18029 src: /192.168.158.9:58838 dest: /192.168.158.4:9866 2025-07-21 17:03:50,357 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:58838, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_570975876_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758853_18029, duration(ns): 19104965 2025-07-21 17:03:50,357 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758853_18029, type=LAST_IN_PIPELINE terminating 2025-07-21 17:03:55,865 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758853_18029 replica FinalizedReplica, blk_1073758853_18029, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758853 for deletion 2025-07-21 17:03:55,867 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758853_18029 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758853 2025-07-21 17:07:00,169 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758856_18032 src: /192.168.158.1:49512 dest: /192.168.158.4:9866 2025-07-21 17:07:00,208 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49512, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-540126917_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758856_18032, duration(ns): 25445972 2025-07-21 17:07:00,212 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758856_18032, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-21 17:07:04,873 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758856_18032 replica FinalizedReplica, blk_1073758856_18032, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758856 for deletion 2025-07-21 17:07:04,874 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758856_18032 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758856 2025-07-21 17:08:00,179 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758857_18033 src: /192.168.158.1:36386 dest: /192.168.158.4:9866 2025-07-21 17:08:00,217 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36386, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1010756719_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758857_18033, duration(ns): 26388050 2025-07-21 17:08:00,218 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758857_18033, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-21 17:08:01,875 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758857_18033 replica FinalizedReplica, blk_1073758857_18033, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758857 for deletion 2025-07-21 17:08:01,876 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758857_18033 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758857 2025-07-21 17:10:10,175 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758859_18035 src: /192.168.158.5:56820 dest: /192.168.158.4:9866 2025-07-21 17:10:10,207 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:56820, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2074430852_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758859_18035, duration(ns): 24114701 2025-07-21 17:10:10,209 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758859_18035, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-21 17:10:13,882 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758859_18035 replica FinalizedReplica, blk_1073758859_18035, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758859 for deletion 2025-07-21 17:10:13,883 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758859_18035 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758859 2025-07-21 17:11:10,184 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758860_18036 src: /192.168.158.5:40000 dest: /192.168.158.4:9866 2025-07-21 17:11:10,217 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:40000, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_857394339_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758860_18036, duration(ns): 25048867 2025-07-21 17:11:10,220 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758860_18036, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-21 17:11:13,883 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758860_18036 replica FinalizedReplica, blk_1073758860_18036, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758860 for deletion 2025-07-21 17:11:13,886 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758860_18036 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758860 2025-07-21 17:13:20,178 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758862_18038 src: /192.168.158.8:58620 dest: /192.168.158.4:9866 2025-07-21 17:13:20,206 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:58620, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-388774752_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758862_18038, duration(ns): 20794618 2025-07-21 17:13:20,207 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758862_18038, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-21 17:13:25,889 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758862_18038 replica FinalizedReplica, blk_1073758862_18038, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758862 for deletion 2025-07-21 17:13:25,891 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758862_18038 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758862 2025-07-21 17:14:25,178 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758863_18039 src: /192.168.158.6:47832 dest: /192.168.158.4:9866 2025-07-21 17:14:25,209 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:47832, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1796064172_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758863_18039, duration(ns): 23645944 2025-07-21 17:14:25,211 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758863_18039, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-21 17:14:31,892 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758863_18039 replica FinalizedReplica, blk_1073758863_18039, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758863 for deletion 2025-07-21 17:14:31,894 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758863_18039 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758863 2025-07-21 17:15:30,181 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758864_18040 src: /192.168.158.7:44564 dest: /192.168.158.4:9866 2025-07-21 17:15:30,213 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:44564, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1559641760_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758864_18040, duration(ns): 22616272 2025-07-21 17:15:30,215 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758864_18040, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-21 17:15:31,894 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758864_18040 replica FinalizedReplica, blk_1073758864_18040, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758864 for deletion 2025-07-21 17:15:31,896 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758864_18040 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758864 2025-07-21 17:16:30,190 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758865_18041 src: /192.168.158.9:45892 dest: /192.168.158.4:9866 2025-07-21 17:16:30,211 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:45892, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-409022964_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758865_18041, duration(ns): 17838434 2025-07-21 17:16:30,211 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758865_18041, type=LAST_IN_PIPELINE terminating 2025-07-21 17:16:34,900 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758865_18041 replica FinalizedReplica, blk_1073758865_18041, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758865 for deletion 2025-07-21 17:16:34,901 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758865_18041 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758865 2025-07-21 17:17:30,184 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758866_18042 src: /192.168.158.9:59988 dest: /192.168.158.4:9866 2025-07-21 17:17:30,216 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:59988, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-430272035_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758866_18042, duration(ns): 24259697 2025-07-21 17:17:30,216 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758866_18042, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-21 17:17:31,900 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758866_18042 replica FinalizedReplica, blk_1073758866_18042, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758866 for deletion 2025-07-21 17:17:31,902 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758866_18042 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758866 2025-07-21 17:19:35,182 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758868_18044 src: /192.168.158.1:51812 dest: /192.168.158.4:9866 2025-07-21 17:19:35,220 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51812, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-502359358_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758868_18044, duration(ns): 26428652 2025-07-21 17:19:35,221 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758868_18044, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-21 17:19:37,905 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758868_18044 replica FinalizedReplica, blk_1073758868_18044, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758868 for deletion 2025-07-21 17:19:37,906 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758868_18044 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758868 2025-07-21 17:20:35,182 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758869_18045 src: /192.168.158.1:57296 dest: /192.168.158.4:9866 2025-07-21 17:20:35,218 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57296, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1725102121_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758869_18045, duration(ns): 24992656 2025-07-21 17:20:35,218 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758869_18045, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-21 17:20:37,908 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758869_18045 replica FinalizedReplica, blk_1073758869_18045, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758869 for deletion 2025-07-21 17:20:37,909 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758869_18045 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758869 2025-07-21 17:26:35,204 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758875_18051 src: /192.168.158.9:37314 dest: /192.168.158.4:9866 2025-07-21 17:26:35,233 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:37314, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1901204689_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758875_18051, duration(ns): 21198933 2025-07-21 17:26:35,233 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758875_18051, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-21 17:26:37,925 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758875_18051 replica FinalizedReplica, blk_1073758875_18051, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758875 for deletion 2025-07-21 17:26:37,927 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758875_18051 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758875 2025-07-21 17:27:35,201 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758876_18052 src: /192.168.158.7:43244 dest: /192.168.158.4:9866 2025-07-21 17:27:35,228 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:43244, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_644245539_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758876_18052, duration(ns): 20283295 2025-07-21 17:27:35,229 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758876_18052, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-21 17:27:37,928 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758876_18052 replica FinalizedReplica, blk_1073758876_18052, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758876 for deletion 2025-07-21 17:27:37,930 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758876_18052 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758876 2025-07-21 17:29:35,193 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758878_18054 src: /192.168.158.6:38312 dest: /192.168.158.4:9866 2025-07-21 17:29:35,222 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:38312, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-713822554_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758878_18054, duration(ns): 21795726 2025-07-21 17:29:35,223 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758878_18054, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-21 17:29:37,935 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758878_18054 replica FinalizedReplica, blk_1073758878_18054, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758878 for deletion 2025-07-21 17:29:37,936 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758878_18054 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758878 2025-07-21 17:30:35,199 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758879_18055 src: /192.168.158.7:59812 dest: /192.168.158.4:9866 2025-07-21 17:30:35,220 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:59812, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1662931386_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758879_18055, duration(ns): 16993419 2025-07-21 17:30:35,220 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758879_18055, type=LAST_IN_PIPELINE terminating 2025-07-21 17:30:40,937 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758879_18055 replica FinalizedReplica, blk_1073758879_18055, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758879 for deletion 2025-07-21 17:30:40,939 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758879_18055 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758879 2025-07-21 17:33:35,196 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758882_18058 src: /192.168.158.1:55596 dest: /192.168.158.4:9866 2025-07-21 17:33:35,233 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55596, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1238729725_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758882_18058, duration(ns): 25869748 2025-07-21 17:33:35,234 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758882_18058, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-21 17:33:37,943 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758882_18058 replica FinalizedReplica, blk_1073758882_18058, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758882 for deletion 2025-07-21 17:33:37,944 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758882_18058 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758882 2025-07-21 17:34:35,210 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758883_18059 src: /192.168.158.5:46516 dest: /192.168.158.4:9866 2025-07-21 17:34:35,231 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:46516, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1423274943_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758883_18059, duration(ns): 18234513 2025-07-21 17:34:35,231 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758883_18059, type=LAST_IN_PIPELINE terminating 2025-07-21 17:34:37,947 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758883_18059 replica FinalizedReplica, blk_1073758883_18059, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758883 for deletion 2025-07-21 17:34:37,949 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758883_18059 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758883 2025-07-21 17:35:35,208 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758884_18060 src: /192.168.158.6:59214 dest: /192.168.158.4:9866 2025-07-21 17:35:35,239 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:59214, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1274107956_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758884_18060, duration(ns): 22351957 2025-07-21 17:35:35,240 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758884_18060, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-21 17:35:40,948 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758884_18060 replica FinalizedReplica, blk_1073758884_18060, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758884 for deletion 2025-07-21 17:35:40,950 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758884_18060 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758884 2025-07-21 17:36:35,207 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758885_18061 src: /192.168.158.8:51104 dest: /192.168.158.4:9866 2025-07-21 17:36:35,237 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:51104, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_358483540_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758885_18061, duration(ns): 21218079 2025-07-21 17:36:35,238 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758885_18061, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-21 17:36:40,950 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758885_18061 replica FinalizedReplica, blk_1073758885_18061, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758885 for deletion 2025-07-21 17:36:40,952 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758885_18061 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758885 2025-07-21 17:37:35,216 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758886_18062 src: /192.168.158.6:53806 dest: /192.168.158.4:9866 2025-07-21 17:37:35,237 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:53806, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_326663077_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758886_18062, duration(ns): 18592756 2025-07-21 17:37:35,238 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758886_18062, type=LAST_IN_PIPELINE terminating 2025-07-21 17:37:40,953 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758886_18062 replica FinalizedReplica, blk_1073758886_18062, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758886 for deletion 2025-07-21 17:37:40,955 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758886_18062 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758886 2025-07-21 17:41:40,215 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758890_18066 src: /192.168.158.7:55016 dest: /192.168.158.4:9866 2025-07-21 17:41:40,247 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:55016, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1863556615_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758890_18066, duration(ns): 24855308 2025-07-21 17:41:40,248 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758890_18066, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-21 17:41:43,959 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758890_18066 replica FinalizedReplica, blk_1073758890_18066, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758890 for deletion 2025-07-21 17:41:43,961 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758890_18066 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758890 2025-07-21 17:42:40,211 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758891_18067 src: /192.168.158.1:50232 dest: /192.168.158.4:9866 2025-07-21 17:42:40,278 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50232, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_567982097_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758891_18067, duration(ns): 25779383 2025-07-21 17:42:40,279 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758891_18067, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-21 17:42:43,963 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758891_18067 replica FinalizedReplica, blk_1073758891_18067, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758891 for deletion 2025-07-21 17:42:43,964 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758891_18067 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758891 2025-07-21 17:45:40,221 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758894_18070 src: /192.168.158.7:60378 dest: /192.168.158.4:9866 2025-07-21 17:45:40,249 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:60378, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1365723101_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758894_18070, duration(ns): 20954964 2025-07-21 17:45:40,249 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758894_18070, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-21 17:45:43,969 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758894_18070 replica FinalizedReplica, blk_1073758894_18070, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758894 for deletion 2025-07-21 17:45:43,971 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758894_18070 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758894 2025-07-21 17:47:45,253 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758896_18072 src: /192.168.158.6:56498 dest: /192.168.158.4:9866 2025-07-21 17:47:45,287 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:56498, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_412100782_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758896_18072, duration(ns): 25524619 2025-07-21 17:47:45,288 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758896_18072, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-21 17:47:46,974 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758896_18072 replica FinalizedReplica, blk_1073758896_18072, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758896 for deletion 2025-07-21 17:47:46,975 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758896_18072 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758896 2025-07-21 17:48:45,231 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758897_18073 src: /192.168.158.9:55500 dest: /192.168.158.4:9866 2025-07-21 17:48:45,251 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:55500, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1142574857_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758897_18073, duration(ns): 17461990 2025-07-21 17:48:45,252 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758897_18073, type=LAST_IN_PIPELINE terminating 2025-07-21 17:48:46,975 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758897_18073 replica FinalizedReplica, blk_1073758897_18073, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758897 for deletion 2025-07-21 17:48:46,977 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758897_18073 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758897 2025-07-21 17:49:45,228 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758898_18074 src: /192.168.158.9:60010 dest: /192.168.158.4:9866 2025-07-21 17:49:45,257 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:60010, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-741065067_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758898_18074, duration(ns): 22324955 2025-07-21 17:49:45,258 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758898_18074, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-21 17:49:46,978 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758898_18074 replica FinalizedReplica, blk_1073758898_18074, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758898 for deletion 2025-07-21 17:49:46,979 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758898_18074 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758898 2025-07-21 17:50:45,221 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758899_18075 src: /192.168.158.7:49574 dest: /192.168.158.4:9866 2025-07-21 17:50:45,251 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:49574, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2007308076_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758899_18075, duration(ns): 22062337 2025-07-21 17:50:45,252 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758899_18075, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-21 17:50:46,981 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758899_18075 replica FinalizedReplica, blk_1073758899_18075, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758899 for deletion 2025-07-21 17:50:46,982 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758899_18075 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758899 2025-07-21 17:51:45,221 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758900_18076 src: /192.168.158.8:37440 dest: /192.168.158.4:9866 2025-07-21 17:51:45,250 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:37440, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1261949669_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758900_18076, duration(ns): 21547660 2025-07-21 17:51:45,251 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758900_18076, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-21 17:51:46,986 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758900_18076 replica FinalizedReplica, blk_1073758900_18076, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758900 for deletion 2025-07-21 17:51:46,987 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758900_18076 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758900 2025-07-21 17:52:45,221 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758901_18077 src: /192.168.158.8:55842 dest: /192.168.158.4:9866 2025-07-21 17:52:45,249 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:55842, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1800908993_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758901_18077, duration(ns): 21145773 2025-07-21 17:52:45,250 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758901_18077, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-21 17:52:46,989 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758901_18077 replica FinalizedReplica, blk_1073758901_18077, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758901 for deletion 2025-07-21 17:52:46,991 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758901_18077 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758901 2025-07-21 17:56:45,228 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758905_18081 src: /192.168.158.1:55282 dest: /192.168.158.4:9866 2025-07-21 17:56:45,265 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55282, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1487777963_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758905_18081, duration(ns): 25527314 2025-07-21 17:56:45,266 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758905_18081, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-21 17:56:47,003 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758905_18081 replica FinalizedReplica, blk_1073758905_18081, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758905 for deletion 2025-07-21 17:56:47,005 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758905_18081 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758905 2025-07-21 17:57:45,237 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758906_18082 src: /192.168.158.8:46502 dest: /192.168.158.4:9866 2025-07-21 17:57:45,257 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:46502, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-340586765_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758906_18082, duration(ns): 17234149 2025-07-21 17:57:45,257 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758906_18082, type=LAST_IN_PIPELINE terminating 2025-07-21 17:57:50,003 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758906_18082 replica FinalizedReplica, blk_1073758906_18082, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758906 for deletion 2025-07-21 17:57:50,004 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758906_18082 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758906 2025-07-21 17:59:50,245 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758908_18084 src: /192.168.158.7:35590 dest: /192.168.158.4:9866 2025-07-21 17:59:50,275 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:35590, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_905255569_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758908_18084, duration(ns): 21196356 2025-07-21 17:59:50,275 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758908_18084, type=LAST_IN_PIPELINE terminating 2025-07-21 17:59:56,018 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758908_18084 replica FinalizedReplica, blk_1073758908_18084, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758908 for deletion 2025-07-21 17:59:56,022 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758908_18084 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758908 2025-07-21 18:02:50,269 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758911_18087 src: /192.168.158.8:53342 dest: /192.168.158.4:9866 2025-07-21 18:02:50,290 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:53342, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_23800150_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758911_18087, duration(ns): 17723408 2025-07-21 18:02:50,290 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758911_18087, type=LAST_IN_PIPELINE terminating 2025-07-21 18:02:53,020 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758911_18087 replica FinalizedReplica, blk_1073758911_18087, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758911 for deletion 2025-07-21 18:02:53,022 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758911_18087 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758911 2025-07-21 18:04:55,282 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758913_18089 src: /192.168.158.7:33242 dest: /192.168.158.4:9866 2025-07-21 18:04:55,304 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:33242, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_265108632_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758913_18089, duration(ns): 18462060 2025-07-21 18:04:55,305 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758913_18089, type=LAST_IN_PIPELINE terminating 2025-07-21 18:04:59,025 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758913_18089 replica FinalizedReplica, blk_1073758913_18089, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758913 for deletion 2025-07-21 18:04:59,026 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758913_18089 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758913 2025-07-21 18:05:55,263 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758914_18090 src: /192.168.158.1:33640 dest: /192.168.158.4:9866 2025-07-21 18:05:55,300 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33640, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_803349127_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758914_18090, duration(ns): 24695251 2025-07-21 18:05:55,301 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758914_18090, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-21 18:05:59,025 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758914_18090 replica FinalizedReplica, blk_1073758914_18090, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758914 for deletion 2025-07-21 18:05:59,026 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758914_18090 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758914 2025-07-21 18:09:05,261 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758917_18093 src: /192.168.158.1:46900 dest: /192.168.158.4:9866 2025-07-21 18:09:05,297 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46900, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1949975407_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758917_18093, duration(ns): 24315938 2025-07-21 18:09:05,297 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758917_18093, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-21 18:09:08,036 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758917_18093 replica FinalizedReplica, blk_1073758917_18093, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758917 for deletion 2025-07-21 18:09:08,038 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758917_18093 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758917 2025-07-21 18:14:05,266 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758922_18098 src: /192.168.158.1:57100 dest: /192.168.158.4:9866 2025-07-21 18:14:05,302 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57100, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1159103300_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758922_18098, duration(ns): 24196346 2025-07-21 18:14:05,303 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758922_18098, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-21 18:14:11,045 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758922_18098 replica FinalizedReplica, blk_1073758922_18098, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758922 for deletion 2025-07-21 18:14:11,047 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758922_18098 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758922 2025-07-21 18:16:05,281 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758924_18100 src: /192.168.158.1:38652 dest: /192.168.158.4:9866 2025-07-21 18:16:05,318 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38652, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1559401501_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758924_18100, duration(ns): 25487132 2025-07-21 18:16:05,319 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758924_18100, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-21 18:16:08,052 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758924_18100 replica FinalizedReplica, blk_1073758924_18100, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758924 for deletion 2025-07-21 18:16:08,053 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758924_18100 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758924 2025-07-21 18:18:05,302 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758926_18102 src: /192.168.158.9:58824 dest: /192.168.158.4:9866 2025-07-21 18:18:05,332 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:58824, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1013845181_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758926_18102, duration(ns): 22984102 2025-07-21 18:18:05,333 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758926_18102, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-21 18:18:11,056 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758926_18102 replica FinalizedReplica, blk_1073758926_18102, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758926 for deletion 2025-07-21 18:18:11,057 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758926_18102 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758926 2025-07-21 18:19:05,284 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758927_18103 src: /192.168.158.8:52494 dest: /192.168.158.4:9866 2025-07-21 18:19:05,313 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:52494, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1295559154_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758927_18103, duration(ns): 21199559 2025-07-21 18:19:05,314 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758927_18103, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-21 18:19:11,057 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758927_18103 replica FinalizedReplica, blk_1073758927_18103, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758927 for deletion 2025-07-21 18:19:11,059 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758927_18103 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758927 2025-07-21 18:23:10,305 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758931_18107 src: /192.168.158.1:36656 dest: /192.168.158.4:9866 2025-07-21 18:23:10,341 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36656, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1230136974_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758931_18107, duration(ns): 24246636 2025-07-21 18:23:10,341 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758931_18107, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-21 18:23:14,067 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758931_18107 replica FinalizedReplica, blk_1073758931_18107, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758931 for deletion 2025-07-21 18:23:14,068 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758931_18107 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758931 2025-07-21 18:28:15,308 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758936_18112 src: /192.168.158.9:48496 dest: /192.168.158.4:9866 2025-07-21 18:28:15,329 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:48496, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2095504482_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758936_18112, duration(ns): 18018530 2025-07-21 18:28:15,329 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758936_18112, type=LAST_IN_PIPELINE terminating 2025-07-21 18:28:20,079 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758936_18112 replica FinalizedReplica, blk_1073758936_18112, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758936 for deletion 2025-07-21 18:28:20,081 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758936_18112 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758936 2025-07-21 18:29:15,295 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758937_18113 src: /192.168.158.1:41396 dest: /192.168.158.4:9866 2025-07-21 18:29:15,333 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41396, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1135732960_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758937_18113, duration(ns): 26167006 2025-07-21 18:29:15,333 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758937_18113, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-21 18:29:20,083 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758937_18113 replica FinalizedReplica, blk_1073758937_18113, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758937 for deletion 2025-07-21 18:29:20,084 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758937_18113 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758937 2025-07-21 18:30:15,306 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758938_18114 src: /192.168.158.7:46890 dest: /192.168.158.4:9866 2025-07-21 18:30:15,338 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:46890, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_383362098_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758938_18114, duration(ns): 23904966 2025-07-21 18:30:15,339 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758938_18114, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-21 18:30:20,088 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758938_18114 replica FinalizedReplica, blk_1073758938_18114, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758938 for deletion 2025-07-21 18:30:20,089 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758938_18114 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758938 2025-07-21 18:33:15,318 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758941_18117 src: /192.168.158.5:37876 dest: /192.168.158.4:9866 2025-07-21 18:33:15,338 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:37876, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1699627777_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758941_18117, duration(ns): 17181652 2025-07-21 18:33:15,338 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758941_18117, type=LAST_IN_PIPELINE terminating 2025-07-21 18:33:17,097 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758941_18117 replica FinalizedReplica, blk_1073758941_18117, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758941 for deletion 2025-07-21 18:33:17,098 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758941_18117 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758941 2025-07-21 18:35:15,309 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758943_18119 src: /192.168.158.8:36880 dest: /192.168.158.4:9866 2025-07-21 18:35:15,329 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:36880, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1369670158_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758943_18119, duration(ns): 17368888 2025-07-21 18:35:15,330 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758943_18119, type=LAST_IN_PIPELINE terminating 2025-07-21 18:35:17,101 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758943_18119 replica FinalizedReplica, blk_1073758943_18119, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758943 for deletion 2025-07-21 18:35:17,102 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758943_18119 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758943 2025-07-21 18:37:25,336 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758945_18121 src: /192.168.158.7:36236 dest: /192.168.158.4:9866 2025-07-21 18:37:25,355 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:36236, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-472231704_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758945_18121, duration(ns): 16891166 2025-07-21 18:37:25,356 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758945_18121, type=LAST_IN_PIPELINE terminating 2025-07-21 18:37:26,105 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758945_18121 replica FinalizedReplica, blk_1073758945_18121, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758945 for deletion 2025-07-21 18:37:26,107 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758945_18121 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758945 2025-07-21 18:39:25,308 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758947_18123 src: /192.168.158.1:34656 dest: /192.168.158.4:9866 2025-07-21 18:39:25,342 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34656, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_379076010_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758947_18123, duration(ns): 23587869 2025-07-21 18:39:25,343 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758947_18123, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-21 18:39:26,108 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758947_18123 replica FinalizedReplica, blk_1073758947_18123, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758947 for deletion 2025-07-21 18:39:26,110 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758947_18123 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758947 2025-07-21 18:42:30,323 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758950_18126 src: /192.168.158.8:42030 dest: /192.168.158.4:9866 2025-07-21 18:42:30,352 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:42030, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1963446087_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758950_18126, duration(ns): 21964122 2025-07-21 18:42:30,353 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758950_18126, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-21 18:42:35,111 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758950_18126 replica FinalizedReplica, blk_1073758950_18126, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758950 for deletion 2025-07-21 18:42:35,113 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758950_18126 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758950 2025-07-21 18:43:30,354 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758951_18127 src: /192.168.158.7:36906 dest: /192.168.158.4:9866 2025-07-21 18:43:30,383 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:36906, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_125191223_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758951_18127, duration(ns): 21429106 2025-07-21 18:43:30,383 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758951_18127, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-21 18:43:32,113 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758951_18127 replica FinalizedReplica, blk_1073758951_18127, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758951 for deletion 2025-07-21 18:43:32,114 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758951_18127 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758951 2025-07-21 18:45:35,324 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758953_18129 src: /192.168.158.6:55526 dest: /192.168.158.4:9866 2025-07-21 18:45:35,352 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:55526, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1475882063_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758953_18129, duration(ns): 21063282 2025-07-21 18:45:35,352 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758953_18129, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-21 18:45:38,115 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758953_18129 replica FinalizedReplica, blk_1073758953_18129, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758953 for deletion 2025-07-21 18:45:38,117 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758953_18129 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758953 2025-07-21 18:46:35,313 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758954_18130 src: /192.168.158.1:41298 dest: /192.168.158.4:9866 2025-07-21 18:46:35,348 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41298, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-63821572_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758954_18130, duration(ns): 24243613 2025-07-21 18:46:35,349 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758954_18130, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-21 18:46:38,116 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758954_18130 replica FinalizedReplica, blk_1073758954_18130, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758954 for deletion 2025-07-21 18:46:38,117 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758954_18130 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758954 2025-07-21 18:48:35,327 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758956_18132 src: /192.168.158.7:41066 dest: /192.168.158.4:9866 2025-07-21 18:48:35,354 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:41066, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1225396055_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758956_18132, duration(ns): 20495778 2025-07-21 18:48:35,355 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758956_18132, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-21 18:48:38,121 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758956_18132 replica FinalizedReplica, blk_1073758956_18132, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758956 for deletion 2025-07-21 18:48:38,122 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758956_18132 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758956 2025-07-21 18:50:35,322 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758958_18134 src: /192.168.158.1:47206 dest: /192.168.158.4:9866 2025-07-21 18:50:35,358 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47206, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1105913837_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758958_18134, duration(ns): 25120115 2025-07-21 18:50:35,358 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758958_18134, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-21 18:50:38,123 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758958_18134 replica FinalizedReplica, blk_1073758958_18134, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758958 for deletion 2025-07-21 18:50:38,125 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758958_18134 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758958 2025-07-21 18:52:35,339 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758960_18136 src: /192.168.158.9:35632 dest: /192.168.158.4:9866 2025-07-21 18:52:35,359 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:35632, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1329116974_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758960_18136, duration(ns): 17600780 2025-07-21 18:52:35,360 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758960_18136, type=LAST_IN_PIPELINE terminating 2025-07-21 18:52:38,128 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758960_18136 replica FinalizedReplica, blk_1073758960_18136, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758960 for deletion 2025-07-21 18:52:38,129 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758960_18136 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758960 2025-07-21 18:56:40,333 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758964_18140 src: /192.168.158.8:32826 dest: /192.168.158.4:9866 2025-07-21 18:56:40,363 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:32826, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-14827332_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758964_18140, duration(ns): 22298158 2025-07-21 18:56:40,364 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758964_18140, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-21 18:56:41,138 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758964_18140 replica FinalizedReplica, blk_1073758964_18140, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758964 for deletion 2025-07-21 18:56:41,139 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758964_18140 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758964 2025-07-21 18:59:45,330 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758967_18143 src: /192.168.158.1:58768 dest: /192.168.158.4:9866 2025-07-21 18:59:45,363 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58768, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1598759430_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758967_18143, duration(ns): 22328084 2025-07-21 18:59:45,363 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758967_18143, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-21 18:59:47,146 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758967_18143 replica FinalizedReplica, blk_1073758967_18143, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758967 for deletion 2025-07-21 18:59:47,147 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758967_18143 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758967 2025-07-21 19:00:45,336 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758968_18144 src: /192.168.158.7:42320 dest: /192.168.158.4:9866 2025-07-21 19:00:45,365 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:42320, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1876774038_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758968_18144, duration(ns): 21374927 2025-07-21 19:00:45,366 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758968_18144, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-21 19:00:50,148 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758968_18144 replica FinalizedReplica, blk_1073758968_18144, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758968 for deletion 2025-07-21 19:00:50,149 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758968_18144 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758968 2025-07-21 19:02:45,371 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758970_18146 src: /192.168.158.5:48690 dest: /192.168.158.4:9866 2025-07-21 19:02:45,392 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:48690, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1080413897_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758970_18146, duration(ns): 18221726 2025-07-21 19:02:45,393 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758970_18146, type=LAST_IN_PIPELINE terminating 2025-07-21 19:02:47,153 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758970_18146 replica FinalizedReplica, blk_1073758970_18146, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758970 for deletion 2025-07-21 19:02:47,154 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758970_18146 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758970 2025-07-21 19:03:45,375 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758971_18147 src: /192.168.158.5:57738 dest: /192.168.158.4:9866 2025-07-21 19:03:45,396 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:57738, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1035893807_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758971_18147, duration(ns): 18827300 2025-07-21 19:03:45,397 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758971_18147, type=LAST_IN_PIPELINE terminating 2025-07-21 19:03:50,157 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758971_18147 replica FinalizedReplica, blk_1073758971_18147, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758971 for deletion 2025-07-21 19:03:50,158 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758971_18147 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758971 2025-07-21 19:06:50,366 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758974_18150 src: /192.168.158.8:45322 dest: /192.168.158.4:9866 2025-07-21 19:06:50,393 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:45322, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1077883819_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758974_18150, duration(ns): 20144517 2025-07-21 19:06:50,393 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758974_18150, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-21 19:06:53,165 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758974_18150 replica FinalizedReplica, blk_1073758974_18150, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758974 for deletion 2025-07-21 19:06:53,167 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758974_18150 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir2/blk_1073758974 2025-07-21 19:09:50,375 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758977_18153 src: /192.168.158.6:46184 dest: /192.168.158.4:9866 2025-07-21 19:09:50,404 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:46184, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1157279926_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758977_18153, duration(ns): 22087567 2025-07-21 19:09:50,404 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758977_18153, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-21 19:09:53,170 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758977_18153 replica FinalizedReplica, blk_1073758977_18153, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073758977 for deletion 2025-07-21 19:09:53,171 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758977_18153 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073758977 2025-07-21 19:10:50,374 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758978_18154 src: /192.168.158.5:41448 dest: /192.168.158.4:9866 2025-07-21 19:10:50,393 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:41448, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_723638752_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758978_18154, duration(ns): 16777222 2025-07-21 19:10:50,394 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758978_18154, type=LAST_IN_PIPELINE terminating 2025-07-21 19:10:56,171 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758978_18154 replica FinalizedReplica, blk_1073758978_18154, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073758978 for deletion 2025-07-21 19:10:56,172 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758978_18154 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073758978 2025-07-21 19:14:05,383 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758981_18157 src: /192.168.158.9:52730 dest: /192.168.158.4:9866 2025-07-21 19:14:05,415 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:52730, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1867220943_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758981_18157, duration(ns): 24277225 2025-07-21 19:14:05,415 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758981_18157, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-21 19:14:08,175 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758981_18157 replica FinalizedReplica, blk_1073758981_18157, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073758981 for deletion 2025-07-21 19:14:08,177 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758981_18157 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073758981 2025-07-21 19:15:05,382 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758982_18158 src: /192.168.158.6:52894 dest: /192.168.158.4:9866 2025-07-21 19:15:05,401 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:52894, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_443231843_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758982_18158, duration(ns): 16879523 2025-07-21 19:15:05,402 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758982_18158, type=LAST_IN_PIPELINE terminating 2025-07-21 19:15:11,178 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758982_18158 replica FinalizedReplica, blk_1073758982_18158, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073758982 for deletion 2025-07-21 19:15:11,179 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758982_18158 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073758982 2025-07-21 19:16:05,378 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758983_18159 src: /192.168.158.5:45024 dest: /192.168.158.4:9866 2025-07-21 19:16:05,406 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:45024, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_923984887_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758983_18159, duration(ns): 21382297 2025-07-21 19:16:05,406 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758983_18159, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-21 19:16:11,181 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758983_18159 replica FinalizedReplica, blk_1073758983_18159, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073758983 for deletion 2025-07-21 19:16:11,183 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758983_18159 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073758983 2025-07-21 19:21:15,380 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758988_18164 src: /192.168.158.1:50576 dest: /192.168.158.4:9866 2025-07-21 19:21:15,414 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50576, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1042134825_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758988_18164, duration(ns): 23281568 2025-07-21 19:21:15,416 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758988_18164, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-21 19:21:17,195 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758988_18164 replica FinalizedReplica, blk_1073758988_18164, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073758988 for deletion 2025-07-21 19:21:17,196 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758988_18164 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073758988 2025-07-21 19:24:20,379 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758991_18167 src: /192.168.158.1:36556 dest: /192.168.158.4:9866 2025-07-21 19:24:20,415 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36556, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_238352670_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758991_18167, duration(ns): 24971102 2025-07-21 19:24:20,415 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758991_18167, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-21 19:24:23,199 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758991_18167 replica FinalizedReplica, blk_1073758991_18167, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073758991 for deletion 2025-07-21 19:24:23,200 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758991_18167 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073758991 2025-07-21 19:25:20,380 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758992_18168 src: /192.168.158.1:57726 dest: /192.168.158.4:9866 2025-07-21 19:25:20,416 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57726, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1180506459_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758992_18168, duration(ns): 24876284 2025-07-21 19:25:20,417 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758992_18168, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-21 19:25:23,201 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758992_18168 replica FinalizedReplica, blk_1073758992_18168, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073758992 for deletion 2025-07-21 19:25:23,203 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758992_18168 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073758992 2025-07-21 19:26:20,384 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758993_18169 src: /192.168.158.9:48640 dest: /192.168.158.4:9866 2025-07-21 19:26:20,413 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:48640, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-837169816_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758993_18169, duration(ns): 22245888 2025-07-21 19:26:20,413 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758993_18169, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-21 19:26:23,205 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758993_18169 replica FinalizedReplica, blk_1073758993_18169, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073758993 for deletion 2025-07-21 19:26:23,206 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758993_18169 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073758993 2025-07-21 19:27:20,388 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758994_18170 src: /192.168.158.1:40294 dest: /192.168.158.4:9866 2025-07-21 19:27:20,425 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40294, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1930020717_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758994_18170, duration(ns): 25997577 2025-07-21 19:27:20,425 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758994_18170, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-21 19:27:23,207 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758994_18170 replica FinalizedReplica, blk_1073758994_18170, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073758994 for deletion 2025-07-21 19:27:23,209 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758994_18170 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073758994 2025-07-21 19:30:30,386 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073758997_18173 src: /192.168.158.5:42662 dest: /192.168.158.4:9866 2025-07-21 19:30:30,415 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:42662, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1082711187_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073758997_18173, duration(ns): 22314026 2025-07-21 19:30:30,416 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073758997_18173, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-21 19:30:32,214 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073758997_18173 replica FinalizedReplica, blk_1073758997_18173, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073758997 for deletion 2025-07-21 19:30:32,215 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073758997_18173 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073758997 2025-07-21 19:33:30,401 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759000_18176 src: /192.168.158.6:44848 dest: /192.168.158.4:9866 2025-07-21 19:33:30,422 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:44848, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_366987969_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759000_18176, duration(ns): 18162247 2025-07-21 19:33:30,422 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759000_18176, type=LAST_IN_PIPELINE terminating 2025-07-21 19:33:35,218 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759000_18176 replica FinalizedReplica, blk_1073759000_18176, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759000 for deletion 2025-07-21 19:33:35,219 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759000_18176 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759000 2025-07-21 19:34:30,396 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759001_18177 src: /192.168.158.1:56052 dest: /192.168.158.4:9866 2025-07-21 19:34:30,433 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56052, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-202390668_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759001_18177, duration(ns): 25783899 2025-07-21 19:34:30,434 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759001_18177, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-21 19:34:32,219 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759001_18177 replica FinalizedReplica, blk_1073759001_18177, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759001 for deletion 2025-07-21 19:34:32,221 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759001_18177 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759001 2025-07-21 19:36:30,406 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759003_18179 src: /192.168.158.5:42542 dest: /192.168.158.4:9866 2025-07-21 19:36:30,427 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:42542, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_878686414_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759003_18179, duration(ns): 18993132 2025-07-21 19:36:30,428 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759003_18179, type=LAST_IN_PIPELINE terminating 2025-07-21 19:36:32,221 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759003_18179 replica FinalizedReplica, blk_1073759003_18179, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759003 for deletion 2025-07-21 19:36:32,222 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759003_18179 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759003 2025-07-21 19:37:35,402 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759004_18180 src: /192.168.158.1:55134 dest: /192.168.158.4:9866 2025-07-21 19:37:35,439 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55134, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1078047284_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759004_18180, duration(ns): 25543665 2025-07-21 19:37:35,440 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759004_18180, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-21 19:37:38,226 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759004_18180 replica FinalizedReplica, blk_1073759004_18180, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759004 for deletion 2025-07-21 19:37:38,228 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759004_18180 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759004 2025-07-21 19:45:45,504 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759012_18188 src: /192.168.158.1:34692 dest: /192.168.158.4:9866 2025-07-21 19:45:45,543 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34692, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-492514658_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759012_18188, duration(ns): 26599771 2025-07-21 19:45:45,543 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759012_18188, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-21 19:45:50,244 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759012_18188 replica FinalizedReplica, blk_1073759012_18188, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759012 for deletion 2025-07-21 19:45:50,245 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759012_18188 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759012 2025-07-21 19:46:45,509 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759013_18189 src: /192.168.158.5:50760 dest: /192.168.158.4:9866 2025-07-21 19:46:45,529 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:50760, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1954601994_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759013_18189, duration(ns): 17355225 2025-07-21 19:46:45,529 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759013_18189, type=LAST_IN_PIPELINE terminating 2025-07-21 19:46:47,243 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759013_18189 replica FinalizedReplica, blk_1073759013_18189, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759013 for deletion 2025-07-21 19:46:47,244 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759013_18189 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759013 2025-07-21 19:47:50,507 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759014_18190 src: /192.168.158.7:58326 dest: /192.168.158.4:9866 2025-07-21 19:47:50,535 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:58326, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1288015136_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759014_18190, duration(ns): 20878456 2025-07-21 19:47:50,536 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759014_18190, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-21 19:47:56,247 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759014_18190 replica FinalizedReplica, blk_1073759014_18190, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759014 for deletion 2025-07-21 19:47:56,249 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759014_18190 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759014 2025-07-21 19:48:50,510 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759015_18191 src: /192.168.158.1:33228 dest: /192.168.158.4:9866 2025-07-21 19:48:50,547 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33228, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1240246808_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759015_18191, duration(ns): 25470342 2025-07-21 19:48:50,548 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759015_18191, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-21 19:48:53,250 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759015_18191 replica FinalizedReplica, blk_1073759015_18191, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759015 for deletion 2025-07-21 19:48:53,251 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759015_18191 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759015 2025-07-21 19:49:50,508 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759016_18192 src: /192.168.158.8:40332 dest: /192.168.158.4:9866 2025-07-21 19:49:50,535 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:40332, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_507300512_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759016_18192, duration(ns): 20434930 2025-07-21 19:49:50,536 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759016_18192, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-21 19:49:53,256 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759016_18192 replica FinalizedReplica, blk_1073759016_18192, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759016 for deletion 2025-07-21 19:49:53,257 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759016_18192 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759016 2025-07-21 19:50:50,516 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759017_18193 src: /192.168.158.6:34646 dest: /192.168.158.4:9866 2025-07-21 19:50:50,537 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:34646, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-167266064_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759017_18193, duration(ns): 18462249 2025-07-21 19:50:50,538 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759017_18193, type=LAST_IN_PIPELINE terminating 2025-07-21 19:50:53,257 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759017_18193 replica FinalizedReplica, blk_1073759017_18193, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759017 for deletion 2025-07-21 19:50:53,259 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759017_18193 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759017 2025-07-21 19:51:55,541 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759018_18194 src: /192.168.158.9:44266 dest: /192.168.158.4:9866 2025-07-21 19:51:55,561 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:44266, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1747914868_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759018_18194, duration(ns): 16709987 2025-07-21 19:51:55,561 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759018_18194, type=LAST_IN_PIPELINE terminating 2025-07-21 19:51:59,261 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759018_18194 replica FinalizedReplica, blk_1073759018_18194, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759018 for deletion 2025-07-21 19:51:59,262 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759018_18194 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759018 2025-07-21 19:52:55,530 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759019_18195 src: /192.168.158.8:58140 dest: /192.168.158.4:9866 2025-07-21 19:52:55,552 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:58140, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_267561668_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759019_18195, duration(ns): 18373921 2025-07-21 19:52:55,552 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759019_18195, type=LAST_IN_PIPELINE terminating 2025-07-21 19:52:56,264 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759019_18195 replica FinalizedReplica, blk_1073759019_18195, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759019 for deletion 2025-07-21 19:52:56,265 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759019_18195 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759019 2025-07-21 19:56:00,541 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759022_18198 src: /192.168.158.9:39726 dest: /192.168.158.4:9866 2025-07-21 19:56:00,568 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:39726, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1136235364_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759022_18198, duration(ns): 20092162 2025-07-21 19:56:00,570 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759022_18198, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-21 19:56:02,274 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759022_18198 replica FinalizedReplica, blk_1073759022_18198, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759022 for deletion 2025-07-21 19:56:02,275 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759022_18198 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759022 2025-07-21 19:57:00,546 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759023_18199 src: /192.168.158.1:34792 dest: /192.168.158.4:9866 2025-07-21 19:57:00,584 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34792, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1528476851_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759023_18199, duration(ns): 27232788 2025-07-21 19:57:00,585 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759023_18199, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-21 19:57:02,277 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759023_18199 replica FinalizedReplica, blk_1073759023_18199, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759023 for deletion 2025-07-21 19:57:02,278 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759023_18199 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759023 2025-07-21 19:59:00,532 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759025_18201 src: /192.168.158.9:56724 dest: /192.168.158.4:9866 2025-07-21 19:59:00,560 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:56724, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1322650565_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759025_18201, duration(ns): 20533629 2025-07-21 19:59:00,560 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759025_18201, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-21 19:59:02,283 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759025_18201 replica FinalizedReplica, blk_1073759025_18201, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759025 for deletion 2025-07-21 19:59:02,284 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759025_18201 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759025 2025-07-21 19:59:36,288 INFO org.apache.hadoop.hdfs.server.datanode.DirectoryScanner: BlockPool BP-1059995147-192.168.158.1-1752101929360 Total blocks: 8, missing metadata files:0, missing block files:0, missing blocks in memory:0, mismatched blocks:0 2025-07-21 20:01:00,537 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759027_18203 src: /192.168.158.6:37006 dest: /192.168.158.4:9866 2025-07-21 20:01:00,564 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:37006, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2053353369_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759027_18203, duration(ns): 19558606 2025-07-21 20:01:00,564 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759027_18203, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-21 20:01:02,287 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759027_18203 replica FinalizedReplica, blk_1073759027_18203, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759027 for deletion 2025-07-21 20:01:02,289 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759027_18203 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759027 2025-07-21 20:04:05,551 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759030_18206 src: /192.168.158.7:35744 dest: /192.168.158.4:9866 2025-07-21 20:04:05,573 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:35744, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-18813_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759030_18206, duration(ns): 18997215 2025-07-21 20:04:05,573 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759030_18206, type=LAST_IN_PIPELINE terminating 2025-07-21 20:04:11,299 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759030_18206 replica FinalizedReplica, blk_1073759030_18206, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759030 for deletion 2025-07-21 20:04:11,300 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759030_18206 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759030 2025-07-21 20:05:10,549 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759031_18207 src: /192.168.158.1:53288 dest: /192.168.158.4:9866 2025-07-21 20:05:10,588 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53288, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1268526360_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759031_18207, duration(ns): 27717837 2025-07-21 20:05:10,589 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759031_18207, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-21 20:05:11,300 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759031_18207 replica FinalizedReplica, blk_1073759031_18207, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759031 for deletion 2025-07-21 20:05:11,301 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759031_18207 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759031 2025-07-21 20:08:10,565 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759034_18210 src: /192.168.158.5:56288 dest: /192.168.158.4:9866 2025-07-21 20:08:10,595 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:56288, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1499414916_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759034_18210, duration(ns): 23270700 2025-07-21 20:08:10,597 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759034_18210, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-21 20:08:11,311 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759034_18210 replica FinalizedReplica, blk_1073759034_18210, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759034 for deletion 2025-07-21 20:08:11,312 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759034_18210 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759034 2025-07-21 20:11:10,542 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759037_18213 src: /192.168.158.1:58276 dest: /192.168.158.4:9866 2025-07-21 20:11:10,579 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58276, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1619453483_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759037_18213, duration(ns): 25368500 2025-07-21 20:11:10,581 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759037_18213, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-21 20:11:11,315 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759037_18213 replica FinalizedReplica, blk_1073759037_18213, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759037 for deletion 2025-07-21 20:11:11,316 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759037_18213 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759037 2025-07-21 20:13:15,550 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759039_18215 src: /192.168.158.1:38992 dest: /192.168.158.4:9866 2025-07-21 20:13:15,588 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38992, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2042053419_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759039_18215, duration(ns): 25408810 2025-07-21 20:13:15,589 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759039_18215, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-21 20:13:17,317 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759039_18215 replica FinalizedReplica, blk_1073759039_18215, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759039 for deletion 2025-07-21 20:13:17,318 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759039_18215 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759039 2025-07-21 20:16:25,555 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759042_18218 src: /192.168.158.1:60156 dest: /192.168.158.4:9866 2025-07-21 20:16:25,591 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60156, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_106131091_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759042_18218, duration(ns): 25574603 2025-07-21 20:16:25,592 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759042_18218, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-21 20:16:26,326 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759042_18218 replica FinalizedReplica, blk_1073759042_18218, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759042 for deletion 2025-07-21 20:16:26,327 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759042_18218 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759042 2025-07-21 20:25:50,587 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759051_18227 src: /192.168.158.5:38220 dest: /192.168.158.4:9866 2025-07-21 20:25:50,614 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:38220, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_925207947_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759051_18227, duration(ns): 19926907 2025-07-21 20:25:50,614 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759051_18227, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-21 20:25:53,353 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759051_18227 replica FinalizedReplica, blk_1073759051_18227, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759051 for deletion 2025-07-21 20:25:53,355 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759051_18227 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759051 2025-07-21 20:28:50,589 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759054_18230 src: /192.168.158.7:44856 dest: /192.168.158.4:9866 2025-07-21 20:28:50,618 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:44856, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2135276742_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759054_18230, duration(ns): 22024161 2025-07-21 20:28:50,618 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759054_18230, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-21 20:28:56,358 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759054_18230 replica FinalizedReplica, blk_1073759054_18230, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759054 for deletion 2025-07-21 20:28:56,359 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759054_18230 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759054 2025-07-21 20:34:00,583 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759059_18235 src: /192.168.158.1:57546 dest: /192.168.158.4:9866 2025-07-21 20:34:00,621 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57546, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_99573808_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759059_18235, duration(ns): 26854003 2025-07-21 20:34:00,621 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759059_18235, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-21 20:34:05,374 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759059_18235 replica FinalizedReplica, blk_1073759059_18235, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759059 for deletion 2025-07-21 20:34:05,375 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759059_18235 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759059 2025-07-21 20:36:00,599 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759061_18237 src: /192.168.158.7:38388 dest: /192.168.158.4:9866 2025-07-21 20:36:00,625 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:38388, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-842033481_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759061_18237, duration(ns): 18985691 2025-07-21 20:36:00,625 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759061_18237, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-21 20:36:05,380 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759061_18237 replica FinalizedReplica, blk_1073759061_18237, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759061 for deletion 2025-07-21 20:36:05,381 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759061_18237 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759061 2025-07-21 20:39:06,285 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759064_18240 src: /192.168.158.6:54294 dest: /192.168.158.4:9866 2025-07-21 20:39:06,307 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:54294, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_845327253_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759064_18240, duration(ns): 19633528 2025-07-21 20:39:06,308 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759064_18240, type=LAST_IN_PIPELINE terminating 2025-07-21 20:39:11,388 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759064_18240 replica FinalizedReplica, blk_1073759064_18240, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759064 for deletion 2025-07-21 20:39:11,390 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759064_18240 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759064 2025-07-21 20:41:11,293 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759066_18242 src: /192.168.158.9:47960 dest: /192.168.158.4:9866 2025-07-21 20:41:11,313 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:47960, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1814617568_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759066_18242, duration(ns): 17664775 2025-07-21 20:41:11,315 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759066_18242, type=LAST_IN_PIPELINE terminating 2025-07-21 20:41:14,393 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759066_18242 replica FinalizedReplica, blk_1073759066_18242, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759066 for deletion 2025-07-21 20:41:14,394 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759066_18242 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759066 2025-07-21 20:43:02,402 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Successfully sent block report 0x6cb5f0b7da23e675, containing 4 storage report(s), of which we sent 4. The reports had 8 total blocks and used 1 RPC(s). This took 0 msec to generate and 7 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5. 2025-07-21 20:43:02,403 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Got finalize command for block pool BP-1059995147-192.168.158.1-1752101929360 2025-07-21 20:44:16,280 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759069_18245 src: /192.168.158.1:48932 dest: /192.168.158.4:9866 2025-07-21 20:44:16,315 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48932, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1513356593_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759069_18245, duration(ns): 23850093 2025-07-21 20:44:16,316 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759069_18245, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-21 20:44:20,396 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759069_18245 replica FinalizedReplica, blk_1073759069_18245, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759069 for deletion 2025-07-21 20:44:20,398 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759069_18245 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759069 2025-07-21 20:45:16,288 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759070_18246 src: /192.168.158.1:60136 dest: /192.168.158.4:9866 2025-07-21 20:45:16,325 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60136, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-687761566_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759070_18246, duration(ns): 25713828 2025-07-21 20:45:16,325 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759070_18246, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-21 20:45:23,399 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759070_18246 replica FinalizedReplica, blk_1073759070_18246, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759070 for deletion 2025-07-21 20:45:23,400 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759070_18246 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759070 2025-07-21 20:47:16,299 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759072_18248 src: /192.168.158.6:60288 dest: /192.168.158.4:9866 2025-07-21 20:47:16,319 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:60288, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1084573211_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759072_18248, duration(ns): 17176940 2025-07-21 20:47:16,319 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759072_18248, type=LAST_IN_PIPELINE terminating 2025-07-21 20:47:20,404 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759072_18248 replica FinalizedReplica, blk_1073759072_18248, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759072 for deletion 2025-07-21 20:47:20,405 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759072_18248 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759072 2025-07-21 20:49:16,309 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759074_18250 src: /192.168.158.8:36300 dest: /192.168.158.4:9866 2025-07-21 20:49:16,339 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:36300, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_25677729_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759074_18250, duration(ns): 22230775 2025-07-21 20:49:16,340 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759074_18250, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-21 20:49:20,410 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759074_18250 replica FinalizedReplica, blk_1073759074_18250, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759074 for deletion 2025-07-21 20:49:20,411 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759074_18250 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759074 2025-07-21 20:52:21,307 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759077_18253 src: /192.168.158.1:35454 dest: /192.168.158.4:9866 2025-07-21 20:52:21,341 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35454, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-710629976_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759077_18253, duration(ns): 23723937 2025-07-21 20:52:21,342 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759077_18253, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-21 20:52:26,418 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759077_18253 replica FinalizedReplica, blk_1073759077_18253, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759077 for deletion 2025-07-21 20:52:26,419 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759077_18253 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759077 2025-07-21 20:54:21,313 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759079_18255 src: /192.168.158.8:46556 dest: /192.168.158.4:9866 2025-07-21 20:54:21,340 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:46556, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1789258858_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759079_18255, duration(ns): 20638919 2025-07-21 20:54:21,340 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759079_18255, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-21 20:54:26,422 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759079_18255 replica FinalizedReplica, blk_1073759079_18255, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759079 for deletion 2025-07-21 20:54:26,423 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759079_18255 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759079 2025-07-21 20:55:21,308 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759080_18256 src: /192.168.158.1:59876 dest: /192.168.158.4:9866 2025-07-21 20:55:21,342 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59876, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-325468429_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759080_18256, duration(ns): 22961692 2025-07-21 20:55:21,342 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759080_18256, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-21 20:55:26,425 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759080_18256 replica FinalizedReplica, blk_1073759080_18256, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759080 for deletion 2025-07-21 20:55:26,426 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759080_18256 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759080 2025-07-21 20:59:31,326 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759084_18260 src: /192.168.158.8:56918 dest: /192.168.158.4:9866 2025-07-21 20:59:31,347 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:56918, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1310636434_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759084_18260, duration(ns): 15764606 2025-07-21 20:59:31,347 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759084_18260, type=LAST_IN_PIPELINE terminating 2025-07-21 20:59:35,434 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759084_18260 replica FinalizedReplica, blk_1073759084_18260, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759084 for deletion 2025-07-21 20:59:35,435 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759084_18260 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759084 2025-07-21 21:01:31,320 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759086_18262 src: /192.168.158.8:43464 dest: /192.168.158.4:9866 2025-07-21 21:01:31,349 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:43464, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1967487560_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759086_18262, duration(ns): 21457487 2025-07-21 21:01:31,349 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759086_18262, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-21 21:01:38,440 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759086_18262 replica FinalizedReplica, blk_1073759086_18262, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759086 for deletion 2025-07-21 21:01:38,441 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759086_18262 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759086 2025-07-21 21:02:36,319 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759087_18263 src: /192.168.158.1:59392 dest: /192.168.158.4:9866 2025-07-21 21:02:36,353 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59392, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-573695214_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759087_18263, duration(ns): 22754290 2025-07-21 21:02:36,353 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759087_18263, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-21 21:02:41,441 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759087_18263 replica FinalizedReplica, blk_1073759087_18263, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759087 for deletion 2025-07-21 21:02:41,442 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759087_18263 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759087 2025-07-21 21:05:46,316 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759090_18266 src: /192.168.158.7:59418 dest: /192.168.158.4:9866 2025-07-21 21:05:46,345 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:59418, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-334779722_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759090_18266, duration(ns): 21709829 2025-07-21 21:05:46,346 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759090_18266, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-21 21:05:50,446 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759090_18266 replica FinalizedReplica, blk_1073759090_18266, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759090 for deletion 2025-07-21 21:05:50,448 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759090_18266 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759090 2025-07-21 21:07:56,323 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759092_18268 src: /192.168.158.1:50600 dest: /192.168.158.4:9866 2025-07-21 21:07:56,361 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50600, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2037607941_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759092_18268, duration(ns): 26878290 2025-07-21 21:07:56,363 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759092_18268, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-21 21:07:59,450 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759092_18268 replica FinalizedReplica, blk_1073759092_18268, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759092 for deletion 2025-07-21 21:07:59,451 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759092_18268 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759092 2025-07-21 21:08:56,337 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759093_18269 src: /192.168.158.6:48390 dest: /192.168.158.4:9866 2025-07-21 21:08:56,364 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:48390, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1772909989_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759093_18269, duration(ns): 20206958 2025-07-21 21:08:56,366 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759093_18269, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-21 21:09:02,455 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759093_18269 replica FinalizedReplica, blk_1073759093_18269, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759093 for deletion 2025-07-21 21:09:02,457 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759093_18269 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759093 2025-07-21 21:09:56,326 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759094_18270 src: /192.168.158.1:49966 dest: /192.168.158.4:9866 2025-07-21 21:09:56,362 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49966, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-736173441_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759094_18270, duration(ns): 24923231 2025-07-21 21:09:56,365 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759094_18270, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-21 21:09:59,458 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759094_18270 replica FinalizedReplica, blk_1073759094_18270, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759094 for deletion 2025-07-21 21:09:59,459 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759094_18270 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759094 2025-07-21 21:10:56,318 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759095_18271 src: /192.168.158.1:40206 dest: /192.168.158.4:9866 2025-07-21 21:10:56,355 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40206, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_106699172_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759095_18271, duration(ns): 24674335 2025-07-21 21:10:56,357 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759095_18271, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-21 21:11:02,460 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759095_18271 replica FinalizedReplica, blk_1073759095_18271, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759095 for deletion 2025-07-21 21:11:02,462 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759095_18271 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759095 2025-07-21 21:11:56,334 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759096_18272 src: /192.168.158.9:34412 dest: /192.168.158.4:9866 2025-07-21 21:11:56,354 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:34412, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1991228205_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759096_18272, duration(ns): 17226199 2025-07-21 21:11:56,355 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759096_18272, type=LAST_IN_PIPELINE terminating 2025-07-21 21:11:59,468 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759096_18272 replica FinalizedReplica, blk_1073759096_18272, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759096 for deletion 2025-07-21 21:11:59,469 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759096_18272 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759096 2025-07-21 21:15:11,332 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759099_18275 src: /192.168.158.9:47008 dest: /192.168.158.4:9866 2025-07-21 21:15:11,349 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:47008, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1118289076_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759099_18275, duration(ns): 15386511 2025-07-21 21:15:11,350 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759099_18275, type=LAST_IN_PIPELINE terminating 2025-07-21 21:15:17,472 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759099_18275 replica FinalizedReplica, blk_1073759099_18275, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759099 for deletion 2025-07-21 21:15:17,473 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759099_18275 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759099 2025-07-21 21:16:16,331 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759100_18276 src: /192.168.158.5:43316 dest: /192.168.158.4:9866 2025-07-21 21:16:16,361 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:43316, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-4878558_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759100_18276, duration(ns): 23359982 2025-07-21 21:16:16,361 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759100_18276, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-21 21:16:20,477 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759100_18276 replica FinalizedReplica, blk_1073759100_18276, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759100 for deletion 2025-07-21 21:16:20,478 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759100_18276 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759100 2025-07-21 21:18:21,330 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759102_18278 src: /192.168.158.1:42672 dest: /192.168.158.4:9866 2025-07-21 21:18:21,368 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42672, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1192717897_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759102_18278, duration(ns): 26273797 2025-07-21 21:18:21,369 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759102_18278, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-21 21:18:29,480 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759102_18278 replica FinalizedReplica, blk_1073759102_18278, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759102 for deletion 2025-07-21 21:18:29,481 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759102_18278 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759102 2025-07-21 21:19:21,333 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759103_18279 src: /192.168.158.9:35970 dest: /192.168.158.4:9866 2025-07-21 21:19:21,352 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:35970, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1407222588_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759103_18279, duration(ns): 16121221 2025-07-21 21:19:21,352 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759103_18279, type=LAST_IN_PIPELINE terminating 2025-07-21 21:19:26,483 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759103_18279 replica FinalizedReplica, blk_1073759103_18279, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759103 for deletion 2025-07-21 21:19:26,484 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759103_18279 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759103 2025-07-21 21:21:21,375 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759105_18281 src: /192.168.158.5:47252 dest: /192.168.158.4:9866 2025-07-21 21:21:21,401 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:47252, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-261244757_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759105_18281, duration(ns): 19389078 2025-07-21 21:21:21,401 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759105_18281, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-21 21:21:26,492 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759105_18281 replica FinalizedReplica, blk_1073759105_18281, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759105 for deletion 2025-07-21 21:21:26,493 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759105_18281 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759105 2025-07-21 21:23:21,348 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759107_18283 src: /192.168.158.7:58478 dest: /192.168.158.4:9866 2025-07-21 21:23:21,376 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:58478, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1154990048_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759107_18283, duration(ns): 21292601 2025-07-21 21:23:21,376 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759107_18283, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-21 21:23:26,497 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759107_18283 replica FinalizedReplica, blk_1073759107_18283, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759107 for deletion 2025-07-21 21:23:26,498 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759107_18283 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759107 2025-07-21 21:24:21,343 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759108_18284 src: /192.168.158.1:32844 dest: /192.168.158.4:9866 2025-07-21 21:24:21,378 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:32844, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1274636568_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759108_18284, duration(ns): 23873965 2025-07-21 21:24:21,378 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759108_18284, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-21 21:24:29,500 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759108_18284 replica FinalizedReplica, blk_1073759108_18284, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759108 for deletion 2025-07-21 21:24:29,501 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759108_18284 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759108 2025-07-21 21:25:26,359 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759109_18285 src: /192.168.158.5:47866 dest: /192.168.158.4:9866 2025-07-21 21:25:26,380 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:47866, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1023399102_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759109_18285, duration(ns): 17719262 2025-07-21 21:25:26,381 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759109_18285, type=LAST_IN_PIPELINE terminating 2025-07-21 21:25:29,503 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759109_18285 replica FinalizedReplica, blk_1073759109_18285, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759109 for deletion 2025-07-21 21:25:29,505 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759109_18285 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759109 2025-07-21 21:29:31,349 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759113_18289 src: /192.168.158.1:35654 dest: /192.168.158.4:9866 2025-07-21 21:29:31,384 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35654, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1018805128_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759113_18289, duration(ns): 23599268 2025-07-21 21:29:31,386 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759113_18289, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-21 21:29:38,514 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759113_18289 replica FinalizedReplica, blk_1073759113_18289, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759113 for deletion 2025-07-21 21:29:38,515 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759113_18289 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759113 2025-07-21 21:34:36,358 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759118_18294 src: /192.168.158.8:59750 dest: /192.168.158.4:9866 2025-07-21 21:34:36,378 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:59750, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1224110366_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759118_18294, duration(ns): 17252610 2025-07-21 21:34:36,378 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759118_18294, type=LAST_IN_PIPELINE terminating 2025-07-21 21:34:41,525 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759118_18294 replica FinalizedReplica, blk_1073759118_18294, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759118 for deletion 2025-07-21 21:34:41,527 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759118_18294 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759118 2025-07-21 21:36:46,362 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759120_18296 src: /192.168.158.1:33936 dest: /192.168.158.4:9866 2025-07-21 21:36:46,397 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33936, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1505989484_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759120_18296, duration(ns): 24351469 2025-07-21 21:36:46,398 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759120_18296, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-21 21:36:50,534 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759120_18296 replica FinalizedReplica, blk_1073759120_18296, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759120 for deletion 2025-07-21 21:36:50,535 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759120_18296 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759120 2025-07-21 21:39:47,059 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759123_18299 src: /192.168.158.6:51492 dest: /192.168.158.4:9866 2025-07-21 21:39:47,086 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:51492, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1402554683_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759123_18299, duration(ns): 20674783 2025-07-21 21:39:47,087 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759123_18299, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-21 21:39:50,545 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759123_18299 replica FinalizedReplica, blk_1073759123_18299, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759123 for deletion 2025-07-21 21:39:50,546 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759123_18299 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759123 2025-07-21 21:45:57,062 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759129_18305 src: /192.168.158.5:34740 dest: /192.168.158.4:9866 2025-07-21 21:45:57,090 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:34740, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1891069723_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759129_18305, duration(ns): 21691500 2025-07-21 21:45:57,090 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759129_18305, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-21 21:46:02,563 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759129_18305 replica FinalizedReplica, blk_1073759129_18305, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759129 for deletion 2025-07-21 21:46:02,564 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759129_18305 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759129 2025-07-21 21:54:02,074 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759137_18313 src: /192.168.158.8:50102 dest: /192.168.158.4:9866 2025-07-21 21:54:02,094 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:50102, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2047905998_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759137_18313, duration(ns): 17607226 2025-07-21 21:54:02,095 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759137_18313, type=LAST_IN_PIPELINE terminating 2025-07-21 21:54:05,576 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759137_18313 replica FinalizedReplica, blk_1073759137_18313, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759137 for deletion 2025-07-21 21:54:05,577 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759137_18313 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759137 2025-07-21 21:56:02,090 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759139_18315 src: /192.168.158.9:53598 dest: /192.168.158.4:9866 2025-07-21 21:56:02,116 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:53598, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1916082945_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759139_18315, duration(ns): 18935353 2025-07-21 21:56:02,117 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759139_18315, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-21 21:56:08,582 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759139_18315 replica FinalizedReplica, blk_1073759139_18315, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759139 for deletion 2025-07-21 21:56:08,583 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759139_18315 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759139 2025-07-21 21:57:02,097 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759140_18316 src: /192.168.158.9:33652 dest: /192.168.158.4:9866 2025-07-21 21:57:02,118 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:33652, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_586460414_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759140_18316, duration(ns): 18351502 2025-07-21 21:57:02,118 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759140_18316, type=LAST_IN_PIPELINE terminating 2025-07-21 21:57:08,583 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759140_18316 replica FinalizedReplica, blk_1073759140_18316, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759140 for deletion 2025-07-21 21:57:08,585 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759140_18316 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759140 2025-07-21 21:59:07,085 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759142_18318 src: /192.168.158.8:34186 dest: /192.168.158.4:9866 2025-07-21 21:59:07,111 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:34186, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2123039290_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759142_18318, duration(ns): 19715852 2025-07-21 21:59:07,112 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759142_18318, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-21 21:59:14,592 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759142_18318 replica FinalizedReplica, blk_1073759142_18318, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759142 for deletion 2025-07-21 21:59:14,593 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759142_18318 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759142 2025-07-21 22:00:07,076 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759143_18319 src: /192.168.158.1:34620 dest: /192.168.158.4:9866 2025-07-21 22:00:07,112 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34620, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-356410616_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759143_18319, duration(ns): 24906671 2025-07-21 22:00:07,113 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759143_18319, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-21 22:00:14,595 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759143_18319 replica FinalizedReplica, blk_1073759143_18319, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759143 for deletion 2025-07-21 22:00:14,596 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759143_18319 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759143 2025-07-21 22:01:07,097 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759144_18320 src: /192.168.158.1:58976 dest: /192.168.158.4:9866 2025-07-21 22:01:07,133 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58976, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1292796534_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759144_18320, duration(ns): 24703326 2025-07-21 22:01:07,133 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759144_18320, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-21 22:01:11,598 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759144_18320 replica FinalizedReplica, blk_1073759144_18320, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759144 for deletion 2025-07-21 22:01:11,599 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759144_18320 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759144 2025-07-21 22:09:12,100 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759152_18328 src: /192.168.158.5:59278 dest: /192.168.158.4:9866 2025-07-21 22:09:12,119 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:59278, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1628388877_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759152_18328, duration(ns): 16083908 2025-07-21 22:09:12,119 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759152_18328, type=LAST_IN_PIPELINE terminating 2025-07-21 22:09:14,654 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759152_18328 replica FinalizedReplica, blk_1073759152_18328, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759152 for deletion 2025-07-21 22:09:14,655 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759152_18328 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759152 2025-07-21 22:12:22,075 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759155_18331 src: /192.168.158.6:40562 dest: /192.168.158.4:9866 2025-07-21 22:12:22,101 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:40562, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_669253832_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759155_18331, duration(ns): 19164010 2025-07-21 22:12:22,101 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759155_18331, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-21 22:12:26,660 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759155_18331 replica FinalizedReplica, blk_1073759155_18331, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759155 for deletion 2025-07-21 22:12:26,661 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759155_18331 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759155 2025-07-21 22:13:22,098 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759156_18332 src: /192.168.158.6:37790 dest: /192.168.158.4:9866 2025-07-21 22:13:22,117 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:37790, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_990769971_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759156_18332, duration(ns): 16390788 2025-07-21 22:13:22,117 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759156_18332, type=LAST_IN_PIPELINE terminating 2025-07-21 22:13:26,663 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759156_18332 replica FinalizedReplica, blk_1073759156_18332, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759156 for deletion 2025-07-21 22:13:26,665 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759156_18332 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759156 2025-07-21 22:15:22,096 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759158_18334 src: /192.168.158.1:47430 dest: /192.168.158.4:9866 2025-07-21 22:15:22,130 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47430, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2030932157_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759158_18334, duration(ns): 22506861 2025-07-21 22:15:22,130 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759158_18334, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-21 22:15:26,668 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759158_18334 replica FinalizedReplica, blk_1073759158_18334, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759158 for deletion 2025-07-21 22:15:26,669 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759158_18334 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759158 2025-07-21 22:16:22,104 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759159_18335 src: /192.168.158.1:51308 dest: /192.168.158.4:9866 2025-07-21 22:16:22,138 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51308, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_210655972_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759159_18335, duration(ns): 23306159 2025-07-21 22:16:22,139 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759159_18335, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-21 22:16:29,673 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759159_18335 replica FinalizedReplica, blk_1073759159_18335, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759159 for deletion 2025-07-21 22:16:29,674 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759159_18335 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759159 2025-07-21 22:17:22,113 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759160_18336 src: /192.168.158.9:33308 dest: /192.168.158.4:9866 2025-07-21 22:17:22,132 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:33308, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1819892101_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759160_18336, duration(ns): 17065588 2025-07-21 22:17:22,133 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759160_18336, type=LAST_IN_PIPELINE terminating 2025-07-21 22:17:26,673 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759160_18336 replica FinalizedReplica, blk_1073759160_18336, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759160 for deletion 2025-07-21 22:17:26,674 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759160_18336 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759160 2025-07-21 22:18:22,109 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759161_18337 src: /192.168.158.5:39050 dest: /192.168.158.4:9866 2025-07-21 22:18:22,137 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:39050, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-269682898_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759161_18337, duration(ns): 21119663 2025-07-21 22:18:22,137 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759161_18337, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-21 22:18:29,679 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759161_18337 replica FinalizedReplica, blk_1073759161_18337, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759161 for deletion 2025-07-21 22:18:29,680 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759161_18337 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759161 2025-07-21 22:19:22,119 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759162_18338 src: /192.168.158.9:51328 dest: /192.168.158.4:9866 2025-07-21 22:19:22,137 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:51328, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-265452627_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759162_18338, duration(ns): 16122815 2025-07-21 22:19:22,138 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759162_18338, type=LAST_IN_PIPELINE terminating 2025-07-21 22:19:26,684 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759162_18338 replica FinalizedReplica, blk_1073759162_18338, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759162 for deletion 2025-07-21 22:19:26,685 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759162_18338 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759162 2025-07-21 22:20:22,119 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759163_18339 src: /192.168.158.9:49434 dest: /192.168.158.4:9866 2025-07-21 22:20:22,141 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:49434, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1850494498_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759163_18339, duration(ns): 18635340 2025-07-21 22:20:22,142 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759163_18339, type=LAST_IN_PIPELINE terminating 2025-07-21 22:20:29,688 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759163_18339 replica FinalizedReplica, blk_1073759163_18339, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759163 for deletion 2025-07-21 22:20:29,690 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759163_18339 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759163 2025-07-21 22:22:22,115 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759165_18341 src: /192.168.158.8:37936 dest: /192.168.158.4:9866 2025-07-21 22:22:22,133 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:37936, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-695134639_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759165_18341, duration(ns): 15642379 2025-07-21 22:22:22,134 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759165_18341, type=LAST_IN_PIPELINE terminating 2025-07-21 22:22:26,692 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759165_18341 replica FinalizedReplica, blk_1073759165_18341, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759165 for deletion 2025-07-21 22:22:26,693 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759165_18341 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759165 2025-07-21 22:23:22,113 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759166_18342 src: /192.168.158.1:54068 dest: /192.168.158.4:9866 2025-07-21 22:23:22,147 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54068, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-733628752_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759166_18342, duration(ns): 22772104 2025-07-21 22:23:22,147 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759166_18342, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-21 22:23:29,695 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759166_18342 replica FinalizedReplica, blk_1073759166_18342, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759166 for deletion 2025-07-21 22:23:29,697 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759166_18342 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759166 2025-07-21 22:26:27,125 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759169_18345 src: /192.168.158.6:59608 dest: /192.168.158.4:9866 2025-07-21 22:26:27,146 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:59608, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_165096359_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759169_18345, duration(ns): 17704385 2025-07-21 22:26:27,146 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759169_18345, type=LAST_IN_PIPELINE terminating 2025-07-21 22:26:29,702 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759169_18345 replica FinalizedReplica, blk_1073759169_18345, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759169 for deletion 2025-07-21 22:26:29,703 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759169_18345 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759169 2025-07-21 22:29:32,135 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759172_18348 src: /192.168.158.1:34268 dest: /192.168.158.4:9866 2025-07-21 22:29:32,173 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34268, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-685120354_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759172_18348, duration(ns): 27731565 2025-07-21 22:29:32,173 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759172_18348, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-21 22:29:35,707 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759172_18348 replica FinalizedReplica, blk_1073759172_18348, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759172 for deletion 2025-07-21 22:29:35,708 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759172_18348 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759172 2025-07-21 22:30:32,128 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759173_18349 src: /192.168.158.1:51780 dest: /192.168.158.4:9866 2025-07-21 22:30:32,160 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51780, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1922236124_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759173_18349, duration(ns): 21600509 2025-07-21 22:30:32,160 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759173_18349, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-21 22:30:35,712 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759173_18349 replica FinalizedReplica, blk_1073759173_18349, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759173 for deletion 2025-07-21 22:30:35,713 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759173_18349 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759173 2025-07-21 22:32:37,127 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759175_18351 src: /192.168.158.9:34186 dest: /192.168.158.4:9866 2025-07-21 22:32:37,156 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:34186, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_754853343_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759175_18351, duration(ns): 22760976 2025-07-21 22:32:37,156 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759175_18351, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-21 22:32:41,720 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759175_18351 replica FinalizedReplica, blk_1073759175_18351, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759175 for deletion 2025-07-21 22:32:41,722 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759175_18351 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759175 2025-07-21 22:33:37,155 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759176_18352 src: /192.168.158.6:39912 dest: /192.168.158.4:9866 2025-07-21 22:33:37,176 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:39912, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1228278492_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759176_18352, duration(ns): 18344522 2025-07-21 22:33:37,176 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759176_18352, type=LAST_IN_PIPELINE terminating 2025-07-21 22:33:41,725 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759176_18352 replica FinalizedReplica, blk_1073759176_18352, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759176 for deletion 2025-07-21 22:33:41,726 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759176_18352 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759176 2025-07-21 22:34:37,151 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759177_18353 src: /192.168.158.9:41330 dest: /192.168.158.4:9866 2025-07-21 22:34:37,170 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:41330, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1306670347_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759177_18353, duration(ns): 16048667 2025-07-21 22:34:37,170 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759177_18353, type=LAST_IN_PIPELINE terminating 2025-07-21 22:34:41,724 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759177_18353 replica FinalizedReplica, blk_1073759177_18353, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759177 for deletion 2025-07-21 22:34:41,725 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759177_18353 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759177 2025-07-21 22:35:37,141 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759178_18354 src: /192.168.158.6:39910 dest: /192.168.158.4:9866 2025-07-21 22:35:37,184 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:39910, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1178561419_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759178_18354, duration(ns): 36723255 2025-07-21 22:35:37,184 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759178_18354, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-21 22:35:44,726 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759178_18354 replica FinalizedReplica, blk_1073759178_18354, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759178 for deletion 2025-07-21 22:35:44,728 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759178_18354 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759178 2025-07-21 22:36:37,173 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759179_18355 src: /192.168.158.9:51778 dest: /192.168.158.4:9866 2025-07-21 22:36:37,193 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:51778, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1232536654_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759179_18355, duration(ns): 17056802 2025-07-21 22:36:37,193 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759179_18355, type=LAST_IN_PIPELINE terminating 2025-07-21 22:36:41,730 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759179_18355 replica FinalizedReplica, blk_1073759179_18355, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759179 for deletion 2025-07-21 22:36:41,731 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759179_18355 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759179 2025-07-21 22:37:37,169 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759180_18356 src: /192.168.158.1:46804 dest: /192.168.158.4:9866 2025-07-21 22:37:37,207 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46804, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_180837799_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759180_18356, duration(ns): 27720690 2025-07-21 22:37:37,208 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759180_18356, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-21 22:37:44,734 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759180_18356 replica FinalizedReplica, blk_1073759180_18356, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759180 for deletion 2025-07-21 22:37:44,735 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759180_18356 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759180 2025-07-21 22:39:37,167 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759182_18358 src: /192.168.158.1:40458 dest: /192.168.158.4:9866 2025-07-21 22:39:37,201 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40458, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1260920146_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759182_18358, duration(ns): 23673106 2025-07-21 22:39:37,201 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759182_18358, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-21 22:39:41,735 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759182_18358 replica FinalizedReplica, blk_1073759182_18358, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759182 for deletion 2025-07-21 22:39:41,736 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759182_18358 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759182 2025-07-21 22:41:42,195 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759184_18360 src: /192.168.158.7:47158 dest: /192.168.158.4:9866 2025-07-21 22:41:42,214 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:47158, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-104408938_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759184_18360, duration(ns): 16144084 2025-07-21 22:41:42,214 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759184_18360, type=LAST_IN_PIPELINE terminating 2025-07-21 22:41:44,741 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759184_18360 replica FinalizedReplica, blk_1073759184_18360, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759184 for deletion 2025-07-21 22:41:44,743 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759184_18360 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759184 2025-07-21 22:42:47,170 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759185_18361 src: /192.168.158.6:42030 dest: /192.168.158.4:9866 2025-07-21 22:42:47,189 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:42030, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2025952187_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759185_18361, duration(ns): 16684797 2025-07-21 22:42:47,190 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759185_18361, type=LAST_IN_PIPELINE terminating 2025-07-21 22:42:50,743 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759185_18361 replica FinalizedReplica, blk_1073759185_18361, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759185 for deletion 2025-07-21 22:42:50,744 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759185_18361 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759185 2025-07-21 22:43:52,177 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759186_18362 src: /192.168.158.8:49058 dest: /192.168.158.4:9866 2025-07-21 22:43:52,196 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:49058, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-320754305_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759186_18362, duration(ns): 16801757 2025-07-21 22:43:52,198 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759186_18362, type=LAST_IN_PIPELINE terminating 2025-07-21 22:43:56,743 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759186_18362 replica FinalizedReplica, blk_1073759186_18362, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759186 for deletion 2025-07-21 22:43:56,744 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759186_18362 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759186 2025-07-21 22:48:02,164 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759190_18366 src: /192.168.158.1:42216 dest: /192.168.158.4:9866 2025-07-21 22:48:02,206 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42216, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-95041687_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759190_18366, duration(ns): 25458738 2025-07-21 22:48:02,214 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759190_18366, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-21 22:48:05,756 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759190_18366 replica FinalizedReplica, blk_1073759190_18366, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759190 for deletion 2025-07-21 22:48:05,758 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759190_18366 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759190 2025-07-21 22:51:02,174 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759193_18369 src: /192.168.158.1:55040 dest: /192.168.158.4:9866 2025-07-21 22:51:02,208 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:55040, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-720713801_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759193_18369, duration(ns): 22787361 2025-07-21 22:51:02,209 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759193_18369, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-21 22:51:05,764 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759193_18369 replica FinalizedReplica, blk_1073759193_18369, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759193 for deletion 2025-07-21 22:51:05,765 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759193_18369 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759193 2025-07-21 22:52:02,190 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759194_18370 src: /192.168.158.6:52040 dest: /192.168.158.4:9866 2025-07-21 22:52:02,218 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:52040, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1185447297_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759194_18370, duration(ns): 21572651 2025-07-21 22:52:02,220 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759194_18370, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-21 22:52:05,764 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759194_18370 replica FinalizedReplica, blk_1073759194_18370, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759194 for deletion 2025-07-21 22:52:05,765 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759194_18370 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759194 2025-07-21 22:55:07,176 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759197_18373 src: /192.168.158.9:45170 dest: /192.168.158.4:9866 2025-07-21 22:55:07,201 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:45170, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_331883735_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759197_18373, duration(ns): 18878729 2025-07-21 22:55:07,201 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759197_18373, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-21 22:55:11,773 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759197_18373 replica FinalizedReplica, blk_1073759197_18373, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759197 for deletion 2025-07-21 22:55:11,774 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759197_18373 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759197 2025-07-21 22:58:12,172 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759200_18376 src: /192.168.158.1:51920 dest: /192.168.158.4:9866 2025-07-21 22:58:12,209 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51920, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1675517647_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759200_18376, duration(ns): 25932327 2025-07-21 22:58:12,209 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759200_18376, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-21 22:58:14,783 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759200_18376 replica FinalizedReplica, blk_1073759200_18376, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759200 for deletion 2025-07-21 22:58:14,784 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759200_18376 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759200 2025-07-21 23:01:12,184 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759203_18379 src: /192.168.158.9:50716 dest: /192.168.158.4:9866 2025-07-21 23:01:12,210 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:50716, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-443604225_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759203_18379, duration(ns): 20196421 2025-07-21 23:01:12,210 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759203_18379, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-21 23:01:17,788 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759203_18379 replica FinalizedReplica, blk_1073759203_18379, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759203 for deletion 2025-07-21 23:01:17,789 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759203_18379 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759203 2025-07-21 23:02:12,183 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759204_18380 src: /192.168.158.1:44148 dest: /192.168.158.4:9866 2025-07-21 23:02:12,218 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44148, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1092631380_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759204_18380, duration(ns): 25013135 2025-07-21 23:02:12,218 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759204_18380, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-21 23:02:17,792 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759204_18380 replica FinalizedReplica, blk_1073759204_18380, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759204 for deletion 2025-07-21 23:02:17,793 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759204_18380 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759204 2025-07-21 23:03:12,186 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759205_18381 src: /192.168.158.9:56924 dest: /192.168.158.4:9866 2025-07-21 23:03:12,214 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:56924, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_526132964_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759205_18381, duration(ns): 22101591 2025-07-21 23:03:12,215 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759205_18381, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-21 23:03:17,793 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759205_18381 replica FinalizedReplica, blk_1073759205_18381, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759205 for deletion 2025-07-21 23:03:17,795 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759205_18381 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759205 2025-07-21 23:05:22,186 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759207_18383 src: /192.168.158.6:55666 dest: /192.168.158.4:9866 2025-07-21 23:05:22,214 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:55666, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_425726634_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759207_18383, duration(ns): 21156778 2025-07-21 23:05:22,214 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759207_18383, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-21 23:05:29,800 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759207_18383 replica FinalizedReplica, blk_1073759207_18383, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759207 for deletion 2025-07-21 23:05:29,801 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759207_18383 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759207 2025-07-21 23:09:27,192 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759211_18387 src: /192.168.158.6:58922 dest: /192.168.158.4:9866 2025-07-21 23:09:27,217 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:58922, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1212654204_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759211_18387, duration(ns): 18831256 2025-07-21 23:09:27,218 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759211_18387, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-21 23:09:29,810 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759211_18387 replica FinalizedReplica, blk_1073759211_18387, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759211 for deletion 2025-07-21 23:09:29,811 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759211_18387 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759211 2025-07-21 23:10:27,194 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759212_18388 src: /192.168.158.1:45814 dest: /192.168.158.4:9866 2025-07-21 23:10:27,229 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45814, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1700209837_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759212_18388, duration(ns): 23204327 2025-07-21 23:10:27,229 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759212_18388, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-21 23:10:29,813 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759212_18388 replica FinalizedReplica, blk_1073759212_18388, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759212 for deletion 2025-07-21 23:10:29,814 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759212_18388 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759212 2025-07-21 23:11:27,193 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759213_18389 src: /192.168.158.1:45798 dest: /192.168.158.4:9866 2025-07-21 23:11:27,226 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45798, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1533014705_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759213_18389, duration(ns): 22712984 2025-07-21 23:11:27,226 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759213_18389, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-21 23:11:29,818 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759213_18389 replica FinalizedReplica, blk_1073759213_18389, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759213 for deletion 2025-07-21 23:11:29,819 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759213_18389 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759213 2025-07-21 23:12:27,208 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759214_18390 src: /192.168.158.7:53324 dest: /192.168.158.4:9866 2025-07-21 23:12:27,227 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:53324, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-475752001_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759214_18390, duration(ns): 16598515 2025-07-21 23:12:27,227 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759214_18390, type=LAST_IN_PIPELINE terminating 2025-07-21 23:12:29,821 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759214_18390 replica FinalizedReplica, blk_1073759214_18390, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759214 for deletion 2025-07-21 23:12:29,822 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759214_18390 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759214 2025-07-21 23:14:27,196 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759216_18392 src: /192.168.158.1:45994 dest: /192.168.158.4:9866 2025-07-21 23:14:27,230 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45994, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-893969915_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759216_18392, duration(ns): 23118576 2025-07-21 23:14:27,230 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759216_18392, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-21 23:14:29,825 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759216_18392 replica FinalizedReplica, blk_1073759216_18392, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759216 for deletion 2025-07-21 23:14:29,826 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759216_18392 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759216 2025-07-21 23:20:47,203 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759222_18398 src: /192.168.158.1:45730 dest: /192.168.158.4:9866 2025-07-21 23:20:47,237 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45730, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-928606257_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759222_18398, duration(ns): 23588369 2025-07-21 23:20:47,237 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759222_18398, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-21 23:20:53,844 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759222_18398 replica FinalizedReplica, blk_1073759222_18398, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759222 for deletion 2025-07-21 23:20:53,845 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759222_18398 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759222 2025-07-21 23:21:47,205 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759223_18399 src: /192.168.158.8:57708 dest: /192.168.158.4:9866 2025-07-21 23:21:47,231 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:57708, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-580818081_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759223_18399, duration(ns): 19177915 2025-07-21 23:21:47,231 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759223_18399, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-21 23:21:53,848 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759223_18399 replica FinalizedReplica, blk_1073759223_18399, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759223 for deletion 2025-07-21 23:21:53,849 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759223_18399 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759223 2025-07-21 23:22:52,208 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759224_18400 src: /192.168.158.1:42032 dest: /192.168.158.4:9866 2025-07-21 23:22:52,241 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42032, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1729068387_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759224_18400, duration(ns): 22840206 2025-07-21 23:22:52,241 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759224_18400, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-21 23:22:53,852 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759224_18400 replica FinalizedReplica, blk_1073759224_18400, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759224 for deletion 2025-07-21 23:22:53,853 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759224_18400 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759224 2025-07-21 23:23:52,212 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759225_18401 src: /192.168.158.9:45130 dest: /192.168.158.4:9866 2025-07-21 23:23:52,231 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:45130, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_10086137_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759225_18401, duration(ns): 17108065 2025-07-21 23:23:52,231 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759225_18401, type=LAST_IN_PIPELINE terminating 2025-07-21 23:23:56,852 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759225_18401 replica FinalizedReplica, blk_1073759225_18401, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759225 for deletion 2025-07-21 23:23:56,853 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759225_18401 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759225 2025-07-21 23:27:57,214 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759229_18405 src: /192.168.158.9:51142 dest: /192.168.158.4:9866 2025-07-21 23:27:57,234 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:51142, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1281450349_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759229_18405, duration(ns): 17376214 2025-07-21 23:27:57,234 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759229_18405, type=LAST_IN_PIPELINE terminating 2025-07-21 23:27:59,862 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759229_18405 replica FinalizedReplica, blk_1073759229_18405, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759229 for deletion 2025-07-21 23:27:59,863 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759229_18405 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759229 2025-07-21 23:29:57,212 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759231_18407 src: /192.168.158.1:59250 dest: /192.168.158.4:9866 2025-07-21 23:29:57,246 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59250, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-301519315_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759231_18407, duration(ns): 23251754 2025-07-21 23:29:57,246 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759231_18407, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-21 23:30:02,867 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759231_18407 replica FinalizedReplica, blk_1073759231_18407, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759231 for deletion 2025-07-21 23:30:02,868 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759231_18407 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir3/blk_1073759231 2025-07-21 23:30:57,215 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759232_18408 src: /192.168.158.8:35194 dest: /192.168.158.4:9866 2025-07-21 23:30:57,242 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:35194, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_664548422_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759232_18408, duration(ns): 19969169 2025-07-21 23:30:57,242 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759232_18408, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-21 23:30:59,867 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759232_18408 replica FinalizedReplica, blk_1073759232_18408, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759232 for deletion 2025-07-21 23:30:59,868 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759232_18408 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759232 2025-07-21 23:32:02,223 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759233_18409 src: /192.168.158.9:40482 dest: /192.168.158.4:9866 2025-07-21 23:32:02,244 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:40482, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-94447477_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759233_18409, duration(ns): 19106307 2025-07-21 23:32:02,246 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759233_18409, type=LAST_IN_PIPELINE terminating 2025-07-21 23:32:05,870 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759233_18409 replica FinalizedReplica, blk_1073759233_18409, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759233 for deletion 2025-07-21 23:32:05,871 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759233_18409 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759233 2025-07-21 23:33:02,227 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759234_18410 src: /192.168.158.7:54402 dest: /192.168.158.4:9866 2025-07-21 23:33:02,248 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:54402, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_840918604_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759234_18410, duration(ns): 18832774 2025-07-21 23:33:02,248 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759234_18410, type=LAST_IN_PIPELINE terminating 2025-07-21 23:33:08,873 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759234_18410 replica FinalizedReplica, blk_1073759234_18410, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759234 for deletion 2025-07-21 23:33:08,874 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759234_18410 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759234 2025-07-21 23:34:02,221 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759235_18411 src: /192.168.158.8:37790 dest: /192.168.158.4:9866 2025-07-21 23:34:02,247 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:37790, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1640174515_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759235_18411, duration(ns): 19319079 2025-07-21 23:34:02,248 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759235_18411, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-21 23:34:08,873 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759235_18411 replica FinalizedReplica, blk_1073759235_18411, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759235 for deletion 2025-07-21 23:34:08,874 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759235_18411 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759235 2025-07-21 23:36:02,220 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759237_18413 src: /192.168.158.1:36042 dest: /192.168.158.4:9866 2025-07-21 23:36:02,258 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36042, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_907014892_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759237_18413, duration(ns): 26099732 2025-07-21 23:36:02,259 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759237_18413, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-21 23:36:08,881 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759237_18413 replica FinalizedReplica, blk_1073759237_18413, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759237 for deletion 2025-07-21 23:36:08,882 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759237_18413 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759237 2025-07-21 23:37:02,229 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759238_18414 src: /192.168.158.8:48892 dest: /192.168.158.4:9866 2025-07-21 23:37:02,250 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:48892, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-242483387_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759238_18414, duration(ns): 18034331 2025-07-21 23:37:02,250 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759238_18414, type=LAST_IN_PIPELINE terminating 2025-07-21 23:37:08,884 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759238_18414 replica FinalizedReplica, blk_1073759238_18414, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759238 for deletion 2025-07-21 23:37:08,886 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759238_18414 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759238 2025-07-21 23:40:07,221 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759241_18417 src: /192.168.158.1:47202 dest: /192.168.158.4:9866 2025-07-21 23:40:07,255 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47202, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1565247543_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759241_18417, duration(ns): 23807657 2025-07-21 23:40:07,255 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759241_18417, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-21 23:40:08,897 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759241_18417 replica FinalizedReplica, blk_1073759241_18417, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759241 for deletion 2025-07-21 23:40:08,898 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759241_18417 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759241 2025-07-21 23:41:12,226 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759242_18418 src: /192.168.158.8:40554 dest: /192.168.158.4:9866 2025-07-21 23:41:12,253 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:40554, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1592974660_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759242_18418, duration(ns): 19392337 2025-07-21 23:41:12,253 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759242_18418, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-21 23:41:14,902 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759242_18418 replica FinalizedReplica, blk_1073759242_18418, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759242 for deletion 2025-07-21 23:41:14,904 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759242_18418 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759242 2025-07-21 23:42:12,225 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759243_18419 src: /192.168.158.1:51958 dest: /192.168.158.4:9866 2025-07-21 23:42:12,260 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51958, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1165782341_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759243_18419, duration(ns): 24590148 2025-07-21 23:42:12,260 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759243_18419, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-21 23:42:17,902 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759243_18419 replica FinalizedReplica, blk_1073759243_18419, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759243 for deletion 2025-07-21 23:42:17,903 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759243_18419 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759243 2025-07-21 23:46:17,227 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759247_18423 src: /192.168.158.1:50404 dest: /192.168.158.4:9866 2025-07-21 23:46:17,260 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50404, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1742985814_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759247_18423, duration(ns): 22580675 2025-07-21 23:46:17,260 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759247_18423, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-21 23:46:20,911 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759247_18423 replica FinalizedReplica, blk_1073759247_18423, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759247 for deletion 2025-07-21 23:46:20,912 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759247_18423 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759247 2025-07-21 23:48:17,233 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759249_18425 src: /192.168.158.1:36612 dest: /192.168.158.4:9866 2025-07-21 23:48:17,264 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36612, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1724408692_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759249_18425, duration(ns): 21908608 2025-07-21 23:48:17,265 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759249_18425, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-21 23:48:23,914 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759249_18425 replica FinalizedReplica, blk_1073759249_18425, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759249 for deletion 2025-07-21 23:48:23,915 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759249_18425 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759249 2025-07-21 23:49:22,239 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759250_18426 src: /192.168.158.7:53216 dest: /192.168.158.4:9866 2025-07-21 23:49:22,267 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:53216, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_893669077_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759250_18426, duration(ns): 21271321 2025-07-21 23:49:22,267 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759250_18426, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-21 23:49:26,918 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759250_18426 replica FinalizedReplica, blk_1073759250_18426, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759250 for deletion 2025-07-21 23:49:26,919 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759250_18426 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759250 2025-07-21 23:50:22,241 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759251_18427 src: /192.168.158.1:49292 dest: /192.168.158.4:9866 2025-07-21 23:50:22,273 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49292, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-868381018_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759251_18427, duration(ns): 21287038 2025-07-21 23:50:22,273 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759251_18427, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-21 23:50:23,920 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759251_18427 replica FinalizedReplica, blk_1073759251_18427, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759251 for deletion 2025-07-21 23:50:23,922 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759251_18427 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759251 2025-07-21 23:53:22,244 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759254_18430 src: /192.168.158.1:49520 dest: /192.168.158.4:9866 2025-07-21 23:53:22,275 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49520, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_11834294_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759254_18430, duration(ns): 21371093 2025-07-21 23:53:22,276 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759254_18430, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-21 23:53:23,927 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759254_18430 replica FinalizedReplica, blk_1073759254_18430, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759254 for deletion 2025-07-21 23:53:23,928 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759254_18430 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759254 2025-07-21 23:54:22,248 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759255_18431 src: /192.168.158.6:58162 dest: /192.168.158.4:9866 2025-07-21 23:54:22,267 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:58162, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1561525975_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759255_18431, duration(ns): 16150411 2025-07-21 23:54:22,267 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759255_18431, type=LAST_IN_PIPELINE terminating 2025-07-21 23:54:23,928 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759255_18431 replica FinalizedReplica, blk_1073759255_18431, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759255 for deletion 2025-07-21 23:54:23,930 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759255_18431 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759255 2025-07-21 23:55:22,258 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759256_18432 src: /192.168.158.6:34104 dest: /192.168.158.4:9866 2025-07-21 23:55:22,277 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:34104, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_469236297_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759256_18432, duration(ns): 17026228 2025-07-21 23:55:22,278 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759256_18432, type=LAST_IN_PIPELINE terminating 2025-07-21 23:55:23,929 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759256_18432 replica FinalizedReplica, blk_1073759256_18432, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759256 for deletion 2025-07-21 23:55:23,930 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759256_18432 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759256 2025-07-21 23:58:27,256 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759259_18435 src: /192.168.158.5:51744 dest: /192.168.158.4:9866 2025-07-21 23:58:27,275 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:51744, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1475750450_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759259_18435, duration(ns): 17176875 2025-07-21 23:58:27,276 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759259_18435, type=LAST_IN_PIPELINE terminating 2025-07-21 23:58:32,939 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759259_18435 replica FinalizedReplica, blk_1073759259_18435, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759259 for deletion 2025-07-21 23:58:32,940 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759259_18435 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759259 2025-07-22 00:00:27,255 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759261_18437 src: /192.168.158.8:33978 dest: /192.168.158.4:9866 2025-07-22 00:00:27,281 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:33978, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_553283128_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759261_18437, duration(ns): 19094178 2025-07-22 00:00:27,281 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759261_18437, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-22 00:00:32,941 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759261_18437 replica FinalizedReplica, blk_1073759261_18437, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759261 for deletion 2025-07-22 00:00:32,943 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759261_18437 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759261 2025-07-22 00:01:27,256 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759262_18438 src: /192.168.158.1:41194 dest: /192.168.158.4:9866 2025-07-22 00:01:27,288 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41194, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1286029114_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759262_18438, duration(ns): 21293099 2025-07-22 00:01:27,288 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759262_18438, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-22 00:01:29,944 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759262_18438 replica FinalizedReplica, blk_1073759262_18438, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759262 for deletion 2025-07-22 00:01:29,945 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759262_18438 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759262 2025-07-22 00:03:32,273 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759264_18440 src: /192.168.158.7:50618 dest: /192.168.158.4:9866 2025-07-22 00:03:32,300 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:50618, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_794150720_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759264_18440, duration(ns): 20239015 2025-07-22 00:03:32,300 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759264_18440, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-22 00:03:35,948 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759264_18440 replica FinalizedReplica, blk_1073759264_18440, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759264 for deletion 2025-07-22 00:03:35,949 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759264_18440 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759264 2025-07-22 00:06:37,265 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759267_18443 src: /192.168.158.1:35230 dest: /192.168.158.4:9866 2025-07-22 00:06:37,296 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35230, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-415024292_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759267_18443, duration(ns): 21375607 2025-07-22 00:06:37,298 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759267_18443, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-22 00:06:38,956 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759267_18443 replica FinalizedReplica, blk_1073759267_18443, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759267 for deletion 2025-07-22 00:06:38,958 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759267_18443 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759267 2025-07-22 00:08:42,272 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759269_18445 src: /192.168.158.1:52756 dest: /192.168.158.4:9866 2025-07-22 00:08:42,305 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52756, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1577317570_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759269_18445, duration(ns): 22781137 2025-07-22 00:08:42,305 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759269_18445, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-22 00:08:47,963 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759269_18445 replica FinalizedReplica, blk_1073759269_18445, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759269 for deletion 2025-07-22 00:08:47,964 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759269_18445 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759269 2025-07-22 00:09:42,282 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759270_18446 src: /192.168.158.5:46352 dest: /192.168.158.4:9866 2025-07-22 00:09:42,304 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:46352, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-633974016_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759270_18446, duration(ns): 20582461 2025-07-22 00:09:42,305 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759270_18446, type=LAST_IN_PIPELINE terminating 2025-07-22 00:09:47,963 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759270_18446 replica FinalizedReplica, blk_1073759270_18446, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759270 for deletion 2025-07-22 00:09:47,964 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759270_18446 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759270 2025-07-22 00:11:42,272 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759272_18448 src: /192.168.158.1:53216 dest: /192.168.158.4:9866 2025-07-22 00:11:42,307 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53216, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_58179460_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759272_18448, duration(ns): 24653055 2025-07-22 00:11:42,307 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759272_18448, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-22 00:11:47,967 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759272_18448 replica FinalizedReplica, blk_1073759272_18448, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759272 for deletion 2025-07-22 00:11:47,969 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759272_18448 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759272 2025-07-22 00:12:42,286 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759273_18449 src: /192.168.158.8:46210 dest: /192.168.158.4:9866 2025-07-22 00:12:42,315 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:46210, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1886944906_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759273_18449, duration(ns): 22068797 2025-07-22 00:12:42,316 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759273_18449, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-22 00:12:44,968 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759273_18449 replica FinalizedReplica, blk_1073759273_18449, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759273 for deletion 2025-07-22 00:12:44,969 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759273_18449 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759273 2025-07-22 00:17:42,297 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759278_18454 src: /192.168.158.1:47156 dest: /192.168.158.4:9866 2025-07-22 00:17:42,330 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47156, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1614300888_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759278_18454, duration(ns): 23050713 2025-07-22 00:17:42,330 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759278_18454, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-22 00:17:44,980 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759278_18454 replica FinalizedReplica, blk_1073759278_18454, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759278 for deletion 2025-07-22 00:17:44,981 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759278_18454 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759278 2025-07-22 00:18:42,293 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759279_18455 src: /192.168.158.1:46778 dest: /192.168.158.4:9866 2025-07-22 00:18:42,326 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46778, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1711608211_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759279_18455, duration(ns): 22734171 2025-07-22 00:18:42,326 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759279_18455, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-22 00:18:44,981 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759279_18455 replica FinalizedReplica, blk_1073759279_18455, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759279 for deletion 2025-07-22 00:18:44,983 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759279_18455 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759279 2025-07-22 00:19:42,332 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759280_18456 src: /192.168.158.8:57226 dest: /192.168.158.4:9866 2025-07-22 00:19:42,352 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:57226, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-965600440_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759280_18456, duration(ns): 17155665 2025-07-22 00:19:42,352 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759280_18456, type=LAST_IN_PIPELINE terminating 2025-07-22 00:19:47,986 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759280_18456 replica FinalizedReplica, blk_1073759280_18456, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759280 for deletion 2025-07-22 00:19:47,988 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759280_18456 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759280 2025-07-22 00:20:42,304 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759281_18457 src: /192.168.158.9:37960 dest: /192.168.158.4:9866 2025-07-22 00:20:42,324 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:37960, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_137342033_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759281_18457, duration(ns): 17598797 2025-07-22 00:20:42,324 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759281_18457, type=LAST_IN_PIPELINE terminating 2025-07-22 00:20:44,986 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759281_18457 replica FinalizedReplica, blk_1073759281_18457, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759281 for deletion 2025-07-22 00:20:44,987 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759281_18457 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759281 2025-07-22 00:24:47,292 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759285_18461 src: /192.168.158.5:54282 dest: /192.168.158.4:9866 2025-07-22 00:24:47,319 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:54282, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1539962005_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759285_18461, duration(ns): 20732022 2025-07-22 00:24:47,320 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759285_18461, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-22 00:24:50,998 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759285_18461 replica FinalizedReplica, blk_1073759285_18461, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759285 for deletion 2025-07-22 00:24:50,999 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759285_18461 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759285 2025-07-22 00:32:57,314 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759293_18469 src: /192.168.158.8:51150 dest: /192.168.158.4:9866 2025-07-22 00:32:57,333 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:51150, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_720796678_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759293_18469, duration(ns): 17134445 2025-07-22 00:32:57,334 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759293_18469, type=LAST_IN_PIPELINE terminating 2025-07-22 00:33:00,020 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759293_18469 replica FinalizedReplica, blk_1073759293_18469, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759293 for deletion 2025-07-22 00:33:00,021 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759293_18469 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759293 2025-07-22 00:37:07,319 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759297_18473 src: /192.168.158.6:57464 dest: /192.168.158.4:9866 2025-07-22 00:37:07,345 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:57464, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_219839293_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759297_18473, duration(ns): 19856944 2025-07-22 00:37:07,346 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759297_18473, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-22 00:37:09,033 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759297_18473 replica FinalizedReplica, blk_1073759297_18473, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759297 for deletion 2025-07-22 00:37:09,034 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759297_18473 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759297 2025-07-22 00:38:07,355 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759298_18474 src: /192.168.158.8:56276 dest: /192.168.158.4:9866 2025-07-22 00:38:07,374 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:56276, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_181474418_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759298_18474, duration(ns): 16729362 2025-07-22 00:38:07,374 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759298_18474, type=LAST_IN_PIPELINE terminating 2025-07-22 00:38:09,036 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759298_18474 replica FinalizedReplica, blk_1073759298_18474, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759298 for deletion 2025-07-22 00:38:09,037 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759298_18474 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759298 2025-07-22 00:39:07,316 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759299_18475 src: /192.168.158.8:53152 dest: /192.168.158.4:9866 2025-07-22 00:39:07,343 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:53152, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1983850270_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759299_18475, duration(ns): 20427256 2025-07-22 00:39:07,343 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759299_18475, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-22 00:39:12,039 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759299_18475 replica FinalizedReplica, blk_1073759299_18475, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759299 for deletion 2025-07-22 00:39:12,041 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759299_18475 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759299 2025-07-22 00:40:07,316 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759300_18476 src: /192.168.158.8:45460 dest: /192.168.158.4:9866 2025-07-22 00:40:07,336 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:45460, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1716806157_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759300_18476, duration(ns): 17606330 2025-07-22 00:40:07,336 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759300_18476, type=LAST_IN_PIPELINE terminating 2025-07-22 00:40:12,041 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759300_18476 replica FinalizedReplica, blk_1073759300_18476, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759300 for deletion 2025-07-22 00:40:12,044 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759300_18476 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759300 2025-07-22 00:47:12,328 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759307_18483 src: /192.168.158.1:41420 dest: /192.168.158.4:9866 2025-07-22 00:47:12,360 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41420, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1525626914_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759307_18483, duration(ns): 22150383 2025-07-22 00:47:12,361 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759307_18483, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-22 00:47:15,066 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759307_18483 replica FinalizedReplica, blk_1073759307_18483, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759307 for deletion 2025-07-22 00:47:15,067 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759307_18483 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759307 2025-07-22 00:50:17,334 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759310_18486 src: /192.168.158.5:37812 dest: /192.168.158.4:9866 2025-07-22 00:50:17,361 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:37812, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1760098325_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759310_18486, duration(ns): 20783955 2025-07-22 00:50:17,361 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759310_18486, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-22 00:50:21,076 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759310_18486 replica FinalizedReplica, blk_1073759310_18486, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759310 for deletion 2025-07-22 00:50:21,077 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759310_18486 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759310 2025-07-22 00:51:17,340 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759311_18487 src: /192.168.158.8:48252 dest: /192.168.158.4:9866 2025-07-22 00:51:17,368 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:48252, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1120312289_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759311_18487, duration(ns): 21735085 2025-07-22 00:51:17,368 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759311_18487, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-22 00:51:21,078 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759311_18487 replica FinalizedReplica, blk_1073759311_18487, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759311 for deletion 2025-07-22 00:51:21,079 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759311_18487 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759311 2025-07-22 00:52:17,335 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759312_18488 src: /192.168.158.1:57568 dest: /192.168.158.4:9866 2025-07-22 00:52:17,370 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57568, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1802476426_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759312_18488, duration(ns): 23824800 2025-07-22 00:52:17,370 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759312_18488, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-22 00:52:21,079 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759312_18488 replica FinalizedReplica, blk_1073759312_18488, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759312 for deletion 2025-07-22 00:52:21,080 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759312_18488 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759312 2025-07-22 00:54:17,337 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759314_18490 src: /192.168.158.6:41296 dest: /192.168.158.4:9866 2025-07-22 00:54:17,365 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:41296, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-494614640_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759314_18490, duration(ns): 21563211 2025-07-22 00:54:17,365 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759314_18490, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-22 00:54:21,084 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759314_18490 replica FinalizedReplica, blk_1073759314_18490, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759314 for deletion 2025-07-22 00:54:21,085 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759314_18490 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759314 2025-07-22 00:57:17,343 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759317_18493 src: /192.168.158.7:32838 dest: /192.168.158.4:9866 2025-07-22 00:57:17,362 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:32838, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1171836334_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759317_18493, duration(ns): 16053460 2025-07-22 00:57:17,362 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759317_18493, type=LAST_IN_PIPELINE terminating 2025-07-22 00:57:21,091 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759317_18493 replica FinalizedReplica, blk_1073759317_18493, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759317 for deletion 2025-07-22 00:57:21,093 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759317_18493 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759317 2025-07-22 00:58:17,345 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759318_18494 src: /192.168.158.8:40772 dest: /192.168.158.4:9866 2025-07-22 00:58:17,372 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:40772, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1508936618_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759318_18494, duration(ns): 20838500 2025-07-22 00:58:17,373 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759318_18494, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-22 00:58:24,095 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759318_18494 replica FinalizedReplica, blk_1073759318_18494, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759318 for deletion 2025-07-22 00:58:24,096 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759318_18494 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759318 2025-07-22 00:59:17,344 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759319_18495 src: /192.168.158.1:48484 dest: /192.168.158.4:9866 2025-07-22 00:59:17,381 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48484, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1129460271_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759319_18495, duration(ns): 25895716 2025-07-22 00:59:17,381 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759319_18495, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-22 00:59:21,096 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759319_18495 replica FinalizedReplica, blk_1073759319_18495, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759319 for deletion 2025-07-22 00:59:21,098 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759319_18495 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759319 2025-07-22 01:00:17,346 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759320_18496 src: /192.168.158.6:51004 dest: /192.168.158.4:9866 2025-07-22 01:00:17,366 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:51004, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_883766139_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759320_18496, duration(ns): 17016968 2025-07-22 01:00:17,366 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759320_18496, type=LAST_IN_PIPELINE terminating 2025-07-22 01:00:24,100 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759320_18496 replica FinalizedReplica, blk_1073759320_18496, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759320 for deletion 2025-07-22 01:00:24,102 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759320_18496 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759320 2025-07-22 01:01:17,353 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759321_18497 src: /192.168.158.6:42648 dest: /192.168.158.4:9866 2025-07-22 01:01:17,374 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:42648, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-791475036_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759321_18497, duration(ns): 17727176 2025-07-22 01:01:17,375 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759321_18497, type=LAST_IN_PIPELINE terminating 2025-07-22 01:01:21,100 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759321_18497 replica FinalizedReplica, blk_1073759321_18497, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759321 for deletion 2025-07-22 01:01:21,102 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759321_18497 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759321 2025-07-22 01:02:22,375 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759322_18498 src: /192.168.158.6:53838 dest: /192.168.158.4:9866 2025-07-22 01:02:22,394 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:53838, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-432714192_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759322_18498, duration(ns): 16477305 2025-07-22 01:02:22,395 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759322_18498, type=LAST_IN_PIPELINE terminating 2025-07-22 01:02:24,104 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759322_18498 replica FinalizedReplica, blk_1073759322_18498, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759322 for deletion 2025-07-22 01:02:24,105 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759322_18498 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759322 2025-07-22 01:03:22,348 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759323_18499 src: /192.168.158.6:39298 dest: /192.168.158.4:9866 2025-07-22 01:03:22,374 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:39298, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_592782823_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759323_18499, duration(ns): 20051284 2025-07-22 01:03:22,374 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759323_18499, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-22 01:03:27,106 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759323_18499 replica FinalizedReplica, blk_1073759323_18499, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759323 for deletion 2025-07-22 01:03:27,107 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759323_18499 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759323 2025-07-22 01:04:22,349 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759324_18500 src: /192.168.158.1:33200 dest: /192.168.158.4:9866 2025-07-22 01:04:22,382 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33200, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1456733879_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759324_18500, duration(ns): 22686703 2025-07-22 01:04:22,383 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759324_18500, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-22 01:04:24,108 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759324_18500 replica FinalizedReplica, blk_1073759324_18500, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759324 for deletion 2025-07-22 01:04:24,109 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759324_18500 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759324 2025-07-22 01:05:22,359 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759325_18501 src: /192.168.158.1:48024 dest: /192.168.158.4:9866 2025-07-22 01:05:22,394 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48024, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-687589430_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759325_18501, duration(ns): 24397377 2025-07-22 01:05:22,394 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759325_18501, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-22 01:05:24,113 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759325_18501 replica FinalizedReplica, blk_1073759325_18501, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759325 for deletion 2025-07-22 01:05:24,114 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759325_18501 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759325 2025-07-22 01:06:22,361 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759326_18502 src: /192.168.158.1:43626 dest: /192.168.158.4:9866 2025-07-22 01:06:22,394 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43626, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2004904702_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759326_18502, duration(ns): 22497211 2025-07-22 01:06:22,394 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759326_18502, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-22 01:06:27,117 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759326_18502 replica FinalizedReplica, blk_1073759326_18502, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759326 for deletion 2025-07-22 01:06:27,119 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759326_18502 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759326 2025-07-22 01:08:22,368 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759328_18504 src: /192.168.158.6:36344 dest: /192.168.158.4:9866 2025-07-22 01:08:22,393 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:36344, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-880469867_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759328_18504, duration(ns): 19064527 2025-07-22 01:08:22,394 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759328_18504, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-22 01:08:24,124 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759328_18504 replica FinalizedReplica, blk_1073759328_18504, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759328 for deletion 2025-07-22 01:08:24,125 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759328_18504 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759328 2025-07-22 01:10:22,374 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759330_18506 src: /192.168.158.6:60910 dest: /192.168.158.4:9866 2025-07-22 01:10:22,397 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:60910, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1677635210_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759330_18506, duration(ns): 20949474 2025-07-22 01:10:22,398 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759330_18506, type=LAST_IN_PIPELINE terminating 2025-07-22 01:10:24,128 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759330_18506 replica FinalizedReplica, blk_1073759330_18506, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759330 for deletion 2025-07-22 01:10:24,129 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759330_18506 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759330 2025-07-22 01:13:22,373 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759333_18509 src: /192.168.158.8:46676 dest: /192.168.158.4:9866 2025-07-22 01:13:22,391 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:46676, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1522650330_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759333_18509, duration(ns): 16392978 2025-07-22 01:13:22,392 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759333_18509, type=LAST_IN_PIPELINE terminating 2025-07-22 01:13:24,138 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759333_18509 replica FinalizedReplica, blk_1073759333_18509, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759333 for deletion 2025-07-22 01:13:24,139 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759333_18509 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759333 2025-07-22 01:14:22,378 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759334_18510 src: /192.168.158.5:59360 dest: /192.168.158.4:9866 2025-07-22 01:14:22,400 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:59360, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1576627933_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759334_18510, duration(ns): 18838800 2025-07-22 01:14:22,400 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759334_18510, type=LAST_IN_PIPELINE terminating 2025-07-22 01:14:24,141 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759334_18510 replica FinalizedReplica, blk_1073759334_18510, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759334 for deletion 2025-07-22 01:14:24,142 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759334_18510 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759334 2025-07-22 01:15:22,378 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759335_18511 src: /192.168.158.9:53272 dest: /192.168.158.4:9866 2025-07-22 01:15:22,398 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:53272, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1040900995_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759335_18511, duration(ns): 17053754 2025-07-22 01:15:22,398 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759335_18511, type=LAST_IN_PIPELINE terminating 2025-07-22 01:15:24,144 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759335_18511 replica FinalizedReplica, blk_1073759335_18511, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759335 for deletion 2025-07-22 01:15:24,145 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759335_18511 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759335 2025-07-22 01:16:22,375 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759336_18512 src: /192.168.158.7:42376 dest: /192.168.158.4:9866 2025-07-22 01:16:22,402 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:42376, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1322588669_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759336_18512, duration(ns): 21020769 2025-07-22 01:16:22,403 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759336_18512, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-22 01:16:27,144 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759336_18512 replica FinalizedReplica, blk_1073759336_18512, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759336 for deletion 2025-07-22 01:16:27,146 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759336_18512 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759336 2025-07-22 01:18:22,411 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759338_18514 src: /192.168.158.8:50830 dest: /192.168.158.4:9866 2025-07-22 01:18:22,430 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:50830, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_451711569_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759338_18514, duration(ns): 16283994 2025-07-22 01:18:22,430 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759338_18514, type=LAST_IN_PIPELINE terminating 2025-07-22 01:18:27,149 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759338_18514 replica FinalizedReplica, blk_1073759338_18514, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759338 for deletion 2025-07-22 01:18:27,150 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759338_18514 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759338 2025-07-22 01:22:32,382 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759342_18518 src: /192.168.158.7:43504 dest: /192.168.158.4:9866 2025-07-22 01:22:32,408 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:43504, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1021814240_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759342_18518, duration(ns): 19243486 2025-07-22 01:22:32,409 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759342_18518, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-22 01:22:36,161 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759342_18518 replica FinalizedReplica, blk_1073759342_18518, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759342 for deletion 2025-07-22 01:22:36,162 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759342_18518 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759342 2025-07-22 01:24:32,383 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759344_18520 src: /192.168.158.1:47206 dest: /192.168.158.4:9866 2025-07-22 01:24:32,415 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47206, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2133100420_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759344_18520, duration(ns): 21872733 2025-07-22 01:24:32,415 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759344_18520, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-22 01:24:39,168 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759344_18520 replica FinalizedReplica, blk_1073759344_18520, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759344 for deletion 2025-07-22 01:24:39,169 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759344_18520 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759344 2025-07-22 01:28:37,384 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759348_18524 src: /192.168.158.6:45742 dest: /192.168.158.4:9866 2025-07-22 01:28:37,410 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:45742, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2067119424_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759348_18524, duration(ns): 20204373 2025-07-22 01:28:37,411 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759348_18524, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-22 01:28:39,183 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759348_18524 replica FinalizedReplica, blk_1073759348_18524, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759348 for deletion 2025-07-22 01:28:39,184 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759348_18524 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759348 2025-07-22 01:30:42,388 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759350_18526 src: /192.168.158.1:52270 dest: /192.168.158.4:9866 2025-07-22 01:30:42,426 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52270, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1196844501_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759350_18526, duration(ns): 25832051 2025-07-22 01:30:42,426 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759350_18526, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-22 01:30:45,186 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759350_18526 replica FinalizedReplica, blk_1073759350_18526, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759350 for deletion 2025-07-22 01:30:45,187 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759350_18526 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759350 2025-07-22 01:33:47,386 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759353_18529 src: /192.168.158.1:38482 dest: /192.168.158.4:9866 2025-07-22 01:33:47,420 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38482, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_398223787_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759353_18529, duration(ns): 24202350 2025-07-22 01:33:47,421 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759353_18529, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-22 01:33:51,196 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759353_18529 replica FinalizedReplica, blk_1073759353_18529, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759353 for deletion 2025-07-22 01:33:51,197 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759353_18529 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759353 2025-07-22 01:35:47,385 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759355_18531 src: /192.168.158.1:58124 dest: /192.168.158.4:9866 2025-07-22 01:35:47,419 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58124, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-279667212_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759355_18531, duration(ns): 23736177 2025-07-22 01:35:47,419 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759355_18531, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-22 01:35:54,204 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759355_18531 replica FinalizedReplica, blk_1073759355_18531, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759355 for deletion 2025-07-22 01:35:54,206 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759355_18531 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759355 2025-07-22 01:36:47,402 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759356_18532 src: /192.168.158.5:52282 dest: /192.168.158.4:9866 2025-07-22 01:36:47,420 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:52282, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_394315646_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759356_18532, duration(ns): 16139408 2025-07-22 01:36:47,421 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759356_18532, type=LAST_IN_PIPELINE terminating 2025-07-22 01:36:51,206 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759356_18532 replica FinalizedReplica, blk_1073759356_18532, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759356 for deletion 2025-07-22 01:36:51,208 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759356_18532 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759356 2025-07-22 01:41:47,401 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759361_18537 src: /192.168.158.6:47600 dest: /192.168.158.4:9866 2025-07-22 01:41:47,431 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:47600, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-303585792_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759361_18537, duration(ns): 22636404 2025-07-22 01:41:47,431 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759361_18537, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-22 01:41:51,218 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759361_18537 replica FinalizedReplica, blk_1073759361_18537, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759361 for deletion 2025-07-22 01:41:51,219 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759361_18537 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759361 2025-07-22 01:42:47,403 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759362_18538 src: /192.168.158.8:32898 dest: /192.168.158.4:9866 2025-07-22 01:42:47,430 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:32898, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1552409172_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759362_18538, duration(ns): 20831684 2025-07-22 01:42:47,430 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759362_18538, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-22 01:42:48,218 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759362_18538 replica FinalizedReplica, blk_1073759362_18538, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759362 for deletion 2025-07-22 01:42:48,219 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759362_18538 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759362 2025-07-22 01:43:47,405 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759363_18539 src: /192.168.158.7:42640 dest: /192.168.158.4:9866 2025-07-22 01:43:47,433 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:42640, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-296501875_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759363_18539, duration(ns): 21225309 2025-07-22 01:43:47,433 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759363_18539, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-22 01:43:48,221 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759363_18539 replica FinalizedReplica, blk_1073759363_18539, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759363 for deletion 2025-07-22 01:43:48,222 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759363_18539 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759363 2025-07-22 01:44:47,406 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759364_18540 src: /192.168.158.9:49418 dest: /192.168.158.4:9866 2025-07-22 01:44:47,431 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:49418, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1116503696_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759364_18540, duration(ns): 19134178 2025-07-22 01:44:47,432 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759364_18540, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-22 01:44:48,225 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759364_18540 replica FinalizedReplica, blk_1073759364_18540, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759364 for deletion 2025-07-22 01:44:48,226 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759364_18540 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759364 2025-07-22 01:45:47,403 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759365_18541 src: /192.168.158.1:59492 dest: /192.168.158.4:9866 2025-07-22 01:45:47,439 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59492, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-795151051_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759365_18541, duration(ns): 24317629 2025-07-22 01:45:47,440 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759365_18541, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-22 01:45:48,229 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759365_18541 replica FinalizedReplica, blk_1073759365_18541, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759365 for deletion 2025-07-22 01:45:48,230 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759365_18541 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759365 2025-07-22 01:49:57,412 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759369_18545 src: /192.168.158.8:39772 dest: /192.168.158.4:9866 2025-07-22 01:49:57,442 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:39772, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_232106164_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759369_18545, duration(ns): 23570854 2025-07-22 01:49:57,443 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759369_18545, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-22 01:50:00,243 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759369_18545 replica FinalizedReplica, blk_1073759369_18545, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759369 for deletion 2025-07-22 01:50:00,244 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759369_18545 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759369 2025-07-22 01:51:57,415 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759371_18547 src: /192.168.158.8:41844 dest: /192.168.158.4:9866 2025-07-22 01:51:57,436 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:41844, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1874501602_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759371_18547, duration(ns): 18808840 2025-07-22 01:51:57,437 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759371_18547, type=LAST_IN_PIPELINE terminating 2025-07-22 01:52:03,247 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759371_18547 replica FinalizedReplica, blk_1073759371_18547, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759371 for deletion 2025-07-22 01:52:03,249 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759371_18547 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759371 2025-07-22 01:56:12,419 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759375_18551 src: /192.168.158.1:37302 dest: /192.168.158.4:9866 2025-07-22 01:56:12,454 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37302, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-78786883_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759375_18551, duration(ns): 24552604 2025-07-22 01:56:12,454 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759375_18551, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-22 01:56:15,261 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759375_18551 replica FinalizedReplica, blk_1073759375_18551, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759375 for deletion 2025-07-22 01:56:15,263 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759375_18551 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759375 2025-07-22 01:59:36,287 INFO org.apache.hadoop.hdfs.server.datanode.DirectoryScanner: BlockPool BP-1059995147-192.168.158.1-1752101929360 Total blocks: 8, missing metadata files:0, missing block files:0, missing blocks in memory:0, mismatched blocks:0 2025-07-22 02:00:17,436 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759379_18555 src: /192.168.158.5:38734 dest: /192.168.158.4:9866 2025-07-22 02:00:17,455 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:38734, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-88107984_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759379_18555, duration(ns): 16389297 2025-07-22 02:00:17,455 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759379_18555, type=LAST_IN_PIPELINE terminating 2025-07-22 02:00:18,269 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759379_18555 replica FinalizedReplica, blk_1073759379_18555, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759379 for deletion 2025-07-22 02:00:18,270 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759379_18555 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759379 2025-07-22 02:01:17,428 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759380_18556 src: /192.168.158.9:39168 dest: /192.168.158.4:9866 2025-07-22 02:01:17,454 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:39168, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1262548018_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759380_18556, duration(ns): 19845772 2025-07-22 02:01:17,455 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759380_18556, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-22 02:01:21,270 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759380_18556 replica FinalizedReplica, blk_1073759380_18556, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759380 for deletion 2025-07-22 02:01:21,272 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759380_18556 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759380 2025-07-22 02:02:22,437 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759381_18557 src: /192.168.158.7:46440 dest: /192.168.158.4:9866 2025-07-22 02:02:22,464 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:46440, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_381325104_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759381_18557, duration(ns): 20659125 2025-07-22 02:02:22,464 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759381_18557, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-22 02:02:27,271 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759381_18557 replica FinalizedReplica, blk_1073759381_18557, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759381 for deletion 2025-07-22 02:02:27,272 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759381_18557 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759381 2025-07-22 02:03:22,425 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759382_18558 src: /192.168.158.1:57704 dest: /192.168.158.4:9866 2025-07-22 02:03:22,458 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57704, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1087092611_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759382_18558, duration(ns): 22625394 2025-07-22 02:03:22,458 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759382_18558, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-22 02:03:24,277 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759382_18558 replica FinalizedReplica, blk_1073759382_18558, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759382 for deletion 2025-07-22 02:03:24,278 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759382_18558 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759382 2025-07-22 02:04:22,439 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759383_18559 src: /192.168.158.5:57482 dest: /192.168.158.4:9866 2025-07-22 02:04:22,465 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:57482, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_921558729_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759383_18559, duration(ns): 19780323 2025-07-22 02:04:22,465 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759383_18559, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-22 02:04:24,280 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759383_18559 replica FinalizedReplica, blk_1073759383_18559, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759383 for deletion 2025-07-22 02:04:24,281 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759383_18559 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759383 2025-07-22 02:08:22,432 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759387_18563 src: /192.168.158.1:59466 dest: /192.168.158.4:9866 2025-07-22 02:08:22,467 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59466, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1249822587_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759387_18563, duration(ns): 24726176 2025-07-22 02:08:22,467 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759387_18563, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-22 02:08:27,282 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759387_18563 replica FinalizedReplica, blk_1073759387_18563, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759387 for deletion 2025-07-22 02:08:27,283 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759387_18563 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759387 2025-07-22 02:10:22,443 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759389_18565 src: /192.168.158.9:37366 dest: /192.168.158.4:9866 2025-07-22 02:10:22,464 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:37366, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-539692285_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759389_18565, duration(ns): 18567513 2025-07-22 02:10:22,464 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759389_18565, type=LAST_IN_PIPELINE terminating 2025-07-22 02:10:24,287 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759389_18565 replica FinalizedReplica, blk_1073759389_18565, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759389 for deletion 2025-07-22 02:10:24,288 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759389_18565 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759389 2025-07-22 02:11:22,442 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759390_18566 src: /192.168.158.1:53130 dest: /192.168.158.4:9866 2025-07-22 02:11:22,476 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53130, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_766676192_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759390_18566, duration(ns): 23878081 2025-07-22 02:11:22,477 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759390_18566, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-22 02:11:27,288 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759390_18566 replica FinalizedReplica, blk_1073759390_18566, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759390 for deletion 2025-07-22 02:11:27,290 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759390_18566 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759390 2025-07-22 02:17:22,457 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759396_18572 src: /192.168.158.7:39700 dest: /192.168.158.4:9866 2025-07-22 02:17:22,476 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:39700, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_743822007_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759396_18572, duration(ns): 16837547 2025-07-22 02:17:22,477 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759396_18572, type=LAST_IN_PIPELINE terminating 2025-07-22 02:17:24,301 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759396_18572 replica FinalizedReplica, blk_1073759396_18572, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759396 for deletion 2025-07-22 02:17:24,302 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759396_18572 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759396 2025-07-22 02:20:27,463 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759399_18575 src: /192.168.158.9:43240 dest: /192.168.158.4:9866 2025-07-22 02:20:27,482 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:43240, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2021469430_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759399_18575, duration(ns): 16855371 2025-07-22 02:20:27,483 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759399_18575, type=LAST_IN_PIPELINE terminating 2025-07-22 02:20:30,309 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759399_18575 replica FinalizedReplica, blk_1073759399_18575, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759399 for deletion 2025-07-22 02:20:30,310 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759399_18575 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759399 2025-07-22 02:22:27,479 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759401_18577 src: /192.168.158.6:48282 dest: /192.168.158.4:9866 2025-07-22 02:22:27,500 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:48282, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1413797214_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759401_18577, duration(ns): 18180528 2025-07-22 02:22:27,500 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759401_18577, type=LAST_IN_PIPELINE terminating 2025-07-22 02:22:30,315 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759401_18577 replica FinalizedReplica, blk_1073759401_18577, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759401 for deletion 2025-07-22 02:22:30,316 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759401_18577 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759401 2025-07-22 02:23:27,467 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759402_18578 src: /192.168.158.5:33578 dest: /192.168.158.4:9866 2025-07-22 02:23:27,494 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:33578, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1089347853_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759402_18578, duration(ns): 20139238 2025-07-22 02:23:27,494 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759402_18578, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-22 02:23:30,319 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759402_18578 replica FinalizedReplica, blk_1073759402_18578, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759402 for deletion 2025-07-22 02:23:30,320 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759402_18578 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759402 2025-07-22 02:24:27,501 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759403_18579 src: /192.168.158.1:52432 dest: /192.168.158.4:9866 2025-07-22 02:24:27,542 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52432, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-387460471_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759403_18579, duration(ns): 29396349 2025-07-22 02:24:27,543 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759403_18579, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-22 02:24:33,323 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759403_18579 replica FinalizedReplica, blk_1073759403_18579, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759403 for deletion 2025-07-22 02:24:33,324 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759403_18579 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759403 2025-07-22 02:26:27,501 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759405_18581 src: /192.168.158.5:36684 dest: /192.168.158.4:9866 2025-07-22 02:26:27,529 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:36684, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-366584145_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759405_18581, duration(ns): 21654123 2025-07-22 02:26:27,529 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759405_18581, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-22 02:26:30,330 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759405_18581 replica FinalizedReplica, blk_1073759405_18581, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759405 for deletion 2025-07-22 02:26:30,331 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759405_18581 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759405 2025-07-22 02:28:32,501 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759407_18583 src: /192.168.158.1:42780 dest: /192.168.158.4:9866 2025-07-22 02:28:32,536 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42780, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2094739284_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759407_18583, duration(ns): 23976030 2025-07-22 02:28:32,537 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759407_18583, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-22 02:28:36,334 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759407_18583 replica FinalizedReplica, blk_1073759407_18583, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759407 for deletion 2025-07-22 02:28:36,335 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759407_18583 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759407 2025-07-22 02:29:32,495 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759408_18584 src: /192.168.158.1:34858 dest: /192.168.158.4:9866 2025-07-22 02:29:32,530 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34858, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-635480432_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759408_18584, duration(ns): 24767073 2025-07-22 02:29:32,532 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759408_18584, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-22 02:29:33,336 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759408_18584 replica FinalizedReplica, blk_1073759408_18584, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759408 for deletion 2025-07-22 02:29:33,337 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759408_18584 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759408 2025-07-22 02:30:37,501 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759409_18585 src: /192.168.158.1:53414 dest: /192.168.158.4:9866 2025-07-22 02:30:37,536 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53414, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1179590798_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759409_18585, duration(ns): 25462582 2025-07-22 02:30:37,536 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759409_18585, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-22 02:30:39,337 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759409_18585 replica FinalizedReplica, blk_1073759409_18585, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759409 for deletion 2025-07-22 02:30:39,338 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759409_18585 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759409 2025-07-22 02:31:37,512 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759410_18586 src: /192.168.158.7:48200 dest: /192.168.158.4:9866 2025-07-22 02:31:37,542 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:48200, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-582403340_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759410_18586, duration(ns): 23154549 2025-07-22 02:31:37,542 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759410_18586, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-22 02:31:42,338 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759410_18586 replica FinalizedReplica, blk_1073759410_18586, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759410 for deletion 2025-07-22 02:31:42,339 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759410_18586 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759410 2025-07-22 02:34:42,540 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759413_18589 src: /192.168.158.6:51732 dest: /192.168.158.4:9866 2025-07-22 02:34:42,560 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:51732, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1126980171_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759413_18589, duration(ns): 17758712 2025-07-22 02:34:42,560 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759413_18589, type=LAST_IN_PIPELINE terminating 2025-07-22 02:34:45,351 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759413_18589 replica FinalizedReplica, blk_1073759413_18589, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759413 for deletion 2025-07-22 02:34:45,353 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759413_18589 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759413 2025-07-22 02:35:42,512 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759414_18590 src: /192.168.158.7:47850 dest: /192.168.158.4:9866 2025-07-22 02:35:42,532 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:47850, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1583338176_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759414_18590, duration(ns): 16777134 2025-07-22 02:35:42,532 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759414_18590, type=LAST_IN_PIPELINE terminating 2025-07-22 02:35:45,352 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759414_18590 replica FinalizedReplica, blk_1073759414_18590, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759414 for deletion 2025-07-22 02:35:45,353 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759414_18590 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759414 2025-07-22 02:36:47,536 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759415_18591 src: /192.168.158.6:52270 dest: /192.168.158.4:9866 2025-07-22 02:36:47,555 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:52270, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-390271835_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759415_18591, duration(ns): 16608773 2025-07-22 02:36:47,555 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759415_18591, type=LAST_IN_PIPELINE terminating 2025-07-22 02:36:51,355 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759415_18591 replica FinalizedReplica, blk_1073759415_18591, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759415 for deletion 2025-07-22 02:36:51,357 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759415_18591 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759415 2025-07-22 02:37:47,513 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759416_18592 src: /192.168.158.7:46994 dest: /192.168.158.4:9866 2025-07-22 02:37:47,533 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:46994, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-747453199_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759416_18592, duration(ns): 17893910 2025-07-22 02:37:47,533 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759416_18592, type=LAST_IN_PIPELINE terminating 2025-07-22 02:37:48,361 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759416_18592 replica FinalizedReplica, blk_1073759416_18592, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759416 for deletion 2025-07-22 02:37:48,363 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759416_18592 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759416 2025-07-22 02:38:47,505 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759417_18593 src: /192.168.158.1:56194 dest: /192.168.158.4:9866 2025-07-22 02:38:47,541 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56194, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-241793240_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759417_18593, duration(ns): 25460090 2025-07-22 02:38:47,542 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759417_18593, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-22 02:38:51,361 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759417_18593 replica FinalizedReplica, blk_1073759417_18593, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759417 for deletion 2025-07-22 02:38:51,362 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759417_18593 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759417 2025-07-22 02:39:47,546 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759418_18594 src: /192.168.158.7:55286 dest: /192.168.158.4:9866 2025-07-22 02:39:47,571 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:55286, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-862305326_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759418_18594, duration(ns): 19203126 2025-07-22 02:39:47,572 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759418_18594, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-22 02:39:48,362 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759418_18594 replica FinalizedReplica, blk_1073759418_18594, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759418 for deletion 2025-07-22 02:39:48,363 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759418_18594 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759418 2025-07-22 02:43:00,376 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Successfully sent block report 0x6cb5f0b7da23e676, containing 4 storage report(s), of which we sent 4. The reports had 8 total blocks and used 1 RPC(s). This took 0 msec to generate and 6 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5. 2025-07-22 02:43:00,376 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Got finalize command for block pool BP-1059995147-192.168.158.1-1752101929360 2025-07-22 02:44:52,521 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759423_18599 src: /192.168.158.9:40414 dest: /192.168.158.4:9866 2025-07-22 02:44:52,541 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:40414, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_338717104_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759423_18599, duration(ns): 16837306 2025-07-22 02:44:52,541 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759423_18599, type=LAST_IN_PIPELINE terminating 2025-07-22 02:44:57,376 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759423_18599 replica FinalizedReplica, blk_1073759423_18599, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759423 for deletion 2025-07-22 02:44:57,377 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759423_18599 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759423 2025-07-22 02:46:52,525 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759425_18601 src: /192.168.158.8:51116 dest: /192.168.158.4:9866 2025-07-22 02:46:52,552 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:51116, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1852430423_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759425_18601, duration(ns): 20492879 2025-07-22 02:46:52,552 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759425_18601, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-22 02:46:54,380 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759425_18601 replica FinalizedReplica, blk_1073759425_18601, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759425 for deletion 2025-07-22 02:46:54,381 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759425_18601 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759425 2025-07-22 02:47:57,526 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759426_18602 src: /192.168.158.8:58124 dest: /192.168.158.4:9866 2025-07-22 02:47:57,547 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:58124, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_748473477_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759426_18602, duration(ns): 18711530 2025-07-22 02:47:57,548 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759426_18602, type=LAST_IN_PIPELINE terminating 2025-07-22 02:48:00,384 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759426_18602 replica FinalizedReplica, blk_1073759426_18602, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759426 for deletion 2025-07-22 02:48:00,385 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759426_18602 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759426 2025-07-22 02:48:57,529 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759427_18603 src: /192.168.158.5:33940 dest: /192.168.158.4:9866 2025-07-22 02:48:57,555 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:33940, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1217917698_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759427_18603, duration(ns): 20349259 2025-07-22 02:48:57,555 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759427_18603, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-22 02:49:03,387 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759427_18603 replica FinalizedReplica, blk_1073759427_18603, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759427 for deletion 2025-07-22 02:49:03,388 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759427_18603 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759427 2025-07-22 02:56:02,539 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759434_18610 src: /192.168.158.7:53334 dest: /192.168.158.4:9866 2025-07-22 02:56:02,558 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:53334, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_689615160_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759434_18610, duration(ns): 16120574 2025-07-22 02:56:02,558 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759434_18610, type=LAST_IN_PIPELINE terminating 2025-07-22 02:56:06,438 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759434_18610 replica FinalizedReplica, blk_1073759434_18610, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759434 for deletion 2025-07-22 02:56:06,439 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759434_18610 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759434 2025-07-22 02:57:02,537 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759435_18611 src: /192.168.158.7:58566 dest: /192.168.158.4:9866 2025-07-22 02:57:02,566 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:58566, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-882582932_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759435_18611, duration(ns): 23155969 2025-07-22 02:57:02,567 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759435_18611, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-22 02:57:03,439 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759435_18611 replica FinalizedReplica, blk_1073759435_18611, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759435 for deletion 2025-07-22 02:57:03,440 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759435_18611 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759435 2025-07-22 03:00:07,542 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759438_18614 src: /192.168.158.5:50748 dest: /192.168.158.4:9866 2025-07-22 03:00:07,563 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:50748, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-318668937_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759438_18614, duration(ns): 18149101 2025-07-22 03:00:07,564 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759438_18614, type=LAST_IN_PIPELINE terminating 2025-07-22 03:00:09,450 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759438_18614 replica FinalizedReplica, blk_1073759438_18614, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759438 for deletion 2025-07-22 03:00:09,451 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759438_18614 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759438 2025-07-22 03:01:07,543 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759439_18615 src: /192.168.158.1:37940 dest: /192.168.158.4:9866 2025-07-22 03:01:07,576 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37940, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1378026980_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759439_18615, duration(ns): 23964861 2025-07-22 03:01:07,577 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759439_18615, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-22 03:01:09,455 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759439_18615 replica FinalizedReplica, blk_1073759439_18615, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759439 for deletion 2025-07-22 03:01:09,456 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759439_18615 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759439 2025-07-22 03:02:07,548 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759440_18616 src: /192.168.158.9:45918 dest: /192.168.158.4:9866 2025-07-22 03:02:07,571 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:45918, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1157116519_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759440_18616, duration(ns): 19425402 2025-07-22 03:02:07,571 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759440_18616, type=LAST_IN_PIPELINE terminating 2025-07-22 03:02:09,459 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759440_18616 replica FinalizedReplica, blk_1073759440_18616, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759440 for deletion 2025-07-22 03:02:09,461 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759440_18616 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759440 2025-07-22 03:05:07,558 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759443_18619 src: /192.168.158.8:51868 dest: /192.168.158.4:9866 2025-07-22 03:05:07,578 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:51868, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_574741346_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759443_18619, duration(ns): 17910729 2025-07-22 03:05:07,579 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759443_18619, type=LAST_IN_PIPELINE terminating 2025-07-22 03:05:12,465 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759443_18619 replica FinalizedReplica, blk_1073759443_18619, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759443 for deletion 2025-07-22 03:05:12,467 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759443_18619 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759443 2025-07-22 03:06:12,554 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759444_18620 src: /192.168.158.9:36808 dest: /192.168.158.4:9866 2025-07-22 03:06:12,584 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:36808, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_363448143_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759444_18620, duration(ns): 23626968 2025-07-22 03:06:12,584 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759444_18620, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-22 03:06:18,473 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759444_18620 replica FinalizedReplica, blk_1073759444_18620, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759444 for deletion 2025-07-22 03:06:18,474 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759444_18620 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759444 2025-07-22 03:10:17,554 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759448_18624 src: /192.168.158.8:33712 dest: /192.168.158.4:9866 2025-07-22 03:10:17,583 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:33712, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1372453567_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759448_18624, duration(ns): 21008944 2025-07-22 03:10:17,584 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759448_18624, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-22 03:10:18,480 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759448_18624 replica FinalizedReplica, blk_1073759448_18624, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759448 for deletion 2025-07-22 03:10:18,481 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759448_18624 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759448 2025-07-22 03:11:17,558 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759449_18625 src: /192.168.158.7:49574 dest: /192.168.158.4:9866 2025-07-22 03:11:17,589 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:49574, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1343151438_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759449_18625, duration(ns): 23002708 2025-07-22 03:11:17,591 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759449_18625, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-22 03:11:18,483 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759449_18625 replica FinalizedReplica, blk_1073759449_18625, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759449 for deletion 2025-07-22 03:11:18,485 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759449_18625 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759449 2025-07-22 03:13:17,557 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759451_18627 src: /192.168.158.9:42414 dest: /192.168.158.4:9866 2025-07-22 03:13:17,584 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:42414, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_204215936_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759451_18627, duration(ns): 19683109 2025-07-22 03:13:17,585 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759451_18627, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-22 03:13:18,487 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759451_18627 replica FinalizedReplica, blk_1073759451_18627, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759451 for deletion 2025-07-22 03:13:18,489 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759451_18627 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759451 2025-07-22 03:14:17,559 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759452_18628 src: /192.168.158.1:40552 dest: /192.168.158.4:9866 2025-07-22 03:14:17,594 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40552, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1711313871_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759452_18628, duration(ns): 24974536 2025-07-22 03:14:17,596 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759452_18628, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-22 03:14:18,487 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759452_18628 replica FinalizedReplica, blk_1073759452_18628, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759452 for deletion 2025-07-22 03:14:18,488 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759452_18628 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759452 2025-07-22 03:15:17,567 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759453_18629 src: /192.168.158.9:55450 dest: /192.168.158.4:9866 2025-07-22 03:15:17,594 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:55450, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_731716481_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759453_18629, duration(ns): 20575984 2025-07-22 03:15:17,595 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759453_18629, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-22 03:15:18,491 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759453_18629 replica FinalizedReplica, blk_1073759453_18629, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759453 for deletion 2025-07-22 03:15:18,492 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759453_18629 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759453 2025-07-22 03:16:17,562 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759454_18630 src: /192.168.158.5:36888 dest: /192.168.158.4:9866 2025-07-22 03:16:17,588 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:36888, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-97557400_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759454_18630, duration(ns): 20009206 2025-07-22 03:16:17,590 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759454_18630, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-22 03:16:18,492 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759454_18630 replica FinalizedReplica, blk_1073759454_18630, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759454 for deletion 2025-07-22 03:16:18,494 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759454_18630 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759454 2025-07-22 03:17:17,569 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759455_18631 src: /192.168.158.9:52412 dest: /192.168.158.4:9866 2025-07-22 03:17:17,596 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:52412, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1916226468_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759455_18631, duration(ns): 20829736 2025-07-22 03:17:17,596 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759455_18631, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-22 03:17:18,493 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759455_18631 replica FinalizedReplica, blk_1073759455_18631, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759455 for deletion 2025-07-22 03:17:18,494 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759455_18631 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759455 2025-07-22 03:18:22,572 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759456_18632 src: /192.168.158.6:46326 dest: /192.168.158.4:9866 2025-07-22 03:18:22,599 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:46326, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-963818683_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759456_18632, duration(ns): 20965699 2025-07-22 03:18:22,599 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759456_18632, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-22 03:18:27,496 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759456_18632 replica FinalizedReplica, blk_1073759456_18632, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759456 for deletion 2025-07-22 03:18:27,497 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759456_18632 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759456 2025-07-22 03:20:22,571 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759458_18634 src: /192.168.158.1:44010 dest: /192.168.158.4:9866 2025-07-22 03:20:22,607 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44010, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1176323891_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759458_18634, duration(ns): 25548031 2025-07-22 03:20:22,607 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759458_18634, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-22 03:20:24,504 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759458_18634 replica FinalizedReplica, blk_1073759458_18634, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759458 for deletion 2025-07-22 03:20:24,505 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759458_18634 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759458 2025-07-22 03:21:22,566 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759459_18635 src: /192.168.158.1:35864 dest: /192.168.158.4:9866 2025-07-22 03:21:22,599 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35864, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1615862498_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759459_18635, duration(ns): 23267154 2025-07-22 03:21:22,600 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759459_18635, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-22 03:21:24,507 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759459_18635 replica FinalizedReplica, blk_1073759459_18635, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759459 for deletion 2025-07-22 03:21:24,508 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759459_18635 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759459 2025-07-22 03:23:22,580 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759461_18637 src: /192.168.158.7:47464 dest: /192.168.158.4:9866 2025-07-22 03:23:22,602 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:47464, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1720955987_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759461_18637, duration(ns): 18965676 2025-07-22 03:23:22,602 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759461_18637, type=LAST_IN_PIPELINE terminating 2025-07-22 03:23:27,514 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759461_18637 replica FinalizedReplica, blk_1073759461_18637, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759461 for deletion 2025-07-22 03:23:27,515 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759461_18637 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759461 2025-07-22 03:24:22,574 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759462_18638 src: /192.168.158.5:48234 dest: /192.168.158.4:9866 2025-07-22 03:24:22,600 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:48234, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1311267881_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759462_18638, duration(ns): 19635771 2025-07-22 03:24:22,600 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759462_18638, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-22 03:24:27,516 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759462_18638 replica FinalizedReplica, blk_1073759462_18638, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759462 for deletion 2025-07-22 03:24:27,518 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759462_18638 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759462 2025-07-22 03:28:27,582 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759466_18642 src: /192.168.158.6:59740 dest: /192.168.158.4:9866 2025-07-22 03:28:27,601 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:59740, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_922435772_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759466_18642, duration(ns): 17122422 2025-07-22 03:28:27,602 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759466_18642, type=LAST_IN_PIPELINE terminating 2025-07-22 03:28:30,524 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759466_18642 replica FinalizedReplica, blk_1073759466_18642, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759466 for deletion 2025-07-22 03:28:30,525 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759466_18642 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759466 2025-07-22 03:32:37,591 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759470_18646 src: /192.168.158.9:42018 dest: /192.168.158.4:9866 2025-07-22 03:32:37,613 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:42018, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-218810793_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759470_18646, duration(ns): 19440761 2025-07-22 03:32:37,613 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759470_18646, type=LAST_IN_PIPELINE terminating 2025-07-22 03:32:39,529 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759470_18646 replica FinalizedReplica, blk_1073759470_18646, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759470 for deletion 2025-07-22 03:32:39,531 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759470_18646 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759470 2025-07-22 03:34:37,596 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759472_18648 src: /192.168.158.5:54850 dest: /192.168.158.4:9866 2025-07-22 03:34:37,621 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:54850, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-427813725_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759472_18648, duration(ns): 19306316 2025-07-22 03:34:37,621 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759472_18648, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-22 03:34:39,534 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759472_18648 replica FinalizedReplica, blk_1073759472_18648, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759472 for deletion 2025-07-22 03:34:39,536 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759472_18648 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759472 2025-07-22 03:40:47,601 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759478_18654 src: /192.168.158.9:38404 dest: /192.168.158.4:9866 2025-07-22 03:40:47,630 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:38404, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_55361760_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759478_18654, duration(ns): 22231245 2025-07-22 03:40:47,630 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759478_18654, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-22 03:40:54,542 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759478_18654 replica FinalizedReplica, blk_1073759478_18654, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759478 for deletion 2025-07-22 03:40:54,543 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759478_18654 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759478 2025-07-22 03:41:47,600 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759479_18655 src: /192.168.158.8:54704 dest: /192.168.158.4:9866 2025-07-22 03:41:47,629 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:54704, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1064735751_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759479_18655, duration(ns): 23018222 2025-07-22 03:41:47,629 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759479_18655, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-22 03:41:54,549 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759479_18655 replica FinalizedReplica, blk_1073759479_18655, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759479 for deletion 2025-07-22 03:41:54,550 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759479_18655 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759479 2025-07-22 03:43:47,607 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759481_18657 src: /192.168.158.6:40654 dest: /192.168.158.4:9866 2025-07-22 03:43:47,636 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:40654, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1617645417_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759481_18657, duration(ns): 22153383 2025-07-22 03:43:47,636 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759481_18657, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-22 03:43:51,556 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759481_18657 replica FinalizedReplica, blk_1073759481_18657, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759481 for deletion 2025-07-22 03:43:51,557 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759481_18657 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759481 2025-07-22 03:44:47,617 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759482_18658 src: /192.168.158.9:50416 dest: /192.168.158.4:9866 2025-07-22 03:44:47,637 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:50416, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_626493899_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759482_18658, duration(ns): 17352672 2025-07-22 03:44:47,637 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759482_18658, type=LAST_IN_PIPELINE terminating 2025-07-22 03:44:51,557 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759482_18658 replica FinalizedReplica, blk_1073759482_18658, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759482 for deletion 2025-07-22 03:44:51,559 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759482_18658 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759482 2025-07-22 03:45:47,606 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759483_18659 src: /192.168.158.1:39494 dest: /192.168.158.4:9866 2025-07-22 03:45:47,639 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39494, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1594713502_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759483_18659, duration(ns): 23787984 2025-07-22 03:45:47,640 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759483_18659, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-22 03:45:54,563 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759483_18659 replica FinalizedReplica, blk_1073759483_18659, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759483 for deletion 2025-07-22 03:45:54,565 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759483_18659 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759483 2025-07-22 03:46:47,616 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759484_18660 src: /192.168.158.6:58828 dest: /192.168.158.4:9866 2025-07-22 03:46:47,638 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:58828, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1112791713_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759484_18660, duration(ns): 19102142 2025-07-22 03:46:47,638 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759484_18660, type=LAST_IN_PIPELINE terminating 2025-07-22 03:46:54,564 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759484_18660 replica FinalizedReplica, blk_1073759484_18660, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759484 for deletion 2025-07-22 03:46:54,565 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759484_18660 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759484 2025-07-22 03:48:57,606 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759486_18662 src: /192.168.158.1:35520 dest: /192.168.158.4:9866 2025-07-22 03:48:57,639 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35520, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1857704200_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759486_18662, duration(ns): 23692808 2025-07-22 03:48:57,640 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759486_18662, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-22 03:49:03,568 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759486_18662 replica FinalizedReplica, blk_1073759486_18662, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759486 for deletion 2025-07-22 03:49:03,569 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759486_18662 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir4/blk_1073759486 2025-07-22 03:51:57,612 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759489_18665 src: /192.168.158.5:55396 dest: /192.168.158.4:9866 2025-07-22 03:51:57,639 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:55396, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_427374614_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759489_18665, duration(ns): 21258321 2025-07-22 03:51:57,639 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759489_18665, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-22 03:52:03,573 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759489_18665 replica FinalizedReplica, blk_1073759489_18665, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759489 for deletion 2025-07-22 03:52:03,574 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759489_18665 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759489 2025-07-22 03:52:57,614 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759490_18666 src: /192.168.158.9:53484 dest: /192.168.158.4:9866 2025-07-22 03:52:57,634 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:53484, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1365773312_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759490_18666, duration(ns): 17752410 2025-07-22 03:52:57,635 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759490_18666, type=LAST_IN_PIPELINE terminating 2025-07-22 03:53:03,576 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759490_18666 replica FinalizedReplica, blk_1073759490_18666, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759490 for deletion 2025-07-22 03:53:03,577 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759490_18666 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759490 2025-07-22 03:54:02,619 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759491_18667 src: /192.168.158.5:36258 dest: /192.168.158.4:9866 2025-07-22 03:54:02,640 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:36258, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-386947078_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759491_18667, duration(ns): 17934874 2025-07-22 03:54:02,640 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759491_18667, type=LAST_IN_PIPELINE terminating 2025-07-22 03:54:06,578 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759491_18667 replica FinalizedReplica, blk_1073759491_18667, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759491 for deletion 2025-07-22 03:54:06,580 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759491_18667 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759491 2025-07-22 03:55:02,626 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759492_18668 src: /192.168.158.6:48314 dest: /192.168.158.4:9866 2025-07-22 03:55:02,646 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:48314, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-559415209_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759492_18668, duration(ns): 17321554 2025-07-22 03:55:02,646 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759492_18668, type=LAST_IN_PIPELINE terminating 2025-07-22 03:55:09,583 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759492_18668 replica FinalizedReplica, blk_1073759492_18668, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759492 for deletion 2025-07-22 03:55:09,584 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759492_18668 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759492 2025-07-22 03:56:02,622 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759493_18669 src: /192.168.158.7:43364 dest: /192.168.158.4:9866 2025-07-22 03:56:02,648 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:43364, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1196823302_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759493_18669, duration(ns): 20071697 2025-07-22 03:56:02,648 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759493_18669, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-22 03:56:06,585 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759493_18669 replica FinalizedReplica, blk_1073759493_18669, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759493 for deletion 2025-07-22 03:56:06,587 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759493_18669 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759493 2025-07-22 03:57:07,616 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759494_18670 src: /192.168.158.1:47304 dest: /192.168.158.4:9866 2025-07-22 03:57:07,651 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47304, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-994822212_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759494_18670, duration(ns): 25581733 2025-07-22 03:57:07,652 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759494_18670, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-22 03:57:12,587 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759494_18670 replica FinalizedReplica, blk_1073759494_18670, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759494 for deletion 2025-07-22 03:57:12,588 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759494_18670 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759494 2025-07-22 03:59:12,616 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759496_18672 src: /192.168.158.1:42582 dest: /192.168.158.4:9866 2025-07-22 03:59:12,648 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:42582, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1677001794_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759496_18672, duration(ns): 22245915 2025-07-22 03:59:12,648 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759496_18672, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-22 03:59:15,595 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759496_18672 replica FinalizedReplica, blk_1073759496_18672, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759496 for deletion 2025-07-22 03:59:15,596 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759496_18672 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759496 2025-07-22 04:00:12,625 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759497_18673 src: /192.168.158.8:44542 dest: /192.168.158.4:9866 2025-07-22 04:00:12,642 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:44542, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1293163990_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759497_18673, duration(ns): 15546286 2025-07-22 04:00:12,643 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759497_18673, type=LAST_IN_PIPELINE terminating 2025-07-22 04:00:15,600 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759497_18673 replica FinalizedReplica, blk_1073759497_18673, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759497 for deletion 2025-07-22 04:00:15,601 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759497_18673 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759497 2025-07-22 04:01:17,625 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759498_18674 src: /192.168.158.5:40954 dest: /192.168.158.4:9866 2025-07-22 04:01:17,649 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:40954, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1099166056_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759498_18674, duration(ns): 18884644 2025-07-22 04:01:17,651 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759498_18674, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-22 04:01:24,606 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759498_18674 replica FinalizedReplica, blk_1073759498_18674, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759498 for deletion 2025-07-22 04:01:24,607 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759498_18674 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759498 2025-07-22 04:04:27,615 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759501_18677 src: /192.168.158.1:44580 dest: /192.168.158.4:9866 2025-07-22 04:04:27,647 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44580, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1626592168_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759501_18677, duration(ns): 22507319 2025-07-22 04:04:27,649 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759501_18677, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.8:9866] terminating 2025-07-22 04:04:33,612 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759501_18677 replica FinalizedReplica, blk_1073759501_18677, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759501 for deletion 2025-07-22 04:04:33,613 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759501_18677 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759501 2025-07-22 04:10:42,626 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759507_18683 src: /192.168.158.1:47220 dest: /192.168.158.4:9866 2025-07-22 04:10:42,659 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47220, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1020794806_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759507_18683, duration(ns): 23256238 2025-07-22 04:10:42,659 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759507_18683, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-22 04:10:48,625 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759507_18683 replica FinalizedReplica, blk_1073759507_18683, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759507 for deletion 2025-07-22 04:10:48,626 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759507_18683 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759507 2025-07-22 04:11:42,631 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759508_18684 src: /192.168.158.6:47424 dest: /192.168.158.4:9866 2025-07-22 04:11:42,656 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:47424, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_484365697_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759508_18684, duration(ns): 19107906 2025-07-22 04:11:42,657 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759508_18684, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-22 04:11:45,626 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759508_18684 replica FinalizedReplica, blk_1073759508_18684, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759508 for deletion 2025-07-22 04:11:45,628 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759508_18684 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759508 2025-07-22 04:15:47,633 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759512_18688 src: /192.168.158.1:36582 dest: /192.168.158.4:9866 2025-07-22 04:15:47,669 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36582, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_49810796_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759512_18688, duration(ns): 24983448 2025-07-22 04:15:47,669 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759512_18688, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-22 04:15:51,638 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759512_18688 replica FinalizedReplica, blk_1073759512_18688, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759512 for deletion 2025-07-22 04:15:51,639 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759512_18688 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759512 2025-07-22 04:17:47,646 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759514_18690 src: /192.168.158.5:42662 dest: /192.168.158.4:9866 2025-07-22 04:17:47,666 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:42662, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-145231035_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759514_18690, duration(ns): 17096426 2025-07-22 04:17:47,666 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759514_18690, type=LAST_IN_PIPELINE terminating 2025-07-22 04:17:51,641 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759514_18690 replica FinalizedReplica, blk_1073759514_18690, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759514 for deletion 2025-07-22 04:17:51,643 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759514_18690 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759514 2025-07-22 04:19:52,657 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759516_18692 src: /192.168.158.6:51070 dest: /192.168.158.4:9866 2025-07-22 04:19:52,684 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:51070, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1614719913_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759516_18692, duration(ns): 20811395 2025-07-22 04:19:52,686 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759516_18692, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-22 04:20:00,646 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759516_18692 replica FinalizedReplica, blk_1073759516_18692, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759516 for deletion 2025-07-22 04:20:00,647 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759516_18692 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759516 2025-07-22 04:20:52,636 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759517_18693 src: /192.168.158.1:50284 dest: /192.168.158.4:9866 2025-07-22 04:20:52,673 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50284, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_242009139_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759517_18693, duration(ns): 25885549 2025-07-22 04:20:52,674 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759517_18693, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-22 04:20:57,647 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759517_18693 replica FinalizedReplica, blk_1073759517_18693, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759517 for deletion 2025-07-22 04:20:57,648 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759517_18693 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759517 2025-07-22 04:22:52,639 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759519_18695 src: /192.168.158.1:45084 dest: /192.168.158.4:9866 2025-07-22 04:22:52,673 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45084, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2080256027_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759519_18695, duration(ns): 23939263 2025-07-22 04:22:52,673 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759519_18695, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-22 04:23:00,656 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759519_18695 replica FinalizedReplica, blk_1073759519_18695, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759519 for deletion 2025-07-22 04:23:00,657 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759519_18695 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759519 2025-07-22 04:23:52,644 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759520_18696 src: /192.168.158.8:51002 dest: /192.168.158.4:9866 2025-07-22 04:23:52,670 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:51002, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1425177230_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759520_18696, duration(ns): 19797488 2025-07-22 04:23:52,670 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759520_18696, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-22 04:24:00,661 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759520_18696 replica FinalizedReplica, blk_1073759520_18696, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759520 for deletion 2025-07-22 04:24:00,662 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759520_18696 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759520 2025-07-22 04:31:57,660 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759528_18704 src: /192.168.158.7:49842 dest: /192.168.158.4:9866 2025-07-22 04:31:57,685 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:49842, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1475235266_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759528_18704, duration(ns): 18817982 2025-07-22 04:31:57,686 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759528_18704, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-22 04:32:03,679 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759528_18704 replica FinalizedReplica, blk_1073759528_18704, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759528 for deletion 2025-07-22 04:32:03,680 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759528_18704 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759528 2025-07-22 04:34:57,670 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759531_18707 src: /192.168.158.9:60042 dest: /192.168.158.4:9866 2025-07-22 04:34:57,692 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:60042, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_795982965_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759531_18707, duration(ns): 19838254 2025-07-22 04:34:57,692 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759531_18707, type=LAST_IN_PIPELINE terminating 2025-07-22 04:35:00,686 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759531_18707 replica FinalizedReplica, blk_1073759531_18707, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759531 for deletion 2025-07-22 04:35:00,687 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759531_18707 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759531 2025-07-22 04:38:02,673 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759534_18710 src: /192.168.158.9:56248 dest: /192.168.158.4:9866 2025-07-22 04:38:02,693 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:56248, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1546172252_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759534_18710, duration(ns): 18175936 2025-07-22 04:38:02,693 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759534_18710, type=LAST_IN_PIPELINE terminating 2025-07-22 04:38:09,693 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759534_18710 replica FinalizedReplica, blk_1073759534_18710, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759534 for deletion 2025-07-22 04:38:09,694 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759534_18710 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759534 2025-07-22 04:42:07,682 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759538_18714 src: /192.168.158.9:34498 dest: /192.168.158.4:9866 2025-07-22 04:42:07,701 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:34498, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1780417249_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759538_18714, duration(ns): 16010294 2025-07-22 04:42:07,701 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759538_18714, type=LAST_IN_PIPELINE terminating 2025-07-22 04:42:15,704 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759538_18714 replica FinalizedReplica, blk_1073759538_18714, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759538 for deletion 2025-07-22 04:42:15,706 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759538_18714 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759538 2025-07-22 04:43:07,680 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759539_18715 src: /192.168.158.1:41530 dest: /192.168.158.4:9866 2025-07-22 04:43:07,711 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41530, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-573538576_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759539_18715, duration(ns): 22133522 2025-07-22 04:43:07,712 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759539_18715, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-22 04:43:12,707 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759539_18715 replica FinalizedReplica, blk_1073759539_18715, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759539 for deletion 2025-07-22 04:43:12,708 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759539_18715 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759539 2025-07-22 04:44:12,676 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759540_18716 src: /192.168.158.9:38110 dest: /192.168.158.4:9866 2025-07-22 04:44:12,701 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:38110, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-175435143_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759540_18716, duration(ns): 18905689 2025-07-22 04:44:12,701 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759540_18716, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-22 04:44:15,711 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759540_18716 replica FinalizedReplica, blk_1073759540_18716, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759540 for deletion 2025-07-22 04:44:15,712 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759540_18716 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759540 2025-07-22 04:45:12,684 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759541_18717 src: /192.168.158.9:51846 dest: /192.168.158.4:9866 2025-07-22 04:45:12,712 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:51846, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1686145360_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759541_18717, duration(ns): 22026974 2025-07-22 04:45:12,713 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759541_18717, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-22 04:45:15,711 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759541_18717 replica FinalizedReplica, blk_1073759541_18717, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759541 for deletion 2025-07-22 04:45:15,713 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759541_18717 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759541 2025-07-22 04:46:12,687 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759542_18718 src: /192.168.158.1:52004 dest: /192.168.158.4:9866 2025-07-22 04:46:12,724 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52004, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_785651161_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759542_18718, duration(ns): 26077262 2025-07-22 04:46:12,724 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759542_18718, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-22 04:46:18,709 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759542_18718 replica FinalizedReplica, blk_1073759542_18718, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759542 for deletion 2025-07-22 04:46:18,711 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759542_18718 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759542 2025-07-22 04:47:17,710 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759543_18719 src: /192.168.158.8:37482 dest: /192.168.158.4:9866 2025-07-22 04:47:17,729 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:37482, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1483727358_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759543_18719, duration(ns): 16877396 2025-07-22 04:47:17,730 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759543_18719, type=LAST_IN_PIPELINE terminating 2025-07-22 04:47:24,710 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759543_18719 replica FinalizedReplica, blk_1073759543_18719, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759543 for deletion 2025-07-22 04:47:24,711 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759543_18719 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759543 2025-07-22 04:50:17,687 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759546_18722 src: /192.168.158.1:58054 dest: /192.168.158.4:9866 2025-07-22 04:50:17,721 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58054, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-264894343_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759546_18722, duration(ns): 24157087 2025-07-22 04:50:17,721 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759546_18722, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-22 04:50:21,714 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759546_18722 replica FinalizedReplica, blk_1073759546_18722, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759546 for deletion 2025-07-22 04:50:21,716 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759546_18722 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759546 2025-07-22 04:51:17,695 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759547_18723 src: /192.168.158.5:59432 dest: /192.168.158.4:9866 2025-07-22 04:51:17,714 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:59432, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1523084413_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759547_18723, duration(ns): 16452155 2025-07-22 04:51:17,714 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759547_18723, type=LAST_IN_PIPELINE terminating 2025-07-22 04:51:24,718 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759547_18723 replica FinalizedReplica, blk_1073759547_18723, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759547 for deletion 2025-07-22 04:51:24,719 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759547_18723 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759547 2025-07-22 04:53:22,683 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759549_18725 src: /192.168.158.1:46768 dest: /192.168.158.4:9866 2025-07-22 04:53:22,715 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46768, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1983283673_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759549_18725, duration(ns): 22479395 2025-07-22 04:53:22,716 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759549_18725, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-22 04:53:27,726 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759549_18725 replica FinalizedReplica, blk_1073759549_18725, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759549 for deletion 2025-07-22 04:53:27,727 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759549_18725 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759549 2025-07-22 04:54:22,684 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759550_18726 src: /192.168.158.1:39286 dest: /192.168.158.4:9866 2025-07-22 04:54:22,720 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39286, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-62849240_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759550_18726, duration(ns): 25940336 2025-07-22 04:54:22,720 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759550_18726, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-22 04:54:27,729 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759550_18726 replica FinalizedReplica, blk_1073759550_18726, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759550 for deletion 2025-07-22 04:54:27,730 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759550_18726 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759550 2025-07-22 04:55:27,691 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759551_18727 src: /192.168.158.1:39778 dest: /192.168.158.4:9866 2025-07-22 04:55:27,726 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39778, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_251866944_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759551_18727, duration(ns): 24501443 2025-07-22 04:55:27,726 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759551_18727, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-22 04:55:30,733 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759551_18727 replica FinalizedReplica, blk_1073759551_18727, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759551 for deletion 2025-07-22 04:55:30,735 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759551_18727 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759551 2025-07-22 04:56:32,695 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759552_18728 src: /192.168.158.6:54186 dest: /192.168.158.4:9866 2025-07-22 04:56:32,717 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:54186, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1647816330_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759552_18728, duration(ns): 19756129 2025-07-22 04:56:32,718 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759552_18728, type=LAST_IN_PIPELINE terminating 2025-07-22 04:56:36,738 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759552_18728 replica FinalizedReplica, blk_1073759552_18728, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759552 for deletion 2025-07-22 04:56:36,739 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759552_18728 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759552 2025-07-22 04:57:32,695 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759553_18729 src: /192.168.158.5:53636 dest: /192.168.158.4:9866 2025-07-22 04:57:32,722 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:53636, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1052941240_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759553_18729, duration(ns): 21055453 2025-07-22 04:57:32,723 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759553_18729, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-22 04:57:39,741 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759553_18729 replica FinalizedReplica, blk_1073759553_18729, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759553 for deletion 2025-07-22 04:57:39,742 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759553_18729 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759553 2025-07-22 04:58:32,696 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759554_18730 src: /192.168.158.6:46502 dest: /192.168.158.4:9866 2025-07-22 04:58:32,716 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:46502, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1544332894_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759554_18730, duration(ns): 18102572 2025-07-22 04:58:32,716 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759554_18730, type=LAST_IN_PIPELINE terminating 2025-07-22 04:58:39,745 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759554_18730 replica FinalizedReplica, blk_1073759554_18730, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759554 for deletion 2025-07-22 04:58:39,747 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759554_18730 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759554 2025-07-22 05:00:37,699 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759556_18732 src: /192.168.158.9:54168 dest: /192.168.158.4:9866 2025-07-22 05:00:37,730 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:54168, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1916851028_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759556_18732, duration(ns): 24317654 2025-07-22 05:00:37,731 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759556_18732, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-22 05:00:45,749 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759556_18732 replica FinalizedReplica, blk_1073759556_18732, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759556 for deletion 2025-07-22 05:00:45,750 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759556_18732 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759556 2025-07-22 05:01:37,699 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759557_18733 src: /192.168.158.9:38320 dest: /192.168.158.4:9866 2025-07-22 05:01:37,725 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:38320, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1736122500_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759557_18733, duration(ns): 19391427 2025-07-22 05:01:37,725 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759557_18733, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-22 05:01:42,751 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759557_18733 replica FinalizedReplica, blk_1073759557_18733, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759557 for deletion 2025-07-22 05:01:42,752 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759557_18733 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759557 2025-07-22 05:08:47,707 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759564_18740 src: /192.168.158.1:51040 dest: /192.168.158.4:9866 2025-07-22 05:08:47,741 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51040, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1973826710_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759564_18740, duration(ns): 23822414 2025-07-22 05:08:47,741 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759564_18740, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-22 05:08:54,773 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759564_18740 replica FinalizedReplica, blk_1073759564_18740, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759564 for deletion 2025-07-22 05:08:54,774 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759564_18740 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759564 2025-07-22 05:10:52,707 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759566_18742 src: /192.168.158.8:54860 dest: /192.168.158.4:9866 2025-07-22 05:10:52,733 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:54860, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_820263489_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759566_18742, duration(ns): 20522925 2025-07-22 05:10:52,733 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759566_18742, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-22 05:11:00,779 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759566_18742 replica FinalizedReplica, blk_1073759566_18742, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759566 for deletion 2025-07-22 05:11:00,780 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759566_18742 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759566 2025-07-22 05:12:52,717 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759568_18744 src: /192.168.158.5:57042 dest: /192.168.158.4:9866 2025-07-22 05:12:52,737 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:57042, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1649162221_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759568_18744, duration(ns): 17206625 2025-07-22 05:12:52,737 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759568_18744, type=LAST_IN_PIPELINE terminating 2025-07-22 05:12:57,787 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759568_18744 replica FinalizedReplica, blk_1073759568_18744, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759568 for deletion 2025-07-22 05:12:57,788 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759568_18744 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759568 2025-07-22 05:14:52,716 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759570_18746 src: /192.168.158.1:37036 dest: /192.168.158.4:9866 2025-07-22 05:14:52,750 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37036, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2131147766_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759570_18746, duration(ns): 24055426 2025-07-22 05:14:52,750 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759570_18746, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-22 05:14:57,793 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759570_18746 replica FinalizedReplica, blk_1073759570_18746, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759570 for deletion 2025-07-22 05:14:57,794 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759570_18746 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759570 2025-07-22 05:15:52,726 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759571_18747 src: /192.168.158.6:51978 dest: /192.168.158.4:9866 2025-07-22 05:15:52,745 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:51978, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1097786085_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759571_18747, duration(ns): 17002745 2025-07-22 05:15:52,745 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759571_18747, type=LAST_IN_PIPELINE terminating 2025-07-22 05:15:57,795 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759571_18747 replica FinalizedReplica, blk_1073759571_18747, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759571 for deletion 2025-07-22 05:15:57,797 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759571_18747 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759571 2025-07-22 05:16:52,745 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759572_18748 src: /192.168.158.6:37058 dest: /192.168.158.4:9866 2025-07-22 05:16:52,764 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:37058, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-480947141_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759572_18748, duration(ns): 16452873 2025-07-22 05:16:52,764 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759572_18748, type=LAST_IN_PIPELINE terminating 2025-07-22 05:17:00,795 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759572_18748 replica FinalizedReplica, blk_1073759572_18748, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759572 for deletion 2025-07-22 05:17:00,796 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759572_18748 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759572 2025-07-22 05:17:52,719 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759573_18749 src: /192.168.158.1:44328 dest: /192.168.158.4:9866 2025-07-22 05:17:52,752 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44328, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1701632150_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759573_18749, duration(ns): 23254718 2025-07-22 05:17:52,752 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759573_18749, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-22 05:17:57,800 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759573_18749 replica FinalizedReplica, blk_1073759573_18749, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759573 for deletion 2025-07-22 05:17:57,801 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759573_18749 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759573 2025-07-22 05:18:57,724 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759574_18750 src: /192.168.158.6:60600 dest: /192.168.158.4:9866 2025-07-22 05:18:57,743 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:60600, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1435342194_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759574_18750, duration(ns): 16258671 2025-07-22 05:18:57,743 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759574_18750, type=LAST_IN_PIPELINE terminating 2025-07-22 05:19:00,805 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759574_18750 replica FinalizedReplica, blk_1073759574_18750, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759574 for deletion 2025-07-22 05:19:00,806 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759574_18750 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759574 2025-07-22 05:19:57,722 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759575_18751 src: /192.168.158.1:38064 dest: /192.168.158.4:9866 2025-07-22 05:19:57,756 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38064, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1870101736_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759575_18751, duration(ns): 24174521 2025-07-22 05:19:57,757 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759575_18751, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-22 05:20:03,806 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759575_18751 replica FinalizedReplica, blk_1073759575_18751, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759575 for deletion 2025-07-22 05:20:03,808 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759575_18751 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759575 2025-07-22 05:20:57,728 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759576_18752 src: /192.168.158.5:36234 dest: /192.168.158.4:9866 2025-07-22 05:20:57,753 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:36234, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1919503731_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759576_18752, duration(ns): 19459971 2025-07-22 05:20:57,754 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759576_18752, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-22 05:21:03,810 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759576_18752 replica FinalizedReplica, blk_1073759576_18752, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759576 for deletion 2025-07-22 05:21:03,812 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759576_18752 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759576 2025-07-22 05:22:57,732 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759578_18754 src: /192.168.158.7:57900 dest: /192.168.158.4:9866 2025-07-22 05:22:57,751 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:57900, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1903063923_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759578_18754, duration(ns): 16322790 2025-07-22 05:22:57,751 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759578_18754, type=LAST_IN_PIPELINE terminating 2025-07-22 05:23:03,815 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759578_18754 replica FinalizedReplica, blk_1073759578_18754, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759578 for deletion 2025-07-22 05:23:03,817 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759578_18754 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759578 2025-07-22 05:25:57,738 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759581_18757 src: /192.168.158.6:36510 dest: /192.168.158.4:9866 2025-07-22 05:25:57,758 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:36510, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1997557168_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759581_18757, duration(ns): 17049345 2025-07-22 05:25:57,758 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759581_18757, type=LAST_IN_PIPELINE terminating 2025-07-22 05:26:03,825 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759581_18757 replica FinalizedReplica, blk_1073759581_18757, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759581 for deletion 2025-07-22 05:26:03,826 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759581_18757 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759581 2025-07-22 05:30:07,739 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759585_18761 src: /192.168.158.7:49388 dest: /192.168.158.4:9866 2025-07-22 05:30:07,767 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:49388, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1857786622_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759585_18761, duration(ns): 21395352 2025-07-22 05:30:07,767 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759585_18761, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-22 05:30:15,836 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759585_18761 replica FinalizedReplica, blk_1073759585_18761, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759585 for deletion 2025-07-22 05:30:15,838 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759585_18761 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759585 2025-07-22 05:32:07,740 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759587_18763 src: /192.168.158.9:52182 dest: /192.168.158.4:9866 2025-07-22 05:32:07,765 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:52182, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1758662705_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759587_18763, duration(ns): 19365362 2025-07-22 05:32:07,766 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759587_18763, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-22 05:32:15,842 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759587_18763 replica FinalizedReplica, blk_1073759587_18763, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759587 for deletion 2025-07-22 05:32:15,843 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759587_18763 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759587 2025-07-22 05:36:12,745 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759591_18767 src: /192.168.158.6:47692 dest: /192.168.158.4:9866 2025-07-22 05:36:12,771 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:47692, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_56684973_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759591_18767, duration(ns): 20627362 2025-07-22 05:36:12,772 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759591_18767, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-22 05:36:15,851 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759591_18767 replica FinalizedReplica, blk_1073759591_18767, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759591 for deletion 2025-07-22 05:36:15,852 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759591_18767 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759591 2025-07-22 05:37:12,753 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759592_18768 src: /192.168.158.1:33506 dest: /192.168.158.4:9866 2025-07-22 05:37:12,788 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33506, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2073867218_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759592_18768, duration(ns): 25332695 2025-07-22 05:37:12,788 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759592_18768, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-22 05:37:15,853 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759592_18768 replica FinalizedReplica, blk_1073759592_18768, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759592 for deletion 2025-07-22 05:37:15,854 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759592_18768 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759592 2025-07-22 05:39:12,750 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759594_18770 src: /192.168.158.1:41872 dest: /192.168.158.4:9866 2025-07-22 05:39:12,785 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41872, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1447308672_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759594_18770, duration(ns): 24748933 2025-07-22 05:39:12,785 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759594_18770, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-22 05:39:15,855 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759594_18770 replica FinalizedReplica, blk_1073759594_18770, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759594 for deletion 2025-07-22 05:39:15,856 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759594_18770 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759594 2025-07-22 05:46:17,785 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759601_18777 src: /192.168.158.5:40154 dest: /192.168.158.4:9866 2025-07-22 05:46:17,804 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:40154, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-167321807_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759601_18777, duration(ns): 17524843 2025-07-22 05:46:17,805 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759601_18777, type=LAST_IN_PIPELINE terminating 2025-07-22 05:46:21,869 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759601_18777 replica FinalizedReplica, blk_1073759601_18777, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759601 for deletion 2025-07-22 05:46:21,870 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759601_18777 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759601 2025-07-22 05:50:17,780 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759605_18781 src: /192.168.158.1:34622 dest: /192.168.158.4:9866 2025-07-22 05:50:17,818 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34622, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_274170469_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759605_18781, duration(ns): 27129280 2025-07-22 05:50:17,818 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759605_18781, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-22 05:50:21,883 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759605_18781 replica FinalizedReplica, blk_1073759605_18781, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759605 for deletion 2025-07-22 05:50:21,885 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759605_18781 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759605 2025-07-22 05:51:17,791 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759606_18782 src: /192.168.158.7:59758 dest: /192.168.158.4:9866 2025-07-22 05:51:17,810 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:59758, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-733770122_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759606_18782, duration(ns): 16706458 2025-07-22 05:51:17,811 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759606_18782, type=LAST_IN_PIPELINE terminating 2025-07-22 05:51:24,885 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759606_18782 replica FinalizedReplica, blk_1073759606_18782, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759606 for deletion 2025-07-22 05:51:24,886 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759606_18782 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759606 2025-07-22 05:52:17,794 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759607_18783 src: /192.168.158.1:51950 dest: /192.168.158.4:9866 2025-07-22 05:52:17,826 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51950, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1780762055_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759607_18783, duration(ns): 22277661 2025-07-22 05:52:17,826 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759607_18783, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-22 05:52:24,887 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759607_18783 replica FinalizedReplica, blk_1073759607_18783, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759607 for deletion 2025-07-22 05:52:24,888 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759607_18783 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759607 2025-07-22 05:54:17,800 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759609_18785 src: /192.168.158.1:49360 dest: /192.168.158.4:9866 2025-07-22 05:54:17,836 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49360, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1553723071_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759609_18785, duration(ns): 25482687 2025-07-22 05:54:17,836 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759609_18785, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-22 05:54:24,892 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759609_18785 replica FinalizedReplica, blk_1073759609_18785, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759609 for deletion 2025-07-22 05:54:24,893 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759609_18785 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759609 2025-07-22 05:57:17,793 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759612_18788 src: /192.168.158.9:51970 dest: /192.168.158.4:9866 2025-07-22 05:57:17,811 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:51970, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-148138493_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759612_18788, duration(ns): 16459306 2025-07-22 05:57:17,812 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759612_18788, type=LAST_IN_PIPELINE terminating 2025-07-22 05:57:24,898 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759612_18788 replica FinalizedReplica, blk_1073759612_18788, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759612 for deletion 2025-07-22 05:57:24,899 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759612_18788 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759612 2025-07-22 06:03:22,786 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759618_18794 src: /192.168.158.5:44468 dest: /192.168.158.4:9866 2025-07-22 06:03:22,813 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:44468, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2058065970_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759618_18794, duration(ns): 19537942 2025-07-22 06:03:22,813 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759618_18794, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-22 06:03:30,914 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759618_18794 replica FinalizedReplica, blk_1073759618_18794, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759618 for deletion 2025-07-22 06:03:30,915 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759618_18794 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759618 2025-07-22 06:05:22,791 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759620_18796 src: /192.168.158.9:35730 dest: /192.168.158.4:9866 2025-07-22 06:05:22,817 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:35730, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1646093237_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759620_18796, duration(ns): 20500137 2025-07-22 06:05:22,817 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759620_18796, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-22 06:05:24,922 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759620_18796 replica FinalizedReplica, blk_1073759620_18796, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759620 for deletion 2025-07-22 06:05:24,923 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759620_18796 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759620 2025-07-22 06:07:22,799 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759622_18798 src: /192.168.158.7:44166 dest: /192.168.158.4:9866 2025-07-22 06:07:22,821 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:44166, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_765210673_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759622_18798, duration(ns): 20319437 2025-07-22 06:07:22,822 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759622_18798, type=LAST_IN_PIPELINE terminating 2025-07-22 06:07:24,929 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759622_18798 replica FinalizedReplica, blk_1073759622_18798, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759622 for deletion 2025-07-22 06:07:24,930 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759622_18798 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759622 2025-07-22 06:11:22,801 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759626_18802 src: /192.168.158.1:44184 dest: /192.168.158.4:9866 2025-07-22 06:11:22,834 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44184, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-387334594_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759626_18802, duration(ns): 23089315 2025-07-22 06:11:22,834 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759626_18802, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-22 06:11:24,934 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759626_18802 replica FinalizedReplica, blk_1073759626_18802, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759626 for deletion 2025-07-22 06:11:24,935 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759626_18802 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759626 2025-07-22 06:13:22,804 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759628_18804 src: /192.168.158.9:48274 dest: /192.168.158.4:9866 2025-07-22 06:13:22,830 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:48274, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1468468335_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759628_18804, duration(ns): 19377057 2025-07-22 06:13:22,830 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759628_18804, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-22 06:13:24,938 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759628_18804 replica FinalizedReplica, blk_1073759628_18804, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759628 for deletion 2025-07-22 06:13:24,940 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759628_18804 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759628 2025-07-22 06:14:22,806 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759629_18805 src: /192.168.158.1:50598 dest: /192.168.158.4:9866 2025-07-22 06:14:22,840 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50598, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-22194583_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759629_18805, duration(ns): 24780711 2025-07-22 06:14:22,841 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759629_18805, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-22 06:14:24,939 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759629_18805 replica FinalizedReplica, blk_1073759629_18805, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759629 for deletion 2025-07-22 06:14:24,940 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759629_18805 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759629 2025-07-22 06:15:22,808 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759630_18806 src: /192.168.158.7:49480 dest: /192.168.158.4:9866 2025-07-22 06:15:22,836 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:49480, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_739081313_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759630_18806, duration(ns): 21575753 2025-07-22 06:15:22,836 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759630_18806, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-22 06:15:27,943 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759630_18806 replica FinalizedReplica, blk_1073759630_18806, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759630 for deletion 2025-07-22 06:15:27,944 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759630_18806 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759630 2025-07-22 06:16:22,818 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759631_18807 src: /192.168.158.9:40958 dest: /192.168.158.4:9866 2025-07-22 06:16:22,837 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:40958, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_459390553_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759631_18807, duration(ns): 15995487 2025-07-22 06:16:22,837 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759631_18807, type=LAST_IN_PIPELINE terminating 2025-07-22 06:16:24,945 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759631_18807 replica FinalizedReplica, blk_1073759631_18807, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759631 for deletion 2025-07-22 06:16:24,946 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759631_18807 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759631 2025-07-22 06:17:22,814 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759632_18808 src: /192.168.158.1:35754 dest: /192.168.158.4:9866 2025-07-22 06:17:22,847 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35754, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_899536079_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759632_18808, duration(ns): 22816707 2025-07-22 06:17:22,847 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759632_18808, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-22 06:17:24,948 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759632_18808 replica FinalizedReplica, blk_1073759632_18808, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759632 for deletion 2025-07-22 06:17:24,949 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759632_18808 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759632 2025-07-22 06:18:27,819 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759633_18809 src: /192.168.158.1:52176 dest: /192.168.158.4:9866 2025-07-22 06:18:27,853 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52176, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-370676269_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759633_18809, duration(ns): 24207253 2025-07-22 06:18:27,853 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759633_18809, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-22 06:18:33,948 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759633_18809 replica FinalizedReplica, blk_1073759633_18809, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759633 for deletion 2025-07-22 06:18:33,949 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759633_18809 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759633 2025-07-22 06:22:32,821 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759637_18813 src: /192.168.158.1:36818 dest: /192.168.158.4:9866 2025-07-22 06:22:32,855 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36818, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2107241681_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759637_18813, duration(ns): 22756053 2025-07-22 06:22:32,855 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759637_18813, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-22 06:22:36,957 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759637_18813 replica FinalizedReplica, blk_1073759637_18813, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759637 for deletion 2025-07-22 06:22:36,958 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759637_18813 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759637 2025-07-22 06:24:42,819 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759639_18815 src: /192.168.158.1:35066 dest: /192.168.158.4:9866 2025-07-22 06:24:42,852 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35066, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_302025831_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759639_18815, duration(ns): 22631235 2025-07-22 06:24:42,852 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759639_18815, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-22 06:24:48,962 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759639_18815 replica FinalizedReplica, blk_1073759639_18815, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759639 for deletion 2025-07-22 06:24:48,963 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759639_18815 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759639 2025-07-22 06:29:47,827 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759644_18820 src: /192.168.158.1:48956 dest: /192.168.158.4:9866 2025-07-22 06:29:47,860 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48956, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_728589072_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759644_18820, duration(ns): 23195166 2025-07-22 06:29:47,860 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759644_18820, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-22 06:29:51,979 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759644_18820 replica FinalizedReplica, blk_1073759644_18820, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759644 for deletion 2025-07-22 06:29:51,980 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759644_18820 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759644 2025-07-22 06:30:47,836 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759645_18821 src: /192.168.158.5:37910 dest: /192.168.158.4:9866 2025-07-22 06:30:47,856 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:37910, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1009929074_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759645_18821, duration(ns): 17465744 2025-07-22 06:30:47,856 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759645_18821, type=LAST_IN_PIPELINE terminating 2025-07-22 06:30:51,982 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759645_18821 replica FinalizedReplica, blk_1073759645_18821, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759645 for deletion 2025-07-22 06:30:51,983 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759645_18821 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759645 2025-07-22 06:33:52,850 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759648_18824 src: /192.168.158.1:47246 dest: /192.168.158.4:9866 2025-07-22 06:33:52,883 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47246, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-260665890_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759648_18824, duration(ns): 23123472 2025-07-22 06:33:52,883 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759648_18824, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-22 06:33:54,992 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759648_18824 replica FinalizedReplica, blk_1073759648_18824, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759648 for deletion 2025-07-22 06:33:54,994 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759648_18824 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759648 2025-07-22 06:36:57,861 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759651_18827 src: /192.168.158.1:48592 dest: /192.168.158.4:9866 2025-07-22 06:36:57,899 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:48592, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1830856383_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759651_18827, duration(ns): 26873120 2025-07-22 06:36:57,899 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759651_18827, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-22 06:37:04,001 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759651_18827 replica FinalizedReplica, blk_1073759651_18827, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759651 for deletion 2025-07-22 06:37:04,002 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759651_18827 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759651 2025-07-22 06:39:02,875 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759653_18829 src: /192.168.158.8:58586 dest: /192.168.158.4:9866 2025-07-22 06:39:02,901 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:58586, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1823208439_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759653_18829, duration(ns): 20314672 2025-07-22 06:39:02,901 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759653_18829, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-22 06:39:07,003 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759653_18829 replica FinalizedReplica, blk_1073759653_18829, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759653 for deletion 2025-07-22 06:39:07,004 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759653_18829 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759653 2025-07-22 06:40:02,866 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759654_18830 src: /192.168.158.1:56624 dest: /192.168.158.4:9866 2025-07-22 06:40:02,898 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:56624, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_306446397_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759654_18830, duration(ns): 23161937 2025-07-22 06:40:02,898 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759654_18830, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-22 06:40:07,008 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759654_18830 replica FinalizedReplica, blk_1073759654_18830, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759654 for deletion 2025-07-22 06:40:07,009 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759654_18830 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759654 2025-07-22 06:41:02,867 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759655_18831 src: /192.168.158.5:58638 dest: /192.168.158.4:9866 2025-07-22 06:41:02,893 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:58638, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1668095345_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759655_18831, duration(ns): 19731551 2025-07-22 06:41:02,893 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759655_18831, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-22 06:41:07,008 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759655_18831 replica FinalizedReplica, blk_1073759655_18831, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759655 for deletion 2025-07-22 06:41:07,010 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759655_18831 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759655 2025-07-22 06:43:02,859 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759657_18833 src: /192.168.158.8:37018 dest: /192.168.158.4:9866 2025-07-22 06:43:02,889 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:37018, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_981793478_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759657_18833, duration(ns): 23707381 2025-07-22 06:43:02,889 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759657_18833, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-22 06:43:07,011 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759657_18833 replica FinalizedReplica, blk_1073759657_18833, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759657 for deletion 2025-07-22 06:43:07,012 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759657_18833 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759657 2025-07-22 06:46:02,862 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759660_18836 src: /192.168.158.1:58072 dest: /192.168.158.4:9866 2025-07-22 06:46:02,895 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58072, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1653188562_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759660_18836, duration(ns): 22989810 2025-07-22 06:46:02,895 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759660_18836, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-22 06:46:07,022 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759660_18836 replica FinalizedReplica, blk_1073759660_18836, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759660 for deletion 2025-07-22 06:46:07,023 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759660_18836 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759660 2025-07-22 06:47:02,888 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759661_18837 src: /192.168.158.7:44264 dest: /192.168.158.4:9866 2025-07-22 06:47:02,916 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:44264, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-823173919_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759661_18837, duration(ns): 21574231 2025-07-22 06:47:02,916 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759661_18837, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-22 06:47:07,026 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759661_18837 replica FinalizedReplica, blk_1073759661_18837, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759661 for deletion 2025-07-22 06:47:07,027 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759661_18837 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759661 2025-07-22 06:49:02,864 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759663_18839 src: /192.168.158.1:38758 dest: /192.168.158.4:9866 2025-07-22 06:49:02,899 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38758, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_693914961_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759663_18839, duration(ns): 25671435 2025-07-22 06:49:02,899 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759663_18839, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-22 06:49:07,029 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759663_18839 replica FinalizedReplica, blk_1073759663_18839, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759663 for deletion 2025-07-22 06:49:07,030 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759663_18839 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759663 2025-07-22 06:51:02,868 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759665_18841 src: /192.168.158.1:44324 dest: /192.168.158.4:9866 2025-07-22 06:51:02,901 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44324, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-734352111_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759665_18841, duration(ns): 24003589 2025-07-22 06:51:02,902 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759665_18841, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-22 06:51:07,031 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759665_18841 replica FinalizedReplica, blk_1073759665_18841, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759665 for deletion 2025-07-22 06:51:07,033 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759665_18841 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759665 2025-07-22 06:53:02,879 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759667_18843 src: /192.168.158.9:47818 dest: /192.168.158.4:9866 2025-07-22 06:53:02,897 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:47818, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2021233655_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759667_18843, duration(ns): 16482686 2025-07-22 06:53:02,898 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759667_18843, type=LAST_IN_PIPELINE terminating 2025-07-22 06:53:10,041 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759667_18843 replica FinalizedReplica, blk_1073759667_18843, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759667 for deletion 2025-07-22 06:53:10,042 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759667_18843 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759667 2025-07-22 06:54:02,878 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759668_18844 src: /192.168.158.6:42508 dest: /192.168.158.4:9866 2025-07-22 06:54:02,898 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:42508, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_552969435_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759668_18844, duration(ns): 16956853 2025-07-22 06:54:02,898 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759668_18844, type=LAST_IN_PIPELINE terminating 2025-07-22 06:54:07,046 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759668_18844 replica FinalizedReplica, blk_1073759668_18844, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759668 for deletion 2025-07-22 06:54:07,047 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759668_18844 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759668 2025-07-22 06:59:07,881 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759673_18849 src: /192.168.158.5:40002 dest: /192.168.158.4:9866 2025-07-22 06:59:07,900 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:40002, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1242104271_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759673_18849, duration(ns): 16422633 2025-07-22 06:59:07,900 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759673_18849, type=LAST_IN_PIPELINE terminating 2025-07-22 06:59:10,061 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759673_18849 replica FinalizedReplica, blk_1073759673_18849, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759673 for deletion 2025-07-22 06:59:10,062 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759673_18849 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759673 2025-07-22 07:05:12,886 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759679_18855 src: /192.168.158.6:60252 dest: /192.168.158.4:9866 2025-07-22 07:05:12,911 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:60252, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1446630664_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759679_18855, duration(ns): 19045918 2025-07-22 07:05:12,912 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759679_18855, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-22 07:05:16,077 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759679_18855 replica FinalizedReplica, blk_1073759679_18855, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759679 for deletion 2025-07-22 07:05:16,078 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759679_18855 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759679 2025-07-22 07:06:12,889 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759680_18856 src: /192.168.158.8:51352 dest: /192.168.158.4:9866 2025-07-22 07:06:12,916 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:51352, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-48695488_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759680_18856, duration(ns): 20531826 2025-07-22 07:06:12,917 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759680_18856, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-22 07:06:16,078 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759680_18856 replica FinalizedReplica, blk_1073759680_18856, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759680 for deletion 2025-07-22 07:06:16,081 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759680_18856 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759680 2025-07-22 07:08:12,892 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759682_18858 src: /192.168.158.6:37916 dest: /192.168.158.4:9866 2025-07-22 07:08:12,917 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:37916, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1372452020_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759682_18858, duration(ns): 19180510 2025-07-22 07:08:12,919 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759682_18858, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-22 07:08:19,083 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759682_18858 replica FinalizedReplica, blk_1073759682_18858, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759682 for deletion 2025-07-22 07:08:19,084 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759682_18858 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759682 2025-07-22 07:11:12,913 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759685_18861 src: /192.168.158.7:59402 dest: /192.168.158.4:9866 2025-07-22 07:11:12,941 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:59402, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-51755581_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759685_18861, duration(ns): 21336476 2025-07-22 07:11:12,943 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759685_18861, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-22 07:11:19,091 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759685_18861 replica FinalizedReplica, blk_1073759685_18861, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759685 for deletion 2025-07-22 07:11:19,092 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759685_18861 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759685 2025-07-22 07:13:12,905 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759687_18863 src: /192.168.158.9:46088 dest: /192.168.158.4:9866 2025-07-22 07:13:12,923 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:46088, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1020486523_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759687_18863, duration(ns): 16570775 2025-07-22 07:13:12,924 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759687_18863, type=LAST_IN_PIPELINE terminating 2025-07-22 07:13:19,097 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759687_18863 replica FinalizedReplica, blk_1073759687_18863, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759687 for deletion 2025-07-22 07:13:19,098 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759687_18863 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759687 2025-07-22 07:14:12,907 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759688_18864 src: /192.168.158.6:54386 dest: /192.168.158.4:9866 2025-07-22 07:14:12,925 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:54386, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_264346484_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759688_18864, duration(ns): 15908642 2025-07-22 07:14:12,926 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759688_18864, type=LAST_IN_PIPELINE terminating 2025-07-22 07:14:19,100 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759688_18864 replica FinalizedReplica, blk_1073759688_18864, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759688 for deletion 2025-07-22 07:14:19,101 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759688_18864 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759688 2025-07-22 07:15:17,903 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759689_18865 src: /192.168.158.1:44270 dest: /192.168.158.4:9866 2025-07-22 07:15:17,938 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44270, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1570996534_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759689_18865, duration(ns): 23713863 2025-07-22 07:15:17,938 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759689_18865, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-22 07:15:22,101 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759689_18865 replica FinalizedReplica, blk_1073759689_18865, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759689 for deletion 2025-07-22 07:15:22,102 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759689_18865 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759689 2025-07-22 07:16:17,909 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759690_18866 src: /192.168.158.6:46164 dest: /192.168.158.4:9866 2025-07-22 07:16:17,929 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:46164, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1397251843_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759690_18866, duration(ns): 17798071 2025-07-22 07:16:17,930 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759690_18866, type=LAST_IN_PIPELINE terminating 2025-07-22 07:16:25,102 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759690_18866 replica FinalizedReplica, blk_1073759690_18866, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759690 for deletion 2025-07-22 07:16:25,104 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759690_18866 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759690 2025-07-22 07:17:17,911 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759691_18867 src: /192.168.158.5:50286 dest: /192.168.158.4:9866 2025-07-22 07:17:17,941 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:50286, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_182650600_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759691_18867, duration(ns): 23921963 2025-07-22 07:17:17,941 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759691_18867, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-22 07:17:22,103 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759691_18867 replica FinalizedReplica, blk_1073759691_18867, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759691 for deletion 2025-07-22 07:17:22,104 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759691_18867 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759691 2025-07-22 07:21:22,912 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759695_18871 src: /192.168.158.7:58832 dest: /192.168.158.4:9866 2025-07-22 07:21:22,940 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:58832, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_882432424_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759695_18871, duration(ns): 21365893 2025-07-22 07:21:22,940 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759695_18871, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-22 07:21:25,113 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759695_18871 replica FinalizedReplica, blk_1073759695_18871, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759695 for deletion 2025-07-22 07:21:25,114 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759695_18871 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759695 2025-07-22 07:22:22,917 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759696_18872 src: /192.168.158.6:56344 dest: /192.168.158.4:9866 2025-07-22 07:22:22,944 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:56344, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_638905807_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759696_18872, duration(ns): 20556888 2025-07-22 07:22:22,944 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759696_18872, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-22 07:22:25,114 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759696_18872 replica FinalizedReplica, blk_1073759696_18872, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759696 for deletion 2025-07-22 07:22:25,117 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759696_18872 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759696 2025-07-22 07:24:22,923 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759698_18874 src: /192.168.158.8:49688 dest: /192.168.158.4:9866 2025-07-22 07:24:22,948 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:49688, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_471627200_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759698_18874, duration(ns): 19651896 2025-07-22 07:24:22,948 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759698_18874, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-22 07:24:28,118 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759698_18874 replica FinalizedReplica, blk_1073759698_18874, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759698 for deletion 2025-07-22 07:24:28,119 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759698_18874 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759698 2025-07-22 07:25:22,926 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759699_18875 src: /192.168.158.8:53926 dest: /192.168.158.4:9866 2025-07-22 07:25:22,951 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:53926, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2055658391_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759699_18875, duration(ns): 19252056 2025-07-22 07:25:22,952 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759699_18875, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-22 07:25:25,118 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759699_18875 replica FinalizedReplica, blk_1073759699_18875, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759699 for deletion 2025-07-22 07:25:25,120 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759699_18875 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759699 2025-07-22 07:26:22,922 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759700_18876 src: /192.168.158.1:51176 dest: /192.168.158.4:9866 2025-07-22 07:26:22,956 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51176, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1268183488_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759700_18876, duration(ns): 25222678 2025-07-22 07:26:22,957 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759700_18876, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-22 07:26:28,123 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759700_18876 replica FinalizedReplica, blk_1073759700_18876, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759700 for deletion 2025-07-22 07:26:28,124 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759700_18876 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759700 2025-07-22 07:28:27,926 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759702_18878 src: /192.168.158.1:37756 dest: /192.168.158.4:9866 2025-07-22 07:28:27,958 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37756, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-493197763_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759702_18878, duration(ns): 22689074 2025-07-22 07:28:27,958 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759702_18878, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-22 07:28:31,129 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759702_18878 replica FinalizedReplica, blk_1073759702_18878, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759702 for deletion 2025-07-22 07:28:31,130 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759702_18878 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759702 2025-07-22 07:30:32,934 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759704_18880 src: /192.168.158.8:51204 dest: /192.168.158.4:9866 2025-07-22 07:30:32,953 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:51204, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1403903376_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759704_18880, duration(ns): 16380463 2025-07-22 07:30:32,953 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759704_18880, type=LAST_IN_PIPELINE terminating 2025-07-22 07:30:37,135 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759704_18880 replica FinalizedReplica, blk_1073759704_18880, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759704 for deletion 2025-07-22 07:30:37,137 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759704_18880 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759704 2025-07-22 07:35:37,942 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759709_18885 src: /192.168.158.8:52218 dest: /192.168.158.4:9866 2025-07-22 07:35:37,967 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:52218, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_69854560_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759709_18885, duration(ns): 18331221 2025-07-22 07:35:37,967 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759709_18885, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-22 07:35:40,146 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759709_18885 replica FinalizedReplica, blk_1073759709_18885, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759709 for deletion 2025-07-22 07:35:40,147 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759709_18885 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759709 2025-07-22 07:36:42,945 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759710_18886 src: /192.168.158.8:39472 dest: /192.168.158.4:9866 2025-07-22 07:36:42,969 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:39472, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1301996508_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759710_18886, duration(ns): 21212269 2025-07-22 07:36:42,969 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759710_18886, type=LAST_IN_PIPELINE terminating 2025-07-22 07:36:46,149 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759710_18886 replica FinalizedReplica, blk_1073759710_18886, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759710 for deletion 2025-07-22 07:36:46,150 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759710_18886 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759710 2025-07-22 07:38:42,944 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759712_18888 src: /192.168.158.5:39624 dest: /192.168.158.4:9866 2025-07-22 07:38:42,974 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:39624, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_573054198_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759712_18888, duration(ns): 24205808 2025-07-22 07:38:42,975 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759712_18888, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-22 07:38:46,152 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759712_18888 replica FinalizedReplica, blk_1073759712_18888, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759712 for deletion 2025-07-22 07:38:46,153 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759712_18888 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759712 2025-07-22 07:39:47,945 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759713_18889 src: /192.168.158.7:40096 dest: /192.168.158.4:9866 2025-07-22 07:39:47,971 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:40096, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-909222639_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759713_18889, duration(ns): 19866318 2025-07-22 07:39:47,973 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759713_18889, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-22 07:39:52,152 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759713_18889 replica FinalizedReplica, blk_1073759713_18889, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759713 for deletion 2025-07-22 07:39:52,154 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759713_18889 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759713 2025-07-22 07:42:57,949 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759716_18892 src: /192.168.158.7:35240 dest: /192.168.158.4:9866 2025-07-22 07:42:57,969 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:35240, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_843888565_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759716_18892, duration(ns): 16788045 2025-07-22 07:42:57,971 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759716_18892, type=LAST_IN_PIPELINE terminating 2025-07-22 07:43:01,160 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759716_18892 replica FinalizedReplica, blk_1073759716_18892, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759716 for deletion 2025-07-22 07:43:01,162 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759716_18892 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759716 2025-07-22 07:43:57,946 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759717_18893 src: /192.168.158.1:35472 dest: /192.168.158.4:9866 2025-07-22 07:43:57,979 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35472, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1769411942_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759717_18893, duration(ns): 23733851 2025-07-22 07:43:57,981 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759717_18893, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-22 07:44:04,162 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759717_18893 replica FinalizedReplica, blk_1073759717_18893, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759717 for deletion 2025-07-22 07:44:04,163 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759717_18893 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759717 2025-07-22 07:44:57,947 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759718_18894 src: /192.168.158.7:60576 dest: /192.168.158.4:9866 2025-07-22 07:44:57,972 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:60576, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-556084994_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759718_18894, duration(ns): 19680591 2025-07-22 07:44:57,973 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759718_18894, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-22 07:45:01,163 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759718_18894 replica FinalizedReplica, blk_1073759718_18894, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759718 for deletion 2025-07-22 07:45:01,164 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759718_18894 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759718 2025-07-22 07:47:57,951 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759721_18897 src: /192.168.158.1:53504 dest: /192.168.158.4:9866 2025-07-22 07:47:57,983 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53504, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1345542725_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759721_18897, duration(ns): 23548775 2025-07-22 07:47:57,987 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759721_18897, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-22 07:48:01,169 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759721_18897 replica FinalizedReplica, blk_1073759721_18897, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759721 for deletion 2025-07-22 07:48:01,171 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759721_18897 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759721 2025-07-22 07:48:57,956 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759722_18898 src: /192.168.158.6:39494 dest: /192.168.158.4:9866 2025-07-22 07:48:57,975 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:39494, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1818830829_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759722_18898, duration(ns): 17279762 2025-07-22 07:48:57,975 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759722_18898, type=LAST_IN_PIPELINE terminating 2025-07-22 07:49:04,171 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759722_18898 replica FinalizedReplica, blk_1073759722_18898, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759722 for deletion 2025-07-22 07:49:04,172 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759722_18898 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759722 2025-07-22 07:49:57,951 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759723_18899 src: /192.168.158.1:51640 dest: /192.168.158.4:9866 2025-07-22 07:49:57,983 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51640, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-803446785_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759723_18899, duration(ns): 22664813 2025-07-22 07:49:57,985 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759723_18899, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-22 07:50:01,172 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759723_18899 replica FinalizedReplica, blk_1073759723_18899, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759723 for deletion 2025-07-22 07:50:01,173 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759723_18899 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759723 2025-07-22 07:51:57,960 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759725_18901 src: /192.168.158.1:33054 dest: /192.168.158.4:9866 2025-07-22 07:51:57,993 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33054, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1929433129_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759725_18901, duration(ns): 24006700 2025-07-22 07:51:57,995 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759725_18901, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-22 07:52:01,177 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759725_18901 replica FinalizedReplica, blk_1073759725_18901, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759725 for deletion 2025-07-22 07:52:01,179 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759725_18901 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759725 2025-07-22 07:52:57,961 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759726_18902 src: /192.168.158.6:59676 dest: /192.168.158.4:9866 2025-07-22 07:52:57,987 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:59676, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1935387191_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759726_18902, duration(ns): 20044136 2025-07-22 07:52:57,989 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759726_18902, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-22 07:53:04,181 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759726_18902 replica FinalizedReplica, blk_1073759726_18902, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759726 for deletion 2025-07-22 07:53:04,182 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759726_18902 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759726 2025-07-22 07:53:57,959 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759727_18903 src: /192.168.158.1:60094 dest: /192.168.158.4:9866 2025-07-22 07:53:57,994 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60094, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-66175517_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759727_18903, duration(ns): 24942017 2025-07-22 07:53:57,996 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759727_18903, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-22 07:54:04,184 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759727_18903 replica FinalizedReplica, blk_1073759727_18903, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759727 for deletion 2025-07-22 07:54:04,186 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759727_18903 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759727 2025-07-22 07:55:02,964 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759728_18904 src: /192.168.158.6:37728 dest: /192.168.158.4:9866 2025-07-22 07:55:02,983 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:37728, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1287943112_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759728_18904, duration(ns): 16829446 2025-07-22 07:55:02,983 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759728_18904, type=LAST_IN_PIPELINE terminating 2025-07-22 07:55:07,186 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759728_18904 replica FinalizedReplica, blk_1073759728_18904, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759728 for deletion 2025-07-22 07:55:07,187 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759728_18904 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759728 2025-07-22 07:59:12,972 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759732_18908 src: /192.168.158.6:44480 dest: /192.168.158.4:9866 2025-07-22 07:59:12,993 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:44480, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1406834534_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759732_18908, duration(ns): 18728014 2025-07-22 07:59:12,994 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759732_18908, type=LAST_IN_PIPELINE terminating 2025-07-22 07:59:19,192 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759732_18908 replica FinalizedReplica, blk_1073759732_18908, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759732 for deletion 2025-07-22 07:59:19,193 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759732_18908 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759732 2025-07-22 07:59:36,283 INFO org.apache.hadoop.hdfs.server.datanode.DirectoryScanner: BlockPool BP-1059995147-192.168.158.1-1752101929360 Total blocks: 8, missing metadata files:0, missing block files:0, missing blocks in memory:0, mismatched blocks:0 2025-07-22 08:00:17,972 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759733_18909 src: /192.168.158.6:46518 dest: /192.168.158.4:9866 2025-07-22 08:00:17,991 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:46518, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_893328592_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759733_18909, duration(ns): 16851047 2025-07-22 08:00:17,991 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759733_18909, type=LAST_IN_PIPELINE terminating 2025-07-22 08:00:22,196 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759733_18909 replica FinalizedReplica, blk_1073759733_18909, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759733 for deletion 2025-07-22 08:00:22,197 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759733_18909 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759733 2025-07-22 08:06:27,979 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759739_18915 src: /192.168.158.7:49812 dest: /192.168.158.4:9866 2025-07-22 08:06:28,005 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:49812, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1679828457_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759739_18915, duration(ns): 19913961 2025-07-22 08:06:28,005 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759739_18915, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-22 08:06:31,204 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759739_18915 replica FinalizedReplica, blk_1073759739_18915, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759739 for deletion 2025-07-22 08:06:31,206 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759739_18915 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759739 2025-07-22 08:07:32,980 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759740_18916 src: /192.168.158.9:46494 dest: /192.168.158.4:9866 2025-07-22 08:07:33,000 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:46494, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1257049455_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759740_18916, duration(ns): 17565592 2025-07-22 08:07:33,001 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759740_18916, type=LAST_IN_PIPELINE terminating 2025-07-22 08:07:37,205 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759740_18916 replica FinalizedReplica, blk_1073759740_18916, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759740 for deletion 2025-07-22 08:07:37,207 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759740_18916 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir5/blk_1073759740 2025-07-22 08:14:47,978 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759747_18923 src: /192.168.158.9:43782 dest: /192.168.158.4:9866 2025-07-22 08:14:48,003 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:43782, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1620486076_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759747_18923, duration(ns): 18741053 2025-07-22 08:14:48,004 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759747_18923, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-22 08:14:52,228 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759747_18923 replica FinalizedReplica, blk_1073759747_18923, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759747 for deletion 2025-07-22 08:14:52,229 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759747_18923 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759747 2025-07-22 08:17:47,989 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759750_18926 src: /192.168.158.5:48644 dest: /192.168.158.4:9866 2025-07-22 08:17:48,016 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:48644, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_588880503_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759750_18926, duration(ns): 21034544 2025-07-22 08:17:48,016 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759750_18926, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-22 08:17:52,236 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759750_18926 replica FinalizedReplica, blk_1073759750_18926, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759750 for deletion 2025-07-22 08:17:52,237 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759750_18926 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759750 2025-07-22 08:18:47,992 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759751_18927 src: /192.168.158.5:58490 dest: /192.168.158.4:9866 2025-07-22 08:18:48,012 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:58490, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_869575377_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759751_18927, duration(ns): 18425227 2025-07-22 08:18:48,013 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759751_18927, type=LAST_IN_PIPELINE terminating 2025-07-22 08:18:52,238 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759751_18927 replica FinalizedReplica, blk_1073759751_18927, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759751 for deletion 2025-07-22 08:18:52,239 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759751_18927 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759751 2025-07-22 08:19:47,990 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759752_18928 src: /192.168.158.6:57402 dest: /192.168.158.4:9866 2025-07-22 08:19:48,015 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:57402, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-385945932_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759752_18928, duration(ns): 18716071 2025-07-22 08:19:48,015 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759752_18928, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-22 08:19:52,240 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759752_18928 replica FinalizedReplica, blk_1073759752_18928, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759752 for deletion 2025-07-22 08:19:52,242 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759752_18928 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759752 2025-07-22 08:21:52,991 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759754_18930 src: /192.168.158.5:36778 dest: /192.168.158.4:9866 2025-07-22 08:21:53,017 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:36778, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1401568076_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759754_18930, duration(ns): 20943997 2025-07-22 08:21:53,017 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759754_18930, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-22 08:21:55,245 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759754_18930 replica FinalizedReplica, blk_1073759754_18930, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759754 for deletion 2025-07-22 08:21:55,246 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759754_18930 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759754 2025-07-22 08:23:52,997 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759756_18932 src: /192.168.158.1:52858 dest: /192.168.158.4:9866 2025-07-22 08:23:53,029 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52858, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-744098996_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759756_18932, duration(ns): 22830440 2025-07-22 08:23:53,030 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759756_18932, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-22 08:23:58,250 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759756_18932 replica FinalizedReplica, blk_1073759756_18932, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759756 for deletion 2025-07-22 08:23:58,251 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759756_18932 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759756 2025-07-22 08:27:58,014 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759760_18936 src: /192.168.158.1:47698 dest: /192.168.158.4:9866 2025-07-22 08:27:58,047 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47698, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-764165008_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759760_18936, duration(ns): 23502025 2025-07-22 08:27:58,049 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759760_18936, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-22 08:28:01,264 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759760_18936 replica FinalizedReplica, blk_1073759760_18936, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759760 for deletion 2025-07-22 08:28:01,265 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759760_18936 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759760 2025-07-22 08:29:58,017 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759762_18938 src: /192.168.158.1:38806 dest: /192.168.158.4:9866 2025-07-22 08:29:58,048 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38806, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1080923283_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759762_18938, duration(ns): 21780843 2025-07-22 08:29:58,049 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759762_18938, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-22 08:30:01,269 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759762_18938 replica FinalizedReplica, blk_1073759762_18938, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759762 for deletion 2025-07-22 08:30:01,271 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759762_18938 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759762 2025-07-22 08:31:58,015 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759764_18940 src: /192.168.158.1:46904 dest: /192.168.158.4:9866 2025-07-22 08:31:58,046 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46904, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1597818474_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759764_18940, duration(ns): 21694133 2025-07-22 08:31:58,046 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759764_18940, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-22 08:32:01,274 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759764_18940 replica FinalizedReplica, blk_1073759764_18940, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759764 for deletion 2025-07-22 08:32:01,275 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759764_18940 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759764 2025-07-22 08:36:03,032 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759768_18944 src: /192.168.158.7:55612 dest: /192.168.158.4:9866 2025-07-22 08:36:03,051 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:55612, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1269518853_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759768_18944, duration(ns): 17483219 2025-07-22 08:36:03,052 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759768_18944, type=LAST_IN_PIPELINE terminating 2025-07-22 08:36:07,285 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759768_18944 replica FinalizedReplica, blk_1073759768_18944, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759768 for deletion 2025-07-22 08:36:07,286 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759768_18944 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759768 2025-07-22 08:37:03,031 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759769_18945 src: /192.168.158.5:50638 dest: /192.168.158.4:9866 2025-07-22 08:37:03,058 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:50638, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_382863881_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759769_18945, duration(ns): 20748526 2025-07-22 08:37:03,058 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759769_18945, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-22 08:37:07,289 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759769_18945 replica FinalizedReplica, blk_1073759769_18945, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759769 for deletion 2025-07-22 08:37:07,291 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759769_18945 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759769 2025-07-22 08:40:08,064 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759772_18948 src: /192.168.158.6:55500 dest: /192.168.158.4:9866 2025-07-22 08:40:08,085 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:55500, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1585058022_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759772_18948, duration(ns): 19713855 2025-07-22 08:40:08,086 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759772_18948, type=LAST_IN_PIPELINE terminating 2025-07-22 08:40:13,294 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759772_18948 replica FinalizedReplica, blk_1073759772_18948, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759772 for deletion 2025-07-22 08:40:13,295 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759772_18948 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759772 2025-07-22 08:43:01,308 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Successfully sent block report 0x6cb5f0b7da23e677, containing 4 storage report(s), of which we sent 4. The reports had 8 total blocks and used 1 RPC(s). This took 1 msec to generate and 5 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5. 2025-07-22 08:43:01,309 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Got finalize command for block pool BP-1059995147-192.168.158.1-1752101929360 2025-07-22 08:43:08,044 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759775_18951 src: /192.168.158.1:51134 dest: /192.168.158.4:9866 2025-07-22 08:43:08,078 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51134, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1346145595_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759775_18951, duration(ns): 24846796 2025-07-22 08:43:08,078 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759775_18951, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-22 08:43:10,305 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759775_18951 replica FinalizedReplica, blk_1073759775_18951, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759775 for deletion 2025-07-22 08:43:10,306 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759775_18951 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759775 2025-07-22 08:44:13,041 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759776_18952 src: /192.168.158.7:42748 dest: /192.168.158.4:9866 2025-07-22 08:44:13,061 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:42748, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-673347683_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759776_18952, duration(ns): 17076597 2025-07-22 08:44:13,061 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759776_18952, type=LAST_IN_PIPELINE terminating 2025-07-22 08:44:16,309 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759776_18952 replica FinalizedReplica, blk_1073759776_18952, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759776 for deletion 2025-07-22 08:44:16,311 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759776_18952 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759776 2025-07-22 08:45:13,039 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759777_18953 src: /192.168.158.9:42450 dest: /192.168.158.4:9866 2025-07-22 08:45:13,060 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:42450, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1655268838_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759777_18953, duration(ns): 17445453 2025-07-22 08:45:13,060 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759777_18953, type=LAST_IN_PIPELINE terminating 2025-07-22 08:45:16,312 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759777_18953 replica FinalizedReplica, blk_1073759777_18953, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759777 for deletion 2025-07-22 08:45:16,313 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759777_18953 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759777 2025-07-22 08:47:13,062 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759779_18955 src: /192.168.158.8:33758 dest: /192.168.158.4:9866 2025-07-22 08:47:13,088 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:33758, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2123775369_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759779_18955, duration(ns): 20642429 2025-07-22 08:47:13,088 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759779_18955, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-22 08:47:16,320 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759779_18955 replica FinalizedReplica, blk_1073759779_18955, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759779 for deletion 2025-07-22 08:47:16,321 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759779_18955 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759779 2025-07-22 08:48:18,044 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759780_18956 src: /192.168.158.7:39572 dest: /192.168.158.4:9866 2025-07-22 08:48:18,066 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:39572, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1673907893_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759780_18956, duration(ns): 19919256 2025-07-22 08:48:18,067 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759780_18956, type=LAST_IN_PIPELINE terminating 2025-07-22 08:48:25,323 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759780_18956 replica FinalizedReplica, blk_1073759780_18956, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759780 for deletion 2025-07-22 08:48:25,324 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759780_18956 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759780 2025-07-22 08:50:18,049 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759782_18958 src: /192.168.158.1:60356 dest: /192.168.158.4:9866 2025-07-22 08:50:18,083 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60356, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1024978597_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759782_18958, duration(ns): 24724516 2025-07-22 08:50:18,083 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759782_18958, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-22 08:50:25,329 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759782_18958 replica FinalizedReplica, blk_1073759782_18958, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759782 for deletion 2025-07-22 08:50:25,330 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759782_18958 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759782 2025-07-22 08:51:18,056 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759783_18959 src: /192.168.158.6:43126 dest: /192.168.158.4:9866 2025-07-22 08:51:18,077 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:43126, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_26821645_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759783_18959, duration(ns): 18564750 2025-07-22 08:51:18,077 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759783_18959, type=LAST_IN_PIPELINE terminating 2025-07-22 08:51:22,331 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759783_18959 replica FinalizedReplica, blk_1073759783_18959, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759783 for deletion 2025-07-22 08:51:22,333 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759783_18959 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759783 2025-07-22 08:55:23,055 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759787_18963 src: /192.168.158.1:36380 dest: /192.168.158.4:9866 2025-07-22 08:55:23,089 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36380, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1335644389_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759787_18963, duration(ns): 24570309 2025-07-22 08:55:23,089 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759787_18963, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-22 08:55:25,345 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759787_18963 replica FinalizedReplica, blk_1073759787_18963, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759787 for deletion 2025-07-22 08:55:25,346 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759787_18963 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759787 2025-07-22 08:57:33,050 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759789_18965 src: /192.168.158.8:35414 dest: /192.168.158.4:9866 2025-07-22 08:57:33,075 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:35414, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1037261133_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759789_18965, duration(ns): 19567528 2025-07-22 08:57:33,075 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759789_18965, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-22 08:57:37,349 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759789_18965 replica FinalizedReplica, blk_1073759789_18965, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759789 for deletion 2025-07-22 08:57:37,350 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759789_18965 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759789 2025-07-22 08:58:33,071 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759790_18966 src: /192.168.158.6:56244 dest: /192.168.158.4:9866 2025-07-22 08:58:33,091 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:56244, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-664180491_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759790_18966, duration(ns): 18152706 2025-07-22 08:58:33,092 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759790_18966, type=LAST_IN_PIPELINE terminating 2025-07-22 08:58:40,349 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759790_18966 replica FinalizedReplica, blk_1073759790_18966, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759790 for deletion 2025-07-22 08:58:40,351 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759790_18966 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759790 2025-07-22 08:59:33,055 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759791_18967 src: /192.168.158.9:58118 dest: /192.168.158.4:9866 2025-07-22 08:59:33,080 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:58118, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1689752576_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759791_18967, duration(ns): 19350183 2025-07-22 08:59:33,080 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759791_18967, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-22 08:59:37,353 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759791_18967 replica FinalizedReplica, blk_1073759791_18967, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759791 for deletion 2025-07-22 08:59:37,354 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759791_18967 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759791 2025-07-22 09:01:33,060 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759793_18969 src: /192.168.158.9:59898 dest: /192.168.158.4:9866 2025-07-22 09:01:33,087 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:59898, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1862624414_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759793_18969, duration(ns): 20695749 2025-07-22 09:01:33,087 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759793_18969, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-22 09:01:40,357 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759793_18969 replica FinalizedReplica, blk_1073759793_18969, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759793 for deletion 2025-07-22 09:01:40,359 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759793_18969 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759793 2025-07-22 09:03:33,064 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759795_18971 src: /192.168.158.6:40148 dest: /192.168.158.4:9866 2025-07-22 09:03:33,083 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:40148, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_312037238_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759795_18971, duration(ns): 17815434 2025-07-22 09:03:33,084 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759795_18971, type=LAST_IN_PIPELINE terminating 2025-07-22 09:03:37,364 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759795_18971 replica FinalizedReplica, blk_1073759795_18971, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759795 for deletion 2025-07-22 09:03:37,365 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759795_18971 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759795 2025-07-22 09:04:33,055 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759796_18972 src: /192.168.158.1:52114 dest: /192.168.158.4:9866 2025-07-22 09:04:33,088 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52114, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_393939851_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759796_18972, duration(ns): 23024211 2025-07-22 09:04:33,088 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759796_18972, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-22 09:04:37,365 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759796_18972 replica FinalizedReplica, blk_1073759796_18972, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759796 for deletion 2025-07-22 09:04:37,367 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759796_18972 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759796 2025-07-22 09:05:33,060 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759797_18973 src: /192.168.158.9:44844 dest: /192.168.158.4:9866 2025-07-22 09:05:33,080 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:44844, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_317711115_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759797_18973, duration(ns): 17179794 2025-07-22 09:05:33,080 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759797_18973, type=LAST_IN_PIPELINE terminating 2025-07-22 09:05:37,368 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759797_18973 replica FinalizedReplica, blk_1073759797_18973, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759797 for deletion 2025-07-22 09:05:37,370 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759797_18973 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759797 2025-07-22 09:06:33,063 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759798_18974 src: /192.168.158.8:36726 dest: /192.168.158.4:9866 2025-07-22 09:06:33,088 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:36726, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2075271019_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759798_18974, duration(ns): 18577738 2025-07-22 09:06:33,088 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759798_18974, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-22 09:06:37,375 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759798_18974 replica FinalizedReplica, blk_1073759798_18974, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759798 for deletion 2025-07-22 09:06:37,375 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759798_18974 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759798 2025-07-22 09:08:38,067 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759800_18976 src: /192.168.158.9:53044 dest: /192.168.158.4:9866 2025-07-22 09:08:38,085 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:53044, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1785493501_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759800_18976, duration(ns): 16132066 2025-07-22 09:08:38,086 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759800_18976, type=LAST_IN_PIPELINE terminating 2025-07-22 09:08:40,381 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759800_18976 replica FinalizedReplica, blk_1073759800_18976, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759800 for deletion 2025-07-22 09:08:40,383 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759800_18976 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759800 2025-07-22 09:10:48,062 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759802_18978 src: /192.168.158.5:39304 dest: /192.168.158.4:9866 2025-07-22 09:10:48,087 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:39304, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-198027576_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759802_18978, duration(ns): 19183309 2025-07-22 09:10:48,088 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759802_18978, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-22 09:10:55,388 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759802_18978 replica FinalizedReplica, blk_1073759802_18978, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759802 for deletion 2025-07-22 09:10:55,389 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759802_18978 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759802 2025-07-22 09:13:53,072 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759805_18981 src: /192.168.158.7:41788 dest: /192.168.158.4:9866 2025-07-22 09:13:53,092 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:41788, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1212392177_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759805_18981, duration(ns): 17315388 2025-07-22 09:13:53,092 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759805_18981, type=LAST_IN_PIPELINE terminating 2025-07-22 09:13:55,399 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759805_18981 replica FinalizedReplica, blk_1073759805_18981, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759805 for deletion 2025-07-22 09:13:55,400 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759805_18981 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759805 2025-07-22 09:15:53,074 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759807_18983 src: /192.168.158.8:49336 dest: /192.168.158.4:9866 2025-07-22 09:15:53,094 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:49336, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1588604118_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759807_18983, duration(ns): 18496773 2025-07-22 09:15:53,095 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759807_18983, type=LAST_IN_PIPELINE terminating 2025-07-22 09:15:58,404 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759807_18983 replica FinalizedReplica, blk_1073759807_18983, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759807 for deletion 2025-07-22 09:15:58,406 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759807_18983 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759807 2025-07-22 09:19:53,080 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759811_18987 src: /192.168.158.8:50968 dest: /192.168.158.4:9866 2025-07-22 09:19:53,099 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:50968, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2069946247_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759811_18987, duration(ns): 17007059 2025-07-22 09:19:53,099 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759811_18987, type=LAST_IN_PIPELINE terminating 2025-07-22 09:19:55,411 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759811_18987 replica FinalizedReplica, blk_1073759811_18987, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759811 for deletion 2025-07-22 09:19:55,412 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759811_18987 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759811 2025-07-22 09:22:53,112 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759814_18990 src: /192.168.158.9:36618 dest: /192.168.158.4:9866 2025-07-22 09:22:53,137 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:36618, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1244867226_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759814_18990, duration(ns): 19089883 2025-07-22 09:22:53,139 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759814_18990, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-22 09:22:55,419 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759814_18990 replica FinalizedReplica, blk_1073759814_18990, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759814 for deletion 2025-07-22 09:22:55,421 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759814_18990 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759814 2025-07-22 09:23:58,117 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759815_18991 src: /192.168.158.5:56808 dest: /192.168.158.4:9866 2025-07-22 09:23:58,138 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:56808, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1317007881_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759815_18991, duration(ns): 17775247 2025-07-22 09:23:58,138 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759815_18991, type=LAST_IN_PIPELINE terminating 2025-07-22 09:24:01,419 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759815_18991 replica FinalizedReplica, blk_1073759815_18991, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759815 for deletion 2025-07-22 09:24:01,421 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759815_18991 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759815 2025-07-22 09:25:03,115 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759816_18992 src: /192.168.158.9:49612 dest: /192.168.158.4:9866 2025-07-22 09:25:03,141 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:49612, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1906817065_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759816_18992, duration(ns): 20202882 2025-07-22 09:25:03,141 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759816_18992, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-22 09:25:07,422 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759816_18992 replica FinalizedReplica, blk_1073759816_18992, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759816 for deletion 2025-07-22 09:25:07,423 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759816_18992 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759816 2025-07-22 09:27:08,113 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759818_18994 src: /192.168.158.1:38168 dest: /192.168.158.4:9866 2025-07-22 09:27:08,148 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:38168, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1172980926_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759818_18994, duration(ns): 24762742 2025-07-22 09:27:08,148 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759818_18994, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-22 09:27:10,427 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759818_18994 replica FinalizedReplica, blk_1073759818_18994, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759818 for deletion 2025-07-22 09:27:10,428 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759818_18994 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759818 2025-07-22 09:28:08,125 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759819_18995 src: /192.168.158.7:44672 dest: /192.168.158.4:9866 2025-07-22 09:28:08,151 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:44672, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1641053885_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759819_18995, duration(ns): 19265668 2025-07-22 09:28:08,151 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759819_18995, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-22 09:28:10,433 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759819_18995 replica FinalizedReplica, blk_1073759819_18995, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759819 for deletion 2025-07-22 09:28:10,434 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759819_18995 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759819 2025-07-22 09:32:13,136 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759823_18999 src: /192.168.158.9:37560 dest: /192.168.158.4:9866 2025-07-22 09:32:13,156 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:37560, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2023310536_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759823_18999, duration(ns): 17551780 2025-07-22 09:32:13,157 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759823_18999, type=LAST_IN_PIPELINE terminating 2025-07-22 09:32:16,448 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759823_18999 replica FinalizedReplica, blk_1073759823_18999, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759823 for deletion 2025-07-22 09:32:16,449 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759823_18999 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759823 2025-07-22 09:33:13,137 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759824_19000 src: /192.168.158.6:56472 dest: /192.168.158.4:9866 2025-07-22 09:33:13,163 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:56472, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1071885937_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759824_19000, duration(ns): 20270815 2025-07-22 09:33:13,163 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759824_19000, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-22 09:33:19,450 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759824_19000 replica FinalizedReplica, blk_1073759824_19000, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759824 for deletion 2025-07-22 09:33:19,451 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759824_19000 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759824 2025-07-22 09:34:13,144 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759825_19001 src: /192.168.158.5:44824 dest: /192.168.158.4:9866 2025-07-22 09:34:13,164 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:44824, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1694552235_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759825_19001, duration(ns): 17783785 2025-07-22 09:34:13,164 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759825_19001, type=LAST_IN_PIPELINE terminating 2025-07-22 09:34:19,454 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759825_19001 replica FinalizedReplica, blk_1073759825_19001, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759825 for deletion 2025-07-22 09:34:19,455 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759825_19001 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759825 2025-07-22 09:36:13,142 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759827_19003 src: /192.168.158.7:36378 dest: /192.168.158.4:9866 2025-07-22 09:36:13,161 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:36378, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_865015213_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759827_19003, duration(ns): 17295073 2025-07-22 09:36:13,162 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759827_19003, type=LAST_IN_PIPELINE terminating 2025-07-22 09:36:16,459 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759827_19003 replica FinalizedReplica, blk_1073759827_19003, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759827 for deletion 2025-07-22 09:36:16,460 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759827_19003 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759827 2025-07-22 09:37:13,140 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759828_19004 src: /192.168.158.1:53260 dest: /192.168.158.4:9866 2025-07-22 09:37:13,172 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53260, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-29330345_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759828_19004, duration(ns): 21859275 2025-07-22 09:37:13,173 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759828_19004, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-22 09:37:16,459 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759828_19004 replica FinalizedReplica, blk_1073759828_19004, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759828 for deletion 2025-07-22 09:37:16,460 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759828_19004 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759828 2025-07-22 09:40:18,150 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759831_19007 src: /192.168.158.9:40308 dest: /192.168.158.4:9866 2025-07-22 09:40:18,176 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:40308, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1099789180_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759831_19007, duration(ns): 19830339 2025-07-22 09:40:18,176 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759831_19007, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-22 09:40:22,464 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759831_19007 replica FinalizedReplica, blk_1073759831_19007, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759831 for deletion 2025-07-22 09:40:22,466 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759831_19007 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759831 2025-07-22 09:41:18,151 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759832_19008 src: /192.168.158.1:51136 dest: /192.168.158.4:9866 2025-07-22 09:41:18,186 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51136, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1385436402_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759832_19008, duration(ns): 25786055 2025-07-22 09:41:18,186 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759832_19008, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-22 09:41:25,474 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759832_19008 replica FinalizedReplica, blk_1073759832_19008, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759832 for deletion 2025-07-22 09:41:25,475 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759832_19008 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759832 2025-07-22 09:43:18,155 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759834_19010 src: /192.168.158.7:46450 dest: /192.168.158.4:9866 2025-07-22 09:43:18,174 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:46450, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1116737793_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759834_19010, duration(ns): 17541396 2025-07-22 09:43:18,175 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759834_19010, type=LAST_IN_PIPELINE terminating 2025-07-22 09:43:25,478 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759834_19010 replica FinalizedReplica, blk_1073759834_19010, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759834 for deletion 2025-07-22 09:43:25,480 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759834_19010 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759834 2025-07-22 09:46:18,154 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759837_19013 src: /192.168.158.1:37664 dest: /192.168.158.4:9866 2025-07-22 09:46:18,187 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37664, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_934127618_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759837_19013, duration(ns): 23577114 2025-07-22 09:46:18,188 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759837_19013, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-22 09:46:22,487 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759837_19013 replica FinalizedReplica, blk_1073759837_19013, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759837 for deletion 2025-07-22 09:46:22,488 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759837_19013 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759837 2025-07-22 09:48:18,149 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759839_19015 src: /192.168.158.1:34068 dest: /192.168.158.4:9866 2025-07-22 09:48:18,181 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34068, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_100907785_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759839_19015, duration(ns): 22326325 2025-07-22 09:48:18,181 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759839_19015, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-22 09:48:22,494 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759839_19015 replica FinalizedReplica, blk_1073759839_19015, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759839 for deletion 2025-07-22 09:48:22,495 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759839_19015 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759839 2025-07-22 09:50:23,158 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759841_19017 src: /192.168.158.1:59344 dest: /192.168.158.4:9866 2025-07-22 09:50:23,190 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59344, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_748190162_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759841_19017, duration(ns): 22458372 2025-07-22 09:50:23,190 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759841_19017, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-22 09:50:28,497 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759841_19017 replica FinalizedReplica, blk_1073759841_19017, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759841 for deletion 2025-07-22 09:50:28,499 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759841_19017 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759841 2025-07-22 09:52:28,174 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759843_19019 src: /192.168.158.9:40008 dest: /192.168.158.4:9866 2025-07-22 09:52:28,192 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:40008, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1773614774_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759843_19019, duration(ns): 16118461 2025-07-22 09:52:28,193 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759843_19019, type=LAST_IN_PIPELINE terminating 2025-07-22 09:52:31,502 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759843_19019 replica FinalizedReplica, blk_1073759843_19019, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759843 for deletion 2025-07-22 09:52:31,503 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759843_19019 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759843 2025-07-22 09:54:33,181 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759845_19021 src: /192.168.158.6:60438 dest: /192.168.158.4:9866 2025-07-22 09:54:33,200 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:60438, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1462315731_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759845_19021, duration(ns): 16982699 2025-07-22 09:54:33,200 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759845_19021, type=LAST_IN_PIPELINE terminating 2025-07-22 09:54:37,508 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759845_19021 replica FinalizedReplica, blk_1073759845_19021, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759845 for deletion 2025-07-22 09:54:37,510 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759845_19021 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759845 2025-07-22 09:56:38,172 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759847_19023 src: /192.168.158.9:52060 dest: /192.168.158.4:9866 2025-07-22 09:56:38,190 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:52060, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1757850100_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759847_19023, duration(ns): 15633938 2025-07-22 09:56:38,190 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759847_19023, type=LAST_IN_PIPELINE terminating 2025-07-22 09:56:40,514 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759847_19023 replica FinalizedReplica, blk_1073759847_19023, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759847 for deletion 2025-07-22 09:56:40,515 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759847_19023 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759847 2025-07-22 10:01:43,223 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759852_19028 src: /192.168.158.5:41756 dest: /192.168.158.4:9866 2025-07-22 10:01:43,243 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:41756, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1442970364_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759852_19028, duration(ns): 18159819 2025-07-22 10:01:43,244 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759852_19028, type=LAST_IN_PIPELINE terminating 2025-07-22 10:01:46,526 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759852_19028 replica FinalizedReplica, blk_1073759852_19028, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759852 for deletion 2025-07-22 10:01:46,527 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759852_19028 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759852 2025-07-22 10:08:43,187 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759859_19035 src: /192.168.158.1:51652 dest: /192.168.158.4:9866 2025-07-22 10:08:43,221 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51652, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_685782192_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759859_19035, duration(ns): 24014955 2025-07-22 10:08:43,221 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759859_19035, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-22 10:08:46,539 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759859_19035 replica FinalizedReplica, blk_1073759859_19035, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759859 for deletion 2025-07-22 10:08:46,541 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759859_19035 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759859 2025-07-22 10:09:43,189 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759860_19036 src: /192.168.158.7:42570 dest: /192.168.158.4:9866 2025-07-22 10:09:43,221 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:42570, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1104277063_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759860_19036, duration(ns): 23849201 2025-07-22 10:09:43,222 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759860_19036, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-22 10:09:46,543 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759860_19036 replica FinalizedReplica, blk_1073759860_19036, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759860 for deletion 2025-07-22 10:09:46,544 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759860_19036 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759860 2025-07-22 10:10:48,202 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759861_19037 src: /192.168.158.9:46400 dest: /192.168.158.4:9866 2025-07-22 10:10:48,222 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:46400, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1415153213_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759861_19037, duration(ns): 17989247 2025-07-22 10:10:48,223 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759861_19037, type=LAST_IN_PIPELINE terminating 2025-07-22 10:10:52,548 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759861_19037 replica FinalizedReplica, blk_1073759861_19037, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759861 for deletion 2025-07-22 10:10:52,549 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759861_19037 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759861 2025-07-22 10:11:53,198 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759862_19038 src: /192.168.158.8:55398 dest: /192.168.158.4:9866 2025-07-22 10:11:53,219 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:55398, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1662605772_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759862_19038, duration(ns): 18506835 2025-07-22 10:11:53,219 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759862_19038, type=LAST_IN_PIPELINE terminating 2025-07-22 10:11:55,550 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759862_19038 replica FinalizedReplica, blk_1073759862_19038, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759862 for deletion 2025-07-22 10:11:55,551 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759862_19038 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759862 2025-07-22 10:13:53,200 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759864_19040 src: /192.168.158.5:45596 dest: /192.168.158.4:9866 2025-07-22 10:13:53,226 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:45596, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-358691809_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759864_19040, duration(ns): 19837057 2025-07-22 10:13:53,226 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759864_19040, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-22 10:13:58,555 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759864_19040 replica FinalizedReplica, blk_1073759864_19040, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759864 for deletion 2025-07-22 10:13:58,558 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759864_19040 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759864 2025-07-22 10:15:58,201 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759866_19042 src: /192.168.158.7:35384 dest: /192.168.158.4:9866 2025-07-22 10:15:58,226 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:35384, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1986279617_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759866_19042, duration(ns): 19095386 2025-07-22 10:15:58,231 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759866_19042, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-22 10:16:01,560 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759866_19042 replica FinalizedReplica, blk_1073759866_19042, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759866 for deletion 2025-07-22 10:16:01,561 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759866_19042 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759866 2025-07-22 10:18:03,212 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759868_19044 src: /192.168.158.7:57186 dest: /192.168.158.4:9866 2025-07-22 10:18:03,230 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:57186, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1462266760_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759868_19044, duration(ns): 16417679 2025-07-22 10:18:03,231 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759868_19044, type=LAST_IN_PIPELINE terminating 2025-07-22 10:18:04,563 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759868_19044 replica FinalizedReplica, blk_1073759868_19044, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759868 for deletion 2025-07-22 10:18:04,564 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759868_19044 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759868 2025-07-22 10:19:03,206 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759869_19045 src: /192.168.158.1:41754 dest: /192.168.158.4:9866 2025-07-22 10:19:03,240 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:41754, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_712973732_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759869_19045, duration(ns): 24006725 2025-07-22 10:19:03,240 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759869_19045, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-22 10:19:04,565 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759869_19045 replica FinalizedReplica, blk_1073759869_19045, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759869 for deletion 2025-07-22 10:19:04,566 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759869_19045 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759869 2025-07-22 10:22:13,218 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759872_19048 src: /192.168.158.7:58332 dest: /192.168.158.4:9866 2025-07-22 10:22:13,238 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:58332, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2108860490_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759872_19048, duration(ns): 18164548 2025-07-22 10:22:13,239 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759872_19048, type=LAST_IN_PIPELINE terminating 2025-07-22 10:22:19,570 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759872_19048 replica FinalizedReplica, blk_1073759872_19048, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759872 for deletion 2025-07-22 10:22:19,571 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759872_19048 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759872 2025-07-22 10:24:18,217 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759874_19050 src: /192.168.158.5:43256 dest: /192.168.158.4:9866 2025-07-22 10:24:18,235 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:43256, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-701768328_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759874_19050, duration(ns): 16429559 2025-07-22 10:24:18,236 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759874_19050, type=LAST_IN_PIPELINE terminating 2025-07-22 10:24:19,575 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759874_19050 replica FinalizedReplica, blk_1073759874_19050, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759874 for deletion 2025-07-22 10:24:19,576 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759874_19050 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759874 2025-07-22 10:26:18,214 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759876_19052 src: /192.168.158.1:58710 dest: /192.168.158.4:9866 2025-07-22 10:26:18,246 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58710, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2003636098_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759876_19052, duration(ns): 21981800 2025-07-22 10:26:18,246 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759876_19052, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-22 10:26:22,581 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759876_19052 replica FinalizedReplica, blk_1073759876_19052, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759876 for deletion 2025-07-22 10:26:22,583 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759876_19052 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759876 2025-07-22 10:27:23,210 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759877_19053 src: /192.168.158.1:35124 dest: /192.168.158.4:9866 2025-07-22 10:27:23,244 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35124, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1067520270_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759877_19053, duration(ns): 24159837 2025-07-22 10:27:23,244 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759877_19053, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-22 10:27:25,583 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759877_19053 replica FinalizedReplica, blk_1073759877_19053, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759877 for deletion 2025-07-22 10:27:25,584 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759877_19053 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759877 2025-07-22 10:28:23,214 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759878_19054 src: /192.168.158.6:36940 dest: /192.168.158.4:9866 2025-07-22 10:28:23,233 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:36940, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_744727367_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759878_19054, duration(ns): 16300247 2025-07-22 10:28:23,233 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759878_19054, type=LAST_IN_PIPELINE terminating 2025-07-22 10:28:25,586 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759878_19054 replica FinalizedReplica, blk_1073759878_19054, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759878 for deletion 2025-07-22 10:28:25,587 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759878_19054 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759878 2025-07-22 10:29:23,220 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759879_19055 src: /192.168.158.5:38976 dest: /192.168.158.4:9866 2025-07-22 10:29:23,244 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:38976, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1135269884_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759879_19055, duration(ns): 18491551 2025-07-22 10:29:23,245 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759879_19055, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-22 10:29:25,589 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759879_19055 replica FinalizedReplica, blk_1073759879_19055, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759879 for deletion 2025-07-22 10:29:25,590 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759879_19055 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759879 2025-07-22 10:30:23,212 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759880_19056 src: /192.168.158.1:58332 dest: /192.168.158.4:9866 2025-07-22 10:30:23,243 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58332, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1607139650_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759880_19056, duration(ns): 21422468 2025-07-22 10:30:23,243 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759880_19056, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-22 10:30:25,591 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759880_19056 replica FinalizedReplica, blk_1073759880_19056, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759880 for deletion 2025-07-22 10:30:25,593 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759880_19056 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759880 2025-07-22 10:32:23,218 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759882_19058 src: /192.168.158.1:53498 dest: /192.168.158.4:9866 2025-07-22 10:32:23,252 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53498, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_10780523_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759882_19058, duration(ns): 24560561 2025-07-22 10:32:23,252 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759882_19058, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-22 10:32:28,592 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759882_19058 replica FinalizedReplica, blk_1073759882_19058, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759882 for deletion 2025-07-22 10:32:28,593 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759882_19058 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759882 2025-07-22 10:35:23,228 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759885_19061 src: /192.168.158.8:50846 dest: /192.168.158.4:9866 2025-07-22 10:35:23,255 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:50846, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1649330360_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759885_19061, duration(ns): 20664999 2025-07-22 10:35:23,255 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759885_19061, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-22 10:35:28,600 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759885_19061 replica FinalizedReplica, blk_1073759885_19061, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759885 for deletion 2025-07-22 10:35:28,602 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759885_19061 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759885 2025-07-22 10:36:23,237 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759886_19062 src: /192.168.158.5:53254 dest: /192.168.158.4:9866 2025-07-22 10:36:23,256 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:53254, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2067122324_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759886_19062, duration(ns): 17183351 2025-07-22 10:36:23,257 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759886_19062, type=LAST_IN_PIPELINE terminating 2025-07-22 10:36:28,602 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759886_19062 replica FinalizedReplica, blk_1073759886_19062, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759886 for deletion 2025-07-22 10:36:28,604 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759886_19062 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759886 2025-07-22 10:37:23,237 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759887_19063 src: /192.168.158.9:45112 dest: /192.168.158.4:9866 2025-07-22 10:37:23,263 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:45112, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_326581452_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759887_19063, duration(ns): 20406593 2025-07-22 10:37:23,263 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759887_19063, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-22 10:37:28,606 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759887_19063 replica FinalizedReplica, blk_1073759887_19063, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759887 for deletion 2025-07-22 10:37:28,607 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759887_19063 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759887 2025-07-22 10:39:23,230 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759889_19065 src: /192.168.158.1:50272 dest: /192.168.158.4:9866 2025-07-22 10:39:23,264 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50272, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_969739170_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759889_19065, duration(ns): 23022949 2025-07-22 10:39:23,264 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759889_19065, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-22 10:39:25,610 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759889_19065 replica FinalizedReplica, blk_1073759889_19065, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759889 for deletion 2025-07-22 10:39:25,611 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759889_19065 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759889 2025-07-22 10:42:28,234 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759892_19068 src: /192.168.158.1:44554 dest: /192.168.158.4:9866 2025-07-22 10:42:28,266 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44554, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1612363879_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759892_19068, duration(ns): 22407867 2025-07-22 10:42:28,266 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759892_19068, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-22 10:42:34,620 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759892_19068 replica FinalizedReplica, blk_1073759892_19068, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759892 for deletion 2025-07-22 10:42:34,622 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759892_19068 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759892 2025-07-22 10:45:28,232 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759895_19071 src: /192.168.158.5:35454 dest: /192.168.158.4:9866 2025-07-22 10:45:28,262 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:35454, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-189048351_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759895_19071, duration(ns): 22792234 2025-07-22 10:45:28,262 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759895_19071, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-22 10:45:31,631 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759895_19071 replica FinalizedReplica, blk_1073759895_19071, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759895 for deletion 2025-07-22 10:45:31,633 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759895_19071 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759895 2025-07-22 10:46:28,235 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759896_19072 src: /192.168.158.1:59590 dest: /192.168.158.4:9866 2025-07-22 10:46:28,268 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59590, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1975641612_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759896_19072, duration(ns): 23632886 2025-07-22 10:46:28,268 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759896_19072, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-22 10:46:31,634 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759896_19072 replica FinalizedReplica, blk_1073759896_19072, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759896 for deletion 2025-07-22 10:46:31,635 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759896_19072 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759896 2025-07-22 10:47:33,242 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759897_19073 src: /192.168.158.1:40482 dest: /192.168.158.4:9866 2025-07-22 10:47:33,276 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40482, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1634091247_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759897_19073, duration(ns): 24431412 2025-07-22 10:47:33,277 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759897_19073, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.9:9866] terminating 2025-07-22 10:47:34,636 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759897_19073 replica FinalizedReplica, blk_1073759897_19073, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759897 for deletion 2025-07-22 10:47:34,638 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759897_19073 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759897 2025-07-22 10:48:38,240 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759898_19074 src: /192.168.158.1:34422 dest: /192.168.158.4:9866 2025-07-22 10:48:38,273 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34422, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1645674113_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759898_19074, duration(ns): 23822054 2025-07-22 10:48:38,273 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759898_19074, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-22 10:48:40,639 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759898_19074 replica FinalizedReplica, blk_1073759898_19074, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759898 for deletion 2025-07-22 10:48:40,640 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759898_19074 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759898 2025-07-22 10:50:43,245 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759900_19076 src: /192.168.158.5:49198 dest: /192.168.158.4:9866 2025-07-22 10:50:43,272 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:49198, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1304457992_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759900_19076, duration(ns): 20743556 2025-07-22 10:50:43,272 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759900_19076, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-22 10:50:49,644 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759900_19076 replica FinalizedReplica, blk_1073759900_19076, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759900 for deletion 2025-07-22 10:50:49,645 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759900_19076 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759900 2025-07-22 10:51:43,260 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759901_19077 src: /192.168.158.6:34856 dest: /192.168.158.4:9866 2025-07-22 10:51:43,280 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:34856, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-952334001_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759901_19077, duration(ns): 17644038 2025-07-22 10:51:43,281 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759901_19077, type=LAST_IN_PIPELINE terminating 2025-07-22 10:51:46,647 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759901_19077 replica FinalizedReplica, blk_1073759901_19077, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759901 for deletion 2025-07-22 10:51:46,648 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759901_19077 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759901 2025-07-22 10:54:48,259 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759904_19080 src: /192.168.158.8:54112 dest: /192.168.158.4:9866 2025-07-22 10:54:48,286 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:54112, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-984563704_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759904_19080, duration(ns): 20586543 2025-07-22 10:54:48,286 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759904_19080, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-22 10:54:52,652 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759904_19080 replica FinalizedReplica, blk_1073759904_19080, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759904 for deletion 2025-07-22 10:54:52,653 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759904_19080 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759904 2025-07-22 10:55:48,247 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759905_19081 src: /192.168.158.5:51910 dest: /192.168.158.4:9866 2025-07-22 10:55:48,272 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:51910, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-830306488_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759905_19081, duration(ns): 19640606 2025-07-22 10:55:48,273 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759905_19081, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-22 10:55:49,654 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759905_19081 replica FinalizedReplica, blk_1073759905_19081, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759905 for deletion 2025-07-22 10:55:49,655 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759905_19081 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759905 2025-07-22 10:59:58,288 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759909_19085 src: /192.168.158.9:49042 dest: /192.168.158.4:9866 2025-07-22 10:59:58,307 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:49042, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1646771972_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759909_19085, duration(ns): 16470923 2025-07-22 10:59:58,307 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759909_19085, type=LAST_IN_PIPELINE terminating 2025-07-22 11:00:01,669 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759909_19085 replica FinalizedReplica, blk_1073759909_19085, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759909 for deletion 2025-07-22 11:00:01,670 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759909_19085 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759909 2025-07-22 11:04:03,264 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759913_19089 src: /192.168.158.9:38598 dest: /192.168.158.4:9866 2025-07-22 11:04:03,283 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:38598, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1825529085_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759913_19089, duration(ns): 16323967 2025-07-22 11:04:03,283 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759913_19089, type=LAST_IN_PIPELINE terminating 2025-07-22 11:04:04,678 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759913_19089 replica FinalizedReplica, blk_1073759913_19089, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759913 for deletion 2025-07-22 11:04:04,679 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759913_19089 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759913 2025-07-22 11:06:08,272 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759915_19091 src: /192.168.158.1:47914 dest: /192.168.158.4:9866 2025-07-22 11:06:08,305 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47914, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-560006409_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759915_19091, duration(ns): 24283347 2025-07-22 11:06:08,306 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759915_19091, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-22 11:06:10,681 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759915_19091 replica FinalizedReplica, blk_1073759915_19091, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759915 for deletion 2025-07-22 11:06:10,682 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759915_19091 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759915 2025-07-22 11:07:08,260 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759916_19092 src: /192.168.158.9:59722 dest: /192.168.158.4:9866 2025-07-22 11:07:08,285 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:59722, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1341791251_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759916_19092, duration(ns): 19525007 2025-07-22 11:07:08,285 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759916_19092, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-22 11:07:10,686 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759916_19092 replica FinalizedReplica, blk_1073759916_19092, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759916 for deletion 2025-07-22 11:07:10,687 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759916_19092 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759916 2025-07-22 11:08:08,270 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759917_19093 src: /192.168.158.5:57770 dest: /192.168.158.4:9866 2025-07-22 11:08:08,297 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:57770, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1035458864_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759917_19093, duration(ns): 20705689 2025-07-22 11:08:08,297 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759917_19093, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-22 11:08:10,689 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759917_19093 replica FinalizedReplica, blk_1073759917_19093, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759917 for deletion 2025-07-22 11:08:10,690 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759917_19093 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759917 2025-07-22 11:09:13,270 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759918_19094 src: /192.168.158.1:47508 dest: /192.168.158.4:9866 2025-07-22 11:09:13,303 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47508, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1272587999_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759918_19094, duration(ns): 23466862 2025-07-22 11:09:13,304 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759918_19094, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.9:9866] terminating 2025-07-22 11:09:16,693 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759918_19094 replica FinalizedReplica, blk_1073759918_19094, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759918 for deletion 2025-07-22 11:09:16,694 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759918_19094 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759918 2025-07-22 11:10:18,264 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759919_19095 src: /192.168.158.7:48568 dest: /192.168.158.4:9866 2025-07-22 11:10:18,289 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:48568, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1031959474_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759919_19095, duration(ns): 19530162 2025-07-22 11:10:18,290 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759919_19095, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-22 11:10:19,693 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759919_19095 replica FinalizedReplica, blk_1073759919_19095, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759919 for deletion 2025-07-22 11:10:19,694 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759919_19095 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759919 2025-07-22 11:15:23,275 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759924_19100 src: /192.168.158.5:40012 dest: /192.168.158.4:9866 2025-07-22 11:15:23,302 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:40012, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-190571099_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759924_19100, duration(ns): 21174742 2025-07-22 11:15:23,303 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759924_19100, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-22 11:15:28,701 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759924_19100 replica FinalizedReplica, blk_1073759924_19100, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759924 for deletion 2025-07-22 11:15:28,703 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759924_19100 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759924 2025-07-22 11:17:23,275 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759926_19102 src: /192.168.158.1:43418 dest: /192.168.158.4:9866 2025-07-22 11:17:23,306 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43418, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1272401794_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759926_19102, duration(ns): 22107942 2025-07-22 11:17:23,306 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759926_19102, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-22 11:17:28,706 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759926_19102 replica FinalizedReplica, blk_1073759926_19102, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759926 for deletion 2025-07-22 11:17:28,707 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759926_19102 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759926 2025-07-22 11:19:23,291 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759928_19104 src: /192.168.158.9:56888 dest: /192.168.158.4:9866 2025-07-22 11:19:23,310 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:56888, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-305075377_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759928_19104, duration(ns): 17112674 2025-07-22 11:19:23,310 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759928_19104, type=LAST_IN_PIPELINE terminating 2025-07-22 11:19:25,710 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759928_19104 replica FinalizedReplica, blk_1073759928_19104, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759928 for deletion 2025-07-22 11:19:25,711 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759928_19104 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759928 2025-07-22 11:23:33,289 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759932_19108 src: /192.168.158.9:48248 dest: /192.168.158.4:9866 2025-07-22 11:23:33,309 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:48248, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_953040174_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759932_19108, duration(ns): 17953176 2025-07-22 11:23:33,310 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759932_19108, type=LAST_IN_PIPELINE terminating 2025-07-22 11:23:34,719 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759932_19108 replica FinalizedReplica, blk_1073759932_19108, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759932 for deletion 2025-07-22 11:23:34,721 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759932_19108 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759932 2025-07-22 11:24:33,298 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759933_19109 src: /192.168.158.7:50020 dest: /192.168.158.4:9866 2025-07-22 11:24:33,317 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:50020, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1091652397_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759933_19109, duration(ns): 17003411 2025-07-22 11:24:33,318 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759933_19109, type=LAST_IN_PIPELINE terminating 2025-07-22 11:24:34,722 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759933_19109 replica FinalizedReplica, blk_1073759933_19109, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759933 for deletion 2025-07-22 11:24:34,723 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759933_19109 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759933 2025-07-22 11:28:33,310 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759937_19113 src: /192.168.158.5:37424 dest: /192.168.158.4:9866 2025-07-22 11:28:33,330 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:37424, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1990845079_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759937_19113, duration(ns): 17165763 2025-07-22 11:28:33,331 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759937_19113, type=LAST_IN_PIPELINE terminating 2025-07-22 11:28:34,732 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759937_19113 replica FinalizedReplica, blk_1073759937_19113, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759937 for deletion 2025-07-22 11:28:34,733 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759937_19113 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759937 2025-07-22 11:30:38,319 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759939_19115 src: /192.168.158.7:33500 dest: /192.168.158.4:9866 2025-07-22 11:30:38,346 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:33500, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-924861769_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759939_19115, duration(ns): 20733076 2025-07-22 11:30:38,346 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759939_19115, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-22 11:30:40,738 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759939_19115 replica FinalizedReplica, blk_1073759939_19115, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759939 for deletion 2025-07-22 11:30:40,739 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759939_19115 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759939 2025-07-22 11:34:38,309 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759943_19119 src: /192.168.158.1:58328 dest: /192.168.158.4:9866 2025-07-22 11:34:38,340 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58328, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1150926328_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759943_19119, duration(ns): 22307520 2025-07-22 11:34:38,341 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759943_19119, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-22 11:34:40,747 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759943_19119 replica FinalizedReplica, blk_1073759943_19119, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759943 for deletion 2025-07-22 11:34:40,748 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759943_19119 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759943 2025-07-22 11:35:38,317 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759944_19120 src: /192.168.158.8:33788 dest: /192.168.158.4:9866 2025-07-22 11:35:38,336 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:33788, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_346614652_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759944_19120, duration(ns): 17073378 2025-07-22 11:35:38,337 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759944_19120, type=LAST_IN_PIPELINE terminating 2025-07-22 11:35:40,751 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759944_19120 replica FinalizedReplica, blk_1073759944_19120, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759944 for deletion 2025-07-22 11:35:40,752 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759944_19120 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759944 2025-07-22 11:37:38,322 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759946_19122 src: /192.168.158.5:55864 dest: /192.168.158.4:9866 2025-07-22 11:37:38,349 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:55864, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-560154278_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759946_19122, duration(ns): 21235460 2025-07-22 11:37:38,349 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759946_19122, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-22 11:37:40,755 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759946_19122 replica FinalizedReplica, blk_1073759946_19122, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759946 for deletion 2025-07-22 11:37:40,756 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759946_19122 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759946 2025-07-22 11:40:38,322 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759949_19125 src: /192.168.158.6:41598 dest: /192.168.158.4:9866 2025-07-22 11:40:38,349 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:41598, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1252268829_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759949_19125, duration(ns): 20875718 2025-07-22 11:40:38,350 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759949_19125, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-22 11:40:43,763 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759949_19125 replica FinalizedReplica, blk_1073759949_19125, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759949 for deletion 2025-07-22 11:40:43,764 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759949_19125 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759949 2025-07-22 11:41:43,328 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759950_19126 src: /192.168.158.6:58030 dest: /192.168.158.4:9866 2025-07-22 11:41:43,347 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:58030, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-637810922_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759950_19126, duration(ns): 16673336 2025-07-22 11:41:43,347 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759950_19126, type=LAST_IN_PIPELINE terminating 2025-07-22 11:41:46,766 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759950_19126 replica FinalizedReplica, blk_1073759950_19126, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759950 for deletion 2025-07-22 11:41:46,768 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759950_19126 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759950 2025-07-22 11:42:43,318 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759951_19127 src: /192.168.158.1:37690 dest: /192.168.158.4:9866 2025-07-22 11:42:43,352 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37690, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-940532707_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759951_19127, duration(ns): 24132349 2025-07-22 11:42:43,352 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759951_19127, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-22 11:42:46,770 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759951_19127 replica FinalizedReplica, blk_1073759951_19127, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759951 for deletion 2025-07-22 11:42:46,771 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759951_19127 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759951 2025-07-22 11:43:38,594 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: RECEIVED SIGNAL 15: SIGTERM 2025-07-22 11:43:38,601 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG: /************************************************************ SHUTDOWN_MSG: Shutting down DataNode at dmidlkprdls04.svr.luc.edu/192.168.158.4 ************************************************************/ 2025-07-22 11:43:43,385 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: STARTUP_MSG: /************************************************************ STARTUP_MSG: Starting DataNode STARTUP_MSG: host = dmidlkprdls04.svr.luc.edu/192.168.158.4 STARTUP_MSG: args = [] STARTUP_MSG: version = 3.1.1.7.3.1.0-197 STARTUP_MSG: classpath = /var/run/cloudera-scm-agent/process/360-hdfs-DATANODE:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/audience-annotations-0.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/avro.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/aws-java-sdk-bundle-1.12.720.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-hdfs-plugin-shim-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-plugin-classloader-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/ranger-yarn-plugin-shim-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/azure-data-lake-store-sdk-2.3.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/checker-qual-3.33.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-beanutils-1.9.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-cli-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-codec-1.15.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-collections-3.2.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-compress-1.23.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-configuration2-2.10.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-io-2.11.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-lang-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-lang3-3.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-logging-1.1.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-math3-3.6.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-net-3.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/commons-text-1.10.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-client-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-framework-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/curator-recipes-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/failureaccess-1.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/gson-2.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/guava-32.0.1-jre.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/httpclient-4.5.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/httpcore-4.4.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/j2objc-annotations-2.8.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-core-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-core-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-databind-2.12.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-mapper-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jackson-xc-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/javax.activation-api-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/javax.servlet-api-3.1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jaxb-api-2.2.11.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jcip-annotations-1.0-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-core-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-json-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-server-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jersey-servlet-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jettison-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-http-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-io-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-security-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-server-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-servlet-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-util-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-util-ajax-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-webapp-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jetty-xml-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jline-3.22.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsch-0.1.54.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsp-api-2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsr305-3.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jsr311-api-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/jul-to-slf4j-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-admin-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-client-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-common-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-core-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-crypto-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-identity-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-server-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-simplekdc-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerb-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-asn1-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-config-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-pkix-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/kerby-xdr-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/logredactor-2.0.16.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/lz4-java-1.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/metrics-core-3.2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-all-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-buffer-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-haproxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-http-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-http2-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-memcache-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-mqtt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-redis-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-smtp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-log4j12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-reload4j-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/slf4j-api-1.7.36.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/reload4j-1.2.22.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/zookeeper.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-udt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-unix-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-kqueue-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final-linux-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-classes-kqueue-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-rxtx-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-classes-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-classes-macos-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-resolver-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-ssl-ocsp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-proxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-handler-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-xml-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-stomp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-codec-socks-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/wildfly-openssl-2.1.4.ClouderaFinal.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/protobuf-java-2.5.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/nimbus-jose-jwt-9.37.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/woodstox-core-5.4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/token-provider-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/stax2-api-4.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/snappy-java-1.1.10.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/re2j-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/zookeeper-jute.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-sctp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-kqueue-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/lib/netty-transport-native-epoll-4.1.100.Final-linux-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//gcs-connector-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-annotations.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-auth.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-aws.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-datalake.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-kms.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-nfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//ozone-filesystem-hadoop3-1.3.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-nfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-kms-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-common-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-datalake-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-azure-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-aws-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-auth-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//hadoop-annotations-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//gcs-connector-2.1.2.7.3.1.0-197-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-thrift.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-scala_2.12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-protobuf.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-pig.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-pig-bundle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-jackson.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-hadoop.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-hadoop-bundle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-generator.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-format-structures.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-encoding.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-column.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-cascading3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-cascading.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop/.//parquet-avro.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/./:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/audience-annotations-0.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/avro-1.11.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/checker-qual-3.33.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-beanutils-1.9.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-codec-1.15.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-collections-3.2.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-compress-1.23.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-configuration2-2.10.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-io-2.11.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-lang3-3.12.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-math3-3.6.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-net-3.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/commons-text-1.10.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-client-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-framework-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/curator-recipes-5.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/failureaccess-1.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/gson-2.9.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/guava-32.0.1-jre.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/httpclient-4.5.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/httpcore-4.4.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/j2objc-annotations-2.8.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-core-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-databind-2.12.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-jaxrs-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jackson-xc-1.9.13.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/javax.activation-api-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/javax.servlet-api-3.1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jaxb-api-2.2.11.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jaxb-impl-2.2.3-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jcip-annotations-1.0-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-core-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-json-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-server-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jersey-servlet-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jettison-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-http-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-io-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-security-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-server-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-servlet-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-util-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-util-ajax-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-webapp-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jetty-xml-9.4.54.v20240208.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jline-3.22.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsch-0.1.54.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/json-simple-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/jsr311-api-1.1.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-admin-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-client-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-common-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-core-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-crypto-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-identity-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-server-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-simplekdc-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerb-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-asn1-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-config-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-pkix-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-util-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/kerby-xdr-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/leveldbjni-cldr-1.9.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/lz4-java-1.7.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/metrics-core-3.2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-all-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-buffer-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-haproxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-http-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-http2-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-memcache-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-mqtt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-redis-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-smtp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-socks-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-stomp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-codec-xml-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-proxy-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-handler-ssl-ocsp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/re2j-1.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/zookeeper-3.8.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-sctp-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-rxtx-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-unix-common-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-kqueue-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-kqueue-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final-linux-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-native-epoll-4.1.100.Final-linux-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-classes-kqueue-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-classes-epoll-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-x86_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-native-macos-4.1.100.Final-osx-aarch_64.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-classes-macos-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-dns-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-resolver-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/nimbus-jose-jwt-9.37.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/woodstox-core-5.4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/token-provider-2.0.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/stax2-api-4.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/snappy-java-1.1.10.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/reload4j-1.2.22.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/zookeeper-jute-3.8.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/lib/netty-transport-udt-4.1.100.Final.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-httpfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-nfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-rbf-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-nfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-native-client-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-httpfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-client-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/.//hadoop-hdfs-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//asm-5.0.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//aspectjweaver-1.9.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//azure-storage-7.0.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//checker-compat-qual-2.5.3.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-slf4j-backend-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-system-backend-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//google-extensions-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//flogger-0.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//azure-keyvault-core-1.0.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//accessors-smart-2.4.9.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gcs-connector-2.1.2.7.3.1.0-197-shaded.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archive-logs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archive-logs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archives-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-archives.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-datajoin-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-datajoin.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-distcp-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-distcp.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-extras-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-extras.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-fs2img-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-fs2img.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-gridmix-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-gridmix.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-kafka-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-kafka.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-app-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-app.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-core-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-core.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-3.1.1.7.3.1.0-197-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-nativetask-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-nativetask.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-uploader-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-uploader.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-examples-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-mapreduce-examples.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-openstack-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-openstack.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-resourceestimator-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-resourceestimator.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-rumen-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-rumen.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-sls-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-sls.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-streaming-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//hadoop-streaming.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ojalgo-43.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//kafka-clients-2.8.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//log4j-core-2.18.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-hook-abfs-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//forbiddenapis-3.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-intg-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//log4j-api-2.18.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//zstd-jni-1.4.9-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-i18n.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-s3-lib-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//javax.activation-1.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//bundle-2.23.5.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//json-smart-2.4.10.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-util-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-shell.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//gateway-cloud-bindings.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//ranger-raz-hook-s3-2.4.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-mapreduce/.//aspectjrt-1.9.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/./:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/swagger-annotations-1.5.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/objenesis-1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jersey-client-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jakarta.xml.bind-api-2.3.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jakarta.activation-api-1.2.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-dataformat-yaml-2.9.10.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcutil-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcprov-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/bcpkix-jdk18on-1.78.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/HikariCP-java7-2.4.12.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/snakeyaml-2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jsonschema2pojo-core-1.0.2.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/joda-time-2.10.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jna-5.2.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jersey-guice-1.19.4.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/javax.inject-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-module-jaxb-annotations-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-jaxrs-json-provider-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/jackson-jaxrs-base-2.12.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/guice-servlet-4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/guice-4.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/geronimo-jcache_1.0_spec-1.0-alpha-1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/fst-2.50.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/ehcache-3.3.1.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/dnsjava-2.1.7.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/codemodel-2.6.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-api.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-distributedshell.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-client.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-cloud-resourcemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-registry.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-common.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-nodemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-resourcemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-router.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-sharedcachemanager.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-tests.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-timeline-pluginstorage.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-web-proxy.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-api.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-core.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-registry-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-client-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-api-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-core-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-services-api-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-web-proxy-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-timeline-pluginstorage-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-tests-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-sharedcachemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-router-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-resourcemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-nodemanager-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-common-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice-3.1.1.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-cloud-resourcemanager-1.0.0.7.3.1.0-197.jar:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-yarn/.//hadoop-yarn-applications-distributedshell-3.1.1.7.3.1.0-197.jar:/opt/cloudera/cm/lib/plugins/event-publish-7.13.1-shaded.jar:/opt/cloudera/cm/lib/plugins/tt-instrumentation-7.13.1.jar STARTUP_MSG: build = git@github.infra.cloudera.com:CDH/hadoop.git -r 31a42fb39494f541ffae15c3c61185deeeacca86; compiled by 'jenkins' on 2024-12-04T01:09Z STARTUP_MSG: java = 1.8.0_432 ************************************************************/ 2025-07-22 11:43:43,479 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: registered UNIX signal handlers for [TERM, HUP, INT] 2025-07-22 11:43:43,821 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d1/dfs/dn 2025-07-22 11:43:43,827 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d2/dfs/dn 2025-07-22 11:43:43,828 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d3/dfs/dn 2025-07-22 11:43:43,828 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/hdfs/d4/dfs/dn 2025-07-22 11:43:43,935 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: Loaded properties from hadoop-metrics2.properties 2025-07-22 11:43:44,041 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled Metric snapshot period at 10 second(s). 2025-07-22 11:43:44,041 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: DataNode metrics system started 2025-07-22 11:43:44,344 INFO org.apache.hadoop.hdfs.server.common.Util: dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling 2025-07-22 11:43:44,370 INFO org.apache.hadoop.hdfs.server.datanode.BlockScanner: Initialized block scanner with targetBytesPerSec 1048576 2025-07-22 11:43:44,377 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: File descriptor passing is enabled. 2025-07-22 11:43:44,378 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Configured hostname is dmidlkprdls04.svr.luc.edu 2025-07-22 11:43:44,379 INFO org.apache.hadoop.hdfs.server.common.Util: dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling 2025-07-22 11:43:44,485 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting DataNode with maxLockedMemory = 4294967296 2025-07-22 11:43:44,524 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened streaming server at /192.168.158.4:9866 2025-07-22 11:43:44,528 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Balancing bandwidth is 10485760 bytes/s 2025-07-22 11:43:44,528 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Number threads for balancing is 50 2025-07-22 11:43:44,533 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Balancing bandwidth is 10485760 bytes/s 2025-07-22 11:43:44,533 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Number threads for balancing is 50 2025-07-22 11:43:44,533 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Listening on UNIX domain socket: /var/run/hdfs-sockets/dn 2025-07-22 11:43:44,590 INFO org.eclipse.jetty.util.log: Logging initialized @2442ms to org.eclipse.jetty.util.log.Slf4jLog 2025-07-22 11:43:44,718 WARN org.apache.hadoop.security.authentication.server.AuthenticationFilter: Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /var/lib/hadoop-hdfs/hadoop-http-auth-signature-secret 2025-07-22 11:43:44,727 INFO org.apache.hadoop.http.HttpRequestLog: Http request log for http.requests.datanode is not defined 2025-07-22 11:43:44,737 INFO org.apache.hadoop.http.HttpServer2: Added global filter 'safety' (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter) 2025-07-22 11:43:44,739 INFO org.apache.hadoop.security.HttpCrossOriginFilterInitializer: CORS filter not enabled. Please set hadoop.http.cross-origin.enabled to 'true' to enable it 2025-07-22 11:43:44,740 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context datanode 2025-07-22 11:43:44,741 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context logs 2025-07-22 11:43:44,741 INFO org.apache.hadoop.http.HttpServer2: Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context static 2025-07-22 11:43:44,789 INFO org.apache.hadoop.http.HttpServer2: Jetty bound to port 35755 2025-07-22 11:43:44,791 INFO org.eclipse.jetty.server.Server: jetty-9.4.54.v20240208; built: 2024-02-08T19:42:39.027Z; git: cef3fbd6d736a21e7d541a5db490381d95a2047d; jvm 1.8.0_432-b06 2025-07-22 11:43:44,847 INFO org.eclipse.jetty.server.session: DefaultSessionIdManager workerName=node0 2025-07-22 11:43:44,847 INFO org.eclipse.jetty.server.session: No SessionScavenger set, using defaults 2025-07-22 11:43:44,850 INFO org.eclipse.jetty.server.session: node0 Scavenging every 600000ms 2025-07-22 11:43:44,879 WARN org.apache.hadoop.security.authentication.server.AuthenticationFilter: Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /var/lib/hadoop-hdfs/hadoop-http-auth-signature-secret 2025-07-22 11:43:44,884 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.s.ServletContextHandler@48c40605{logs,/logs,file:///var/log/hadoop-hdfs/,AVAILABLE} 2025-07-22 11:43:44,886 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.s.ServletContextHandler@6cea706c{static,/static,file:///opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/static/,AVAILABLE} 2025-07-22 11:43:44,996 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.w.WebAppContext@2c444798{datanode,/,file:///opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/datanode/,AVAILABLE}{file:/opt/cloudera/parcels/CDH-7.3.1-1.cdh7.3.1.p0.60371244/lib/hadoop-hdfs/webapps/datanode} 2025-07-22 11:43:45,009 INFO org.eclipse.jetty.server.AbstractConnector: Started ServerConnector@532721fd{HTTP/1.1, (http/1.1)}{localhost:35755} 2025-07-22 11:43:45,009 INFO org.eclipse.jetty.server.Server: Started @2861ms 2025-07-22 11:43:45,287 INFO org.apache.hadoop.hdfs.server.datanode.web.DatanodeHttpServer: Listening HTTP traffic on /192.168.158.4:9864 2025-07-22 11:43:45,298 INFO org.apache.hadoop.util.JvmPauseMonitor: Starting JVM pause monitor 2025-07-22 11:43:45,300 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: dnUserName = hdfs 2025-07-22 11:43:45,300 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: supergroup = supergroup 2025-07-22 11:43:45,371 INFO org.apache.hadoop.ipc.CallQueueManager: Using callQueue: class java.util.concurrent.LinkedBlockingQueue queueCapacity: 1000 scheduler: class org.apache.hadoop.ipc.DefaultRpcScheduler 2025-07-22 11:43:45,393 INFO org.apache.hadoop.ipc.Server: Starting Socket Reader #1 for port 9867 2025-07-22 11:43:45,443 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened IPC server at /192.168.158.4:9867 2025-07-22 11:43:45,480 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Refresh request received for nameservices: null 2025-07-22 11:43:45,490 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting BPOfferServices for nameservices: 2025-07-22 11:43:45,505 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool (Datanode Uuid unassigned) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 starting to offer service 2025-07-22 11:43:45,513 INFO org.apache.hadoop.ipc.Server: IPC Server listener on 9867: starting 2025-07-22 11:43:45,513 INFO org.apache.hadoop.ipc.Server: IPC Server Responder: starting 2025-07-22 11:43:46,712 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-22 11:43:47,714 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-22 11:43:48,716 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-22 11:43:49,717 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-22 11:43:50,719 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 2025-07-22 11:43:51,043 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-22 11:43:56,171 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Acknowledging ACTIVE Namenode during handshakeBlock pool (Datanode Uuid unassigned) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 2025-07-22 11:43:56,175 INFO org.apache.hadoop.hdfs.server.common.Storage: Using 4 threads to upgrade data directories (dfs.datanode.parallel.volumes.load.threads.num=4, dataDirs=4) 2025-07-22 11:43:56,182 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d1/dfs/dn/in_use.lock acquired by nodename 3561321@dmidlkprdls04.svr.luc.edu 2025-07-22 11:43:56,191 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d2/dfs/dn/in_use.lock acquired by nodename 3561321@dmidlkprdls04.svr.luc.edu 2025-07-22 11:43:56,192 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d3/dfs/dn/in_use.lock acquired by nodename 3561321@dmidlkprdls04.svr.luc.edu 2025-07-22 11:43:56,194 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /hdfs/d4/dfs/dn/in_use.lock acquired by nodename 3561321@dmidlkprdls04.svr.luc.edu 2025-07-22 11:43:56,221 INFO org.apache.hadoop.hdfs.server.common.Storage: Analyzing storage directories for bpid BP-1059995147-192.168.158.1-1752101929360 2025-07-22 11:43:56,221 INFO org.apache.hadoop.hdfs.server.common.Storage: Locking is disabled for /hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360 2025-07-22 11:43:56,244 INFO org.apache.hadoop.hdfs.server.common.Storage: Analyzing storage directories for bpid BP-1059995147-192.168.158.1-1752101929360 2025-07-22 11:43:56,244 INFO org.apache.hadoop.hdfs.server.common.Storage: Locking is disabled for /hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360 2025-07-22 11:43:56,263 INFO org.apache.hadoop.hdfs.server.common.Storage: Analyzing storage directories for bpid BP-1059995147-192.168.158.1-1752101929360 2025-07-22 11:43:56,263 INFO org.apache.hadoop.hdfs.server.common.Storage: Locking is disabled for /hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360 2025-07-22 11:43:56,280 INFO org.apache.hadoop.hdfs.server.common.Storage: Analyzing storage directories for bpid BP-1059995147-192.168.158.1-1752101929360 2025-07-22 11:43:56,280 INFO org.apache.hadoop.hdfs.server.common.Storage: Locking is disabled for /hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360 2025-07-22 11:43:56,281 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Setting up storage: nsid=2068539957;bpid=BP-1059995147-192.168.158.1-1752101929360;lv=-57;nsInfo=lv=-64;cid=cluster59;nsid=2068539957;c=1752101929360;bpid=BP-1059995147-192.168.158.1-1752101929360;dnuuid=be50c32a-aa23-4b9d-aa7f-05816b6e5f1a 2025-07-22 11:43:56,294 INFO org.apache.hadoop.conf.Configuration.deprecation: No unit for dfs.datanode.lock-reporting-threshold-ms(300) assuming MILLISECONDS 2025-07-22 11:43:56,296 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: The datanode lock is a read write lock 2025-07-22 11:43:56,334 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Added new volume: DS-c6caf9b4-0cd0-462e-a7af-39538ffb6d0e 2025-07-22 11:43:56,334 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Added volume - [DISK]file:/hdfs/d1/dfs/dn, StorageType: DISK 2025-07-22 11:43:56,336 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Added new volume: DS-ab1b4344-d9fe-4401-915a-b02983ca3944 2025-07-22 11:43:56,336 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Added volume - [DISK]file:/hdfs/d2/dfs/dn, StorageType: DISK 2025-07-22 11:43:56,337 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Added new volume: DS-f02a6d6f-472c-481a-aa41-d58991ac764f 2025-07-22 11:43:56,337 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Added volume - [DISK]file:/hdfs/d3/dfs/dn, StorageType: DISK 2025-07-22 11:43:56,340 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Added new volume: DS-e9eccc83-296b-4afa-bee5-915188e0d9a5 2025-07-22 11:43:56,340 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Added volume - [DISK]file:/hdfs/d4/dfs/dn, StorageType: DISK 2025-07-22 11:43:56,349 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Registered FSDatasetState MBean 2025-07-22 11:43:56,355 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Adding block pool BP-1059995147-192.168.158.1-1752101929360 2025-07-22 11:43:56,356 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Scanning block pool BP-1059995147-192.168.158.1-1752101929360 on volume /hdfs/d1/dfs/dn... 2025-07-22 11:43:56,356 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Scanning block pool BP-1059995147-192.168.158.1-1752101929360 on volume /hdfs/d3/dfs/dn... 2025-07-22 11:43:56,356 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Scanning block pool BP-1059995147-192.168.158.1-1752101929360 on volume /hdfs/d2/dfs/dn... 2025-07-22 11:43:56,357 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Scanning block pool BP-1059995147-192.168.158.1-1752101929360 on volume /hdfs/d4/dfs/dn... 2025-07-22 11:43:56,383 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Cached dfsUsed found for /hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current: 270557184 2025-07-22 11:43:56,385 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Cached dfsUsed found for /hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current: 270557184 2025-07-22 11:43:56,384 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Cached dfsUsed found for /hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current: 204353536 2025-07-22 11:43:56,387 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Cached dfsUsed found for /hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current: 270557184 2025-07-22 11:43:56,428 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Time taken to scan block pool BP-1059995147-192.168.158.1-1752101929360 on /hdfs/d2/dfs/dn: 68ms 2025-07-22 11:43:56,430 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Time taken to scan block pool BP-1059995147-192.168.158.1-1752101929360 on /hdfs/d3/dfs/dn: 72ms 2025-07-22 11:43:56,430 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Time taken to scan block pool BP-1059995147-192.168.158.1-1752101929360 on /hdfs/d1/dfs/dn: 72ms 2025-07-22 11:43:56,430 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Time taken to scan block pool BP-1059995147-192.168.158.1-1752101929360 on /hdfs/d4/dfs/dn: 70ms 2025-07-22 11:43:56,431 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Total time to scan all replicas for block pool BP-1059995147-192.168.158.1-1752101929360: 76ms 2025-07-22 11:43:56,433 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Adding replicas to map for block pool BP-1059995147-192.168.158.1-1752101929360 on volume /hdfs/d1/dfs/dn... 2025-07-22 11:43:56,433 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Adding replicas to map for block pool BP-1059995147-192.168.158.1-1752101929360 on volume /hdfs/d3/dfs/dn... 2025-07-22 11:43:56,434 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.BlockPoolSlice: Replica Cache file: /hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/replicas doesn't exist 2025-07-22 11:43:56,434 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Adding replicas to map for block pool BP-1059995147-192.168.158.1-1752101929360 on volume /hdfs/d4/dfs/dn... 2025-07-22 11:43:56,433 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Adding replicas to map for block pool BP-1059995147-192.168.158.1-1752101929360 on volume /hdfs/d2/dfs/dn... 2025-07-22 11:43:56,435 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.BlockPoolSlice: Replica Cache file: /hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/replicas doesn't exist 2025-07-22 11:43:56,435 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.BlockPoolSlice: Replica Cache file: /hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/replicas doesn't exist 2025-07-22 11:43:56,434 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.BlockPoolSlice: Replica Cache file: /hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/replicas doesn't exist 2025-07-22 11:43:56,469 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Time to add replicas to map for block pool BP-1059995147-192.168.158.1-1752101929360 on volume /hdfs/d4/dfs/dn: 34ms 2025-07-22 11:43:56,471 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Time to add replicas to map for block pool BP-1059995147-192.168.158.1-1752101929360 on volume /hdfs/d1/dfs/dn: 37ms 2025-07-22 11:43:56,471 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Time to add replicas to map for block pool BP-1059995147-192.168.158.1-1752101929360 on volume /hdfs/d3/dfs/dn: 37ms 2025-07-22 11:43:56,471 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Time to add replicas to map for block pool BP-1059995147-192.168.158.1-1752101929360 on volume /hdfs/d2/dfs/dn: 36ms 2025-07-22 11:43:56,472 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Total time to add all replicas to map for block pool BP-1059995147-192.168.158.1-1752101929360: 40ms 2025-07-22 11:43:56,473 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for /hdfs/d1/dfs/dn 2025-07-22 11:43:56,486 INFO org.apache.hadoop.hdfs.server.datanode.checker.DatasetVolumeChecker: Scheduled health check for volume /hdfs/d1/dfs/dn 2025-07-22 11:43:56,489 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for /hdfs/d2/dfs/dn 2025-07-22 11:43:56,489 INFO org.apache.hadoop.hdfs.server.datanode.checker.DatasetVolumeChecker: Scheduled health check for volume /hdfs/d2/dfs/dn 2025-07-22 11:43:56,489 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for /hdfs/d3/dfs/dn 2025-07-22 11:43:56,490 INFO org.apache.hadoop.hdfs.server.datanode.checker.DatasetVolumeChecker: Scheduled health check for volume /hdfs/d3/dfs/dn 2025-07-22 11:43:56,490 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for /hdfs/d4/dfs/dn 2025-07-22 11:43:56,491 INFO org.apache.hadoop.hdfs.server.datanode.checker.DatasetVolumeChecker: Scheduled health check for volume /hdfs/d4/dfs/dn 2025-07-22 11:43:56,511 INFO org.apache.hadoop.hdfs.server.datanode.VolumeScanner: VolumeScanner(/hdfs/d2/dfs/dn, DS-ab1b4344-d9fe-4401-915a-b02983ca3944): no suitable block pools found to scan. Waiting 713728737 ms. 2025-07-22 11:43:56,512 INFO org.apache.hadoop.hdfs.server.datanode.VolumeScanner: VolumeScanner(/hdfs/d1/dfs/dn, DS-c6caf9b4-0cd0-462e-a7af-39538ffb6d0e): no suitable block pools found to scan. Waiting 713728737 ms. 2025-07-22 11:43:56,512 INFO org.apache.hadoop.hdfs.server.datanode.VolumeScanner: VolumeScanner(/hdfs/d4/dfs/dn, DS-e9eccc83-296b-4afa-bee5-915188e0d9a5): no suitable block pools found to scan. Waiting 713728736 ms. 2025-07-22 11:43:56,513 INFO org.apache.hadoop.hdfs.server.datanode.VolumeScanner: VolumeScanner(/hdfs/d3/dfs/dn, DS-f02a6d6f-472c-481a-aa41-d58991ac764f): no suitable block pools found to scan. Waiting 713728735 ms. 2025-07-22 11:43:56,522 INFO org.apache.hadoop.hdfs.server.datanode.DirectoryScanner: Periodic Directory Tree Verification scan starting at 7/22/25 4:31 PM with interval of 21600000ms 2025-07-22 11:43:56,531 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool BP-1059995147-192.168.158.1-1752101929360 (Datanode Uuid be50c32a-aa23-4b9d-aa7f-05816b6e5f1a) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 beginning handshake with NN 2025-07-22 11:43:56,626 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool Block pool BP-1059995147-192.168.158.1-1752101929360 (Datanode Uuid be50c32a-aa23-4b9d-aa7f-05816b6e5f1a) service to dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 successfully registered with NN 2025-07-22 11:43:56,629 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: For namenode dmidlkprdls01.svr.luc.edu/192.168.158.1:8022 using BLOCKREPORT_INTERVAL of 21600000msec CACHEREPORT_INTERVAL of 10000msec Initial delay: 0msec; heartBeatInterval=3000 2025-07-22 11:43:56,629 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting IBR Task Handler. 2025-07-22 11:43:56,891 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Successfully sent block report 0xef850f8f8a47806a, containing 4 storage report(s), of which we sent 4. The reports had 8 total blocks and used 1 RPC(s). This took 9 msec to generate and 132 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5. 2025-07-22 11:43:56,893 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Got finalize command for block pool BP-1059995147-192.168.158.1-1752101929360 2025-07-22 11:48:08,458 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759955_19131 src: /192.168.158.7:42714 dest: /192.168.158.4:9866 2025-07-22 11:48:08,507 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:42714, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1525989941_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759955_19131, duration(ns): 24045012 2025-07-22 11:48:08,508 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759955_19131, type=LAST_IN_PIPELINE terminating 2025-07-22 11:49:08,369 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759956_19132 src: /192.168.158.6:52250 dest: /192.168.158.4:9866 2025-07-22 11:49:08,390 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:52250, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2021699495_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759956_19132, duration(ns): 17380575 2025-07-22 11:49:08,391 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759956_19132, type=LAST_IN_PIPELINE terminating 2025-07-22 11:50:08,351 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759957_19133 src: /192.168.158.8:34934 dest: /192.168.158.4:9866 2025-07-22 11:50:08,370 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:34934, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1135711121_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759957_19133, duration(ns): 16571922 2025-07-22 11:50:08,371 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759957_19133, type=LAST_IN_PIPELINE terminating 2025-07-22 11:52:13,353 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759959_19135 src: /192.168.158.1:52170 dest: /192.168.158.4:9866 2025-07-22 11:52:13,428 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52170, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_746923045_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759959_19135, duration(ns): 24707557 2025-07-22 11:52:13,428 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759959_19135, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-22 11:53:13,352 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759960_19136 src: /192.168.158.8:60638 dest: /192.168.158.4:9866 2025-07-22 11:53:13,371 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:60638, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1843820726_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759960_19136, duration(ns): 16355059 2025-07-22 11:53:13,372 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759960_19136, type=LAST_IN_PIPELINE terminating 2025-07-22 11:57:13,357 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759964_19140 src: /192.168.158.1:51228 dest: /192.168.158.4:9866 2025-07-22 11:57:13,394 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51228, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_106628987_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759964_19140, duration(ns): 22779419 2025-07-22 11:57:13,394 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759964_19140, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.5:9866] terminating 2025-07-22 11:59:23,381 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759966_19142 src: /192.168.158.7:53642 dest: /192.168.158.4:9866 2025-07-22 11:59:23,403 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:53642, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_305216921_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759966_19142, duration(ns): 18783442 2025-07-22 11:59:23,403 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759966_19142, type=LAST_IN_PIPELINE terminating 2025-07-22 12:00:23,368 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759967_19143 src: /192.168.158.1:54348 dest: /192.168.158.4:9866 2025-07-22 12:00:23,408 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54348, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_832136843_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759967_19143, duration(ns): 26444125 2025-07-22 12:00:23,409 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759967_19143, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-22 12:02:23,363 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759969_19145 src: /192.168.158.6:33510 dest: /192.168.158.4:9866 2025-07-22 12:02:23,383 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:33510, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-84030737_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759969_19145, duration(ns): 16588187 2025-07-22 12:02:23,384 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759969_19145, type=LAST_IN_PIPELINE terminating 2025-07-22 12:05:23,375 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759972_19148 src: /192.168.158.9:52162 dest: /192.168.158.4:9866 2025-07-22 12:05:23,396 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:52162, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-721835930_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759972_19148, duration(ns): 17715718 2025-07-22 12:05:23,396 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759972_19148, type=LAST_IN_PIPELINE terminating 2025-07-22 12:06:23,372 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759973_19149 src: /192.168.158.5:58578 dest: /192.168.158.4:9866 2025-07-22 12:06:23,392 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:58578, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-232440247_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759973_19149, duration(ns): 17080197 2025-07-22 12:06:23,393 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759973_19149, type=LAST_IN_PIPELINE terminating 2025-07-22 12:07:23,364 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759974_19150 src: /192.168.158.1:36350 dest: /192.168.158.4:9866 2025-07-22 12:07:23,403 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36350, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1559721297_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759974_19150, duration(ns): 25214205 2025-07-22 12:07:23,405 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759974_19150, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-22 12:08:23,366 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759975_19151 src: /192.168.158.1:58212 dest: /192.168.158.4:9866 2025-07-22 12:08:23,404 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:58212, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_210594354_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759975_19151, duration(ns): 23422041 2025-07-22 12:08:23,404 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759975_19151, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-22 12:12:28,382 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759979_19155 src: /192.168.158.6:49844 dest: /192.168.158.4:9866 2025-07-22 12:12:28,411 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:49844, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-486370855_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759979_19155, duration(ns): 20803200 2025-07-22 12:12:28,411 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759979_19155, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-22 12:13:28,375 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759980_19156 src: /192.168.158.1:34110 dest: /192.168.158.4:9866 2025-07-22 12:13:28,414 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34110, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1311495727_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759980_19156, duration(ns): 24949610 2025-07-22 12:13:28,414 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759980_19156, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-22 12:17:33,388 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759984_19160 src: /192.168.158.7:40782 dest: /192.168.158.4:9866 2025-07-22 12:17:33,409 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:40782, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1180981509_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759984_19160, duration(ns): 17982690 2025-07-22 12:17:33,410 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759984_19160, type=LAST_IN_PIPELINE terminating 2025-07-22 12:19:38,394 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759986_19162 src: /192.168.158.7:54908 dest: /192.168.158.4:9866 2025-07-22 12:19:38,415 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:54908, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1221724049_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759986_19162, duration(ns): 17095244 2025-07-22 12:19:38,415 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759986_19162, type=LAST_IN_PIPELINE terminating 2025-07-22 12:20:43,397 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759987_19163 src: /192.168.158.9:51756 dest: /192.168.158.4:9866 2025-07-22 12:20:43,419 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:51756, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1178199679_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759987_19163, duration(ns): 18401966 2025-07-22 12:20:43,420 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759987_19163, type=LAST_IN_PIPELINE terminating 2025-07-22 12:25:53,409 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759992_19168 src: /192.168.158.6:60988 dest: /192.168.158.4:9866 2025-07-22 12:25:53,445 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:60988, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1178590717_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759992_19168, duration(ns): 21247798 2025-07-22 12:25:53,445 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759992_19168, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-22 12:26:58,410 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759993_19169 src: /192.168.158.7:54072 dest: /192.168.158.4:9866 2025-07-22 12:26:58,432 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:54072, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1202148655_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759993_19169, duration(ns): 17901185 2025-07-22 12:26:58,433 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759993_19169, type=LAST_IN_PIPELINE terminating 2025-07-22 12:28:58,416 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759995_19171 src: /192.168.158.9:42154 dest: /192.168.158.4:9866 2025-07-22 12:28:58,439 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:42154, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-840772984_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759995_19171, duration(ns): 18783237 2025-07-22 12:28:58,439 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759995_19171, type=LAST_IN_PIPELINE terminating 2025-07-22 12:31:03,408 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073759997_19173 src: /192.168.158.1:54956 dest: /192.168.158.4:9866 2025-07-22 12:31:03,446 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54956, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-863283221_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073759997_19173, duration(ns): 24684987 2025-07-22 12:31:03,446 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073759997_19173, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-22 12:35:03,401 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760001_19177 src: /192.168.158.1:39686 dest: /192.168.158.4:9866 2025-07-22 12:35:03,442 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39686, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1194491661_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760001_19177, duration(ns): 27107832 2025-07-22 12:35:03,443 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760001_19177, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-22 12:38:03,427 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760004_19180 src: /192.168.158.9:60324 dest: /192.168.158.4:9866 2025-07-22 12:38:03,449 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:60324, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_912021515_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760004_19180, duration(ns): 18611696 2025-07-22 12:38:03,450 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760004_19180, type=LAST_IN_PIPELINE terminating 2025-07-22 12:39:03,420 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760005_19181 src: /192.168.158.1:45852 dest: /192.168.158.4:9866 2025-07-22 12:39:03,459 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:45852, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1303743612_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760005_19181, duration(ns): 24131625 2025-07-22 12:39:03,459 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760005_19181, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-22 12:42:08,429 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760008_19184 src: /192.168.158.7:34350 dest: /192.168.158.4:9866 2025-07-22 12:42:08,450 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:34350, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_8237624_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760008_19184, duration(ns): 16824210 2025-07-22 12:42:08,450 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760008_19184, type=LAST_IN_PIPELINE terminating 2025-07-22 12:43:08,451 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760009_19185 src: /192.168.158.6:38358 dest: /192.168.158.4:9866 2025-07-22 12:43:08,471 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:38358, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-180532604_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760009_19185, duration(ns): 16407565 2025-07-22 12:43:08,471 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760009_19185, type=LAST_IN_PIPELINE terminating 2025-07-22 12:43:53,823 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760001_19177 replica FinalizedReplica, blk_1073760001_19177, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760001 for deletion 2025-07-22 12:43:53,826 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760004_19180 replica FinalizedReplica, blk_1073760004_19180, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760004 for deletion 2025-07-22 12:43:53,827 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760001_19177 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760001 2025-07-22 12:43:53,828 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760004_19180 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760004 2025-07-22 12:43:53,828 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760005_19181 replica FinalizedReplica, blk_1073760005_19181, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760005 for deletion 2025-07-22 12:43:53,829 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760008_19184 replica FinalizedReplica, blk_1073760008_19184, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760008 for deletion 2025-07-22 12:43:53,830 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760005_19181 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760005 2025-07-22 12:43:53,830 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760009_19185 replica FinalizedReplica, blk_1073760009_19185, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760009 for deletion 2025-07-22 12:43:53,831 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760008_19184 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760008 2025-07-22 12:43:53,831 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759955_19131 replica FinalizedReplica, blk_1073759955_19131, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759955 for deletion 2025-07-22 12:43:53,831 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760009_19185 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760009 2025-07-22 12:43:53,832 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759955_19131 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759955 2025-07-22 12:43:53,831 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759956_19132 replica FinalizedReplica, blk_1073759956_19132, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759956 for deletion 2025-07-22 12:43:53,833 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759956_19132 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759956 2025-07-22 12:43:53,833 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759957_19133 replica FinalizedReplica, blk_1073759957_19133, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759957 for deletion 2025-07-22 12:43:53,834 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759959_19135 replica FinalizedReplica, blk_1073759959_19135, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759959 for deletion 2025-07-22 12:43:53,834 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759957_19133 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759957 2025-07-22 12:43:53,834 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759960_19136 replica FinalizedReplica, blk_1073759960_19136, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759960 for deletion 2025-07-22 12:43:53,834 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759959_19135 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759959 2025-07-22 12:43:53,835 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759960_19136 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759960 2025-07-22 12:43:53,835 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759964_19140 replica FinalizedReplica, blk_1073759964_19140, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759964 for deletion 2025-07-22 12:43:53,835 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759966_19142 replica FinalizedReplica, blk_1073759966_19142, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759966 for deletion 2025-07-22 12:43:53,836 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759964_19140 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759964 2025-07-22 12:43:53,836 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759967_19143 replica FinalizedReplica, blk_1073759967_19143, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759967 for deletion 2025-07-22 12:43:53,836 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759966_19142 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759966 2025-07-22 12:43:53,837 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759967_19143 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759967 2025-07-22 12:43:53,837 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759969_19145 replica FinalizedReplica, blk_1073759969_19145, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759969 for deletion 2025-07-22 12:43:53,838 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759969_19145 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759969 2025-07-22 12:43:53,838 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759972_19148 replica FinalizedReplica, blk_1073759972_19148, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759972 for deletion 2025-07-22 12:43:53,839 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759972_19148 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759972 2025-07-22 12:43:53,839 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759973_19149 replica FinalizedReplica, blk_1073759973_19149, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759973 for deletion 2025-07-22 12:43:53,840 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759973_19149 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759973 2025-07-22 12:43:53,840 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759974_19150 replica FinalizedReplica, blk_1073759974_19150, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759974 for deletion 2025-07-22 12:43:53,840 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759975_19151 replica FinalizedReplica, blk_1073759975_19151, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759975 for deletion 2025-07-22 12:43:53,841 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759974_19150 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759974 2025-07-22 12:43:53,841 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759975_19151 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759975 2025-07-22 12:43:53,841 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759979_19155 replica FinalizedReplica, blk_1073759979_19155, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759979 for deletion 2025-07-22 12:43:53,842 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759980_19156 replica FinalizedReplica, blk_1073759980_19156, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759980 for deletion 2025-07-22 12:43:53,843 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759979_19155 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759979 2025-07-22 12:43:53,843 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759984_19160 replica FinalizedReplica, blk_1073759984_19160, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759984 for deletion 2025-07-22 12:43:53,843 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759980_19156 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759980 2025-07-22 12:43:53,844 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759984_19160 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759984 2025-07-22 12:43:53,844 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759986_19162 replica FinalizedReplica, blk_1073759986_19162, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759986 for deletion 2025-07-22 12:43:53,845 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759987_19163 replica FinalizedReplica, blk_1073759987_19163, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759987 for deletion 2025-07-22 12:43:53,845 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759986_19162 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759986 2025-07-22 12:43:53,845 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759987_19163 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759987 2025-07-22 12:43:53,845 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759992_19168 replica FinalizedReplica, blk_1073759992_19168, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759992 for deletion 2025-07-22 12:43:53,846 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759993_19169 replica FinalizedReplica, blk_1073759993_19169, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759993 for deletion 2025-07-22 12:43:53,846 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759992_19168 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759992 2025-07-22 12:43:53,847 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759995_19171 replica FinalizedReplica, blk_1073759995_19171, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759995 for deletion 2025-07-22 12:43:53,847 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759993_19169 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759993 2025-07-22 12:43:53,848 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759995_19171 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759995 2025-07-22 12:43:53,848 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073759997_19173 replica FinalizedReplica, blk_1073759997_19173, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759997 for deletion 2025-07-22 12:43:53,848 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073759997_19173 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir6/blk_1073759997 2025-07-22 12:44:13,431 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760010_19186 src: /192.168.158.8:48454 dest: /192.168.158.4:9866 2025-07-22 12:44:13,462 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:48454, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_313306373_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760010_19186, duration(ns): 22640163 2025-07-22 12:44:13,462 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760010_19186, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-22 12:44:17,813 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760010_19186 replica FinalizedReplica, blk_1073760010_19186, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760010 for deletion 2025-07-22 12:44:17,814 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760010_19186 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760010 2025-07-22 12:46:13,436 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760012_19188 src: /192.168.158.8:53940 dest: /192.168.158.4:9866 2025-07-22 12:46:13,458 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:53940, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1894426739_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760012_19188, duration(ns): 19188450 2025-07-22 12:46:13,459 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760012_19188, type=LAST_IN_PIPELINE terminating 2025-07-22 12:46:17,819 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760012_19188 replica FinalizedReplica, blk_1073760012_19188, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760012 for deletion 2025-07-22 12:46:17,821 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760012_19188 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760012 2025-07-22 12:54:28,439 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760020_19196 src: /192.168.158.5:47184 dest: /192.168.158.4:9866 2025-07-22 12:54:28,471 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:47184, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1009801698_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760020_19196, duration(ns): 24281023 2025-07-22 12:54:28,473 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760020_19196, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-22 12:54:32,833 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760020_19196 replica FinalizedReplica, blk_1073760020_19196, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760020 for deletion 2025-07-22 12:54:32,835 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760020_19196 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760020 2025-07-22 12:56:33,451 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760022_19198 src: /192.168.158.9:44410 dest: /192.168.158.4:9866 2025-07-22 12:56:33,474 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:44410, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1623681832_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760022_19198, duration(ns): 19837012 2025-07-22 12:56:33,475 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760022_19198, type=LAST_IN_PIPELINE terminating 2025-07-22 12:56:38,841 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760022_19198 replica FinalizedReplica, blk_1073760022_19198, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760022 for deletion 2025-07-22 12:56:38,843 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760022_19198 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760022 2025-07-22 12:57:33,449 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760023_19199 src: /192.168.158.5:53356 dest: /192.168.158.4:9866 2025-07-22 12:57:33,470 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:53356, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1570633158_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760023_19199, duration(ns): 17658888 2025-07-22 12:57:33,470 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760023_19199, type=LAST_IN_PIPELINE terminating 2025-07-22 12:57:38,843 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760023_19199 replica FinalizedReplica, blk_1073760023_19199, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760023 for deletion 2025-07-22 12:57:38,847 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760023_19199 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760023 2025-07-22 12:58:33,452 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760024_19200 src: /192.168.158.7:55680 dest: /192.168.158.4:9866 2025-07-22 12:58:33,473 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:55680, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2025089912_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760024_19200, duration(ns): 17447603 2025-07-22 12:58:33,474 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760024_19200, type=LAST_IN_PIPELINE terminating 2025-07-22 12:58:38,846 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760024_19200 replica FinalizedReplica, blk_1073760024_19200, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760024 for deletion 2025-07-22 12:58:38,848 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760024_19200 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760024 2025-07-22 12:59:33,455 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760025_19201 src: /192.168.158.9:53576 dest: /192.168.158.4:9866 2025-07-22 12:59:33,476 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:53576, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_523339538_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760025_19201, duration(ns): 17654252 2025-07-22 12:59:33,477 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760025_19201, type=LAST_IN_PIPELINE terminating 2025-07-22 12:59:38,851 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760025_19201 replica FinalizedReplica, blk_1073760025_19201, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760025 for deletion 2025-07-22 12:59:38,853 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760025_19201 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760025 2025-07-22 13:00:38,445 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760026_19202 src: /192.168.158.1:49438 dest: /192.168.158.4:9866 2025-07-22 13:00:38,484 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49438, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-826978502_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760026_19202, duration(ns): 25628453 2025-07-22 13:00:38,487 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760026_19202, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-22 13:00:41,852 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760026_19202 replica FinalizedReplica, blk_1073760026_19202, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760026 for deletion 2025-07-22 13:00:41,853 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760026_19202 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760026 2025-07-22 13:05:43,471 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760031_19207 src: /192.168.158.5:40102 dest: /192.168.158.4:9866 2025-07-22 13:05:43,504 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:40102, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_44485191_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760031_19207, duration(ns): 23159071 2025-07-22 13:05:43,505 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760031_19207, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-22 13:05:50,867 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760031_19207 replica FinalizedReplica, blk_1073760031_19207, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760031 for deletion 2025-07-22 13:05:50,868 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760031_19207 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760031 2025-07-22 13:06:43,467 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760032_19208 src: /192.168.158.9:55636 dest: /192.168.158.4:9866 2025-07-22 13:06:43,502 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:55636, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_936370424_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760032_19208, duration(ns): 26637766 2025-07-22 13:06:43,502 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760032_19208, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-22 13:06:50,868 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760032_19208 replica FinalizedReplica, blk_1073760032_19208, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760032 for deletion 2025-07-22 13:06:50,870 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760032_19208 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760032 2025-07-22 13:08:43,462 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760034_19210 src: /192.168.158.1:57690 dest: /192.168.158.4:9866 2025-07-22 13:08:43,504 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57690, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1644409219_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760034_19210, duration(ns): 28729214 2025-07-22 13:08:43,505 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760034_19210, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-22 13:08:50,878 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760034_19210 replica FinalizedReplica, blk_1073760034_19210, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760034 for deletion 2025-07-22 13:08:50,880 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760034_19210 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760034 2025-07-22 13:09:56,886 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Successfully sent block report 0xef850f8f8a47806b, containing 4 storage report(s), of which we sent 4. The reports had 8 total blocks and used 1 RPC(s). This took 1 msec to generate and 5 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5. 2025-07-22 13:09:56,887 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Got finalize command for block pool BP-1059995147-192.168.158.1-1752101929360 2025-07-22 13:16:53,474 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760042_19218 src: /192.168.158.5:45022 dest: /192.168.158.4:9866 2025-07-22 13:16:53,504 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:45022, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_388396249_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760042_19218, duration(ns): 23062864 2025-07-22 13:16:53,505 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760042_19218, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-22 13:16:59,887 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760042_19218 replica FinalizedReplica, blk_1073760042_19218, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760042 for deletion 2025-07-22 13:16:59,889 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760042_19218 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760042 2025-07-22 13:17:53,467 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760043_19219 src: /192.168.158.1:51860 dest: /192.168.158.4:9866 2025-07-22 13:17:53,511 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:51860, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2064536944_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760043_19219, duration(ns): 30250535 2025-07-22 13:17:53,512 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760043_19219, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-22 13:17:59,894 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760043_19219 replica FinalizedReplica, blk_1073760043_19219, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760043 for deletion 2025-07-22 13:17:59,896 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760043_19219 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760043 2025-07-22 13:19:58,471 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760045_19221 src: /192.168.158.5:46252 dest: /192.168.158.4:9866 2025-07-22 13:19:58,501 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:46252, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1849184631_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760045_19221, duration(ns): 22135887 2025-07-22 13:19:58,502 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760045_19221, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-22 13:20:05,893 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760045_19221 replica FinalizedReplica, blk_1073760045_19221, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760045 for deletion 2025-07-22 13:20:05,894 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760045_19221 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760045 2025-07-22 13:23:03,486 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760048_19224 src: /192.168.158.7:39026 dest: /192.168.158.4:9866 2025-07-22 13:23:03,508 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:39026, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1813571817_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760048_19224, duration(ns): 18344411 2025-07-22 13:23:03,508 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760048_19224, type=LAST_IN_PIPELINE terminating 2025-07-22 13:23:08,896 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760048_19224 replica FinalizedReplica, blk_1073760048_19224, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760048 for deletion 2025-07-22 13:23:08,897 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760048_19224 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760048 2025-07-22 13:24:03,480 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760049_19225 src: /192.168.158.6:48314 dest: /192.168.158.4:9866 2025-07-22 13:24:03,510 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:48314, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1053585600_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760049_19225, duration(ns): 21952941 2025-07-22 13:24:03,510 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760049_19225, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-22 13:24:08,898 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760049_19225 replica FinalizedReplica, blk_1073760049_19225, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760049 for deletion 2025-07-22 13:24:08,900 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760049_19225 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760049 2025-07-22 13:25:03,494 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760050_19226 src: /192.168.158.8:48190 dest: /192.168.158.4:9866 2025-07-22 13:25:03,518 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:48190, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1124385260_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760050_19226, duration(ns): 20351865 2025-07-22 13:25:03,518 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760050_19226, type=LAST_IN_PIPELINE terminating 2025-07-22 13:25:11,899 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760050_19226 replica FinalizedReplica, blk_1073760050_19226, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760050 for deletion 2025-07-22 13:25:11,900 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760050_19226 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760050 2025-07-22 13:26:03,490 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760051_19227 src: /192.168.158.8:51242 dest: /192.168.158.4:9866 2025-07-22 13:26:03,511 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:51242, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_640112876_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760051_19227, duration(ns): 17728269 2025-07-22 13:26:03,511 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760051_19227, type=LAST_IN_PIPELINE terminating 2025-07-22 13:26:11,902 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760051_19227 replica FinalizedReplica, blk_1073760051_19227, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760051 for deletion 2025-07-22 13:26:11,903 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760051_19227 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760051 2025-07-22 13:28:08,496 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760053_19229 src: /192.168.158.6:46408 dest: /192.168.158.4:9866 2025-07-22 13:28:08,527 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:46408, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-301106255_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760053_19229, duration(ns): 22322621 2025-07-22 13:28:08,528 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760053_19229, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-22 13:28:11,905 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760053_19229 replica FinalizedReplica, blk_1073760053_19229, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760053 for deletion 2025-07-22 13:28:11,906 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760053_19229 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760053 2025-07-22 13:29:13,487 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760054_19230 src: /192.168.158.6:53124 dest: /192.168.158.4:9866 2025-07-22 13:29:13,518 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:53124, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_750275497_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760054_19230, duration(ns): 22796534 2025-07-22 13:29:13,520 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760054_19230, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-22 13:29:17,906 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760054_19230 replica FinalizedReplica, blk_1073760054_19230, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760054 for deletion 2025-07-22 13:29:17,908 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760054_19230 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760054 2025-07-22 13:31:13,498 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760056_19232 src: /192.168.158.8:43692 dest: /192.168.158.4:9866 2025-07-22 13:31:13,520 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:43692, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_268379887_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760056_19232, duration(ns): 18873269 2025-07-22 13:31:13,520 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760056_19232, type=LAST_IN_PIPELINE terminating 2025-07-22 13:31:20,911 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760056_19232 replica FinalizedReplica, blk_1073760056_19232, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760056 for deletion 2025-07-22 13:31:20,912 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760056_19232 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760056 2025-07-22 13:32:13,495 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760057_19233 src: /192.168.158.7:52766 dest: /192.168.158.4:9866 2025-07-22 13:32:13,526 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:52766, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1947917645_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760057_19233, duration(ns): 22534968 2025-07-22 13:32:13,527 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760057_19233, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-22 13:32:20,914 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760057_19233 replica FinalizedReplica, blk_1073760057_19233, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760057 for deletion 2025-07-22 13:32:20,915 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760057_19233 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760057 2025-07-22 13:33:13,502 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760058_19234 src: /192.168.158.5:39348 dest: /192.168.158.4:9866 2025-07-22 13:33:13,525 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:39348, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1010353143_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760058_19234, duration(ns): 19504273 2025-07-22 13:33:13,526 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760058_19234, type=LAST_IN_PIPELINE terminating 2025-07-22 13:33:20,917 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760058_19234 replica FinalizedReplica, blk_1073760058_19234, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760058 for deletion 2025-07-22 13:33:20,919 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760058_19234 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760058 2025-07-22 13:35:18,505 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760060_19236 src: /192.168.158.6:38168 dest: /192.168.158.4:9866 2025-07-22 13:35:18,535 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:38168, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-356811198_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760060_19236, duration(ns): 22548605 2025-07-22 13:35:18,536 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760060_19236, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-22 13:35:26,923 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760060_19236 replica FinalizedReplica, blk_1073760060_19236, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760060 for deletion 2025-07-22 13:35:26,925 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760060_19236 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760060 2025-07-22 13:36:23,492 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760061_19237 src: /192.168.158.1:44946 dest: /192.168.158.4:9866 2025-07-22 13:36:23,535 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:44946, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_471406071_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760061_19237, duration(ns): 29126713 2025-07-22 13:36:23,536 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760061_19237, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-22 13:36:26,923 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760061_19237 replica FinalizedReplica, blk_1073760061_19237, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760061 for deletion 2025-07-22 13:36:26,925 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760061_19237 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760061 2025-07-22 13:39:28,497 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760064_19240 src: /192.168.158.1:39268 dest: /192.168.158.4:9866 2025-07-22 13:39:28,542 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:39268, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1918006915_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760064_19240, duration(ns): 30868985 2025-07-22 13:39:28,543 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760064_19240, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-22 13:39:32,931 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760064_19240 replica FinalizedReplica, blk_1073760064_19240, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760064 for deletion 2025-07-22 13:39:32,932 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760064_19240 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760064 2025-07-22 13:43:28,513 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760068_19244 src: /192.168.158.5:34774 dest: /192.168.158.4:9866 2025-07-22 13:43:28,538 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:34774, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-311321727_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760068_19244, duration(ns): 21175314 2025-07-22 13:43:28,538 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760068_19244, type=LAST_IN_PIPELINE terminating 2025-07-22 13:43:32,940 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760068_19244 replica FinalizedReplica, blk_1073760068_19244, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760068 for deletion 2025-07-22 13:43:32,942 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760068_19244 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760068 2025-07-22 13:44:28,509 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760069_19245 src: /192.168.158.9:40420 dest: /192.168.158.4:9866 2025-07-22 13:44:28,531 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:40420, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1485147201_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760069_19245, duration(ns): 18507358 2025-07-22 13:44:28,531 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760069_19245, type=LAST_IN_PIPELINE terminating 2025-07-22 13:44:35,942 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760069_19245 replica FinalizedReplica, blk_1073760069_19245, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760069 for deletion 2025-07-22 13:44:35,943 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760069_19245 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760069 2025-07-22 13:45:33,521 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760070_19246 src: /192.168.158.9:33748 dest: /192.168.158.4:9866 2025-07-22 13:45:33,553 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:33748, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1455951618_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760070_19246, duration(ns): 24222407 2025-07-22 13:45:33,554 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760070_19246, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-22 13:45:38,943 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760070_19246 replica FinalizedReplica, blk_1073760070_19246, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760070 for deletion 2025-07-22 13:45:38,944 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760070_19246 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760070 2025-07-22 13:47:38,518 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760072_19248 src: /192.168.158.9:41340 dest: /192.168.158.4:9866 2025-07-22 13:47:38,540 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:41340, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-199302934_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760072_19248, duration(ns): 19672938 2025-07-22 13:47:38,541 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760072_19248, type=LAST_IN_PIPELINE terminating 2025-07-22 13:47:41,948 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760072_19248 replica FinalizedReplica, blk_1073760072_19248, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760072 for deletion 2025-07-22 13:47:41,949 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760072_19248 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760072 2025-07-22 13:48:43,516 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760073_19249 src: /192.168.158.9:50366 dest: /192.168.158.4:9866 2025-07-22 13:48:43,540 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:50366, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1807453185_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760073_19249, duration(ns): 21364490 2025-07-22 13:48:43,541 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760073_19249, type=LAST_IN_PIPELINE terminating 2025-07-22 13:48:47,951 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760073_19249 replica FinalizedReplica, blk_1073760073_19249, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760073 for deletion 2025-07-22 13:48:47,952 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760073_19249 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760073 2025-07-22 13:50:48,520 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760075_19251 src: /192.168.158.9:47598 dest: /192.168.158.4:9866 2025-07-22 13:50:48,541 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:47598, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1483511787_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760075_19251, duration(ns): 18366705 2025-07-22 13:50:48,541 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760075_19251, type=LAST_IN_PIPELINE terminating 2025-07-22 13:50:53,957 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760075_19251 replica FinalizedReplica, blk_1073760075_19251, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760075 for deletion 2025-07-22 13:50:53,958 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760075_19251 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760075 2025-07-22 13:54:48,527 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760079_19255 src: /192.168.158.8:42302 dest: /192.168.158.4:9866 2025-07-22 13:54:48,560 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:42302, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_266364356_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760079_19255, duration(ns): 24856288 2025-07-22 13:54:48,561 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760079_19255, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-22 13:54:53,967 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760079_19255 replica FinalizedReplica, blk_1073760079_19255, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760079 for deletion 2025-07-22 13:54:53,968 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760079_19255 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760079 2025-07-22 13:56:58,527 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760081_19257 src: /192.168.158.1:34842 dest: /192.168.158.4:9866 2025-07-22 13:56:58,566 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34842, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1591608471_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760081_19257, duration(ns): 26349565 2025-07-22 13:56:58,567 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760081_19257, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.6:9866] terminating 2025-07-22 13:57:02,972 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760081_19257 replica FinalizedReplica, blk_1073760081_19257, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760081 for deletion 2025-07-22 13:57:02,974 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760081_19257 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760081 2025-07-22 14:01:03,530 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760085_19261 src: /192.168.158.8:56564 dest: /192.168.158.4:9866 2025-07-22 14:01:03,559 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:56564, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_125753374_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760085_19261, duration(ns): 21325854 2025-07-22 14:01:03,560 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760085_19261, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-22 14:01:08,987 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760085_19261 replica FinalizedReplica, blk_1073760085_19261, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760085 for deletion 2025-07-22 14:01:08,989 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760085_19261 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760085 2025-07-22 14:05:13,530 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760089_19265 src: /192.168.158.7:33510 dest: /192.168.158.4:9866 2025-07-22 14:05:13,561 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:33510, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-473585863_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760089_19265, duration(ns): 23076022 2025-07-22 14:05:13,561 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760089_19265, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-22 14:05:21,001 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760089_19265 replica FinalizedReplica, blk_1073760089_19265, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760089 for deletion 2025-07-22 14:05:21,002 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760089_19265 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760089 2025-07-22 14:06:18,528 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760090_19266 src: /192.168.158.1:43160 dest: /192.168.158.4:9866 2025-07-22 14:06:18,569 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43160, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1924719115_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760090_19266, duration(ns): 26932429 2025-07-22 14:06:18,569 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760090_19266, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-22 14:06:24,004 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760090_19266 replica FinalizedReplica, blk_1073760090_19266, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760090 for deletion 2025-07-22 14:06:24,006 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760090_19266 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760090 2025-07-22 14:08:28,535 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760092_19268 src: /192.168.158.1:35154 dest: /192.168.158.4:9866 2025-07-22 14:08:28,573 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35154, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1272357339_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760092_19268, duration(ns): 26066783 2025-07-22 14:08:28,574 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760092_19268, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-22 14:08:33,010 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760092_19268 replica FinalizedReplica, blk_1073760092_19268, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760092 for deletion 2025-07-22 14:08:33,012 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760092_19268 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760092 2025-07-22 14:10:33,554 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760094_19270 src: /192.168.158.8:53338 dest: /192.168.158.4:9866 2025-07-22 14:10:33,576 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:53338, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1190974165_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760094_19270, duration(ns): 19241677 2025-07-22 14:10:33,576 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760094_19270, type=LAST_IN_PIPELINE terminating 2025-07-22 14:10:36,013 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760094_19270 replica FinalizedReplica, blk_1073760094_19270, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760094 for deletion 2025-07-22 14:10:36,014 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760094_19270 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760094 2025-07-22 14:11:38,541 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760095_19271 src: /192.168.158.1:52306 dest: /192.168.158.4:9866 2025-07-22 14:11:38,581 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52306, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1445965277_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760095_19271, duration(ns): 27523266 2025-07-22 14:11:38,582 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760095_19271, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-22 14:11:45,016 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760095_19271 replica FinalizedReplica, blk_1073760095_19271, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760095 for deletion 2025-07-22 14:11:45,018 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760095_19271 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760095 2025-07-22 14:12:38,547 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760096_19272 src: /192.168.158.1:40156 dest: /192.168.158.4:9866 2025-07-22 14:12:38,585 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:40156, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2131269097_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760096_19272, duration(ns): 26211078 2025-07-22 14:12:38,586 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760096_19272, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.7:9866] terminating 2025-07-22 14:12:42,020 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760096_19272 replica FinalizedReplica, blk_1073760096_19272, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760096 for deletion 2025-07-22 14:12:42,021 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760096_19272 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760096 2025-07-22 14:13:38,550 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760097_19273 src: /192.168.158.7:41346 dest: /192.168.158.4:9866 2025-07-22 14:13:38,579 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:41346, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1038963603_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760097_19273, duration(ns): 21516698 2025-07-22 14:13:38,580 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760097_19273, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-22 14:13:45,022 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760097_19273 replica FinalizedReplica, blk_1073760097_19273, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760097 for deletion 2025-07-22 14:13:45,024 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760097_19273 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760097 2025-07-22 14:14:38,549 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760098_19274 src: /192.168.158.7:53598 dest: /192.168.158.4:9866 2025-07-22 14:14:38,570 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:53598, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-604528274_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760098_19274, duration(ns): 18389078 2025-07-22 14:14:38,571 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760098_19274, type=LAST_IN_PIPELINE terminating 2025-07-22 14:14:45,028 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760098_19274 replica FinalizedReplica, blk_1073760098_19274, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760098 for deletion 2025-07-22 14:14:45,029 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760098_19274 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760098 2025-07-22 14:16:38,561 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760100_19276 src: /192.168.158.6:56402 dest: /192.168.158.4:9866 2025-07-22 14:16:38,583 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:56402, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1032899220_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760100_19276, duration(ns): 19116749 2025-07-22 14:16:38,583 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760100_19276, type=LAST_IN_PIPELINE terminating 2025-07-22 14:16:42,029 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760100_19276 replica FinalizedReplica, blk_1073760100_19276, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760100 for deletion 2025-07-22 14:16:42,031 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760100_19276 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760100 2025-07-22 14:19:48,579 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760103_19279 src: /192.168.158.7:53672 dest: /192.168.158.4:9866 2025-07-22 14:19:48,601 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:53672, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1771974479_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760103_19279, duration(ns): 19607899 2025-07-22 14:19:48,601 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760103_19279, type=LAST_IN_PIPELINE terminating 2025-07-22 14:19:51,037 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760103_19279 replica FinalizedReplica, blk_1073760103_19279, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760103 for deletion 2025-07-22 14:19:51,038 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760103_19279 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760103 2025-07-22 14:21:48,597 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760105_19281 src: /192.168.158.8:42724 dest: /192.168.158.4:9866 2025-07-22 14:21:48,620 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:42724, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1845613120_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760105_19281, duration(ns): 19780639 2025-07-22 14:21:48,620 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760105_19281, type=LAST_IN_PIPELINE terminating 2025-07-22 14:21:51,040 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760105_19281 replica FinalizedReplica, blk_1073760105_19281, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760105 for deletion 2025-07-22 14:21:51,042 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760105_19281 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760105 2025-07-22 14:22:48,578 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760106_19282 src: /192.168.158.1:46982 dest: /192.168.158.4:9866 2025-07-22 14:22:48,615 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46982, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2043735061_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760106_19282, duration(ns): 24917029 2025-07-22 14:22:48,616 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760106_19282, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-22 14:22:51,043 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760106_19282 replica FinalizedReplica, blk_1073760106_19282, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760106 for deletion 2025-07-22 14:22:51,045 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760106_19282 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760106 2025-07-22 14:23:48,564 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760107_19283 src: /192.168.158.9:46298 dest: /192.168.158.4:9866 2025-07-22 14:23:48,593 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:46298, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-5918369_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760107_19283, duration(ns): 21104881 2025-07-22 14:23:48,593 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760107_19283, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-22 14:23:51,046 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760107_19283 replica FinalizedReplica, blk_1073760107_19283, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760107 for deletion 2025-07-22 14:23:51,048 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760107_19283 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760107 2025-07-22 14:26:48,557 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760110_19286 src: /192.168.158.1:32892 dest: /192.168.158.4:9866 2025-07-22 14:26:48,597 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:32892, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-967869965_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760110_19286, duration(ns): 28202241 2025-07-22 14:26:48,598 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760110_19286, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-22 14:26:51,058 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760110_19286 replica FinalizedReplica, blk_1073760110_19286, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760110 for deletion 2025-07-22 14:26:51,059 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760110_19286 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760110 2025-07-22 14:27:53,562 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760111_19287 src: /192.168.158.1:52132 dest: /192.168.158.4:9866 2025-07-22 14:27:53,598 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:52132, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1359449139_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760111_19287, duration(ns): 24220312 2025-07-22 14:27:53,599 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760111_19287, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.5:9866] terminating 2025-07-22 14:27:57,060 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760111_19287 replica FinalizedReplica, blk_1073760111_19287, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760111 for deletion 2025-07-22 14:27:57,062 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760111_19287 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760111 2025-07-22 14:28:58,580 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760112_19288 src: /192.168.158.5:35090 dest: /192.168.158.4:9866 2025-07-22 14:28:58,606 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:35090, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2043788839_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760112_19288, duration(ns): 22604420 2025-07-22 14:28:58,606 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760112_19288, type=LAST_IN_PIPELINE terminating 2025-07-22 14:29:06,060 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760112_19288 replica FinalizedReplica, blk_1073760112_19288, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760112 for deletion 2025-07-22 14:29:06,061 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760112_19288 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760112 2025-07-22 14:29:58,579 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760113_19289 src: /192.168.158.7:56112 dest: /192.168.158.4:9866 2025-07-22 14:29:58,601 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:56112, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1392134635_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760113_19289, duration(ns): 18915747 2025-07-22 14:29:58,601 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760113_19289, type=LAST_IN_PIPELINE terminating 2025-07-22 14:30:03,062 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760113_19289 replica FinalizedReplica, blk_1073760113_19289, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760113 for deletion 2025-07-22 14:30:03,064 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760113_19289 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760113 2025-07-22 14:30:58,564 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760114_19290 src: /192.168.158.8:54880 dest: /192.168.158.4:9866 2025-07-22 14:30:58,597 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:54880, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-384155120_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760114_19290, duration(ns): 24323301 2025-07-22 14:30:58,597 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760114_19290, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-22 14:31:06,064 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760114_19290 replica FinalizedReplica, blk_1073760114_19290, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760114 for deletion 2025-07-22 14:31:06,065 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760114_19290 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760114 2025-07-22 14:32:58,571 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760116_19292 src: /192.168.158.1:35300 dest: /192.168.158.4:9866 2025-07-22 14:32:58,613 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:35300, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1865046476_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760116_19292, duration(ns): 30382975 2025-07-22 14:32:58,614 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760116_19292, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.7:9866] terminating 2025-07-22 14:33:06,072 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760116_19292 replica FinalizedReplica, blk_1073760116_19292, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760116 for deletion 2025-07-22 14:33:06,073 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760116_19292 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760116 2025-07-22 14:35:03,582 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760118_19294 src: /192.168.158.1:53660 dest: /192.168.158.4:9866 2025-07-22 14:35:03,621 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:53660, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2090451610_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760118_19294, duration(ns): 26416646 2025-07-22 14:35:03,622 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760118_19294, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.8:9866] terminating 2025-07-22 14:35:06,076 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760118_19294 replica FinalizedReplica, blk_1073760118_19294, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760118 for deletion 2025-07-22 14:35:06,077 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760118_19294 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760118 2025-07-22 14:38:03,601 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760121_19297 src: /192.168.158.5:60130 dest: /192.168.158.4:9866 2025-07-22 14:38:03,624 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:60130, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-109897659_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760121_19297, duration(ns): 19435303 2025-07-22 14:38:03,624 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760121_19297, type=LAST_IN_PIPELINE terminating 2025-07-22 14:38:06,082 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760121_19297 replica FinalizedReplica, blk_1073760121_19297, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760121 for deletion 2025-07-22 14:38:06,084 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760121_19297 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760121 2025-07-22 14:40:08,607 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760123_19299 src: /192.168.158.7:54284 dest: /192.168.158.4:9866 2025-07-22 14:40:08,628 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:54284, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_851710959_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760123_19299, duration(ns): 17780955 2025-07-22 14:40:08,632 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760123_19299, type=LAST_IN_PIPELINE terminating 2025-07-22 14:40:12,134 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760123_19299 replica FinalizedReplica, blk_1073760123_19299, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760123 for deletion 2025-07-22 14:40:12,135 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760123_19299 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760123 2025-07-22 14:42:13,576 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760125_19301 src: /192.168.158.7:39224 dest: /192.168.158.4:9866 2025-07-22 14:42:13,606 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:39224, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_340248173_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760125_19301, duration(ns): 22706989 2025-07-22 14:42:13,606 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760125_19301, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-22 14:42:18,097 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760125_19301 replica FinalizedReplica, blk_1073760125_19301, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760125 for deletion 2025-07-22 14:42:18,098 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760125_19301 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760125 2025-07-22 14:43:13,581 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760126_19302 src: /192.168.158.5:38788 dest: /192.168.158.4:9866 2025-07-22 14:43:13,612 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:38788, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1500780814_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760126_19302, duration(ns): 22829106 2025-07-22 14:43:13,612 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760126_19302, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-22 14:43:21,102 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760126_19302 replica FinalizedReplica, blk_1073760126_19302, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760126 for deletion 2025-07-22 14:43:21,104 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760126_19302 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760126 2025-07-22 14:46:13,593 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760129_19305 src: /192.168.158.7:43686 dest: /192.168.158.4:9866 2025-07-22 14:46:13,614 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:43686, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_927600267_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760129_19305, duration(ns): 17470381 2025-07-22 14:46:13,614 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760129_19305, type=LAST_IN_PIPELINE terminating 2025-07-22 14:46:18,113 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760129_19305 replica FinalizedReplica, blk_1073760129_19305, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760129 for deletion 2025-07-22 14:46:18,115 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760129_19305 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760129 2025-07-22 14:47:13,581 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760130_19306 src: /192.168.158.1:43130 dest: /192.168.158.4:9866 2025-07-22 14:47:13,621 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43130, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1496050810_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760130_19306, duration(ns): 28314426 2025-07-22 14:47:13,622 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760130_19306, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.6:9866] terminating 2025-07-22 14:47:21,117 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760130_19306 replica FinalizedReplica, blk_1073760130_19306, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760130 for deletion 2025-07-22 14:47:21,118 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760130_19306 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760130 2025-07-22 14:48:13,583 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760131_19307 src: /192.168.158.5:55248 dest: /192.168.158.4:9866 2025-07-22 14:48:13,611 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:55248, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2113419603_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760131_19307, duration(ns): 20078644 2025-07-22 14:48:13,611 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760131_19307, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-22 14:48:21,118 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760131_19307 replica FinalizedReplica, blk_1073760131_19307, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760131 for deletion 2025-07-22 14:48:21,120 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760131_19307 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760131 2025-07-22 14:49:13,591 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760132_19308 src: /192.168.158.5:39396 dest: /192.168.158.4:9866 2025-07-22 14:49:13,624 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:39396, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-795861720_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760132_19308, duration(ns): 24995954 2025-07-22 14:49:13,624 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760132_19308, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-22 14:49:18,120 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760132_19308 replica FinalizedReplica, blk_1073760132_19308, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760132 for deletion 2025-07-22 14:49:18,122 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760132_19308 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760132 2025-07-22 14:50:13,634 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760133_19309 src: /192.168.158.1:57212 dest: /192.168.158.4:9866 2025-07-22 14:50:13,671 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:57212, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1460995333_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760133_19309, duration(ns): 24739044 2025-07-22 14:50:13,671 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760133_19309, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.9:9866] terminating 2025-07-22 14:50:18,122 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760133_19309 replica FinalizedReplica, blk_1073760133_19309, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760133 for deletion 2025-07-22 14:50:18,125 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760133_19309 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760133 2025-07-22 14:51:13,593 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760134_19310 src: /192.168.158.5:42996 dest: /192.168.158.4:9866 2025-07-22 14:51:13,613 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:42996, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1334684972_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760134_19310, duration(ns): 17613378 2025-07-22 14:51:13,614 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760134_19310, type=LAST_IN_PIPELINE terminating 2025-07-22 14:51:21,126 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760134_19310 replica FinalizedReplica, blk_1073760134_19310, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760134 for deletion 2025-07-22 14:51:21,128 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760134_19310 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760134 2025-07-22 14:53:18,601 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760136_19312 src: /192.168.158.1:54452 dest: /192.168.158.4:9866 2025-07-22 14:53:18,639 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:54452, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1718757806_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760136_19312, duration(ns): 25378499 2025-07-22 14:53:18,639 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760136_19312, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-22 14:53:21,129 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760136_19312 replica FinalizedReplica, blk_1073760136_19312, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760136 for deletion 2025-07-22 14:53:21,130 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760136_19312 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760136 2025-07-22 14:58:23,597 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760141_19317 src: /192.168.158.1:60182 dest: /192.168.158.4:9866 2025-07-22 14:58:23,635 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:60182, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_405055680_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760141_19317, duration(ns): 25955188 2025-07-22 14:58:23,635 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760141_19317, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-22 14:58:27,144 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760141_19317 replica FinalizedReplica, blk_1073760141_19317, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760141 for deletion 2025-07-22 14:58:27,145 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760141_19317 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760141 2025-07-22 14:59:23,605 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760142_19318 src: /192.168.158.1:47516 dest: /192.168.158.4:9866 2025-07-22 14:59:23,644 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:47516, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-527359811_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760142_19318, duration(ns): 27049986 2025-07-22 14:59:23,644 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760142_19318, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-22 14:59:27,147 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760142_19318 replica FinalizedReplica, blk_1073760142_19318, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760142 for deletion 2025-07-22 14:59:27,148 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760142_19318 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760142 2025-07-22 15:00:23,600 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760143_19319 src: /192.168.158.1:33778 dest: /192.168.158.4:9866 2025-07-22 15:00:23,638 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33778, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_321631452_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760143_19319, duration(ns): 25144211 2025-07-22 15:00:23,638 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760143_19319, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-22 15:00:27,150 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760143_19319 replica FinalizedReplica, blk_1073760143_19319, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760143 for deletion 2025-07-22 15:00:27,151 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760143_19319 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760143 2025-07-22 15:03:23,607 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760146_19322 src: /192.168.158.8:49762 dest: /192.168.158.4:9866 2025-07-22 15:03:23,636 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:49762, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1180627472_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760146_19322, duration(ns): 21834765 2025-07-22 15:03:23,637 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760146_19322, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.7:9866] terminating 2025-07-22 15:03:27,158 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760146_19322 replica FinalizedReplica, blk_1073760146_19322, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760146 for deletion 2025-07-22 15:03:27,159 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760146_19322 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760146 2025-07-22 15:04:23,614 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760147_19323 src: /192.168.158.1:34604 dest: /192.168.158.4:9866 2025-07-22 15:04:23,653 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34604, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-648784597_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760147_19323, duration(ns): 27179438 2025-07-22 15:04:23,654 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760147_19323, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-22 15:04:27,163 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760147_19323 replica FinalizedReplica, blk_1073760147_19323, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760147 for deletion 2025-07-22 15:04:27,164 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760147_19323 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760147 2025-07-22 15:05:23,618 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760148_19324 src: /192.168.158.6:40438 dest: /192.168.158.4:9866 2025-07-22 15:05:23,638 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:40438, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1252772225_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760148_19324, duration(ns): 17675217 2025-07-22 15:05:23,638 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760148_19324, type=LAST_IN_PIPELINE terminating 2025-07-22 15:05:27,164 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760148_19324 replica FinalizedReplica, blk_1073760148_19324, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760148 for deletion 2025-07-22 15:05:27,165 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760148_19324 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760148 2025-07-22 15:07:23,614 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760150_19326 src: /192.168.158.1:36538 dest: /192.168.158.4:9866 2025-07-22 15:07:23,656 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:36538, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-242461210_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760150_19326, duration(ns): 29186782 2025-07-22 15:07:23,656 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760150_19326, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.7:9866, 192.168.158.6:9866] terminating 2025-07-22 15:07:30,169 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760150_19326 replica FinalizedReplica, blk_1073760150_19326, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760150 for deletion 2025-07-22 15:07:30,170 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760150_19326 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760150 2025-07-22 15:08:23,624 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760151_19327 src: /192.168.158.7:48200 dest: /192.168.158.4:9866 2025-07-22 15:08:23,644 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:48200, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1987977173_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760151_19327, duration(ns): 17288477 2025-07-22 15:08:23,644 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760151_19327, type=LAST_IN_PIPELINE terminating 2025-07-22 15:08:27,170 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760151_19327 replica FinalizedReplica, blk_1073760151_19327, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760151 for deletion 2025-07-22 15:08:27,171 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760151_19327 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760151 2025-07-22 15:10:28,608 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760153_19329 src: /192.168.158.1:34354 dest: /192.168.158.4:9866 2025-07-22 15:10:28,644 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34354, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1867411869_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760153_19329, duration(ns): 24283493 2025-07-22 15:10:28,645 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760153_19329, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.6:9866, 192.168.158.8:9866] terminating 2025-07-22 15:10:36,174 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760153_19329 replica FinalizedReplica, blk_1073760153_19329, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760153 for deletion 2025-07-22 15:10:36,176 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760153_19329 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760153 2025-07-22 15:11:28,622 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760154_19330 src: /192.168.158.8:36394 dest: /192.168.158.4:9866 2025-07-22 15:11:28,642 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:36394, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1059007925_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760154_19330, duration(ns): 15445870 2025-07-22 15:11:28,642 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760154_19330, type=LAST_IN_PIPELINE terminating 2025-07-22 15:11:33,176 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760154_19330 replica FinalizedReplica, blk_1073760154_19330, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760154 for deletion 2025-07-22 15:11:33,178 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760154_19330 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760154 2025-07-22 15:12:28,618 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760155_19331 src: /192.168.158.7:33014 dest: /192.168.158.4:9866 2025-07-22 15:12:28,647 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:33014, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_947121275_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760155_19331, duration(ns): 21571979 2025-07-22 15:12:28,647 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760155_19331, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-22 15:12:33,177 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760155_19331 replica FinalizedReplica, blk_1073760155_19331, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760155 for deletion 2025-07-22 15:12:33,178 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760155_19331 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760155 2025-07-22 15:14:28,617 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760157_19333 src: /192.168.158.1:43780 dest: /192.168.158.4:9866 2025-07-22 15:14:28,655 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:43780, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1079075425_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760157_19333, duration(ns): 24562363 2025-07-22 15:14:28,656 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760157_19333, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.6:9866] terminating 2025-07-22 15:14:33,183 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760157_19333 replica FinalizedReplica, blk_1073760157_19333, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760157 for deletion 2025-07-22 15:14:33,186 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760157_19333 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760157 2025-07-22 15:15:28,617 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760158_19334 src: /192.168.158.1:46714 dest: /192.168.158.4:9866 2025-07-22 15:15:28,654 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:46714, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1073960663_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760158_19334, duration(ns): 24460718 2025-07-22 15:15:28,655 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760158_19334, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-22 15:15:33,188 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760158_19334 replica FinalizedReplica, blk_1073760158_19334, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760158 for deletion 2025-07-22 15:15:33,189 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760158_19334 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760158 2025-07-22 15:22:38,625 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760165_19341 src: /192.168.158.1:33506 dest: /192.168.158.4:9866 2025-07-22 15:22:38,662 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:33506, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_191626601_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760165_19341, duration(ns): 25151857 2025-07-22 15:22:38,662 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760165_19341, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.5:9866] terminating 2025-07-22 15:22:45,207 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760165_19341 replica FinalizedReplica, blk_1073760165_19341, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760165 for deletion 2025-07-22 15:22:45,209 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760165_19341 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760165 2025-07-22 15:23:38,634 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760166_19342 src: /192.168.158.7:55834 dest: /192.168.158.4:9866 2025-07-22 15:23:38,662 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.7:55834, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1448498585_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760166_19342, duration(ns): 20868513 2025-07-22 15:23:38,665 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760166_19342, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.8:9866] terminating 2025-07-22 15:23:45,207 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760166_19342 replica FinalizedReplica, blk_1073760166_19342, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760166 for deletion 2025-07-22 15:23:45,208 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760166_19342 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760166 2025-07-22 15:24:43,633 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760167_19343 src: /192.168.158.5:33086 dest: /192.168.158.4:9866 2025-07-22 15:24:43,661 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:33086, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1339835443_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760167_19343, duration(ns): 21451355 2025-07-22 15:24:43,664 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760167_19343, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-22 15:24:51,211 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760167_19343 replica FinalizedReplica, blk_1073760167_19343, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760167 for deletion 2025-07-22 15:24:51,212 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760167_19343 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760167 2025-07-22 15:27:48,634 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760170_19346 src: /192.168.158.1:37888 dest: /192.168.158.4:9866 2025-07-22 15:27:48,675 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:37888, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1330100776_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760170_19346, duration(ns): 29210341 2025-07-22 15:27:48,675 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760170_19346, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.8:9866, 192.168.158.9:9866] terminating 2025-07-22 15:27:54,215 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760170_19346 replica FinalizedReplica, blk_1073760170_19346, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760170 for deletion 2025-07-22 15:27:54,217 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760170_19346 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760170 2025-07-22 15:28:53,695 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760171_19347 src: /192.168.158.6:42540 dest: /192.168.158.4:9866 2025-07-22 15:28:53,717 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:42540, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1540733970_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760171_19347, duration(ns): 18759154 2025-07-22 15:28:53,718 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760171_19347, type=LAST_IN_PIPELINE terminating 2025-07-22 15:29:00,216 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760171_19347 replica FinalizedReplica, blk_1073760171_19347, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760171 for deletion 2025-07-22 15:29:00,218 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760171_19347 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760171 2025-07-22 15:29:58,664 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760172_19348 src: /192.168.158.1:49368 dest: /192.168.158.4:9866 2025-07-22 15:29:58,702 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:49368, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-790090790_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760172_19348, duration(ns): 26550448 2025-07-22 15:29:58,703 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760172_19348, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-22 15:30:06,219 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760172_19348 replica FinalizedReplica, blk_1073760172_19348, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760172 for deletion 2025-07-22 15:30:06,220 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760172_19348 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760172 2025-07-22 15:33:58,668 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760176_19352 src: /192.168.158.1:50224 dest: /192.168.158.4:9866 2025-07-22 15:33:58,706 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50224, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1545540653_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760176_19352, duration(ns): 26107111 2025-07-22 15:33:58,707 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760176_19352, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.5:9866] terminating 2025-07-22 15:34:03,224 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760176_19352 replica FinalizedReplica, blk_1073760176_19352, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760176 for deletion 2025-07-22 15:34:03,226 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760176_19352 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760176 2025-07-22 15:34:58,675 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760177_19353 src: /192.168.158.8:55044 dest: /192.168.158.4:9866 2025-07-22 15:34:58,706 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:55044, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1415330787_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760177_19353, duration(ns): 24715567 2025-07-22 15:34:58,707 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760177_19353, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-22 15:35:03,225 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760177_19353 replica FinalizedReplica, blk_1073760177_19353, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760177 for deletion 2025-07-22 15:35:03,228 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760177_19353 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760177 2025-07-22 15:35:58,665 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760178_19354 src: /192.168.158.8:43450 dest: /192.168.158.4:9866 2025-07-22 15:35:58,689 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:43450, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_570314700_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760178_19354, duration(ns): 21185468 2025-07-22 15:35:58,689 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760178_19354, type=LAST_IN_PIPELINE terminating 2025-07-22 15:36:03,228 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760178_19354 replica FinalizedReplica, blk_1073760178_19354, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760178 for deletion 2025-07-22 15:36:03,230 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760178_19354 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760178 2025-07-22 15:36:58,660 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760179_19355 src: /192.168.158.6:60510 dest: /192.168.158.4:9866 2025-07-22 15:36:58,680 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:60510, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_987614195_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760179_19355, duration(ns): 16826202 2025-07-22 15:36:58,680 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760179_19355, type=LAST_IN_PIPELINE terminating 2025-07-22 15:37:03,227 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760179_19355 replica FinalizedReplica, blk_1073760179_19355, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760179 for deletion 2025-07-22 15:37:03,228 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760179_19355 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760179 2025-07-22 15:39:03,662 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760181_19357 src: /192.168.158.8:53072 dest: /192.168.158.4:9866 2025-07-22 15:39:03,682 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:53072, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2108482972_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760181_19357, duration(ns): 16998971 2025-07-22 15:39:03,682 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760181_19357, type=LAST_IN_PIPELINE terminating 2025-07-22 15:39:09,231 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760181_19357 replica FinalizedReplica, blk_1073760181_19357, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760181 for deletion 2025-07-22 15:39:09,232 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760181_19357 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760181 2025-07-22 15:41:03,653 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760183_19359 src: /192.168.158.1:59974 dest: /192.168.158.4:9866 2025-07-22 15:41:03,693 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59974, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-167848452_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760183_19359, duration(ns): 28185229 2025-07-22 15:41:03,693 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760183_19359, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.5:9866, 192.168.158.7:9866] terminating 2025-07-22 15:41:06,239 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760183_19359 replica FinalizedReplica, blk_1073760183_19359, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760183 for deletion 2025-07-22 15:41:06,240 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760183_19359 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760183 2025-07-22 15:42:03,668 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760184_19360 src: /192.168.158.8:48670 dest: /192.168.158.4:9866 2025-07-22 15:42:03,687 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:48670, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1642714735_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760184_19360, duration(ns): 15741679 2025-07-22 15:42:03,687 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760184_19360, type=LAST_IN_PIPELINE terminating 2025-07-22 15:42:06,243 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760184_19360 replica FinalizedReplica, blk_1073760184_19360, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760184 for deletion 2025-07-22 15:42:06,244 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760184_19360 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760184 2025-07-22 15:43:03,697 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760185_19361 src: /192.168.158.8:50806 dest: /192.168.158.4:9866 2025-07-22 15:43:03,729 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.8:50806, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-744644888_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760185_19361, duration(ns): 23927091 2025-07-22 15:43:03,729 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760185_19361, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-22 15:43:09,242 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760185_19361 replica FinalizedReplica, blk_1073760185_19361, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760185 for deletion 2025-07-22 15:43:09,243 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760185_19361 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760185 2025-07-22 15:46:03,673 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760188_19364 src: /192.168.158.5:52252 dest: /192.168.158.4:9866 2025-07-22 15:46:03,694 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:52252, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_585520450_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760188_19364, duration(ns): 17779124 2025-07-22 15:46:03,694 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760188_19364, type=LAST_IN_PIPELINE terminating 2025-07-22 15:46:06,249 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760188_19364 replica FinalizedReplica, blk_1073760188_19364, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760188 for deletion 2025-07-22 15:46:06,251 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760188_19364 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760188 2025-07-22 15:47:08,665 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760189_19365 src: /192.168.158.1:59700 dest: /192.168.158.4:9866 2025-07-22 15:47:08,704 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:59700, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-8356367_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760189_19365, duration(ns): 27133400 2025-07-22 15:47:08,704 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760189_19365, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.7:9866] terminating 2025-07-22 15:47:15,250 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760189_19365 replica FinalizedReplica, blk_1073760189_19365, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760189 for deletion 2025-07-22 15:47:15,251 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760189_19365 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760189 2025-07-22 15:50:13,673 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760192_19368 src: /192.168.158.6:58790 dest: /192.168.158.4:9866 2025-07-22 15:50:13,730 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.6:58790, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1472651418_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760192_19368, duration(ns): 23314805 2025-07-22 15:50:13,730 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760192_19368, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.5:9866] terminating 2025-07-22 15:50:18,260 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760192_19368 replica FinalizedReplica, blk_1073760192_19368, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760192 for deletion 2025-07-22 15:50:18,262 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760192_19368 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760192 2025-07-22 15:52:13,677 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760194_19370 src: /192.168.158.1:50544 dest: /192.168.158.4:9866 2025-07-22 15:52:13,719 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:50544, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-477698792_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760194_19370, duration(ns): 29420803 2025-07-22 15:52:13,722 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760194_19370, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-22 15:52:18,264 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760194_19370 replica FinalizedReplica, blk_1073760194_19370, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d4/dfs/dn getBlockURI() = file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760194 for deletion 2025-07-22 15:52:18,265 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760194_19370 URI file:/hdfs/d4/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760194 2025-07-22 15:53:13,674 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760195_19371 src: /192.168.158.1:34170 dest: /192.168.158.4:9866 2025-07-22 15:53:13,711 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.1:34170, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_610738213_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760195_19371, duration(ns): 25482172 2025-07-22 15:53:13,712 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760195_19371, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=2:[192.168.158.9:9866, 192.168.158.8:9866] terminating 2025-07-22 15:53:18,265 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760195_19371 replica FinalizedReplica, blk_1073760195_19371, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d1/dfs/dn getBlockURI() = file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760195 for deletion 2025-07-22 15:53:18,295 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760195_19371 URI file:/hdfs/d1/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760195 2025-07-22 15:54:13,677 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760196_19372 src: /192.168.158.5:48002 dest: /192.168.158.4:9866 2025-07-22 15:54:13,708 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.5:48002, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1576861447_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760196_19372, duration(ns): 23860601 2025-07-22 15:54:13,709 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760196_19372, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.9:9866] terminating 2025-07-22 15:54:21,264 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760196_19372 replica FinalizedReplica, blk_1073760196_19372, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d2/dfs/dn getBlockURI() = file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760196 for deletion 2025-07-22 15:54:21,266 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760196_19372 URI file:/hdfs/d2/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760196 2025-07-22 15:55:13,679 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1059995147-192.168.158.1-1752101929360:blk_1073760197_19373 src: /192.168.158.9:59448 dest: /192.168.158.4:9866 2025-07-22 15:55:13,709 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /192.168.158.9:59448, dest: /192.168.158.4:9866, bytes: 56, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1158301554_236, offset: 0, srvID: be50c32a-aa23-4b9d-aa7f-05816b6e5f1a, blockid: BP-1059995147-192.168.158.1-1752101929360:blk_1073760197_19373, duration(ns): 22701426 2025-07-22 15:55:13,709 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-1059995147-192.168.158.1-1752101929360:blk_1073760197_19373, type=HAS_DOWNSTREAM_IN_PIPELINE, downstreams=1:[192.168.158.6:9866] terminating 2025-07-22 15:55:18,266 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Scheduling blk_1073760197_19373 replica FinalizedReplica, blk_1073760197_19373, FINALIZED getNumBytes() = 56 getBytesOnDisk() = 56 getVisibleLength()= 56 getVolume() = /hdfs/d3/dfs/dn getBlockURI() = file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760197 for deletion 2025-07-22 15:55:18,267 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-1059995147-192.168.158.1-1752101929360 blk_1073760197_19373 URI file:/hdfs/d3/dfs/dn/current/BP-1059995147-192.168.158.1-1752101929360/current/finalized/subdir0/subdir7/blk_1073760197